You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Beside the fetching of the model, how does the onnx.helper.make_node translates to?
I tried with something like
stringurl="https://github.com/microsoft/ort-customops/raw/main/test/data/test_sentencepiece_ops_model__6.txt";byte[]model;using(HttpClientclient=newHttpClient()){varresponse=awaitclient.GetAsync(url);response.EnsureSuccessStatusCode();varcontent=awaitresponse.Content.ReadAsByteArrayAsync();model=Convert.FromBase64String(System.Text.Encoding.UTF8.GetString(content));}// Create the inputsvarinputs=newDenseTensor<string>(new[]{"Hello world","Hello world louder"},[2]);varnbest_size=newDenseTensor<float>(new[]{0.0f},[1]);varalpha=newDenseTensor<float>(new[]{0.0f},[1]);varadd_bos=newDenseTensor<bool>(new[]{false},[1]);varadd_eos=newDenseTensor<bool>(new[]{false},[1]);varreverse=newDenseTensor<bool>(new[]{false},[1]);// Create the named inputsvarnamedInputs=newNamedOnnxValue[]{NamedOnnxValue.CreateFromTensor("inputs",inputs),NamedOnnxValue.CreateFromTensor("nbest_size",nbest_size),NamedOnnxValue.CreateFromTensor("alpha",alpha),NamedOnnxValue.CreateFromTensor("add_bos",add_bos),NamedOnnxValue.CreateFromTensor("add_eos",add_eos),NamedOnnxValue.CreateFromTensor("reverse",reverse)};usingSessionOptionssessionOptions=new();sessionOptions.RegisterOrtExtensions();// RegisterCustomOpLibraryV2(extensionsDllName, out handle);sessionOptions.AppendExecutionProvider_CPU();// Load the modelusingvarsession=newInferenceSession(model,sessionOptions);// Run inferenceusingvarresults=session.Run(namedInputs);// Extract the resultsvartokens=results.First(r =>r.Name=="tokens").AsTensor<int>();varindices=results.First(r =>r.Name=="indices").AsTensor<long>();Console.WriteLine("Tokens: "+string.Join(", ",tokens.ToArray()));Console.WriteLine("Indices: "+string.Join(", ",indices.ToArray()));
but it fails with Microsoft.ML.OnnxRuntime.OnnxRuntimeException: '[ErrorCode:InvalidArgument] No graph was found in the protobuf.'
How am I supposed to load the SentencepieceTokenizer in C#?
How am I supposed to create that node?
Thanks
P.S.
If you help me sort it out, I can do a PR with working code for C# on some of the extensions
The text was updated successfully, but these errors were encountered:
MithrilMan
changed the title
poor c# documentation
poor c# documentation, need help on implementing some py examples
Jun 6, 2024
The Python code creates the ONNX node proto and then runs a test on that by wrapping it in a graph proto, model proto and then finally passing it into ORT. You'd need to write out a node proto in C# into an ONNX file and then load it in. Or you can write the tokenizer out in Python and use it from C#. ML.net has examples of writing ONNX files from C#.
Hello, I'm looking for an example, in C#, about how to use SentencePieceTokenizer.
I've seen there is a py example here https://github.com/microsoft/onnxruntime-extensions/blob/main/docs/custom_ops.md#sentencepiecetokenizer
But I haven't managed to find a proper way to convert that code into a c# implementation (I'm not even sure that py code is working to be honest)
example py code is this
How can this code be translated into C#?
Beside the fetching of the model, how does the
onnx.helper.make_node
translates to?I tried with something like
but it fails with
Microsoft.ML.OnnxRuntime.OnnxRuntimeException: '[ErrorCode:InvalidArgument] No graph was found in the protobuf.'
How am I supposed to load the
SentencepieceTokenizer
in C#?How am I supposed to create that node?
Thanks
P.S.
If you help me sort it out, I can do a PR with working code for C# on some of the extensions
The text was updated successfully, but these errors were encountered: