-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
New version 6.0.4
which updates Semantic Kernel to 1.0.0-beta8
#15
Conversation
…iew" />` to version `1.0.0-beta8`. - Version of ENMARCHA changes from `6.0.3.20` to `6.0.4.0` - Added a CHANGELOG.md - Removed `NotSemanticFunction` exception message since now `Semantic Kernel` does not longer differentiates between Semantic and Native functions. Also the `ArgumentException` is not thrown anymore. This affects the following methods: * GetSemanticFunctionPromptAsync * GetSemanticFunctionUsedTokensAsync - Extension method `ImportQuestionAnsweringPlugin` from `Encamina.Enmarcha.SemanticKernel.Plugins.QuestionAnswering` now expects an `ISemanticTextMemory` as part of the required services. - Extension method `ImportMemoryPlugin` from `Encamina.Enmarcha.SemanticKernel.Plugins.Memory` now expects a valid instance of `ISemanticTextMemory` as input parameter. This breaks the previous signature of the method. - The constructor of the `MemoryQueryPlugin` class now expects a valid instance of `ISemanticTextMemory` instead of `IKernel`. This breaks the previous signature of the class constructor. - Due to changes form Semantic Kernel, the class `ChatRequestSettings` is replaced by `OpenAIRequestSettings` which breaks the previous signature of the `ChatRequestSettings` property of the `ChatWithHistoryPluginOptions` class in `Encamina.Enmarcha.SemantucKernel.Plugins.Chat`. - The `LengthByTokenCount` function from `ILengthFunctions` mixin interface in `Encamina.Enmarcha.SemanticKernel.Abstractions` now uses `SharpToken` instead of the `GPT3Tokenizer` class which has been removed from Semantic Kernel. - Added a new length by tokens function called `LengthByTokenCountUsingEncoding` in the `ILengthFunctions` mixin interface. - Removed extension method `ValidateAndThrowIfErrorOccurred` since the properties `ErrorOccurred` and `LastException` are removed from `SKContext` in the new version of Semantic Kernel. This is a breaking change. - Some boy scouting.
…g NuGet packages of these projects.
…SemanticKernel.Connectors.Memory` to add a semantic text memory (i.e., `ISemanticTextMemory`) to the dependency container.
…from `Encamina.Enmarcha.SemanticKernel` to `Encamina.Enmarcha.SemanticKernel.Abstractions`.
6.0.4
which updates Semantic Kernel to 1.0.0-beta8
src/Encamina.Enmarcha.SemanticKernel.Connectors.Document/Connectors/TxtDocumentConnector.cs
Outdated
Show resolved
Hide resolved
...ncamina.Enmarcha.SemanticKernel.Connectors.Memory/Extensions/IServiceCollectionExtensions.cs
Outdated
Show resolved
Hide resolved
src/Encamina.Enmarcha.SemanticKernel/Extensions/IKernelExtensions.cs
Outdated
Show resolved
Hide resolved
…rcha.SemanticKernel.Connectors.Document`.
...s/SemanticKernel/Encamina.Enmarcha.Samples.SemanticKernel.QuestionAnswering/appsettings.json
Outdated
Show resolved
Hide resolved
/// </summary> | ||
public static Func<string, int> LengthByTokenCount => (text) => string.IsNullOrEmpty(text) ? 0 : GPT3Tokenizer.Encode(text).Count; | ||
/// <seealso href="https://platform.openai.com/tokenizer"/> | ||
public static Func<string, int> LengthByTokenCount => (text) => string.IsNullOrEmpty(text) ? 0 : GptEncoding.GetEncoding("cl100k_base").Encode(text).Count; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'll leave you a URL that can help you understand the token model. https://github.com/openai/openai-cookbook/blob/main/examples/How_to_count_tokens_with_tiktoken.ipynb
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good idea. I added it as another seealso
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done!
…ernel.QuestionAnswering` sample.
…ons` on `Encamina.Enmarcha.SemanticKernel.Abstractions`.
0.20.230821.4-preview
to version1.0.0-beta8
.6.0.3.20
to6.0.4.0
NotSemanticFunction
exception message since nowSemantic Kernel
does not longer differentiates between Semantic and Native functions. Also theArgumentException
is not thrown anymore. This affects the following methods:ImportQuestionAnsweringPlugin
fromEncamina.Enmarcha.SemanticKernel.Plugins.QuestionAnswering
now expects anISemanticTextMemory
as part of the required services.ImportMemoryPlugin
fromEncamina.Enmarcha.SemanticKernel.Plugins.Memory
now expects a valid instance ofISemanticTextMemory
as input parameter. This breaks the previous signature of the method.MemoryQueryPlugin
class now expects a valid instance ofISemanticTextMemory
instead ofIKernel
. This breaks the previous signature of the class constructor.ChatRequestSettings
is replaced byOpenAIRequestSettings
which breaks the previous signature of theChatRequestSettings
property of theChatWithHistoryPluginOptions
class inEncamina.Enmarcha.SemantucKernel.Plugins.Chat
.LengthByTokenCount
function fromILengthFunctions
mixin interface inEncamina.Enmarcha.SemanticKernel.Abstractions
now usesSharpToken
instead of theGPT3Tokenizer
class which has been removed from Semantic Kernel.LengthByTokenCountUsingEncoding
in theILengthFunctions
mixin interface.ValidateAndThrowIfErrorOccurred
since the propertiesErrorOccurred
andLastException
are removed fromSKContext
in the new version of Semantic Kernel. This is a breaking change.Encamina.Enmarcha.SemanticKernel.Connectors.Memory
to add a semantic text memory (i.e.,ISemanticTextMemory
) to the dependency container.