[Bug]: Global (and local) encoding_model option is being ignored #1499
Labels
awaiting_response
Maintainers or community have suggested solutions or requested info, awaiting filer response
bug
Something isn't working
triage
Default label assignment, indicates new issue needs reviewed by a maintainer
Do you need to file an issue?
Describe the bug
The global
encoding_model
option is ignored when constructing the LLMParameters for the root llm and all sub llm options. In fact this option is even ignored if it is set explicitly for each section. This leads to the wrong tokenizer being used while indexing.Possible fix: The create_graphrag_config function needs to set all encoding_model (local > global > default) options when initializing LLMParameters models.
Steps to reproduce
Init a new project via the latest (0.9.0) graphrag version, edit the global encoding_model option in the settings.yaml, run the index command. Have a look at the config dump in the console. All (except the global) encoding_model options will have the default "cl100k_base" entry.
Expected Behavior
The global option should overwrite the default one if set.
GraphRAG Config Used
Logs and screenshots
No response
Additional Information
The text was updated successfully, but these errors were encountered: