-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bitnami/elasticsearch] No processor type exists with name [inference] #72437
Comments
Hi @alimoezzi, thank you for opening this issue. Can you describe and share how are you running the elasticsearch stack? Are you using the Helm Chart? Do you run the containers directly? Please, give us some more details so we can try to reproduce the issue on our side. |
I have tested both containers and helm chart and none of them works. |
To reproduce the issue, you need to create a pipeline with inference Processor. Also, this processor is missing in the cluster info. |
I think it is related to disabled ML by default in both container and helm chart. |
Hi @alimoezzi, Sorry for the delay on getting back to you. Can you provide us with more information on how to reproduce the issue and how to enable the ML features so we can test it on our side? According to the official docs linked below you have to download and install your own ML model into the stack first and then create the inference type processor using the installed model. Did you find any error following those steps? I have found the "Inference" processor type under the "Stack Management" > "Ingest pipelines" > "Create pipeline" > "Add a processor" menu |
Exactly, at this stage continue by clicking 'Add Processor' and then create the pipeline! Then you will get the same error. |
This feature is not enabled by adding a model. Instead by enabling feature flag |
@alimoezzi thanks for the detailed information. I have found a related configuration in our initialization logic containers/bitnami/elasticsearch/7/debian-12/rootfs/opt/bitnami/scripts/libelasticsearch.sh Line 753 in 2de4377
According to it, we found some issues while enabling it by default in some scenarios, but you can customize the setting on your side via an initialization script https://github.com/bitnami/containers/tree/main/bitnami/elasticsearch#initializing-a-new-instance |
The bigger problem is bitnami build has corrupted xpack-ml. Please enable the option and you see the instance throws error. |
@alimoezzi thanks for your message. I have created an internal task to further investigate this issue. We will keep you posted. |
Name and Version
bitnami/elasticsearch:8.15.0
What architecture are you using?
amd64
What steps will reproduce the bug?
Inference processor although it's available in kibana but upon creating an ingest pipeline with inference processor results in the following error:
This processor works with no problem with docker.elastic.co/elasticsearch/elasticsearch:8.15.0
What is the expected behavior?
Bitnami build should include this processor as the official image also contains it both in kibana and elasticsearch.
What do you see instead?
Kibana includes the option but upon saving this error is raised:
Additional information
Checked 8.15.1 and issue still persists.
The text was updated successfully, but these errors were encountered: