-
Questions: 1 - How to force the local-ai community container to use a nvidia GPU-enabled tag like: 2 - For the Nvidia GPU to be able to be used by a container based on the image-based How to configure the Nextcloud AIO to include the https://github.com/mudler/LocalAI/blob/master/docs/content/docs/features/GPU-acceleration.md Note: v2.9.0-cublas-cuda12-ffmpeg tag has more than 16GB in size |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Hi, in order to make this work, first a capability for AIO would need to be added to add all gpus to a container. See #1659 as inspiration. When that is done, one (you?) could add an additional community container, e.g. name it local-ai-gpu or something that itself is built from |
Beta Was this translation helpful? Give feedback.
-
Nvidia gpu passthrough is now being worked on in #5132 |
Beta Was this translation helpful? Give feedback.
Nvidia gpu passthrough is now being worked on in #5132