Using Fabric with Openrouter.ai? #495
-
Is this possible? It would be amazing to instantly connect up with the whole list of models like Llama, Mistral, Phi: Thank you |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments
-
Hi, I've been using openrouter successfully via this method (if I'm remembering my process correctly). All that was necessary was to input the relevant information into the A few notes:
Example ~/.config/fabric/.envDEFAULT_MODEL=meta-llama/llama-3-8b-instruct:free
OPENAI_BASE_URL=https://openrouter.ai/api/v1
OPENAI_API_KEY=sk-or-v1-yourOpenrouterKeyHere |
Beta Was this translation helpful? Give feedback.
-
Thank you! That worked great |
Beta Was this translation helpful? Give feedback.
-
After the rewrite in golang, I do not think this works anymore. |
Beta Was this translation helpful? Give feedback.
-
Hello, I added OpenRouter as a new vendor, because I use it too ;) |
Beta Was this translation helpful? Give feedback.
Hi, I've been using openrouter successfully via this method (if I'm remembering my process correctly). All that was necessary was to input the relevant information into the
~/.config/fabric/.env
file.A few notes:
OPENAI_API_KEY
takes an openrouter key you generate on the openrouter account keys page.meta-llama/llama-3-70b-instruct
). This value can be used as I have it here in the .env asDEFAULT_MODEL
or you can pass it to fabric via the-m
arg (fabric -m meta-llama/llama…