Skip to content

Using Fabric with Openrouter.ai? #495

Closed Answered by Gerkinfeltser
exponentiallyio asked this question in Q&A
Discussion options

You must be logged in to vote

Hi, I've been using openrouter successfully via this method (if I'm remembering my process correctly). All that was necessary was to input the relevant information into the ~/.config/fabric/.env file.

A few notes:

  • The OPENAI_API_KEY takes an openrouter key you generate on the openrouter account keys page.
  • Correct model names can be grabbed from the openrouter model pages. For instance, on the llama-3-70b-instruct page, you can get the name by clicking the little clipboard icon under the model title (which gives you meta-llama/llama-3-70b-instruct). This value can be used as I have it here in the .env as DEFAULT_MODEL or you can pass it to fabric via the -m arg (fabric -m meta-llama/llama…

Replies: 4 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by exponentiallyio
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
4 participants