Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use OLLAMA_HOST when connecting? #150

Open
mhdawson opened this issue Oct 15, 2024 · 3 comments
Open

use OLLAMA_HOST when connecting? #150

mhdawson opened this issue Oct 15, 2024 · 3 comments
Assignees

Comments

@mhdawson
Copy link
Contributor

I'm wondering if its been discussed if using OLLAMA_HOST on the client side makes sense.

There have been a number of instances where I've had to figure out how to pass through the server IP for my remote system running ollama in libraries that use ollama-js under the covers. For example - i-am-bee/bee-agent-framework#83

In this cases it would be easier if we could specify OLLAMA_HOST or something only used at the client like OLLAMA_SERVER without having to figure out how to get it down to the client through the libraries APIs.

It that makes sense I'd be happy to investigate and open a PR but I want to make sure it's something that would be acceptable before investing the time to do that.

@mhdawson
Copy link
Contributor Author

I'll add that even on the same host I had an issue where I had set OLLAMA_HOST to something that did not include localhost (ie not 0.0.0.0) and then had to pass in the host even though I was running on the same machine. In that case have the client support OLLAMA_HOST would have made it more straight forward.

@ParthSareen ParthSareen self-assigned this Nov 7, 2024
@nise
Copy link

nise commented Dec 11, 2024

Where do you specify the host on client side? I am trying to use an OpenWebUI instance on a remote server which seems to be impossible.

@BruceMacD
Copy link
Collaborator

@nise Ollama uses an OLLAMA_HOST environment variable. Depending on your scenario, try something like this:

import { Ollama } from 'ollama'

const ollama = new Ollama({ 
  host: process.env.OLLAMA_HOST || 'http://127.0.0.1:11434'
})

const response = await ollama.chat({
  model: 'llama3.1',
  messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})

Or in open-webui:
https://docs.openwebui.com/getting-started/advanced-topics/env-configuration#ollama

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants