title | description | image |
---|---|---|
Quick Start |
Getting started with LangStream on Mac |
/images/chat-with-bot-web-ui.gif |
Getting started on Mac with LangStream is easy. You need to have Java, Homebrew and Docker installed. Then follow these steps:
brew install LangStream/langstream/langstream
export OPEN_AI_ACCESS_KEY=your-key-here
langstream docker run test \
-app https://github.com/LangStream/langstream/blob/main/examples/applications/openai-completions \
-s https://github.com/LangStream/langstream/blob/main/examples/secrets/secrets.yaml
This will download the openai-completions
example from the LangStream repository and run it using the default secrets file, which expects the secrets in environment variables. The application provides a complete backend for a chatbot that uses OpenAI's GPT-3.5-turbo to generate responses. It includes a WebSocket gateway for easy access to the streaming AI agents from application environments including web browsers.
After LangStream intializes, it automatically opens the test UI in your brower. Click on the Connect button to connect to the gateway.
Alternatively, you can use the text-based client to connect to the gateway. Run the following command to in a different terminal to start the client:
langstream gateway chat test -cg consume-output -pg produce-input -p sessionId=$(uuidgen)
The gateway supports multiple concurrent users by specifying a unique sessionId
for each.
Type a question in the Message box and hit Enter. The response will be streamed in real-time as it is generated by the LLM.