Skip to content

Latest commit

 

History

History
99 lines (74 loc) · 4.96 KB

README.md

File metadata and controls

99 lines (74 loc) · 4.96 KB

AI Swiss Legal Assistant 🇨🇭 👩‍⚖️ 🤖

This is a simple conversational-ui RAG (retrieval augmented generation) based on the Swiss Code of Obligations.

It was created a starting point of the Ginetta Challenge at the women++ Hack'n'Lead hackathon November 2023

🙋‍♀️ Team

Ana R Correia
Karin Sim
Sanaz Reinhardt
Sirinya Richardson
Yaiza Aragonés-Soria

👩‍💻 Contributions

We improved the initial chatbot by introducing the following functionalitie:

  1. User is asked three onboarding questions (language, familiarity with law and location) such that the answers of the chatbot are tailored to their profile.
  2. User can interact with the chatbot in English, German or French.
  3. Design of new user interface.
Emilie.2.mp4

Home page Onboarding - Step 1 Onboarding - Step 2 Onboarding - Step 3 Chat- Question Chat - Full Response

ℹ️ Instructions

  1. Use this repository as a template (or Fork it)
  2. Add your team members as contributors
  3. Put your presentation in the docs/ folder
  4. This repository must be open source (and licensed) in order to submit
  5. Add the tag hack-n-lead to the repo description

▶️ Setup

There is two different ways to setup this project:

  1. Install Ollama & Qdrant locally (Ollama desktop app is currently is only available for Mac and Linux) - Ollama will take advantage of your GPU to run the model
  2. Use the Docker Compose file to run Ollama & Qdrant in containers (just run in a terminal in the project directory) - easier setup, but Ollama will run on CPU

Option 1: 🐳 Run Docker Compose

  1. docker compose up -d to pull & run the containers
  2. docker compose exec ollama ollama run mistral to download & install the mistral model

Option 2: 🖐🏼 Manual installation

  1. 🦙 Download Ollama and install it locally
  2. ollama run mistral to download and install the model locally (Requires 4.1GB and 8GB of RAM)
  3. Open http://localhost:11434 to check if Ollama is running
  4. docker pull qdrant/qdrant
  5. docker run -p 6333:6333 qdrant/qdrant

Both Option 1 and 2 continue with the following setup:

💾 Setup Qdrant Vector Database

  1. Open the Qdrant dashboard console http://localhost:6333/dashboard#/console
  2. Create a new collection running this:
    PUT collections/swiss-or
    {
      "vectors": {
        "size": 384,
        "distance": "Cosine"
      }
    }
    
  3. Download the snapshot file
  4. Unzip the file using the terminal (⚠️ not with Finder on Mac ⚠️) with unzip <file_name>
  5. Upload the file using the following command. Adapt the fields accordingly and run it from the same directory, as where your snapshot lies
curl -X POST 'http://localhost:6333/collections/swiss-or/snapshots/upload' \
    -H 'Content-Type:multipart/form-data' \
    -F 'snapshot=@swiss-code-of-obligations-articles-gte-small-2023-10-18-12-13-25.snapshot'

👩🏽‍💻 Run the App

  1. Copy the file .env.local.example in the project and rename it to .env. Verify if all environment variables are correct
  2. yarn install to install the required dependencies
  3. yarn dev to launch the development server
  4. Go to http://localhost:3000 and try out the app

👩🏽‍🏫 Learn More

To learn more about LangChain, OpenAI, Next.js, and the Vercel AI SDK take a look at the following resources: