Skip to content

longevity-genie/fine-tuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

InstructLab Fine-tuning Guide

A quick setup guide for fine-tuning models with InstructLab.

Prerequisites

  • micromamba or conda
  • CUDA-capable GPU (recommended)
  • At least 16GB RAM

Environment Setup

  1. Create and activate the environment (with micromamba or conda):
micromamba create -f environment_cuda.yaml
micromamba activate instructlab

# Verify installation
which ilab
  1. Initialize InstructLab configuration:
ilab config init

Configuration Tips

When running ilab config init:

  • Set the taxonomy path to ./instructlab/taxonomy within your project directory
  • Choose the appropriate system profile based on your hardware (NVIDIA, AMD, etc.)
  • Specify the path to your model file

Required Models

You'll need two models for fine-tuning:

  1. Base Model (to be fine-tuned):

  2. Training Assistant Model:

Usage

  1. Test model serving:
ilab model serve --model-path ./models/granite-7b-lab.Q4_K_M.gguf
  1. Validate taxonomy data:
ilab taxonomy diff

Web Interface

For a graphical interface, follow the Playground Chat setup instructions

To set it up locally:

  1. Install pnpm, npm or deno

  2. Clone and run the UI:

git clone https://github.com/instructlab/ui
cd ui
npm install
npm run dev

Additional Resources

About

Instruct lab fine-tuning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published