A simple Python script to download multiple Ollama models sequentially. Set and forget. Helpful when you're internet can't support multiple concurrent downloads, or when downloading concurrently causes the downloads to fail.
- Python 3.x
- Ollama installed and working on your system (Install Ollama)
- Download the
ollamaDownloader.py
script - Ensure you have Ollama installed and working (
ollama --version
should work in your terminal)
Run this command in the terminal/powershell with a comma-separated list of model names:
python ollamaDownloader.py "model1,model2,model3"
Example:
python ollamaDownloader.py "phi3:14b-medium-128k-instruct-q5_0,qwen2.5-coder:14b-instruct-q4_0"
The script will:
- Download each model sequentially
- Show download progress in real-time
- Wait 5 seconds between downloads
- Report success or failure for each model
This script was generated by AI (Claude 3 Sonnet). I use it all the time, works fine.
MIT