This is a prototype of backend code generator using LLM.
Use Ollama to run LLM locally.
And provides a web interface for easy use.
- This is a prototype.
- Generated code is may not work very nice.
- Install Ollama
ollama pull codellama
git clone https://github.com/cloudmatelabs/backend-generator-ai.git
cd backend-generator-ai
bun install
bun run dev
and open http://localhost:5173
If you not want to use bun, you can use npm
or yarn
or pnpm
.
- Fill in the form and click "Generate" button.
- Wait for a while and you will see the generated code.