This is the repository for the LinkedIn Learning course AI Memory: Exploring and Building LLM Memory Systems
. The full course is available from LinkedIn Learning.
AI systems like ChatGPT appear to have memory, but language models can’t learn anything new, so what’s going on? In this course, we’ll explore how systems using language models mimic memory to create consistent conversations. Starting with the ChatGPT memory feature and ending with a custom-built app with memory management, we’ll explore everything from context windows and token limits to assistant thread management and user-centered AI memory design.
This repository has two folders:
./memory-demos/
./better-chat/
./memory-demos/
contains a Next.js web app with four demonstrations of different aspects of LLM memory. The app is provided for demonstration purposes and is used in the course.
To use the Memory Demos app in GitHub Codespaces:
- Navigate to the
./memory-demos/
folder:
cd memory-demos
- Run the app in development mode:
npm run dev
To use the app in a local environment, follow the instructions in ./memory-demos/README.md
.
./better-chat/
contains a Next.js-based AI chat app. As you follow through the videos in the chapter, you'll learn about each feature in the app and how to implement them.
To use the Better Chat app in GitHub Codespaces:
- Navigate to the
./better-chat/
folder:
cd better-chat
- Run the app in development mode:
npm run dev
To use the app in a local environment, follow the instructions in ./better-chat/README.md
.
Morten Rand-Hendriksen
Senior Staff Instructor, Speaker, Web Designer, and Software Developer
Check out my other courses on LinkedIn Learning.