Skip to content

Latest commit

 

History

History
70 lines (40 loc) · 2.42 KB

File metadata and controls

70 lines (40 loc) · 2.42 KB

AI Memory: Exploring and Building LLM Memory Systems

This is the repository for the LinkedIn Learning course AI Memory: Exploring and Building LLM Memory Systems. The full course is available from LinkedIn Learning.

lil-thumbnail-url

Course Description

AI systems like ChatGPT appear to have memory, but language models can’t learn anything new, so what’s going on? In this course, we’ll explore how systems using language models mimic memory to create consistent conversations. Starting with the ChatGPT memory feature and ending with a custom-built app with memory management, we’ll explore everything from context windows and token limits to assistant thread management and user-centered AI memory design.

Instructions

This repository has two folders:

  • ./memory-demos/
  • ./better-chat/

Memory Demos (chapter 1)

./memory-demos/ contains a Next.js web app with four demonstrations of different aspects of LLM memory. The app is provided for demonstration purposes and is used in the course.

To use the Memory Demos app in GitHub Codespaces:

  1. Navigate to the ./memory-demos/ folder:
cd memory-demos
  1. Run the app in development mode:
npm run dev

To use the app in a local environment, follow the instructions in ./memory-demos/README.md.

Better Chat (chapter 2)

./better-chat/ contains a Next.js-based AI chat app. As you follow through the videos in the chapter, you'll learn about each feature in the app and how to implement them.

To use the Better Chat app in GitHub Codespaces:

  1. Navigate to the ./better-chat/ folder:
cd better-chat
  1. Run the app in development mode:
npm run dev

To use the app in a local environment, follow the instructions in ./better-chat/README.md.

Instructor

Morten Rand-Hendriksen

Senior Staff Instructor, Speaker, Web Designer, and Software Developer

Check out my other courses on LinkedIn Learning.