AI Memory: Exploring and Building LLM Memory Systems
.MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 53m | 163 MB
Instructor: Morten Rand-Hendriksen
.MP4, AVC, 1280x720, 30 fps | English, AAC, 2 Ch | 53m | 163 MB
Instructor: Morten Rand-Hendriksen
AI systems like ChatGPT appear to have memory, but language models can’t learn anything new, so what’s going on? In this course, we’ll explore how systems using language models mimic memory to create consistent conversations. Starting with the ChatGPT memory feature and ending with a custom-built app with memory management, we’ll explore everything from context windows and token limits to assistant thread management and user-centered AI memory design.
Learning objectives
- Identify why language models have no ability to remember information.
- Explain how AI chat apps mimic memory through custom data storage and retrieval.
- Produce API-powered apps with editable memory storage.
- Outline how token limits and context windows impact AI memory.
- Discover how to build a user-centric chat memory with editable features.