Becoming An Ai Engineer With Langchain
Published 1/2025
MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
Language: English | Size: 615.94 MB | Duration: 1h 41m
Published 1/2025
MP4 | Video: h264, 1920x1080 | Audio: AAC, 44.1 KHz
Language: English | Size: 615.94 MB | Duration: 1h 41m
Develop your Generative AI Application with LangChain
What you'll learn
Learn to use LangChain to develop generative AI applications
Learn to use LangChain and its platforms to develop RAG applications
Learn to use LangChain and LangGraph to develop LLM agents
Learn the fundamental of LLM application development and prompting techniques
Requirements
Have a basic understanding to Python programming
Have an account to OpenAI API and its API Key
Have an account to Anthropic API and its API Key
A free or premium LangSmith account
Description
Becoming an AI Engineer with LangChainAbout the Course "Becoming an AI Engineer with LangChain" is a hands-on course designed to provide a thorough understanding of LangChain, a robust framework for developing applications with large language models (LLMs). Led by Mark Chen, founder of Mindify AI, this course is crafted to take you from the basics of generative AI to advanced LangChain components and integrations. By the end, you’ll have practical experience building applications that use LangChain to streamline data handling, model interactions, and AI deployment processes.About the Instructor Mark Chen, the founder of Mindify AI, is an experienced AI engineer and entrepreneur dedicated to creating generative AI solutions. His expertise spans building LLM-driven applications, developing AI agent-based applications, and navigating the LangChain framework. Mark’s background in developing real-world AI applications gives this course a unique, practical focus that combines foundational knowledge with insights from the cutting edge of AI technology.Course Outline - Chapter 1: Introduction to Generative AI and LangChain - Chapter 2: Working with LLMs – From Embedding to Chat Models - Chapter 3: Document Handling – Using Document Loaders in LangChain - Chapter 4: Data Storage – Vector Data Stores and Context Retrieval - Chapter 5: Essential Tools – LangChain Tooling and Code Integration - Chapter 6: Agents and Decision-Making – LangGraph Agent Applications - Chapter 7: LangChain on Platforms – Integrating LLMs across platforms - Chapter 8: Building Applications – LangChain APIs for Chatbots, RAG, and Agentic Models What Will You Learn from This Course Understand the Architecture of LangChain: Get familiar with its structure, components, and modular integrations. - Master Prompt Engineering: Learn zero-shot, few-shot, and chain-of-thought prompting to improve model accuracy and utility. - Implement Real-World Applications: Create LLM applications that handle documents, search data, and interact through custom agents. - Build and Deploy AI Models: Learn how to utilize LangChain’s APIs for chat models, data stores, and agents in deployable applications.Who Will Be Suitable for This Course This course is ideal for: - Aspiring AI Engineers and Developers who want hands-on experience with LLM-driven applications. - Software Engineers interested in transitioning to AI by building practical applications with a comprehensive framework. - Tech Enthusiasts and Researchers looking to deepen their understanding of generative AI and LangChain’s framework. - Anyone interested in AI development who wants to leverage the power of LLMs and AI agents to build robust, scalable applications. Take this course to kickstart your journey as an AI engineer and gain the skills to create real-world applications that push the boundaries of what AI can achieve.
Overview
Section 1: Part 0 - Course Introduction / Overview
Lecture 1 Introduction to the Course
Lecture 2 Setting up Cloud Development Environment (CDE) with GitHub Codespace
Lecture 3 Python Environment Set-up for LangChain
Lecture 4 Setting up your OpenAI API
Lecture 5 Course Materials and Supplement Materials
Lecture 6 About the Instructor - Mark Chen
Section 2: Part 1 - Introduction to Generative AI and LangChain
Lecture 7 What is Generative AI?
Lecture 8 What is Large-Language Model (LLM)?
Lecture 9 What is Prompt Engineering?
Lecture 10 What is LangChain?
Lecture 11 Section 1 Summary
Section 3: Part 2 - Chat and Embedding Models
Lecture 12 OpenAI Embedding Models
Lecture 13 OpenAI Chat Models
Lecture 14 Anthropic Chat Models
Lecture 15 Section 2 Summary
Section 4: Part 3 - Documents and Loaders
Lecture 16 LangChain Document and Document Loaders - PDF
Lecture 17 LangChain Markdown Loader
Lecture 18 LangChain HTML Loader
Lecture 19 LangChain JSON and CSV Loaders
Lecture 20 Section 3 Summary
Section 5: Part 4 - Data Stores
Lecture 21 Introduction to Embedding and Vector Search
Lecture 22 Chroma Vector Store
Lecture 23 Pg-Vector Vector Store
Lecture 24 Milvus Vector Store
Lecture 25 Section 4 Summary
Section 6: Part 5 - Tools
Lecture 26 Brave Search
Lecture 27 Rize.io Code Interpreter
Lecture 28 Bash Shell
Lecture 29 Section 5 Summary
Section 7: Part 6 - Agents
Lecture 30 Introduction to LLM and AI Agents
Lecture 31 Agent Architecture - ReAct
Lecture 32 Agent Architecture - Reflection
Lecture 33 Agent Architecture - Plan and Solve (Execute)
Lecture 34 Agent Architecture - Multi-agent System
Lecture 35 Section 6 Summary
Section 8: Part 7 - Platforms
Lecture 36 LLM Application Observability and Evaluation
Lecture 37 Introduction to LangSmith and Tracing
Lecture 38 Introduction to LangChain Chat
Lecture 39 Section 7 Summary
Section 9: Part 8 - Applications / Summary
Lecture 40 Introduction to Context-aware AI Applications
Lecture 41 Naive Retrieval-Augmented Generation (RAG) Application
Lecture 42 Agentic Retrieval-Augmented Generation (RAG) Application
Lecture 43 Course Summary
Lecture 44 Future Learning
People with needed to develop context-aware AI application,Computer science students,People with deep interests in generative AI,People who wants to become an AI engineer in the future