Ollama Crash Course: Build Local LLM powered Apps
English | 2025 | ISBN: B0DXB4SWG9 | Pages: 103 | Epub | 12.12 MB
English | 2025 | ISBN: B0DXB4SWG9 | Pages: 103 | Epub | 12.12 MB
In this book, we take you on a fun, hands-on and pragmatic journey to learning Ollama. You'll start building your first Ollama app within minutes. Every chapter is written in a bite-sized manner and straight to the point as I don’t want to waste your time (and most certainly mine) on the content you don't need.
In the course of this book, we will cover:
Chapter 1 - Introduction
Chapter 2 - Ollama Setup
Chapter 3 - How to Run Different Ollama Models Locally
Chapter 4 - Customizing Models with the Modelfile
Chapter 5 - Ollama REST API
Chapter 6 – Interact with Ollama Models with Msty – UI-Based Tool for RAG
Chapter 7 - Introduction to Python Library for Building LLM Applications Locally
Chapter 8 - Build a Real-World Use Case Application – Introduction
Chapter 9 - Overview of RAG Systems with Ollama and Langchain Crash Course
Chapter 10 - Uploading Custom Documents
Chapter 11 - Loading Different File Types
The goal of this book is to teach you Ollama in a manageable way without overwhelming you. We focus only on the essentials and cover the material in a hands-on practice manner for you to code along.
Working Through This Book
This book is purposely broken down into short chapters where the development process of each chapter will center on different essential topics. The book takes a practical hands on approach to learning through practice. You learn best when you code along with the examples in the book.