Building Neo4j-Powered Applications with LLMs (True PDF)
English | 2025 | ISBN: 1836206232 | 312 pages | True PDF | 5.94 MB
A comprehensive guide to building cutting-edge generative AI applications using Neo4j's knowledge graphs and vector search capabilities
Key Features
- Design vector search and recommendation systems with LLMs using Neo4j GenAI, Haystack, Spring AI, and LangChain4j
- Apply best practices for graph exploration, modeling, reasoning, and performance optimization
- Build and consume Neo4j knowledge graphs and deploy your GenAI apps to Google Cloud
- Purchase of the print or Kindle book includes a free PDF eBook
Book Description
Embark on an expert-led journey into building LLM-powered applications using Retrieval-Augmented Generation (RAG) and Neo4j knowledge graphs. Written by Ravindranatha Anthapu, Principal Consultant at Neo4j, and Siddhant Agrawal, a Google Developer Expert in GenAI, this comprehensive guide is your starting point for exploring alternatives to LangChain, covering frameworks such as Haystack, Spring AI, and LangChain4j.
As LLMs (large language models) reshape how businesses interact with customers, this book helps you develop intelligent applications using RAG architecture and knowledge graphs, with a strong focus on overcoming one of AI's most persistent challenges-mitigating hallucinations. You'll learn how to model and construct Neo4j knowledge graphs with Cypher to enhance the accuracy and relevance of LLM responses.
Through real-world use cases like vector-powered search and personalized recommendations, the authors help you build hands-on experience with Neo4j GenAI integrations across Haystack and Spring AI. With access to a companion GitHub repository, you'll work through code-heavy examples to confidently build and deploy GenAI apps on Google Cloud.
By the end of this book, you'll have the skills to ground LLMs with RAG and Neo4j, optimize graph performance, and strategically select the right cloud platform for your GenAI applications.