├── .gitignore
├── README.md
└── books
├── ai-engineering.jpeg
├── ai-engineering.md
├── build-a-large-language-model.jpeg
├── build-a-large-language-model.md
├── build-llm-applications.jpeg
├── build-llm-applications.md
├── building-llm-powered-applications.jpeg
├── building-llm-powered-applications.md
├── building-llms-for-production.jpeg
├── building-llms-for-production.md
├── cover.png
├── creating-production-ready-llms.jpeg
├── creating-production-ready-llms.md
├── developing-apps-with-gpt-4-and-chatgpt.jpeg
├── developing-apps-with-gpt-4-and-chatgpt.md
├── generative-ai-on-aws.md
├── generative-ai-on-aws.png
├── generative-ai-with-langchain.jpeg
├── generative-ai-with-langchain.md
├── hands-on-large-language-models.jpeg
├── hands-on-large-language-models.md
├── langchain-crash-course.jpeg
├── langchain-crash-course.md
├── large-language-models.jpeg
├── large-language-models.md
├── llm-engineer's-handbook.jpeg
├── llm-engineer's-handbook.md
├── llms-in-production.jpeg
├── llms-in-production.md
├── natural-language-processing-with-transformers.jpeg
├── natural-language-processing-with-transformers.md
├── prompt-engineering-for-generative-ai.md
├── prompt-engineering-for-generative-ai.png
├── prompt-engineering-for-llms.jpeg
├── prompt-engineering-for-llms.md
├── quick-start-guide-to-large-language-models.jpeg
├── quick-start-guide-to-large-language-models.md
├── rag-driven-generative-ai.jpeg
├── rag-driven-generative-ai.md
├── super-study-guide.jpeg
├── super-study-guide.md
├── the-developer's-playbook-for-large-language-model-security.jpeg
├── the-developer's-playbook-for-large-language-model-security.md
├── what-is-chatgpt-doing....jpeg
└── what-is-chatgpt-doing....md
/.gitignore:
--------------------------------------------------------------------------------
1 | .DS_Store
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Awesome LLM Books
2 |
3 | Some of us learn best by reading high quality books on technical topics.
4 |
5 | This is a **curated list** of **books** for **engineers** on **development** with **Large Language Models** (LLMs).
6 |
7 | ## Books:
8 |
9 | Alphabetical list of books on LLMs. Each cover/title links to more information about the book.
10 |
11 | | Cover | Details |
12 | |-------|---------|
13 | | [](books/ai-engineering.md) | [AI Engineering](books/ai-engineering.md)
**Subtitle**: Building Applications with Foundation Models
**Authors**: Chip Huyen
**Publisher**: O'Reilly, 2025
**Star Rating**: 4.6 on Amazon, 4.56 on Goodreads
**Links**: [Amazon](https://a.co/d/hAI9OXl), [Goodreads](https://www.goodreads.com/book/show/216848047-ai-engineering), [Publisher](https://www.oreilly.com/library/view/ai-engineering/9781098166298/), [GitHub Project](https://github.com/chiphuyen/aie-book) |
14 | | [](books/build-a-large-language-model.md) | [Build a Large Language Model](books/build-a-large-language-model.md)
**Subtitle**: (From Scratch)
**Authors**: Sebastian Raschka
**Publisher**: Manning, 2024
**Star Rating**: 4.7 on Amazon, 4.64 on Goodreads
**Links**: [Amazon](https://a.co/d/bXGGLyC), [Goodreads](https://www.goodreads.com/book/show/219388329-build-a-large-language-model), [Publisher](https://www.manning.com/books/build-a-large-language-model-from-scratch), [GitHub Project](https://github.com/rasbt/LLMs-from-scratch) |
15 | | [](books/build-llm-applications.md) | [Build LLM Applications](books/build-llm-applications.md)
**Subtitle**: (from Scratch)
**Authors**: Hamza Farooq
**Publisher**: Manning, 2025
**Links**: [Publisher](https://www.manning.com/books/build-llm-applications-from-scratch) |
16 | | [](books/building-llm-powered-applications.md) | [Building LLM Powered Applications](books/building-llm-powered-applications.md)
**Subtitle**: Create intelligent apps and agents with large language models
**Authors**: Valentina Alto
**Publisher**: Packt, 2024
**Star Rating**: 4.6 on Amazon, 3.35 on Goodreads
**Links**: [Amazon](https://a.co/d/e6rt1da), [Goodreads](https://www.goodreads.com/book/show/201054993-building-llm-powered-applications), [Publisher](https://www.packtpub.com/en-au/product/building-llm-powered-applications-9781835462317), [GitHub Project](https://github.com/PacktPublishing/Building-LLM-Powered-Applications) |
17 | | [](books/building-llms-for-production.md) | [Building LLMs for Production](books/building-llms-for-production.md)
**Subtitle**: Enhancing LLM Abilities and Reliability with Prompting, Fine-Tuning, and RAG
**Authors**: Louis-François Bouchard and Louie Peters
**Publisher**: Independently published, 2024
**Star Rating**: 4.4 on Amazon, 4.10 on Goodreads
**Links**: [Amazon](https://a.co/d/grz7eTc), [Goodreads](https://www.goodreads.com/book/show/213731760-building-llms-for-production), [Publisher](https://www.oreilly.com/library/view/building-llms-for/9798324731472/) |
18 | | [](books/creating-production-ready-llms.md) | [Creating Production-Ready LLMs](books/creating-production-ready-llms.md)
**Subtitle**: A Comprehensive Guide to Building, Optimizing, and Deploying Large Language Models for Production Use
**Authors**: TransformaTech Institute
**Publisher**: Independently published, 2024
**Star Rating**: 4.5 on Amazon, 0.00 on Goodreads
**Links**: [Amazon](https://a.co/d/7nVhfVT), [Goodreads](https://www.goodreads.com/book/show/219981025-creating-production-ready-llms), [Publisher](https://www.amazon.com.au/stores/author/B0DJRMJX76/about) |
19 | | [](books/developing-apps-with-gpt-4-and-chatgpt.md) | [Developing Apps with GPT-4 and ChatGPT](books/developing-apps-with-gpt-4-and-chatgpt.md)
**Subtitle**: Build Intelligent Chatbots, Content Generators, and More
**Authors**: Olivier Caelen and Marie-Alice Blete
**Publisher**: O'Reilly, 2023
**Star Rating**: 4.2 on Amazon, 3.67 on Goodreads
**Links**: [Amazon](https://a.co/d/8aDJJvi), [Goodreads](https://www.goodreads.com/book/show/181704874-developing-apps-with-gpt-4-and-chatgpt), [Publisher](https://www.oreilly.com/library/view/developing-apps-with/9781098152475/), [GitHub Project](https://github.com/malywut/gpt_examples) |
20 | | [](books/generative-ai-on-aws.md) | [Generative AI on AWS](books/generative-ai-on-aws.md)
**Subtitle**: Building Context-Aware Multimodal Reasoning Applications
**Authors**: Chris Fregly, Antje Barth and Shelbee Eigenbrode
**Publisher**: O'Reilly, 2023
**Star Rating**: 4.4 on Amazon, 4.50 on Goodreads
**Links**: [Amazon](https://a.co/d/f6xUdNI), [Goodreads](https://www.goodreads.com/book/show/197525483-generative-ai-on-aws), [Publisher](https://www.oreilly.com/library/view/generative-ai-on/9781098159214/), [GitHub Project](https://github.com/generative-ai-on-aws/generative-ai-on-aws) |
21 | | [](books/generative-ai-with-langchain.md) | [Generative AI with LangChain](books/generative-ai-with-langchain.md)
**Subtitle**: Build large language model (LLM) apps with Python, ChatGPT, and other LLMs
**Authors**: Ben Auffarth
**Publisher**: Packt, 2023
**Star Rating**: 4.3 on Amazon, 3.50 on Goodreads
**Links**: [Amazon](https://a.co/d/8kVpV3T), [Goodreads](https://www.goodreads.com/book/show/185125672-generative-ai-with-langchain), [Publisher](https://www.packtpub.com/en-us/product/generative-ai-with-langchain-9781835083468), [GitHub Project](https://github.com/benman1/generative_ai_with_langchain) |
22 | | [](books/hands-on-large-language-models.md) | [Hands-On Large Language Models](books/hands-on-large-language-models.md)
**Subtitle**: Language Understanding and Generation
**Authors**: Jay Alammar and Maarten Grootendorst
**Publisher**: O'Reilly, 2024
**Star Rating**: 4.6 on Amazon, 4.36 on Goodreads
**Links**: [Amazon](https://a.co/d/hXs5jDF), [Goodreads](https://www.goodreads.com/book/show/210408850-hands-on-large-language-models), [Publisher](https://www.oreilly.com/library/view/hands-on-large-language/9781098150952/), [GitHub Project](https://github.com/HandsOnLLM/Hands-On-Large-Language-Models) |
23 | | [](books/langchain-crash-course.md) | [LangChain Crash Course](books/langchain-crash-course.md)
**Subtitle**: Build OpenAI LLM powered Apps: Fast track to building OpenAI LLM powered Apps using Python
**Authors**: Greg Lim
**Publisher**: Independently Published, 2024
**Star Rating**: 4.2 on Amazon, 4.07 on Goodreads
**Links**: [Amazon](https://a.co/d/ibgu6jy), [Goodreads](https://www.goodreads.com/book/show/198671257-langchain-crash-course), [Publisher](https://greglim.gumroad.com/l/langchain) |
24 | | [](books/large-language-models.md) | [Large Language Models](books/large-language-models.md)
**Subtitle**: A Deep Dive: Bridging Theory and Practice
**Authors**: Uday Kamath, Kevin Keenan, Garrett Somers, and Sarah Sorenson
**Publisher**: Springer, 2024
**Star Rating**: 4.1 on Amazon, 4.00 on Goodreads
**Links**: [Amazon](https://a.co/d/6IMNpkX), [Goodreads](https://www.goodreads.com/book/show/214355031-large-language-models), [Publisher](https://link.springer.com/book/10.1007/978-3-031-65647-7), [GitHub Project](https://github.com/springer-llms-deep-dive/llms-deep-dive-tutorials) |
25 | | [](books/llm-engineer's-handbook.md) | [LLM Engineer's Handbook](books/llm-engineer's-handbook.md)
**Subtitle**: Master the art of engineering large language models from concept to production
**Authors**: Paul Iusztin and Maxime Labonne
**Publisher**: Packt, 2024
**Star Rating**: 4.6 on Amazon, 3.54 on Goodreads
**Links**: [Amazon](https://a.co/d/5H3ufht), [Goodreads](https://www.goodreads.com/book/show/216193554-llm-engineer-s-handbook), [Publisher](https://www.packtpub.com/en-au/product/llm-engineers-handbook-9781836200062), [GitHub Project](https://github.com/PacktPublishing/LLM-Engineers-Handbook) |
26 | | [](books/llms-in-production.md) | [LLMs in Production](books/llms-in-production.md)
**Subtitle**: From language models to successful products
**Authors**: Christopher Brousseau and Matthew Sharp
**Publisher**: Manning, 2025
**Star Rating**: 4.4 on Amazon, 4.08 on Goodreads
**Links**: [Amazon](https://a.co/d/gF1w56V), [Goodreads](https://www.goodreads.com/book/show/215144443-llms-in-production), [Publisher](https://www.manning.com/books/llms-in-production), [GitHub Project](https://github.com/IMJONEZZ/LLMs-in-Production) |
27 | | [](books/natural-language-processing-with-transformers.md) | [Natural Language Processing with Transformers](books/natural-language-processing-with-transformers.md)
**Subtitle**: Building Language Applications with Hugging Face
**Authors**: Lewis Tunstall, Leandro von Werra and Thomas Wolf
**Publisher**: O'Reilly, 2022
**Star Rating**: 4.6 on Amazon, 4.41 on Goodreads
**Links**: [Amazon](https://a.co/d/5WIiVAC), [Goodreads](https://www.goodreads.com/book/show/60114857-natural-language-processing-with-transformers), [Publisher](https://www.oreilly.com/library/view/natural-language-processing/9781098136789/), [GitHub Project](https://github.com/nlp-with-transformers/notebooks) |
28 | | [](books/prompt-engineering-for-generative-ai.md) | [Prompt Engineering for Generative AI](books/prompt-engineering-for-generative-ai.md)
**Subtitle**: Future-Proof Inputs for Reliable AI Outputs
**Authors**: James Phoenix and Mike Taylor
**Publisher**: O'Reilly, 2024
**Star Rating**: 4.5 on Amazon, 3.56 on Goodreads
**Links**: [Amazon](https://a.co/d/52xLb9K), [Goodreads](https://www.goodreads.com/book/show/204133880-prompt-engineering-for-generative-ai), [Publisher](https://www.oreilly.com/library/view/prompt-engineering-for/9781098153427/), [GitHub Project](https://github.com/BrightPool/prompt-engineering-for-generative-ai-examples) |
29 | | [](books/prompt-engineering-for-llms.md) | [Prompt Engineering for LLMs](books/prompt-engineering-for-llms.md)
**Subtitle**: The Art and Science of Building Large Language Model–Based Applications
**Authors**: John Berryman and Albert Ziegler
**Publisher**: O'Reilly, 2024
**Star Rating**: 4.4 on Amazon, 4.00 on Goodreads
**Links**: [Amazon](https://a.co/d/eyWEQ4A), [Goodreads](https://www.goodreads.com/book/show/213739653-prompt-engineering-for-llms), [Publisher](https://www.oreilly.com/library/view/prompt-engineering-for/9781098156145/) |
30 | | [](books/quick-start-guide-to-large-language-models.md) | [Quick Start Guide to Large Language Models](books/quick-start-guide-to-large-language-models.md)
**Subtitle**: Strategies and Best Practices for ChatGPT, Embeddings, Fine-Tuning, and Multimodal AI
**Authors**: Sinan Ozdemir
**Publisher**: Addison-Wesley, 2024
**Star Rating**: 4.9 on Amazon, 3.64 on Goodreads
**Links**: [Amazon](https://a.co/d/aUsDJ7e), [Goodreads](https://www.goodreads.com/book/show/126850297-quick-start-guide-to-large-language-models), [Publisher](https://www.pearson.com/en-us/subject-catalog/p/quick-start-guide-to-large-language-models-2nd-edition/P200000012793), [GitHub Project](https://github.com/sinanuozdemir/quick-start-guide-to-llms) |
31 | | [](books/rag-driven-generative-ai.md) | [RAG-Driven Generative AI](books/rag-driven-generative-ai.md)
**Subtitle**: Build custom retrieval augmented generation pipelines with LlamaIndex, Deep Lake, and Pinecone
**Authors**: Denis Rothman
**Publisher**: Packt, 2024
**Star Rating**: 4 on Amazon, 3.90 on Goodreads
**Links**: [Amazon](https://a.co/d/2zjaDK4), [Goodreads](https://www.goodreads.com/book/show/214330235-rag-driven-generative-ai), [Publisher](https://www.packtpub.com/en-us/product/rag-driven-generative-ai-9781836200918), [GitHub Project](https://github.com/Denis2054/RAG-Driven-Generative-AI) |
32 | | [](books/super-study-guide.md) | [Super Study Guide](books/super-study-guide.md)
**Subtitle**: Transformers & Large Language Models
**Authors**: Afshine Amidi and Shervine Amidi
**Publisher**: Independently published, 2024
**Star Rating**: 4.6 on Amazon, 4.58 on Goodreads
**Links**: [Amazon](https://a.co/d/aE3pz72), [Goodreads](https://www.goodreads.com/book/show/217141763-super-study-guide), [Publisher](https://superstudy.guide/transformers-large-language-models/) |
33 | | [](books/the-developer's-playbook-for-large-language-model-security.md) | [The Developer's Playbook for Large Language Model Security](books/the-developer's-playbook-for-large-language-model-security.md)
**Subtitle**: Building Secure AI Applications
**Authors**: Steve Wilson
**Publisher**: O'Reilly, 2024
**Star Rating**: 5 on Amazon, 3.88 on Goodreads
**Links**: [Amazon](https://a.co/d/d3rJVkn), [Goodreads](https://www.goodreads.com/book/show/210408897-the-developer-s-playbook-for-large-language-model-security), [Publisher](https://www.oreilly.com/library/view/the-developers-playbook/9781098162191/) |
34 | | [](books/what-is-chatgpt-doing....md) | [What Is ChatGPT Doing...](books/what-is-chatgpt-doing....md)
**Subtitle**: ...and Why Does It Work?
**Authors**: Stephen Wolfram
**Publisher**: Wolfram Media Inc., 2023
**Star Rating**: 4.2 on Amazon, 3.86 on Goodreads
**Links**: [Amazon](https://a.co/d/79xVzR5), [Goodreads](https://www.goodreads.com/book/show/123451665-what-is-chatgpt-doing-and-why-does-it-work), [Publisher](https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/) |
35 |
36 | ### On Curation
37 |
38 | The above list is not "all books on LLM development", instead it is filtered using the following procedure:
39 |
40 | 1. Create a [master list](https://docs.google.com/spreadsheets/d/1AGExX1aYINy_FsBRmr9z8OyalJcPso2mo28GraHqhSQ/edit?usp=sharing) of all known books on LLM development (amazon, goodreads, google books, etc.)
41 | 2. Read book blurb and table of contents to confirm relevance (for "engineers doing LLM development").
42 | 3. Read reviews and check star ratings for quality (quality check).
43 | 4. Read comments and discussion about the book on social (twitter/reddit/etc).
44 | 5. Acquire the ebook version of the book, if possible (final read/skim to confirm relevance and quality).
45 | 6. Final judgement call (publisher, gut check).
46 |
47 | Note that I update the list based on newly published books and emails I received about new books. Additionally, listed star ratings are updated periodically.
48 |
49 | ### Make The List Better
50 |
51 | Do you have ideas on how we make this list more awesome?
52 |
53 | Email any time: Jason.Brownlee05@gmail.com
--------------------------------------------------------------------------------
/books/ai-engineering.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/ai-engineering.jpeg
--------------------------------------------------------------------------------
/books/ai-engineering.md:
--------------------------------------------------------------------------------
1 | # AI Engineering
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: AI Engineering
10 | * **Subtitle**: Building Applications with Foundation Models
11 | * **Authors**: Chip Huyen
12 | * **Publication Date**: 2025
13 | * **Publisher**: O'Reilly
14 | * **ISBN-13**: 978-1098166304
15 | * **Pages**: 532
16 | * **Amazon Rating**: 4.6 stars
17 | * **Goodreads Rating**: 4.56 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/hAI9OXl) |
21 | [Goodreads](https://www.goodreads.com/book/show/216848047-ai-engineering) |
22 | [Publisher](https://www.oreilly.com/library/view/ai-engineering/9781098166298/) |
23 | [GitHub Project](https://github.com/chiphuyen/aie-book)
24 |
25 | ## Blurb
26 |
27 | Recent breakthroughs in AI have not only increased demand for AI products, they've also lowered the barriers to entry for those who want to build AI products. The model-as-a-service approach has transformed AI from an esoteric discipline into a powerful development tool that anyone can use. Everyone, including those with minimal or no prior AI experience, can now leverage AI models to build applications. In this book, author Chip Huyen discusses AI engineering: the process of building applications with readily available foundation models.
28 |
29 | The book starts with an overview of AI engineering, explaining how it differs from traditional ML engineering and discussing the new AI stack. The more AI is used, the more opportunities there are for catastrophic failures, and therefore, the more important evaluation becomes. This book discusses different approaches to evaluating open-ended models, including the rapidly growing AI-as-a-judge approach.
30 |
31 | AI application developers will discover how to navigate the AI landscape, including models, datasets, evaluation benchmarks, and the seemingly infinite number of use cases and application patterns. You'll learn a framework for developing an AI application, starting with simple techniques and progressing toward more sophisticated methods, and discover how to efficiently deploy these applications.
32 |
33 | * Understand what AI engineering is and how it differs from traditional machine learning engineering
34 | * Learn the process for developing an AI application, the challenges at each step, and approaches to address them
35 | * Explore various model adaptation techniques, including prompt engineering, RAG, fine-tuning, agents, and dataset engineering, and understand how and why they work
36 | * Examine the bottlenecks for latency and cost when serving foundation models and learn how to overcome them
37 | * Choose the right model, dataset, evaluation benchmarks, and metrics for your needs
38 |
39 | Chip Huyen works to accelerate data analytics on GPUs at Voltron Data. Previously, she was with Snorkel AI and NVIDIA, founded an AI infrastructure startup, and taught Machine Learning Systems Design at Stanford. She's the author of the book Designing Machine Learning Systems, an Amazon bestseller in AI.
40 |
41 | AI Engineering builds upon and is complementary to Designing Machine Learning Systems (O'Reilly).
42 |
43 | ## Contents
44 |
45 | 1. Introduction to Building AI Applications with Foundation Models
46 | 2. Understanding Foundation Models
47 | 3. Evaluation Methodology
48 | 4. Evaluate AI Systems
49 | 5. Prompt Engineering
50 | 6. RAG and Agents
51 | 7. Finetuning
52 | 8. Dataset Engineering
53 | 9. Inference Optimization
54 | 10. AI Engineering Architecture and User Feedback
55 |
--------------------------------------------------------------------------------
/books/build-a-large-language-model.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/build-a-large-language-model.jpeg
--------------------------------------------------------------------------------
/books/build-a-large-language-model.md:
--------------------------------------------------------------------------------
1 | # Build a Large Language Model
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Build a Large Language Model
10 | * **Subtitle**: (From Scratch)
11 | * **Authors**: Sebastian Raschka
12 | * **Publication Date**: 2024
13 | * **Publisher**: Manning
14 | * **ISBN-13**: 978-1633437166
15 | * **Pages**: 368
16 | * **Amazon Rating**: 4.7 stars
17 | * **Goodreads Rating**: 4.64 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/bXGGLyC) |
21 | [Goodreads](https://www.goodreads.com/book/show/219388329-build-a-large-language-model) |
22 | [Publisher](https://www.manning.com/books/build-a-large-language-model-from-scratch) |
23 | [GitHub Project](https://github.com/rasbt/LLMs-from-scratch)
24 |
25 | ## Blurb
26 |
27 | Learn how to create, train, and tweak large language models (LLMs) by building one from the ground up!
28 |
29 | In Build a Large Language Model (from Scratch) bestselling author Sebastian Raschka guides you step by step through creating your own LLM. Each stage is explained with clear text, diagrams, and examples. You’ll go from the initial design and creation, to pretraining on a general corpus, and on to fine-tuning for specific tasks.
30 |
31 | Build a Large Language Model (from Scratch) teaches you how to:
32 |
33 | * Plan and code all the parts of an LLM
34 | * Prepare a dataset suitable for LLM training
35 | * Fine-tune LLMs for text classification and with your own data
36 | * Use human feedback to ensure your LLM follows instructions
37 | * Load pretrained weights into an LLM
38 |
39 | Build a Large Language Model (from Scratch) takes you inside the AI black box to tinker with the internal systems that power generative AI. As you work through each key stage of LLM creation, you’ll develop an in-depth understanding of how LLMs work, their limitations, and their customization methods. Your LLM can be developed on an ordinary laptop, and used as your own personal assistant.
40 |
41 | Purchase of the print book includes a free eBook in PDF and ePub formats from Manning Publications.
42 |
43 | About the technology
44 |
45 | Physicist Richard P. Feynman reportedly said, “I don’t understand anything I can’t build.” Based on this same powerful principle, bestselling author Sebastian Raschka guides you step by step as you build a GPT-style LLM that you can run on your laptop. This is an engaging book that covers each stage of the process, from planning and coding to training and fine-tuning.
46 |
47 | About the book
48 |
49 | Build a Large Language Model (From Scratch) is a practical and eminently-satisfying hands-on journey into the foundations of generative AI. Without relying on any existing LLM libraries, you’ll code a base model, evolve it into a text classifier, and ultimately create a chatbot that can follow your conversational instructions. And you’ll really understand it because you built it yourself!
50 |
51 | What's inside
52 |
53 | * Plan and code an LLM comparable to GPT-2
54 | * Load pretrained weights
55 | * Construct a complete training pipeline
56 | * Fine-tune your LLM for text classification
57 | * Develop LLMs that follow human instructions
58 |
59 | About the reader
60 |
61 | Readers need intermediate Python skills and some knowledge of machine learning. The LLM you create will run on any modern laptop and can optionally utilize GPUs.
62 |
63 | ## Contents
64 |
65 | 1. Understanding large language models
66 | 2. Working with text data
67 | 3. Coding attention mechanisms
68 | 4. Implementing a GPT model from scratch to generate text
69 | 5. Pretraining on unlabeled data
70 | 6. Fine-tuning for classification
71 | 7. Fine-tuning to follow instructions
72 | 8. Introduction to PyTorch
73 | 8. References and further reading
74 | 12. Exercise solutions
75 | 13. Adding bells and whistles to the training loop
76 | 14. Parameter-efficient fine-tuning with LoRA
77 |
--------------------------------------------------------------------------------
/books/build-llm-applications.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/build-llm-applications.jpeg
--------------------------------------------------------------------------------
/books/build-llm-applications.md:
--------------------------------------------------------------------------------
1 | # Build LLM Applications
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Build LLM Applications
10 | * **Subtitle**: (from Scratch)
11 | * **Authors**: Hamza Farooq
12 | * **Publication Date**: 2025
13 | * **Publisher**: Manning
14 | * **ISBN-13**: 9781633436527
15 | * **Pages**: 325
16 |
17 |
18 | **Links**: [Publisher](https://www.manning.com/books/build-llm-applications-from-scratch)
19 |
20 | ## Blurb
21 |
22 | Create your own LLM applications without using a framework like LlamaIndex or LangChain.
23 |
24 | In Build LLM Applications (From Scratch), you'll learn to create applications powered by large language models (LLM) from the ground up. In this practical book, you'll build several fully functioning, real-world AI tools—including a search engine, semantic caching for RAG, and autonomous AI agents.
25 |
26 | In Build LLM Applications (From Scratch), you'll learn how to:
27 |
28 | * Design and implement efficient search algorithms for LLM applications
29 | * Develop custom Retrieval Augmented Generation (RAG) systems
30 | * Master deep customization techniques for every aspect of search and RAG components
31 | * Understand and overcome the limitations of popular LLM frameworks
32 | * Create end-to-end LLM solutions by integrating multiple components cohesively
33 | * Apply advanced fine-tuning techniques for task-specific models and domain adaptation
34 | * Deploy quantized versions of open-source LLMs using vLLMs and Ollama
35 |
36 | Build LLM Applications (From Scratch) shows you just how customizable LLM applications can be when you create your own without using opinionated tools like LangChain and LlamaIndex. You'll learn the fundamentals of AI development hands-on, all without any proprietary tools. Soon you'll have the skills you need to build LLM applications, tailor them to your specific needs, and ensure you have control over your entire system.
37 | about the book
38 |
39 | Build LLM Applications (From Scratch) is a practical and comprehensive handbook for creating custom LLM applications without relying on premade frameworks. You'll start by mastering the fundamentals of search systems and RAG. Then you'll apply this knowledge to real-world projects, including building a hotel search engine using TripAdvisor review data, implementing semantic caching in RAG production systems, and deploying a full RAG application using Hugging Face and Gradio. By the end of the book, you'll have the skills to build AI agents from scratch, deploy open source LLMs with advanced quantization techniques, and create innovative, specialized LLM applications designed for your specific needs.
40 |
41 | ## Contents
42 |
43 | PART 1: THE FUNDAMENTALS
44 | * 1. The World of Large Language Models
45 | * 2. An in-depth look into the soul of the Transformer Architecture
46 | * 3. Encoder models in action: Semantic-Based Retrieval Systems
47 |
48 | PART 2: RETRIEVAL SYSTEMS
49 | * 4. Semantic Search from Scratch
50 | * 5. Combining Encoder & Decoder Model to Create RAG Applications
51 | * 6. Advanced RAG techniques with knowledge graphs and Semantic Cache
52 |
53 | PART 3: BUILDING ENTERPRISE LLM APPLICATION
54 | * 7. Introducing Agents, the next Generation of AI
55 | * 8. Fine-Tuning and Domain Adapation
56 | * 9. Deploying Language Models as APIs
57 | * 10. OpenSource Large Language Models
58 |
--------------------------------------------------------------------------------
/books/building-llm-powered-applications.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/building-llm-powered-applications.jpeg
--------------------------------------------------------------------------------
/books/building-llm-powered-applications.md:
--------------------------------------------------------------------------------
1 | # Building LLM Powered Applications
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Building LLM Powered Applications
10 | * **Subtitle**: Create intelligent apps and agents with large language models
11 | * **Authors**: Valentina Alto
12 | * **Publication Date**: 2024
13 | * **Publisher**: Packt
14 | * **ISBN-13**: 978-1835462317
15 | * **Pages**: 342
16 | * **Amazon Rating**: 4.6 stars
17 | * **Goodreads Rating**: 3.35 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/e6rt1da) |
21 | [Goodreads](https://www.goodreads.com/book/show/201054993-building-llm-powered-applications) |
22 | [Publisher](https://www.packtpub.com/en-au/product/building-llm-powered-applications-9781835462317) |
23 | [GitHub Project](https://github.com/PacktPublishing/Building-LLM-Powered-Applications)
24 |
25 | ## Blurb
26 |
27 | Get hands-on with GPT 3.5, GPT 4, LangChain, Llama 2, Falcon LLM and more, to build LLM-powered sophisticated AI applications
28 |
29 | Key Features
30 | * Embed LLMs into real-world applications
31 | * Use LangChain to orchestrate LLMs and their components within applications
32 | * Grasp basic and advanced techniques of prompt engineering
33 |
34 | Book Description
35 | Building LLM Powered Applications delves into the fundamental concepts, cutting-edge technologies, and practical applications that LLMs offer, ultimately paving the way for the emergence of large foundation models (LFMs) that extend the boundaries of AI capabilities.
36 |
37 | The book begins with an in-depth introduction to LLMs. We then explore various mainstream architectural frameworks, including both proprietary models (GPT 3.5/4) and open-source models (Falcon LLM), and analyze their unique strengths and differences. Moving ahead, with a focus on the Python-based, lightweight framework called LangChain, we guide you through the process of creating intelligent agents capable of retrieving information from unstructured data and engaging with structured data using LLMs and powerful toolkits. Furthermore, the book ventures into the realm of LFMs, which transcend language modeling to encompass various AI tasks and modalities, such as vision and audio.
38 |
39 | Whether you are a seasoned AI expert or a newcomer to the field, this book is your roadmap to unlock the full potential of LLMs and forge a new era of intelligent machines.
40 |
41 | What you will learn
42 | * Explore the core components of LLM architecture, including encoder-decoder blocks and embeddings
43 | * Understand the unique features of LLMs like GPT-3.5/4, Llama 2, and Falcon LLM
44 | * Use AI orchestrators like LangChain, with Streamlit for the frontend
45 | * Get familiar with LLM components such as memory, prompts, and tools
46 | * Learn how to use non-parametric knowledge and vector databases
47 | * Understand the implications of LFMs for AI research and industry applications
48 | * Customize your LLMs with fine tuning
49 | * Learn about the ethical implications of LLM-powered applications
50 |
51 | Who this book is for
52 | Software engineers and data scientists who want hands-on guidance for applying LLMs to build applications. The book will also appeal to technical leaders, students, and researchers interested in applied LLM topics.
53 |
54 | We don’t assume previous experience with LLM specifically. But readers should have core ML/software engineering fundamentals to understand and apply the content.
55 |
56 | ## Contents
57 |
58 | 1. Introduction to Large Language Models
59 | 2. LLMs for AI-Powered Applications
60 | 3. Choosing an LLM for Your Application
61 | 4. Prompt Engineering
62 | 5. Embedding LLMs within Your Applications
63 | 6. Building Conversational Applications
64 | 7. Search and Recommendation Engines with LLMs
65 | 8. Using LLMs with Structured Data
66 | 9. Working with Code
67 | 10. Building Multimodal Applications with LLMs
68 | 11. Fine-Tuning Large Language Models
69 | 12. Responsible AI
70 | 13. Emerging Trends and Innovations
71 |
--------------------------------------------------------------------------------
/books/building-llms-for-production.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/building-llms-for-production.jpeg
--------------------------------------------------------------------------------
/books/building-llms-for-production.md:
--------------------------------------------------------------------------------
1 | # Building LLMs for Production
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Building LLMs for Production
10 | * **Subtitle**: Enhancing LLM Abilities and Reliability with Prompting, Fine-Tuning, and RAG
11 | * **Authors**: Louis-François Bouchard and Louie Peters
12 | * **Publication Date**: 2024
13 | * **Publisher**: Independently published
14 | * **ISBN-13**: 979-8324731472
15 | * **Pages**: 463
16 | * **Amazon Rating**: 4.4 stars
17 | * **Goodreads Rating**: 4.10 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/grz7eTc) |
21 | [Goodreads](https://www.goodreads.com/book/show/213731760-building-llms-for-production) |
22 | [Publisher](https://www.oreilly.com/library/view/building-llms-for/9798324731472/)
23 |
24 | ## Blurb
25 |
26 | With amazing feedback from industry leaders, this book is an end-to-end resource for anyone looking to enhance their skills or dive into the world of AI and develop their understanding of Generative AI and Large Language Models (LLMs). It explores various methods to adapt "foundational" LLMs to specific use cases with enhanced accuracy, reliability, and scalability. Written by over 10 people on our Team at Towards AI and curated by experts from Activeloop, LlamaIndex, Mila, and more, it is a roadmap to the tech stack of the future.
27 |
28 | The book aims to guide developers through creating LLM products ready for production, leveraging the potential of AI across various industries. It is tailored for readers with an intermediate knowledge of Python.
29 |
30 |
31 | What's Inside this 470-page Book (Updated October 2024)?
32 |
33 | * Hands-on Guide on LLMs, Prompting, Retrieval Augmented Generation (RAG) & Fine-tuning
34 | * Roadmap for Building Production-Ready Applications using LLMs
35 | * Fundamentals of LLM Theory
36 | * Simple-to-Advanced LLM Techniques & Frameworks
37 | * Code Projects with Real-World Applications
38 | * Colab Notebooks that you can run right away
39 | * Community access and our own AI Tutor
40 |
41 | Whether you're looking to enhance your skills or dive into the world of AI for the first time as a programmer or software student, our book is for you. From the basics of LLMs to mastering fine-tuning and RAG for scalable, reliable AI applications, we guide you every step of the way.
42 |
43 | ## Contents
44 |
45 | 1. Introduction to Large Language Models
46 | 2. LLM Architectures & Landscape
47 | 3. LLMs in Practice
48 | 4. Introduction to Prompting
49 | 5. Retrieval-Augmented Generation
50 | 6. Introduction to LangChain & LlamaIndex
51 | 7. Prompting with LangChain
52 | 8. Indexes, Retrievers, and Data Preparation
53 | 9. Advanced RAG
54 | 10. Agents
55 | 11. Fine-Tuning
56 | 12. Deployment and Optimization
57 |
--------------------------------------------------------------------------------
/books/cover.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/cover.png
--------------------------------------------------------------------------------
/books/creating-production-ready-llms.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/creating-production-ready-llms.jpeg
--------------------------------------------------------------------------------
/books/creating-production-ready-llms.md:
--------------------------------------------------------------------------------
1 | # Creating Production-Ready LLMs
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Creating Production-Ready LLMs
10 | * **Subtitle**: A Comprehensive Guide to Building, Optimizing, and Deploying Large Language Models for Production Use
11 | * **Authors**: TransformaTech Institute
12 | * **Publication Date**: 2024
13 | * **Publisher**: Independently published
14 | * **ISBN-13**: 979-8341060043
15 | * **Pages**: 546
16 | * **Amazon Rating**: 4.5 stars
17 | * **Goodreads Rating**: 0.00 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/7nVhfVT) |
21 | [Goodreads](https://www.goodreads.com/book/show/219981025-creating-production-ready-llms) |
22 | [Publisher](https://www.amazon.com.au/stores/author/B0DJRMJX76/about)
23 |
24 | ## Blurb
25 |
26 | Master the Art of Building, Optimizing, and Deploying Large Language Models for Production
27 |
28 | The ONLY resource you will need to develop LLMs that can thrive in real-world applications.
29 |
30 | This 500+ page guide covers everything from foundational concepts to advanced techniques, offering a clear roadmap for:
31 | * Understanding the core architectures behind LLMs, including transformers, GPT, BERT, and T5.
32 | * Training models from scratch, optimizing performance, and implementing distributed training with multi-GPU and TPU setups.
33 | * Fine-tuning pre-trained models for specific tasks and ensuring they are reliable, scalable, and efficient.
34 | * Practical strategies for integrating LLMs into business workflows, including case studies from industries like healthcare, finance, and education.
35 | * Addressing key challenges such as debugging, handling edge cases, and ensuring robust security and ethical compliance.
36 | * Mastering prompt engineering to enhance model performance, generate precise outputs, and unlock the full potential of LLMs in real-world applications.
37 |
38 | By the end of this book, you will have the expertise to:
39 | * Take LLMs from concept to production use with confidence.
40 | * Deploy LLMs in high-demand, real-world environments.
41 | * Solve challenges in scaling, optimizing, and maintaining LLMs in production.
42 | * Understand key ethical considerations and how to mitigate bias in LLM deployments.
43 |
44 | This book goes beyond theory, providing hands-on examples, case studies, and real-world insights that will help you apply LLMs effectively in your projects. Whether you're an AI engineer, data scientist, researcher, or business leader, Production-Ready LLMs equips you with the tools to stay ahead in the fast-paced world of AI.
45 |
46 | If you’re ready to move beyond experimentation and develop LLMs that deliver results in real-world scenarios, Production-Ready LLMs is your essential companion.
47 |
48 | ## Contents
49 |
50 | 1. Understanding Language Models
51 | 2. Architectures and Frameworks
52 | 3. The Mathematics behind LLMs
53 | 4. Data Collection and Preprocessing
54 | 5. Traingn LLMs from Scratch
55 | 6. Fine Tuning Pre-trained Models
56 | 7. Prompt Engineering
57 | 8. Retrieval-Augmented Generation (RAG)
58 | 9. Model Optimization for Production
59 | 10. Debugging and Trouble Shooing LLMs
60 | 11. Production-Ready LLMs
61 | 12. Security and Ethical Considerations
62 | 13. Integrating LLMs w ith Business Applications
63 | 14. LLMs in Healthcare
64 | 15. LLMs in Finance
65 | 16. LLMs in Education
66 | 17. Emerging Trends and Technologies
67 | 18. Conclusion and Final Thoughts
68 |
--------------------------------------------------------------------------------
/books/developing-apps-with-gpt-4-and-chatgpt.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/developing-apps-with-gpt-4-and-chatgpt.jpeg
--------------------------------------------------------------------------------
/books/developing-apps-with-gpt-4-and-chatgpt.md:
--------------------------------------------------------------------------------
1 | # Developing Apps with GPT-4 and ChatGPT
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Developing Apps with GPT-4 and ChatGPT
10 | * **Subtitle**: Build Intelligent Chatbots, Content Generators, and More
11 | * **Authors**: Olivier Caelen and Marie-Alice Blete
12 | * **Publication Date**: 2023
13 | * **Publisher**: O'Reilly
14 | * **ISBN-13**: 978-1098152482
15 | * **Pages**: 155
16 | * **Amazon Rating**: 4.2 stars
17 | * **Goodreads Rating**: 3.67 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/8aDJJvi) |
21 | [Goodreads](https://www.goodreads.com/book/show/181704874-developing-apps-with-gpt-4-and-chatgpt) |
22 | [Publisher](https://www.oreilly.com/library/view/developing-apps-with/9781098152475/) |
23 | [GitHub Project](https://github.com/malywut/gpt_examples)
24 |
25 | ## Blurb
26 |
27 | This minibook is a comprehensive guide for Python developers who want to learn how to build applications with large language models. Authors Olivier Caelen and Marie-Alice Blete cover the main features and benefits of GPT-4 and ChatGPT and explain how they work. You'll also get a step-by-step guide for developing applications using the GPT-4 and ChatGPT Python library, including text generation, Q&A, and content summarization tools.
28 |
29 | Written in clear and concise language, Developing Apps with GPT-4 and ChatGPT includes easy-to-follow examples to help you understand and apply the concepts to your projects. Python code examples are available in a GitHub repository, and the book includes a glossary of key terms. Ready to harness the power of large language models in your applications? This book is a must.
30 |
31 | You'll learn:
32 |
33 | * The fundamentals and benefits of ChatGPT and GPT-4 and how they work
34 | * How to integrate these models into Python-based applications for NLP tasks
35 | * How to develop applications using GPT-4 or ChatGPT APIs in Python for text generation, question answering, and content summarization, among other tasks
36 | * Advanced GPT topics including prompt engineering, fine-tuning models for specific tasks, plug-ins, LangChain, and more
37 |
38 | ## Contents
39 |
40 | 1. GPT-4 and ChatGPT Essentials
41 | 2. A Deep Dive into the GPT-4 and ChatGPT APIs
42 | 3. Building Apps with GPT-4 and ChatGPT
43 | 4. Advanced GPT-4 and ChatGPT Techniques
44 | 5. Advancing LLM Capabilities with the LangChain Framework and Plug-ins
45 |
--------------------------------------------------------------------------------
/books/generative-ai-on-aws.md:
--------------------------------------------------------------------------------
1 | # Generative AI on AWS
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Generative AI on AWS
10 | * **Subtitle**: Building Context-Aware Multimodal Reasoning Applications
11 | * **Authors**: Chris Fregly, Antje Barth and Shelbee Eigenbrode
12 | * **Publication Date**: 2023
13 | * **Publisher**: O'Reilly
14 | * **ISBN-13**: 978-1098159221
15 | * **Pages**: 309
16 | * **Amazon Rating**: 4.4 stars
17 | * **Goodreads Rating**: 4.50 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/f6xUdNI) |
21 | [Goodreads](https://www.goodreads.com/book/show/197525483-generative-ai-on-aws) |
22 | [Publisher](https://www.oreilly.com/library/view/generative-ai-on/9781098159214/) |
23 | [GitHub Project](https://github.com/generative-ai-on-aws/generative-ai-on-aws)
24 |
25 | ## Blurb
26 |
27 | Companies today are moving rapidly to integrate generative AI into their products and services. But there's a great deal of hype (and misunderstanding) about the impact and promise of this technology. With this book, Chris Fregly, Antje Barth, and Shelbee Eigenbrode from AWS help CTOs, ML practitioners, application developers, business analysts, data engineers, and data scientists find practical ways to use this exciting new technology.
28 |
29 | You'll learn the generative AI project life cycle including use case definition, model selection, model fine-tuning, retrieval-augmented generation, reinforcement learning from human feedback, and model quantization, optimization, and deployment. And you'll explore different types of models including large language models (LLMs) and multimodal models such as Stable Diffusion for generating images and Flamingo/IDEFICS for answering questions about images.
30 |
31 | * Apply generative AI to your business use cases
32 | * Determine which generative AI models are best suited to your task
33 | * Perform prompt engineering and in-context learning
34 | * Fine-tune generative AI models on your datasets with low-rank adaptation (LoRA)
35 | * Align generative AI models to human values with reinforcement learning from human feedback (RLHF)
36 | * Augment your model with retrieval-augmented generation (RAG)
37 | * Explore libraries such as LangChain and ReAct to develop agents and actions
38 | * Build generative AI applications with Amazon Bedrock
39 |
40 | ## Contents
41 |
42 | 1. Generative AI Use Cases, Fundamentals, Project Lifecycle
43 | 2. Prompt Engineering and In-Context Learning
44 | 3. Large-Language Foundation Models
45 | 4. Quantization and Distributed Computing
46 | 5. Fine-Tuning and Evaluation
47 | 6. Parameter-efficient Fine Tuning (PEFT)
48 | 7. Fine-tuning using Reinforcement Learning with RLHF
49 | 8. Optimize and Deploy Generative AI Applications
50 | 9. Retrieval Augmented Generation (RAG) and Agents
51 | 10. Multimodal Foundation Models
52 | 11. Controlled Generation and Fine-Tuning with Stable Diffusion
53 | 12. Amazon Bedrock Managed Service for Generative AI
54 |
--------------------------------------------------------------------------------
/books/generative-ai-on-aws.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/generative-ai-on-aws.png
--------------------------------------------------------------------------------
/books/generative-ai-with-langchain.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/generative-ai-with-langchain.jpeg
--------------------------------------------------------------------------------
/books/generative-ai-with-langchain.md:
--------------------------------------------------------------------------------
1 | # Generative AI with LangChain
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Generative AI with LangChain
10 | * **Subtitle**: Build large language model (LLM) apps with Python, ChatGPT, and other LLMs
11 | * **Authors**: Ben Auffarth
12 | * **Publication Date**: 2023
13 | * **Publisher**: Packt
14 | * **ISBN-13**: 978-1835083468
15 | * **Pages**: 360
16 | * **Amazon Rating**: 4.3 stars
17 | * **Goodreads Rating**: 3.50 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/8kVpV3T) |
21 | [Goodreads](https://www.goodreads.com/book/show/185125672-generative-ai-with-langchain) |
22 | [Publisher](https://www.packtpub.com/en-us/product/generative-ai-with-langchain-9781835083468) |
23 | [GitHub Project](https://github.com/benman1/generative_ai_with_langchain)
24 |
25 | ## Blurb
26 |
27 | Key Features
28 | * Learn how to leverage LangChain to work around LLMs’ inherent weaknesses
29 | * Delve into LLMs with LangChain and explore their fundamentals, ethical dimensions, and application challenges
30 | * Get better at using ChatGPT and GPT models, from heuristics and training to scalable deployment, empowering you to transform ideas into reality
31 |
32 | Book Description
33 | ChatGPT and the GPT models by OpenAI have brought about a revolution not only in how we write and research but also in how we can process information. This book discusses the functioning, capabilities, and limitations of LLMs underlying chat systems, including ChatGPT and Gemini. It demonstrates, in a series of practical examples, how to use the LangChain framework to build production-ready and responsive LLM applications for tasks ranging from customer support to software development assistance and data analysis – illustrating the expansive utility of LLMs in real-world applications.
34 |
35 | Unlock the full potential of LLMs within your projects as you navigate through guidance on fine-tuning, prompt engineering, and best practices for deployment and monitoring in production environments. Whether you're building creative writing tools, developing sophisticated chatbots, or crafting cutting-edge software development aids, this book will be your roadmap to mastering the transformative power of generative AI with confidence and creativity.
36 |
37 | What you will learn
38 | * Create LLM apps with LangChain, like question-answering systems and chatbots
39 | * Understand transformer models and attention mechanisms
40 | * Automate data analysis and visualization using pandas and Python
41 | * Grasp prompt engineering to improve performance
42 | * Fine-tune LLMs and get to know the tools to unleash their power
43 | * Deploy LLMs as a service with LangChain and apply evaluation strategies
44 | * Privately interact with documents using open-source LLMs to prevent data leaks
45 |
46 | Who this book is for
47 | The book is for developers, researchers, and anyone interested in learning more about LangChain. Whether you are a beginner or an experienced developer, this book will serve as a valuable resource if you want to get the most out of LLMs using LangChain.
48 |
49 | Basic knowledge of Python is a prerequisite, while prior exposure to machine learning will help you follow along more easily.
50 |
51 | ## Contents
52 |
53 | 1. What Is Generative AI?
54 | 2. LangChain for LLM Apps
55 | 3. Getting Started with LangChain
56 | 4. Building Capable Assistants
57 | 5. Building a Chatbot like ChatGPT
58 | 6. Developing Software with Generative AI
59 | 7. LLMs for Data Science
60 | 8. Customizing LLMs and Their Output
61 | 9. Generative AI in Production
62 | 10. The Future of Generative Models
63 |
--------------------------------------------------------------------------------
/books/hands-on-large-language-models.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/hands-on-large-language-models.jpeg
--------------------------------------------------------------------------------
/books/hands-on-large-language-models.md:
--------------------------------------------------------------------------------
1 | # Hands-On Large Language Models
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Hands-On Large Language Models
10 | * **Subtitle**: Language Understanding and Generation
11 | * **Authors**: Jay Alammar and Maarten Grootendorst
12 | * **Publication Date**: 2024
13 | * **Publisher**: O'Reilly
14 | * **ISBN-13**: 978-1098150969
15 | * **Pages**: 425
16 | * **Amazon Rating**: 4.6 stars
17 | * **Goodreads Rating**: 4.36 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/hXs5jDF) |
21 | [Goodreads](https://www.goodreads.com/book/show/210408850-hands-on-large-language-models) |
22 | [Publisher](https://www.oreilly.com/library/view/hands-on-large-language/9781098150952/) |
23 | [GitHub Project](https://github.com/HandsOnLLM/Hands-On-Large-Language-Models)
24 |
25 | ## Blurb
26 |
27 | AI has acquired startling new language capabilities in just the past few years. Driven by rapid advances in deep learning, language AI systems are able to write and understand text better than ever before. This trend is enabling new features, products, and entire industries. Through this book's visually educational nature, readers will learn practical tools and concepts they need to use these capabilities today.
28 |
29 | You'll understand how to use pretrained large language models for use cases like copywriting and summarization; create semantic search systems that go beyond keyword matching; and use existing libraries and pretrained models for text classification, search, and clusterings.
30 |
31 | This book also helps you:
32 |
33 | * Understand the architecture of Transformer language models that excel at text generation and representation
34 | * Build advanced LLM pipelines to cluster text documents and explore the topics they cover
35 | * Build semantic search engines that go beyond keyword search, using methods like dense retrieval and rerankers
36 | * Explore how generative models can be used, from prompt engineering all the way to retrieval-augmented generation
37 | * Gain a deeper understanding of how to train LLMs and optimize them for specific applications using generative model fine-tuning, contrastive fine-tuning, and in-context learning
38 |
39 | ## Contents
40 |
41 | 1. Introduction to Language Models
42 | 2. Tokens and Embeddings
43 | 3. Looking Inside Transformer LLMs
44 | 4. Text Classification
45 | 5. Text Clustering and Topic Modeling
46 | 6. Prompt Engineering
47 | 7. Advanced Text Generation Techniques and Tools
48 | 8. Semantic Search and Retrieval-Augmented Generation
49 | 9. Multimodal Large Language Models
50 | 10. Creating Text Embedding Models
51 | 11. Fine-tuning Representation Models for Classification
52 | 12. Fine-tuning Generation Models
53 |
--------------------------------------------------------------------------------
/books/langchain-crash-course.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/langchain-crash-course.jpeg
--------------------------------------------------------------------------------
/books/langchain-crash-course.md:
--------------------------------------------------------------------------------
1 | # LangChain Crash Course
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: LangChain Crash Course
10 | * **Subtitle**: Build OpenAI LLM powered Apps: Fast track to building OpenAI LLM powered Apps using Python
11 | * **Authors**: Greg Lim
12 | * **Publication Date**: 2024
13 | * **Publisher**: Independently Published
14 | * **ISBN-13**: 978-9819411474
15 | * **Pages**: 88
16 | * **Amazon Rating**: 4.2 stars
17 | * **Goodreads Rating**: 4.07 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/ibgu6jy) |
21 | [Goodreads](https://www.goodreads.com/book/show/198671257-langchain-crash-course) |
22 | [Publisher](https://greglim.gumroad.com/l/langchain)
23 |
24 | ## Blurb
25 |
26 | In this short course, we take you on a fun, hands-on and pragmatic journey to learn how to build LLM powered apps using LangChain. You'll start building your first Generative AI app within minutes. Every section is recorded in a bite-sized manner and straight to the point as I don’t want to waste your time (and most certainly mine) on the content you don't need.
27 |
28 | In this course, we will cover:
29 |
30 | * What is LangChain
31 | * How does LangChain Work
32 | * Installation, Setup and Our First LangChain App
33 | * Building a Medium Article Generator App
34 | * Connecting to OpenAI LLM
35 | * Prompt Templates
36 | * Simple Chains
37 | * Sequential Chains
38 | * Agents
39 | * Chat with a Document
40 | * Adding Memory (Chat History)
41 | * Outputting the Chat History
42 | * Uploading Custom Documents
43 | * Loading Different Document Types (eg PDF, txt, docs)
44 | * Chat with Youtube and more...
45 |
46 | The goal of this course is to teach you LangChain development in a manageable way without overwhelming you. We focus only on the essentials and cover the material in a hands-on practice manner for you to code along.
47 |
48 | Working Through This Course
49 |
50 | This course is purposely broken down into short sections where the development process of each section will center on different essential topics. The course a practical hands on approach to learning through practice. You learn best when you code along with the examples.
51 |
52 | ## Contents
53 |
54 | 1. Introduction
55 | 2. What Is Langchain
56 | 3. How Does Langchain Work
57 | 4. Installation, Setup And Our First Langchain App
58 | 5. Connecting To Openai Llm
59 | 6. Prompt Templates
60 | 7. Simple Chains
61 | 8. Sequential Chains
62 | 9. Agents
63 | 10. Chat With A Document
64 | 11. Adding Memory (Chat History)
65 | 12. Outputting The Chat History
66 | 13. Uploading Custom Documents
67 | 14. Loading Different File Types
68 | 15. Chat With Youtube
69 |
--------------------------------------------------------------------------------
/books/large-language-models.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/large-language-models.jpeg
--------------------------------------------------------------------------------
/books/large-language-models.md:
--------------------------------------------------------------------------------
1 | # Large Language Models
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Large Language Models
10 | * **Subtitle**: A Deep Dive: Bridging Theory and Practice
11 | * **Authors**: Uday Kamath, Kevin Keenan, Garrett Somers, and Sarah Sorenson
12 | * **Publication Date**: 2024
13 | * **Publisher**: Springer
14 | * **ISBN-13**: 978-3031656460
15 | * **Pages**: 506
16 | * **Amazon Rating**: 4.1 stars
17 | * **Goodreads Rating**: 4.00 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/6IMNpkX) |
21 | [Goodreads](https://www.goodreads.com/book/show/214355031-large-language-models) |
22 | [Publisher](https://link.springer.com/book/10.1007/978-3-031-65647-7) |
23 | [GitHub Project](https://github.com/springer-llms-deep-dive/llms-deep-dive-tutorials)
24 |
25 | ## Blurb
26 |
27 | Large Language Models (LLMs) have emerged as a cornerstone technology, transforming how we interact with information and redefining the boundaries of artificial intelligence. LLMs offer an unprecedented ability to understand, generate, and interact with human language in an intuitive and insightful manner, leading to transformative applications across domains like content creation, chatbots, search engines, and research tools. While fascinating, the complex workings of LLMs―their intricate architecture, underlying algorithms, and ethical considerations―require thorough exploration, creating a need for a comprehensive book on this subject.
28 |
29 | This book provides an authoritative exploration of the design, training, evolution, and application of LLMs. It begins with an overview of pre-trained language models and Transformer architectures, laying the groundwork for understanding prompt-based learning techniques. Next, it dives into methods for fine-tuning LLMs, integrating reinforcement learning for value alignment, and the convergence of LLMs with computer vision, robotics, and speech processing. The book strongly emphasizes practical applications, detailing real-world use cases such as conversational chatbots, retrieval-augmented generation (RAG), and code generation. These examples are carefully chosen to illustrate the diverse and impactful ways LLMs are being applied in various industries and scenarios.
30 |
31 | Readers will gain insights into operationalizing and deploying LLMs, from implementing modern tools and libraries to addressing challenges like bias and ethical implications. The book also introduces the cutting-edge realm of multimodal LLMs that can process audio, images, video, and robotic inputs. With hands-on tutorials for applying LLMs to natural language tasks, this thorough guide equips readers with both theoretical knowledge and practical skills for leveraging the full potential of large language models.
32 |
33 | This comprehensive resource is appropriate for a wide audience: students, researchers and academics in AI or NLP, practicing data scientists, and anyone looking to grasp the essence and intricacies of LLMs.
34 |
35 | Key Features:
36 |
37 | * Over 100 techniques and state-of-the-art methods, including pre-training, prompt-based tuning, instruction tuning, parameter-efficient and compute-efficient fine-tuning, end-user prompt engineering, and building and optimizing Retrieval-Augmented Generation systems, along with strategies for aligning LLMs with human values using reinforcement learning
38 | * Over 200 datasets compiled in one place, covering everything from pre- training to multimodal tuning, providing a robust foundation for diverse LLM applications
39 | * Over 50 strategies to address key ethical issues such as hallucination, toxicity, bias, fairness, and privacy. Gain comprehensive methods for measuring, evaluating, and mitigating these challenges to ensure responsible LLM deployment
40 | * Over 200 benchmarks covering LLM performance across various tasks, ethical considerations, multimodal applications, and more than 50 evaluation metrics for the LLM lifecycle
41 | * Nine detailed tutorials that guide readers through pre-training, fine- tuning, alignment tuning, bias mitigation, multimodal training, and deploying large language models using tools and libraries compatible with Google Colab, ensuring practical application of theoretical concepts
42 | * Over 100 practical tips for data scientists and practitioners, offering implementation details, tricks, and tools to successfully navigate the LLM life- cycle and accomplish tasks efficiently
43 |
44 | ## Contents
45 |
46 | 1. Large Language Models: An Introduction
47 | 2. Language Models Pre-training
48 | 3. Prompt-based Learning
49 | 4. LLM Adaptation and Utilization
50 | 5. Tuning for LLM Alignment
51 | 6. LLM Challenges and Solutions
52 | 7. Retrieval-Augmented Generation
53 | 8. LLMs in Production
54 | 9. Multimodal LLMs
55 | 10. LLMs: Evolution and New Frontiers
56 |
--------------------------------------------------------------------------------
/books/llm-engineer's-handbook.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/llm-engineer's-handbook.jpeg
--------------------------------------------------------------------------------
/books/llm-engineer's-handbook.md:
--------------------------------------------------------------------------------
1 | # LLM Engineer's Handbook
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: LLM Engineer's Handbook
10 | * **Subtitle**: Master the art of engineering large language models from concept to production
11 | * **Authors**: Paul Iusztin and Maxime Labonne
12 | * **Publication Date**: 2024
13 | * **Publisher**: Packt
14 | * **ISBN-13**: 978-1836200079
15 | * **Amazon Rating**: 4.6 stars
16 | * **Goodreads Rating**: 3.54 stars
17 |
18 |
19 | **Links**: [Amazon](https://a.co/d/5H3ufht) |
20 | [Goodreads](https://www.goodreads.com/book/show/216193554-llm-engineer-s-handbook) |
21 | [Publisher](https://www.packtpub.com/en-au/product/llm-engineers-handbook-9781836200062) |
22 | [GitHub Project](https://github.com/PacktPublishing/LLM-Engineers-Handbook)
23 |
24 | ## Blurb
25 |
26 | Step into the world of LLMs with this practical guide that takes you from the fundamentals to deploying advanced applications using LLMOps best practices
27 |
28 | Purchase of the print or Kindle book includes a free eBook in PDF format
29 |
30 | “This book is instrumental in making sure that as many people as possible can not only use LLMs but also adapt them, fine-tune them, quantize them, and make them efficient enough to deploy in the real world.”- Julien Chaumond, CTO and Co-founder, Hugging Face
31 |
32 | Book Description
33 | This LLM book provides practical insights into designing, training, and deploying LLMs in real-world scenarios by leveraging MLOps' best practices. The guide walks you through building an LLM-powered twin that’s cost-effective, scalable, and modular. It moves beyond isolated Jupyter Notebooks, focusing on how to build production-grade end-to-end LLM systems.
34 |
35 | Throughout this book, you will learn data engineering, supervised fine-tuning, and deployment. The hands-on approach to building the LLM twin use case will help you implement MLOps components in your own projects. You will also explore cutting-edge advancements in the field, including inference optimization, preference alignment, and real-time data processing, making this a vital resource for those looking to apply LLMs in their projects.
36 |
37 | What you will learn
38 | * Implement robust data pipelines and manage LLM training cycles
39 | * Create your own LLM and refine with the help of hands-on examples
40 | * Get started with LLMOps by diving into core MLOps principles like IaC
41 | * Perform supervised fine-tuning and LLM evaluation
42 | * Deploy end-to-end LLM solutions using AWS and other tools
43 | * Explore continuous training, monitoring, and logic automation
44 | * Learn about RAG ingestion as well as inference and feature pipelines
45 |
46 | Who this book is for
47 | This book is for AI engineers, NLP professionals, and LLM engineers looking to deepen their understanding of LLMs. Basic knowledge of LLMs and the Gen AI landscape, Python and AWS is recommended. Whether you are new to AI or looking to enhance your skills, this book provides comprehensive guidance on implementing LLMs in real-world scenarios.
48 |
49 | ## Contents
50 |
51 | 1. Undersstanding the LLM Twin Concept and Architecture
52 | 2. Tooling and Installation
53 | 3. Data Engineering
54 | 4. RAG Feature Pipeline
55 | 5. Supervised Fine-tuning
56 | 6. Fine-tuning with Preference Alignment
57 | 7. Evaluating LLMs
58 | 8. Inference Optimization
59 | 9. RAG Inference Pipeline
60 | 10. Inference Pipeline Deployment
61 | 11. MLOps and LLMOps
62 | 12. MLOps Principles
63 |
--------------------------------------------------------------------------------
/books/llms-in-production.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/llms-in-production.jpeg
--------------------------------------------------------------------------------
/books/llms-in-production.md:
--------------------------------------------------------------------------------
1 | # LLMs in Production
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: LLMs in Production
10 | * **Subtitle**: From language models to successful products
11 | * **Authors**: Christopher Brousseau and Matthew Sharp
12 | * **Publication Date**: 2025
13 | * **Publisher**: Manning
14 | * **ISBN-13**: 978-1633437203
15 | * **Pages**: 456
16 | * **Amazon Rating**: 4.4 stars
17 | * **Goodreads Rating**: 4.08 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/gF1w56V) |
21 | [Goodreads](https://www.goodreads.com/book/show/215144443-llms-in-production) |
22 | [Publisher](https://www.manning.com/books/llms-in-production) |
23 | [GitHub Project](https://github.com/IMJONEZZ/LLMs-in-Production)
24 |
25 | ## Blurb
26 |
27 | Learn how to put Large Language Model-based applications into production safely and efficiently.
28 |
29 | Large Language Models (LLMs) are the foundation of AI tools like ChatGPT, LLAMA and Bard. This practical book offers clear, example-rich explanations of how LLMs work, how you can interact with them, and how to integrate LLMs into your own applications. In LLMs in Production you will:
30 |
31 | * Grasp the fundamentals of LLMs and the technology behind them
32 | * Evaluate when to use a premade LLM and when to build your own
33 | * Efficiently scale up an ML platform to handle the needs of LLMs
34 | * Train LLM foundation models and finetune an existing LLM
35 | * Deploy LLMs to the cloud and edge devices using complex architectures like RLHF
36 | * Build applications leveraging the strengths of LLMs while mitigating their weaknesses
37 |
38 | LLMs in Production delivers vital insights into delivering MLOps for LLMs. You’ll learn how to operationalize these powerful AI models for chatbots, coding assistants, and more. Find out what makes LLMs so different from traditional software and ML, discover best practices for working with them out of the lab, and dodge common pitfalls with experienced advice.
39 | about the book
40 | LLMs in Production is the comprehensive guide to LLMs you’ll need to effectively guide you to production usage. It takes you through the entire lifecycle of an LLM, from initial concept, to creation and fine tuning, all the way to deployment. You’ll discover how to effectively prepare an LLM dataset, cost-efficient training techniques like LORA and RLHF, and how to evaluate your models against industry benchmarks.
41 |
42 | Learn to properly establish deployment infrastructure and address common challenges like retraining and load testing. Finally, you’ll go hands-on with three exciting example projects: a cloud-based LLM chatbot, a Code Completion VSCode Extension, and deploying LLM to edge devices like Raspberry Pi. By the time you’re done reading, you’ll be ready to start developing LLMs and effectively incorporating them into software.
43 |
44 | ## Contents
45 |
46 | 1. Word’s awakening: Why large language models have captured attention
47 | 2. Large language models: A deep dive into language modeling
48 | 3. Large language model operations: Building a platform for LLMs
49 | 4. Data engineering for large language models: Setting up for success
50 | 5. Training large language models: How to generate the generator
51 | 6. Large language model services: A practical guide
52 | 7. Prompt engineering: Becoming an LLM whisperer
53 | 8. Large Language model applications: Building an interactive experience
54 | 9. Creating an LLM project: Reimplementing Llama 3
55 | 10. Creating a coding copilot project: This would have helped you earlier
56 | 11. Deploying an LLM on a Raspberry PI: How low can you go?
57 | 12. Production, an ever-changing landscape: Things are just getting started
58 |
59 | Appendices
60 | * Appendix A: History of linguistics
61 | * Appendix B: Reinforcement learning with human feedback
62 | * Appendix C: Multimodal latent spaces
63 |
--------------------------------------------------------------------------------
/books/natural-language-processing-with-transformers.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/natural-language-processing-with-transformers.jpeg
--------------------------------------------------------------------------------
/books/natural-language-processing-with-transformers.md:
--------------------------------------------------------------------------------
1 | # Natural Language Processing with Transformers
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Natural Language Processing with Transformers
10 | * **Subtitle**: Building Language Applications with Hugging Face
11 | * **Authors**: Lewis Tunstall, Leandro von Werra and Thomas Wolf
12 | * **Publication Date**: 2022
13 | * **Publisher**: O'Reilly
14 | * **ISBN-13**: 978-9355420329
15 | * **Pages**: 406
16 | * **Amazon Rating**: 4.6 stars
17 | * **Goodreads Rating**: 4.41 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/5WIiVAC) |
21 | [Goodreads](https://www.goodreads.com/book/show/60114857-natural-language-processing-with-transformers) |
22 | [Publisher](https://www.oreilly.com/library/view/natural-language-processing/9781098136789/) |
23 | [GitHub Project](https://github.com/nlp-with-transformers/notebooks)
24 |
25 | ## Blurb
26 |
27 | Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.
28 |
29 | Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.
30 |
31 | * Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
32 | * Learn how transformers can be used for cross-lingual transfer learning
33 | * Apply transformers in real-world scenarios where labeled data is scarce
34 | * Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization
35 | * Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments
36 |
37 | ## Contents
38 |
39 | 1. Introduction
40 | 2. Text Classification
41 | 3. Transformer Anatomy
42 | 4. Multilingual Named Entity Recognition
43 | 5. Text Generation
44 | 6. Summarization
45 | 7. Question Answering
46 | 8. Making Transformers Efficient in Production
47 | 9. Dealing with Few to No Labels
48 | 10. Training Transformers from Scratch
49 | 11. Future Directions
50 |
--------------------------------------------------------------------------------
/books/prompt-engineering-for-generative-ai.md:
--------------------------------------------------------------------------------
1 | # Prompt Engineering for Generative AI
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Prompt Engineering for Generative AI
10 | * **Subtitle**: Future-Proof Inputs for Reliable AI Outputs
11 | * **Authors**: James Phoenix and Mike Taylor
12 | * **Publication Date**: 2024
13 | * **Publisher**: O'Reilly
14 | * **ISBN-13**: 978-1098153434
15 | * **Pages**: 422
16 | * **Amazon Rating**: 4.5 stars
17 | * **Goodreads Rating**: 3.56 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/52xLb9K) |
21 | [Goodreads](https://www.goodreads.com/book/show/204133880-prompt-engineering-for-generative-ai) |
22 | [Publisher](https://www.oreilly.com/library/view/prompt-engineering-for/9781098153427/) |
23 | [GitHub Project](https://github.com/BrightPool/prompt-engineering-for-generative-ai-examples)
24 |
25 | ## Blurb
26 |
27 | Large language models (LLMs) and diffusion models such as ChatGPT and Stable Diffusion have unprecedented potential. Because they have been trained on all the public text and images on the internet, they can make useful contributions to a wide variety of tasks. And with the barrier to entry greatly reduced today, practically any developer can harness LLMs and diffusion models to tackle problems previously unsuitable for automation.
28 |
29 | With this book, you'll gain a solid foundation in generative AI, including how to apply these models in practice. When first integrating LLMs and diffusion models into their workflows, most developers struggle to coax reliable enough results from them to use in automated systems. Authors James Phoenix and Mike Taylor show you how a set of principles called prompt engineering can enable you to work effectively with AI.
30 |
31 | Learn how to empower AI to work for you. This book explains:
32 |
33 | * The structure of the interaction chain of your program's AI model and the fine-grained steps in between
34 | * How AI model requests arise from transforming the application problem into a document completion problem in the model training domain
35 | * The influence of LLM and diffusion model architecture—and how to best interact with it
36 | * How these principles apply in practice in the domains of natural language processing, text and image generation, and code
37 |
38 | ## Contents
39 |
40 | 1. Five Pillars of Prompting
41 | 2. Intro to Text Generation Models
42 | 3. Standard Practices for Text Generation
43 | 4. Advanced Techniques for Text Generation with Langchain
44 | 5. Vector Databases
45 | 6. Autonomous Agents with Memory and Tools
46 | 7. Intro to Diffusion Models for Image Generation
47 | 8. Standard Practices for Image Generation
48 | 9. Advanced Techniques for Image Generation
49 | 10. Building AI-powered Applications
50 |
--------------------------------------------------------------------------------
/books/prompt-engineering-for-generative-ai.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/prompt-engineering-for-generative-ai.png
--------------------------------------------------------------------------------
/books/prompt-engineering-for-llms.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/prompt-engineering-for-llms.jpeg
--------------------------------------------------------------------------------
/books/prompt-engineering-for-llms.md:
--------------------------------------------------------------------------------
1 | # Prompt Engineering for LLMs
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Prompt Engineering for LLMs
10 | * **Subtitle**: The Art and Science of Building Large Language Model–Based Applications
11 | * **Authors**: John Berryman and Albert Ziegler
12 | * **Publication Date**: 2024
13 | * **Publisher**: O'Reilly
14 | * **ISBN-13**: 978-1098156152
15 | * **Pages**: 280
16 | * **Amazon Rating**: 4.4 stars
17 | * **Goodreads Rating**: 4.00 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/eyWEQ4A) |
21 | [Goodreads](https://www.goodreads.com/book/show/213739653-prompt-engineering-for-llms) |
22 | [Publisher](https://www.oreilly.com/library/view/prompt-engineering-for/9781098156145/)
23 |
24 | ## Blurb
25 |
26 | Large language models (LLMs) are revolutionizing the world, promising to automate tasks and solve complex problems. A new generation of software applications are using these models as building blocks to unlock new potential in almost every domain, but reliably accessing these capabilities requires new skills. This book will teach you the art and science of prompt engineering-the key to unlocking the true potential of LLMs.
27 |
28 | Industry experts John Berryman and Albert Ziegler share how to communicate effectively with AI, transforming your ideas into a language model-friendly format. By learning both the philosophical foundation and practical techniques, you'll be equipped with the knowledge and confidence to build the next generation of LLM-powered applications.
29 |
30 | * Understand LLM architecture and learn how to best interact with it
31 | * Design a complete prompt-crafting strategy for an application
32 | * Gather, triage, and present context elements to make an efficient prompt
33 | * Master specific prompt-crafting techniques like few-shot learning, chain-of-thought prompting, and RAG
34 |
35 | ## Contents
36 |
37 | 1. Introduction to Prompt Engineering
38 | 2. Understanding LLMs
39 | 3. Moving to Chat
40 | 4. Designing LLM Applications
41 | 5. Prompt Content
42 | 6. Assembling the Prompt
43 | 7. Taming the Model
44 | 8. Conversational Agency
45 | 9. LLM Workflows
46 | 10. Evaluating LLM Applications
47 | 11. Looking Ahead
48 |
--------------------------------------------------------------------------------
/books/quick-start-guide-to-large-language-models.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/quick-start-guide-to-large-language-models.jpeg
--------------------------------------------------------------------------------
/books/quick-start-guide-to-large-language-models.md:
--------------------------------------------------------------------------------
1 | # Quick Start Guide to Large Language Models
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Quick Start Guide to Large Language Models
10 | * **Subtitle**: Strategies and Best Practices for ChatGPT, Embeddings, Fine-Tuning, and Multimodal AI
11 | * **Authors**: Sinan Ozdemir
12 | * **Publication Date**: 2024
13 | * **Publisher**: Addison-Wesley
14 | * **ISBN-13**: 978-0135346563
15 | * **Pages**: 384
16 | * **Amazon Rating**: 4.9 stars
17 | * **Goodreads Rating**: 3.64 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/aUsDJ7e) |
21 | [Goodreads](https://www.goodreads.com/book/show/126850297-quick-start-guide-to-large-language-models) |
22 | [Publisher](https://www.pearson.com/en-us/subject-catalog/p/quick-start-guide-to-large-language-models-2nd-edition/P200000012793) |
23 | [GitHub Project](https://github.com/sinanuozdemir/quick-start-guide-to-llms)
24 |
25 | ## Blurb
26 |
27 | The Practical, Step-by-Step Guide to Using LLMs at Scale in Projects and Products
28 |
29 | Large Language Models (LLMs) like Llama 3, Claude 3, and the GPT family are demonstrating breathtaking capabilities, but their size and complexity have deterred many practitioners from applying them. In Quick Start Guide to Large Language Models, Second Edition, pioneering data scientist and AI entrepreneur Sinan Ozdemir clears away those obstacles and provides a guide to working with, integrating, and deploying LLMs to solve practical problems.
30 |
31 | Ozdemir brings together all you need to get started, even if you have no direct experience with LLMs: step-by-step instructions, best practices, real-world case studies, and hands-on exercises. Along the way, he shares insights into LLMs' inner workings to help you optimize model choice, data formats, prompting, fine-tuning, performance, and much more. The resources on the companion website include sample datasets and up-to-date code for working with open- and closed-source LLMs such as those from OpenAI (GPT-4 and GPT-3.5), Google (BERT, T5, and Gemini), X (Grok), Anthropic (the Claude family), Cohere (the Command family), and Meta (BART and the LLaMA family).
32 |
33 | * Learn key concepts: pre-training, transfer learning, fine-tuning, attention, embeddings, tokenization, and more
34 | Use APIs and Python to fine-tune and customize LLMs for your requirements
35 | * Build a complete neural/semantic information retrieval system and attach to conversational LLMs for building retrieval-augmented generation (RAG) chatbots and AI Agents
36 | * Master advanced prompt engineering techniques like output structuring, chain-of-thought prompting, and semantic few-shot prompting
37 | * Customize LLM embeddings to build a complete recommendation engine from scratch with user data that outperforms out-of-the-box embeddings from OpenAI
38 | * Construct and fine-tune multimodal Transformer architectures from scratch using open-source LLMs and large visual datasets
39 | * Align LLMs using Reinforcement Learning from Human and AI Feedback (RLHF/RLAIF) to build conversational agents from open models like Llama 3 and FLAN-T5
40 | * Deploy prompts and custom fine-tuned LLMs to the cloud with scalability and evaluation pipelines in mind
41 | * Diagnose and optimize LLMs for speed, memory, and performance with quantization, probing, benchmarking, and evaluation frameworks
42 |
43 | ## Contents
44 |
45 | Part I - Introduction to Large Language Models
46 | * Chapter 2: Semantic Search with LLMs
47 | * Chapter 3: First Steps with Prompt Engineering
48 | * Chapter 4: The AI Ecosystem: Putting the Pieces Together
49 |
50 | Part II - Getting the Most Out of LLMs
51 | * Chapter 5: Optimizing LLMs with Customized Fine-Tuning
52 | * Chapter 6: Advanced Prompt Engineering
53 | * Chapter 7: Customizing Embeddings and Model Architectures
54 |
55 | Part III - Advanced LLM Usage
56 | * Chapter 9: Moving Beyond Foundation Models
57 | * Chapter 10: Advanced Open-Source LLM Fine-Tuning
58 | * Chapter 11: Moving LLMs into Production
59 | * Chapter 12: Evaluating LLMs
60 |
--------------------------------------------------------------------------------
/books/rag-driven-generative-ai.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/rag-driven-generative-ai.jpeg
--------------------------------------------------------------------------------
/books/rag-driven-generative-ai.md:
--------------------------------------------------------------------------------
1 | # RAG-Driven Generative AI
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: RAG-Driven Generative AI
10 | * **Subtitle**: Build custom retrieval augmented generation pipelines with LlamaIndex, Deep Lake, and Pinecone
11 | * **Authors**: Denis Rothman
12 | * **Publication Date**: 2024
13 | * **Publisher**: Packt
14 | * **ISBN-13**: 978-1836200918
15 | * **Pages**: 334
16 | * **Amazon Rating**: 4 stars
17 | * **Goodreads Rating**: 3.90 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/2zjaDK4) |
21 | [Goodreads](https://www.goodreads.com/book/show/214330235-rag-driven-generative-ai) |
22 | [Publisher](https://www.packtpub.com/en-us/product/rag-driven-generative-ai-9781836200918) |
23 | [GitHub Project](https://github.com/Denis2054/RAG-Driven-Generative-AI)
24 |
25 | ## Blurb
26 |
27 | Minimize AI hallucinations and build accurate, custom generative AI pipelines with RAG using embedded vector databases and integrated human feedback
28 |
29 | Purchase of the print or Kindle book includes a free eBook in PDF format
30 |
31 | Key Features
32 | * Implement RAG’s traceable outputs, linking each response to its source document to build reliable multimodal conversational agents
33 | * Deliver accurate generative AI models in pipelines integrating RAG, real-time human feedback improvements, and knowledge graphs
34 | * Balance cost and performance between dynamic retrieval datasets and fine-tuning static data
35 |
36 | Book Description
37 | RAG-Driven Generative AI provides a roadmap for building effective LLM, computer vision, and generative AI systems that balance performance and costs.
38 |
39 | This book offers a detailed exploration of RAG and how to design, manage, and control multimodal AI pipelines. By connecting outputs to traceable source documents, RAG improves output accuracy and contextual relevance, offering a dynamic approach to managing large volumes of information. This AI book shows you how to build a RAG framework, providing practical knowledge on vector stores, chunking, indexing, and ranking. You’ll discover techniques to optimize your project’s performance and better understand your data, including using adaptive RAG and human feedback to refine retrieval accuracy, balancing RAG with fine-tuning, implementing dynamic RAG to enhance real-time decision-making, and visualizing complex data with knowledge graphs.
40 |
41 | You’ll be exposed to a hands-on blend of frameworks like LlamaIndex and Deep Lake, vector databases such as Pinecone and Chroma, and models from Hugging Face and OpenAI. By the end of this book, you will have acquired the skills to implement intelligent solutions, keeping you competitive in fields from production to customer service across any project.
42 |
43 | What you will learn
44 | * Scale RAG pipelines to handle large datasets efficiently
45 | * Employ techniques that minimize hallucinations and ensure accurate responses
46 | * Implement indexing techniques to improve AI accuracy with traceable and transparent outputs
47 | * Customize and scale RAG-driven generative AI systems across domains
48 | * Find out how to use Deep Lake and Pinecone for efficient and fast data retrieval
49 | * Control and build robust generative AI systems grounded in real-world data
50 | * Combine text and image data for richer, more informative AI responses
51 |
52 | Who this book is for
53 | This book is ideal for data scientists, AI engineers, machine learning engineers, and MLOps engineers. If you are a solutions architect, software developer, product manager, or project manager looking to enhance the decision-making process of building RAG applications, then you’ll find this book useful.
54 |
55 | ## Contents
56 |
57 | 1. Why Retrieval Augmented Generation?
58 | 2. RAG Embedding Vector Stores with Deep Lake and OpenAI
59 | 3. Building Index-Based RAG with LlamaIndex, Deep Lake, and OpenAI
60 | 4. Multimodal Modular RAG for Drone Technology
61 | 5. Boosting RAG Performance with Expert Human Feedback
62 | 6. Scaling RAG Bank Customer Data with Pinecone
63 | 7. Building Scalable Knowledge-Graph-Based RAG with Wikipedia API and LlamaIndex
64 | 8. Dynamic RAG with Chroma and Hugging Face Llama
65 | 9. Empowering AI Models: Fine-Tuning RAG Data and Human Feedback
66 | 10. RAG for Video Stock Production with Pinecone and OpenAI
67 |
--------------------------------------------------------------------------------
/books/super-study-guide.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/super-study-guide.jpeg
--------------------------------------------------------------------------------
/books/super-study-guide.md:
--------------------------------------------------------------------------------
1 | # Super Study Guide
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: Super Study Guide
10 | * **Subtitle**: Transformers & Large Language Models
11 | * **Authors**: Afshine Amidi and Shervine Amidi
12 | * **Publication Date**: 2024
13 | * **Publisher**: Independently published
14 | * **ISBN-13**: 979-8836693312
15 | * **Pages**: 247
16 | * **Amazon Rating**: 4.6 stars
17 | * **Goodreads Rating**: 4.58 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/aE3pz72) |
21 | [Goodreads](https://www.goodreads.com/book/show/217141763-super-study-guide) |
22 | [Publisher](https://superstudy.guide/transformers-large-language-models/)
23 |
24 | ## Blurb
25 |
26 | This book is a concise and illustrated guide for anyone who wants to understand the inner workings of large language models in the context of interviews, projects or to satisfy their own curiosity.
27 |
28 | It is divided into 5 parts:
29 |
30 | * Foundations: primer on neural networks and important deep learning concepts for training and evaluation
31 | * Embeddings: tokenization algorithms, word-embeddings (word2vec) and sentence embeddings (RNN, LSTM, GRU)
32 | * Transformers: motivation behind its self-attention mechanism, detailed overview on the encoder-decoder architecture and related variations such as BERT, GPT and T5, along with tips and tricks on how to speed up computations
33 | * Large language models: main techniques to tune Transformer-based models, such as prompt engineering, (parameter efficient) finetuning and preference tuning
34 | * Applications: most common problems including sentiment extraction, machine translation, retrieval-augmented generation and many more
35 |
36 | ## Contents
37 |
38 | 1. Foundations
39 | 2. Embeddings
40 | 3. Transformers
41 | 4. Large language models
42 | 5. Applications
43 |
--------------------------------------------------------------------------------
/books/the-developer's-playbook-for-large-language-model-security.jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/the-developer's-playbook-for-large-language-model-security.jpeg
--------------------------------------------------------------------------------
/books/the-developer's-playbook-for-large-language-model-security.md:
--------------------------------------------------------------------------------
1 | # The Developer's Playbook for Large Language Model Security
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: The Developer's Playbook for Large Language Model Security
10 | * **Subtitle**: Building Secure AI Applications
11 | * **Authors**: Steve Wilson
12 | * **Publication Date**: 2024
13 | * **Publisher**: O'Reilly
14 | * **ISBN-13**: 978-1098162207
15 | * **Pages**: 200
16 | * **Amazon Rating**: 5 stars
17 | * **Goodreads Rating**: 3.88 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/d3rJVkn) |
21 | [Goodreads](https://www.goodreads.com/book/show/210408897-the-developer-s-playbook-for-large-language-model-security) |
22 | [Publisher](https://www.oreilly.com/library/view/the-developers-playbook/9781098162191/)
23 |
24 | ## Blurb
25 |
26 | Large language models (LLMs) are not just shaping the trajectory of AI, they're also unveiling a new era of security challenges. This practical book takes you straight to the heart of these threats. Author Steve Wilson, chief product officer at Exabeam, focuses exclusively on LLMs, eschewing generalized AI security to delve into the unique characteristics and vulnerabilities inherent in these models.
27 |
28 | Complete with collective wisdom gained from the creation of the OWASP Top 10 for LLMs list—a feat accomplished by more than 400 industry experts—this guide delivers real-world guidance and practical strategies to help developers and security teams grapple with the realities of LLM applications. Whether you're architecting a new application or adding AI features to an existing one, this book is your go-to resource for mastering the security landscape of the next frontier in AI.
29 |
30 | You'll learn:
31 |
32 | * Why LLMs present unique security challenges
33 | * How to navigate the many risk conditions associated with using LLM technology
34 | * The threat landscape pertaining to LLMs and the critical trust boundaries that must be maintained
35 | * How to identify the top risks and vulnerabilities associated with LLMs
36 | * Methods for deploying defenses to protect against attacks on top vulnerabilities
37 | * Ways to actively manage critical trust boundaries on your systems to ensure secure execution and risk minimization
38 |
39 | ## Contents
40 |
41 | 1. Chatbots Breaking Bad
42 | 2. The OWASP Top 10 for LLM Applications
43 | 3. Architectures and Trust Boundaries
44 | 4. Prompt Injection
45 | 5. Can Your LLM Know Too Much?
46 | 6. Do Language Models Dream of Electric Sheep?
47 | 7. Trust No One
48 | 8. Don’t Lose Your Wallet
49 | 9. Find the Weakest Link
50 | 10. Learning from Future History
51 | 11. Trust the Process
52 | 12. A Practical Framework for Responsible AI Security
53 |
--------------------------------------------------------------------------------
/books/what-is-chatgpt-doing....jpeg:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Jason2Brownlee/awesome-llm-books/7aad1f488e993fdf011149a22b2d8a7bae98b007/books/what-is-chatgpt-doing....jpeg
--------------------------------------------------------------------------------
/books/what-is-chatgpt-doing....md:
--------------------------------------------------------------------------------
1 | # What Is ChatGPT Doing...
2 |
3 | [home](../)
4 |
5 | 
6 |
7 | ## Details
8 |
9 | * **Title**: What Is ChatGPT Doing...
10 | * **Subtitle**: ...and Why Does It Work?
11 | * **Authors**: Stephen Wolfram
12 | * **Publication Date**: 2023
13 | * **Publisher**: Wolfram Media Inc.
14 | * **ISBN-13**: 978-1579550813
15 | * **Pages**: 102
16 | * **Amazon Rating**: 4.2 stars
17 | * **Goodreads Rating**: 3.86 stars
18 |
19 |
20 | **Links**: [Amazon](https://a.co/d/79xVzR5) |
21 | [Goodreads](https://www.goodreads.com/book/show/123451665-what-is-chatgpt-doing-and-why-does-it-work) |
22 | [Publisher](https://writings.stephenwolfram.com/2023/02/what-is-chatgpt-doing-and-why-does-it-work/)
23 |
24 | ## Blurb
25 |
26 | Nobody expected this—not even its creators: ChatGPT has burst onto the scene as an AI capable of writing at a convincingly human level. But how does it really work? What's going on inside its "AI mind"? In this short book, prominent scientist and computation pioneer Stephen Wolfram provides a readable and engaging explanation that draws on his decades-long unique experience at the frontiers of science and technology. Find out how the success of ChatGPT brings together the latest neural net technology with foundational questions about language and human thought posed by Aristotle more than two thousand years ago.
27 |
28 | ## Contents
29 |
30 | 1. It's Just Adding One Word at a Time
31 | 2. Where Do the Probabilities Come From?
32 | 3. What Is a Model?
33 | 4. Models for Human-Like Tasks
34 | 5. Neural Nets
35 | 6. Machine Learning, and the Training of Neural Nets
36 | 7. The Practice and Lore of Neural Net Training
37 | 8. "Surely a Network That’s Big Enough Can Do Anything!""
38 | 9. The Concept of Embeddings
39 | 10. Inside ChatGPT
40 | 11. The Training of ChatGPT
41 | 12. Beyond Basic Training
42 | 13. What Really Lets ChatGPT Work?
43 | 14. Meaning Space and Semantic Laws of Motion
44 | 15. Semantic Grammar and the Power of Computational Language
45 | 16. So ... What Is ChatGPT Doing, and Why Does It Work?
46 |
--------------------------------------------------------------------------------