LLM Application Development With Python

Learning PathSkills: OpenAI, Ollama, OpenRouter, Prompt Engineering, LangChain, LlamaIndex, ChromaDB, MarkItDown, RAG, Embeddings, Pydantic AI, LangGraph, MCP

Two people operating a large machine with conveyor belts and panels labeled RAG, Agents, and MCP, alongside a robotic arm, a Python logo, and API key icons.

Large language models can do much more than answer questions in a chat window. This learning path teaches you to integrate LLMs into Python applications, from API calls to autonomous agents.

By completing this path, you’ll be able to:

  • Call LLM APIs from OpenAI, Ollama, and OpenRouter in your Python code
  • Write effective prompts that produce reliable, structured results
  • Build retrieval-augmented generation (RAG) pipelines with LlamaIndex, ChromaDB, and LangChain
  • Convert documents into LLM-ready formats with MarkItDown
  • Create stateful AI agents using Pydantic AI and LangGraph
  • Connect agents to external tools and data sources using MCP servers

This path is for Python developers who want to build applications on top of language models. You should be comfortable with Python basics and working with APIs.

You’ll start by calling model APIs directly, then move into prompt engineering, RAG pipelines, agent frameworks, and finish by connecting your agents to external tools through MCP.

LLM Application Development With Python

Learning Path ⋅ 13 Resources

Connect to LLM APIs

Start by learning how to call large language models from Python, whether through cloud APIs or local inference.

Title image for How to Integrate ChatGPT's API With Python Projects (How to Integrate ChatGPT's API With Python Projects)

Tutorial

How to Integrate ChatGPT's API With Python Projects

Learn how to use the ChatGPT Python API with the openai library to build AI-powered features in your Python applications.

Title image for How to Integrate Local LLMs With Ollama and Python (How to Integrate Local LLMs With Ollama and Python)

Tutorial

How to Integrate Local LLMs With Ollama and Python

Learn how to integrate your Python projects with local models (LLMs) using Ollama for enhanced privacy and cost efficiency.

Title image for How to Use the OpenRouter API to Access Multiple AI Models via Python (How to Use the OpenRouter API to Access Multiple AI Models via Python)

Tutorial

How to Use the OpenRouter API to Access Multiple AI Models via Python

Access models from popular AI providers in Python through OpenRouter's unified API with smart routing, fallbacks, and cost controls.

Craft Effective Prompts

Learn how to write prompts that get reliable, structured results from language models.

Title image for Prompt Engineering: A Practical Example (Prompt Engineering: A Practical Example)

Tutorial

Prompt Engineering: A Practical Example

Learn prompt engineering techniques with a practical, real-world project to get better results from large language models. This tutorial covers zero-shot and few-shot prompting, delimiters, numbered steps, role prompts, chain-of-thought prompting, and more. Improve your LLM-assisted projects today.

Title image for Practical Prompt Engineering (Prompt Engineering: A Practical Example)

Interactive Quiz

Practical Prompt Engineering

Work With LLM Frameworks

Use LangChain to build reusable chains and pipelines around language models.

Title image for First Steps With LangChain (Build an LLM RAG Chatbot With LangChain)

Course

First Steps With LangChain

Large language models (LLMs) have taken the world by storm. In this step-by-step video course, you'll learn to use the LangChain library to build LLM-assisted applications.

Title image for First Steps With LangChain (Build an LLM RAG Chatbot With LangChain)

Interactive Quiz

First Steps With LangChain

Add Retrieval‑Augmented Generation (RAG)

Ground your LLM apps in real data using embeddings, vector databases, and retrieval pipelines.

Title image for LlamaIndex in Python: A RAG Guide With Examples (Python LlamaIndex: Step by Step RAG With Examples)

Tutorial

LlamaIndex in Python: A RAG Guide With Examples

Learn how to set up LlamaIndex, choose an LLM, load your data, build and persist an index, and run queries to get grounded, reliable answers with examples.

Title image for Embeddings and Vector Databases With ChromaDB (Embeddings and Vector Databases With ChromaDB)

Tutorial

Embeddings and Vector Databases With ChromaDB

Vector databases are a crucial component of many NLP applications. This tutorial will give you hands-on experience with ChromaDB, an open-source vector database that's quickly gaining traction. Along the way, you'll learn what's needed to understand vector databases with practical examples.

Title image for Python MarkItDown: Convert Documents Into LLM-Ready Markdown (Python MarkItDown: Convert Documents Into LLM-Ready Markdown)

Tutorial

Python MarkItDown: Convert Documents Into LLM-Ready Markdown

Get started with Python MarkItDown to turn PDFs, Office files, images, and URLs into clean, LLM-ready Markdown in seconds.

Title image for First Steps With LangChain (Build an LLM RAG Chatbot With LangChain)

Course

First Steps With LangChain

Large language models (LLMs) have taken the world by storm. In this step-by-step video course, you'll learn to use the LangChain library to build LLM-assisted applications.

Build AI Agents

Go beyond single prompts and build agents that reason, maintain state, and use tools.

Title image for Pydantic AI: Build Type-Safe LLM Agents in Python (PydanticAI: Typed LLM Agents With Structured Outputs)

Tutorial

Pydantic AI: Build Type-Safe LLM Agents in Python

Learn how to use Pydantic AI to build type-safe LLM agents in Python with structured outputs, function calling, and dependency injection patterns.

Title image for LangGraph: Build Stateful AI Agents in Python (LangGraph: Build Stateful AI Agents in Python)

Tutorial

LangGraph: Build Stateful AI Agents in Python

LangGraph is a versatile Python library designed for stateful, cyclic, and multi-actor Large Language Model (LLM) applications. This tutorial will give you an overview of LangGraph fundamentals through hands-on examples, and the tools needed to build your own LLM workflows and agents in LangGraph.

Connect Agents to External Tools With MCP

Use the Model Context Protocol to give your agents access to databases, APIs, and files.

Title image for Python MCP Server: Connect LLMs to Your Data (Python MCP: Connect Your LLM With the World)

Tutorial

Python MCP Server: Connect LLMs to Your Data

Learn how to build a Model Context Protocol (MCP) server in Python. Connect tools, prompts, and data to AI agents like Cursor for smarter assistants.

Title image for Build a Python MCP Client to Test Servers From Your Terminal (Build a Python MCP Client to Test Servers From Your Terminal)

Tutorial

Build a Python MCP Client to Test Servers From Your Terminal

Follow this Python project to build an MCP client that discovers MCP server capabilities and feeds an AI-powered chat with tool calls.

Congratulations on completing this learning path! You can now call LLM APIs, build RAG pipelines, create AI agents, and connect them to external tools using MCP.

You might also be interested in these related learning paths:

Got feedback on this learning path?

Looking for real-time conversation? Visit the Real Python Community Chat or join the next “Office Hours” Live Q&A Session. Happy Pythoning!

« Browse All Learning Paths