Episode 284: Running Local LLMs With Ollama and Connecting With Python
The Real Python Podcast
Would you like to learn how to work with LLMs locally on your own computer? How do you integrate your Python projects with a local model? Christopher Trudeau is back on the show this week with another batch of PyCoder’s Weekly articles and projects.
Episode Sponsor:
We cover a recent Real Python step-by-step tutorial on installing local LLMs with Ollama and connecting them to Python. It begins by outlining the advantages this strategy offers, including reducing costs, improving privacy, and enabling offline-capable AI-powered apps. We talk through the steps of setting things up, generating text and code, and calling tools.
We also share other articles and projects from the Python community, including the 2026 Python Developers Survey, creating callable instances with Python’s .__call__(), creating maps and projections with GeoPandas, ending 15 years of subprocess polling, discussing backseat software, a retry library that classifies errors, and a peer-to-peer encrypted CLI chat project.
This episode is sponsored by Honeybadger.
Course Spotlight: Tips for Using the AI Coding Editor Cursor
Learn Cursor fast: Use AI-powered coding with agents, project-aware chat, and inline edits to supercharge your VS Code workflow.
Topics:
- 00:00:00 – Introduction
- 00:02:37 – Take the Python Developers Survey 2026
- 00:03:07 – How to Integrate Local LLMs With Ollama and Python
- 00:08:15 – Sponsor: Honeybadger
- 00:09:01 – Create Callable Instances With Python’s
.__call__() - 00:12:13 – GeoPandas Basics: Maps, Projections, and Spatial Joins
- 00:16:03 – Ending 15 Years of
subprocessPolling - 00:18:57 – Video Course Spotlight
- 00:20:23 – Backseat Software – Mike Swanson
- 00:39:06 – cmd-chat: Peer-to-Peer Encrypted CLI Chat
- 00:41:58 – redress: A Retry Library That Classifies Errors
- 00:43:56 – Thanks and goodbye
News:
- Take the Python Developers Survey 2026
- The State of Python 2025: Trends and Survey Insights - The PyCharm Blog
Topics:
- How to Integrate Local LLMs With Ollama and Python – Learn how to integrate your Python projects with local models (LLMs) using Ollama for enhanced privacy and cost efficiency.
- Create Callable Instances With Python’s
.__call__()– Learn about Python callables, including what “callable” means, how to use dunder call, and how to build callable objects with step-by-step examples. - GeoPandas Basics: Maps, Projections, and Spatial Joins – Dive into GeoPandas with this tutorial covering data loading, mapping, CRS concepts, projections, and spatial joins for intuitive analysis.
- Ending 15 Years of
subprocessPolling – Python’s standard librarysubprocessmodule relies on busy-loop polling to determine whether a process has completed yet. Modern operating systems have callback mechanisms to do this, and Python 3.15 will now take advantage of these.
Discussion:
Projects:
Additional Links:
- Ollama
- Python’s .call() Method: Creating Callable Instances – Real Python
- Quiz: GeoPandas Basics: Maps, Projections, and Spatial Joins




