- Virtual environments keep your project dependencies isolated.
- They help you avoid version conflicts between packages and different versions of the Python runtime.
- As a best practice, all of your Python projects should use virtual environments to store their dependencies.
Recap and Summary
00:00 Congratulations on completing the virtual environments module in the course! Here is what you covered: You learned what virtual environments are, you learned how to create and activate them, you learned how to install packages into a virtual environment.
00:14 And you’ve also learned how to deactivate and even destroy a virtual environment. And last, I showed you some tricks on how you can optimize your virtual environment workflow and make it a little bit more efficient.
00:26 In the last few lectures you learned about virtual environments and how they can help you keep your project dependencies under control. Virtual environments help keep your project dependencies isolated so that you can avoid version conflicts between packages and different versions of the Python runtime.
00:43 So in a nutshell, a virtual environment allows you to use different versions of the same package, and different versions of Python depending on a project that you’re working on.
00:54 And as a best practice, all of your Python projects should use virtual environments to store their dependencies.

Geir Arne Hjelle RP Team on Sept. 17, 2019
I usually do not use a virtual env inside of Docker. However, there are a few scenarios where it might still be useful to do so. This SO answer, and links within it highlights some of them: stackoverflow.com/a/29359760
- If your container depends on some Python based OS-tools, you might still want to isolate your code from the container system Python (although I believe the Python images on DockerHub already do this
- If you do multistage builds pulling out just the production code, it might help having everything isolated in a virtual environment. This would be more common and more powerful with compiled code where you have many build tools, though.
Finally, if you actually use virtual environments inside Docker, there are a few traps to be aware of. In particular, each command in a Dockerfile is run as a separate process, so special care must be taken to keep the environment active. This blog post has some good solutions: pythonspeed.com/articles/activate-virtualenv-dockerfile/
Zarata on Oct. 19, 2020
My motivation in finally taking this (excellent) course was that I recall somewhere (I think another RealPython course) it was stated one does not usually wish to globally upgrade Python on their Linux install, which generally utilizes a given Python version already. I’m wishing to upgrade to v3.9 and it appears Ubuntu (much less CentOS) haven’t caught up yet. I suspect the actual “how to” make an upgrade in a venv is covered in that half-remembered or another RealPython “somewhere”. A link to such a tutorial, or inclusion of a special lesson, from within this course might be nice :) [True, the RP search is very excellent … I’m just lazy]

Bartosz Zaczyński RP Team on Oct. 20, 2020
@Zarata You might be talking about Managing Multiple Python Versions With pyenv tutorial.
Become a Member to join the conversation.
aravind on Sept. 17, 2019
hi, if we are packaging the python py files/project as well as the python run time binaries in a docker container, then i assume we dont need to worry about venv because the container will use a immutable deployment?