Separating Development and Production Dependencies
00:00 Let’s discuss separating development and production dependencies for your programs. A common challenge with Python dependency management in the real world is that your development and continuous integration environments need additional dependencies that the production environment doesn’t need.
00:19 For example, your development environment might need additional testing frameworks, debuggers or profiling tools that are not really necessary in production, and might actually slow things down.
00:30 The same would be true for you continuous integration environment. If you’re running any kind of automated tests, you probably want to include testing frameworks and all kinds of other supporting tools for that on your build server environment.
00:44 But again, with all of this, the goal is for the production environment to still run lean and mean. Now the good news is that there is actually a requirements files workflow that is well known and often used in the Python community that solves this problem.
00:59 The solution here is to create two different requirements files, one for development dependencies, and one for production dependencies.
Typically, these would be called
requirements-dev.txt for the development dependencies, and
requirements.txt for the production dependencies.
There is a feature in pip that allows you to link up requirements files so you can make this workflow very convenient for developers. They only need to remember to install the
requirements-dev.txt file and that will automatically pull in and install all of the production dependencies as well.
01:36 Let’s take a look at a real world example for this technique. So I am looking at Love here, which is a Python web application that was created by the fine folks at Yelp, and you are going to see in a minute now that they are using the exact same pattern for managing their development in production dependencies.
So right now, I am in the root folder of this repository here and when I scroll down, I’ll find those two requirements files, so there is a
requirements.txt and then there is also a
And this is exactly the split that I talked about earlier, so when you look at
requirements.txt you can see that this includes all of the production dependencies in their exact versions, and nicely enough, they actually put a little comment here to explain what is going on, I think this is a really great use of the commenting feature that you can use in requirements files, by the way.
02:33 So, these are just the production dependencies that would be necessary to run the application in a production environment, now let’s look at the other requirements files.
So there is also this
requirements-dev.txt and what this does, it first of all pulls in the production requirements and then it installs a bunch of additional development dependencies, so it pulls in a testing framework and the
mock module, and some other modules that are useful during development but not really on a production machine.
03:08 And what’s interesting here is that for some of the dependencies they actually pin them to specific version, so every time you specify an exact version that is called pinning, and this is our version pinning and this is exactly what they do here.
But then for this other dependency called
ipdb which is an enhanced version of the Python debugger, they do not pin the dependency. And this makes a lot of sense because the
ipdb debugger would never actually run on an automated test server, but it’s more something that a developer would run on their local machine.
03:39 So it might make sense to leave that dependency unpinned so it will always upgrade to the latest version. Now, again you can see here that they put this little header explaining what is going on in this file, and I think this is really helpful.
Now these two requirements files cover the two usecases we discussed earlier. So if a developer wants to set this up on a new machine, they would clone the repository and then run
pip install -r requirements-dev.txt and this brings in all of the development dependencies.
But every time the application gets deployed onto a production system, all that happens is a
pip install-r requirements.txt so that only the production dependencies are brought in.
If you’re building a web application this workflow is specifically supported by platforms like Heroku. So what Heroku does when you deploy a web application to it, is it will look for a
requirements.txt file in the root of your source repository and try to install those dependencies, so if you follow the
requirements-dev.txt split, it will only install the production dependencies and not the testing dependencies which is exactly what you want.
04:53 I like this workflow because it is simple but also highly flexible.
Whenever you have a
make <target_name> command in a project you can typically look up the definition of the that
make command in a text-based configuration file called
So what this
make lib makefile command does is most likely just calling
pip install with a list of requirements or a requirements file.
When you open the
Makefile included with your project you’ll probably see something like this:
lib: pip install -r requirements.txt
It’ll (very likely) just call the
pip command behind the scenes to handle the actual Python dependency install :)
Here’s more info on the make command and makefiles: en.wikipedia.org/wiki/Makefile
Two observations / questions (with sub questions) at this point:
You detailed the slick
pip freezebuilt-in workflow to get the
requirements-dev.txt. However, does one keep separate dev and production projects to do this, or is there a way of keeping two-in-one (dev and prod) and then “filtering” the freeze? (And if two separate, how are they kept synchronized?)
“freeze” captures only the 3rd party dependency tree but how does one specify the root Python distribution that must be used? (I know some who re-translated a bunch of v2.x code by hand… if they had known the content of this course maybe they wouldn’t have needed to do such :) )
@Zarata You can combine multiple requirements files using the
-r flag inside one of them. For example, consider the following two files:
$ cat requirements.txt requests==2.24.0 click==7.1.2 Flask==1.1.2 $ cat -r requirements-dev.txt -r requirements.txt bpython==0.20 pytest==6.1.1
When you install the development requirements, it’ll automatically install all the production ones too:
$ pip install -r requirements-dev.txt Collecting requests==2.24.0 Using cached requests-2.24.0-py2.py3-none-any.whl (61 kB) Collecting click==7.1.2 Using cached click-7.1.2-py2.py3-none-any.whl (82 kB) Collecting Flask==1.1.2 Using cached Flask-1.1.2-py2.py3-none-any.whl (94 kB) Collecting bpython==0.20 Using cached bpython-0.20-py2.py3-none-any.whl (189 kB) Collecting pytest==6.1.1 Using cached pytest-6.1.1-py3-none-any.whl (272 kB) (...)
To lock third-party dependency versions and specify a particular Python interpreter, you can take advantage of Pipenv or poetry.
Could you please tell me why I received the Error for installing
requirements-dev.txt? I went through following steps:
$ pip install requests $ pip freeze requests $ pip freeze > requirements.txt $ pip install -r requirements-dev.txt ERROR: Could not open requirements file: [Errno 2] No such file or directory: 'requirements-dev.txt'
You’re getting the error because you haven’t created your
requirements-dev.txt file yet. You did make a
requirements.txt. If you want to freeze your dependencies to your dev file, then you should use:
$ pip freeze > requirements-dev.txt
Hope that makes sense.
That is the point, you are right. I appreciate your prompt support so much.
Become a Member to join the conversation.
tuliochiodi on Aug. 24, 2020
It says “Run ‘make lib’ to install these dependencies in this project’s lib directory” in the requirements.txt file. Is this another way to restore Captured Dependencies? How does it works? Thank you :D