Creating a Bucket
To start off, you need an S3 bucket. To create one programmatically, you must first choose a name for your bucket. Remember that this name must be unique throughout the whole AWS platform, as bucket names are DNS compliant.
If you try to create a bucket, but another user has already claimed your desired bucket name, your code will fail. Instead of success, you will see the following error: botocore.errorfactory.BucketAlreadyExists
. You can increase your chance of success when creating your bucket by picking a random name. You can generate your own function that does that for you.
00:00 Now that your SDK is configured and you understand the Boto3 interfaces, it’s time to create an S3 bucket. Buckets are what make S3 S3, so being able to create them using Python is a big step forward.
00:14
Before you can create a bucket, you need to come up with a name for it. These names must be unique across all of AWS, and if your name is already chosen, the code will fail. To have a better chance at making a bucket, you can pick a random name. For this example, you can use the uuid
module to help out. A UUID4’s string representation is 36 characters long, and you can tag a prefix to the start of it to identify what the bucket is for.
00:41
Let’s go ahead and make an example function to do this for us. You’re going to import uuid
and then define a new function called create_bucket_name()
, and you’ll want to add in a bucket_prefix
here.
00:57 Now note, to be valid, your generated bucket name must be between 3 and 63 characters long.
01:11
All right! To do this, you can go ahead and return a joined list, and in here you’re going to pass in your bucket_prefix
, and then this uuid.uuid4()
.
01:29
Let’s see how this works. You can just go ahead and print()
, do create_bucket_name()
, and let’s say 'test_name'
for the prefix.
01:39 And go ahead and run this. All right! So you can see you’ve got your prefix here and then this random string of 36 characters. Perfect. I’m going to close this out.
01:55
Now that you have a name for the bucket, you’re halfway there. The other thing you need to do is be aware of your geographic region. If you’re in the US like me, you don’t have to worry about this, but otherwise you’ll need to specify a region when you create the bucket. Using our resource interface from earlier, let’s take a look at an example of this. You could call .create_bucket()
,
02:19 you’re going to pass in your bucket name,
02:26
and then you’ll have to add in this CreateBucketConfiguration
,
02:38
and then set this attribute 'LocationConstraint'
, and then to whatever region you need to. So in this case, let’s do something like 'eu-west-1'
. While this works, it’s not ideal.
02:52 You’ve hard-coded in the region, so if you deploy this to different regions, you’ll now have to maintain all of these different values. Fortunately, Boto3 solves this using sessions.
03:04
A Session
object is created from the user’s credentials, so you can return the right location no matter where your code is deployed. Let’s go ahead and make a create_bucket()
function that’ll do this for us.
03:18
You’re going to define create_bucket()
and you’ll want to pass in the bucket_prefix
and then also the s3_connection
.
03:29
Go ahead and get that Session
object by calling boto3.session.Session()
. You can now get your current region by calling that session
object and getting the .region_name
.
03:47
Go ahead and make your bucket_name
by using that function from earlier and pass in your bucket_prefix
. You’ll want to save the bucket response—and this will just be whatever response comes from AWS—by using your s3_connection
and calling .create_bucket()
from that.
04:13
Pass in Bucket=bucket_name
, and then CreateBucketConfiguration
04:25
and make a dictionary that will have 'LocationConstraint'
and set this equal to your current_region
.
04:39
Okay. Now is a good time to print out the bucket_name
and the current_region
, and then return the bucket_name
and the bucket_response
.
04:54
All right! So now you have a function where you can pass in your bucket_prefix
, and then your s3_connection
. Note that this does not specify a client or resource interface, and this code will actually work with either of them. From here, you’ll generate a session from the user’s credentials and then find out what that current region is. You’ll generate a bucket name using the prefix, and then you’ll create that bucket. Now, there is one issue here, and it comes into play because I’m in us-east-1. For whatever reason, Boto3 and Amazon S3 do not like having 'us-east-1'
passed in as this current_region
for the 'LocationConstraint'
. Unfortunately, you can’t even pass in an empty dictionary here for this CreateBucketConfiguration
.
05:41
You just need to have this completely not here. So to get around this, and you can use a little if
statement and say something like if current_region == 'us-east-1':
, and I’m just going to copy this
06:04
and paste that in there. Otherwise, go ahead and do this code here. Now, if it’s 'us-east-1'
, you need to remove this CreateBucketConfiguration
, and that should do it! I’m not sure why Boto3 behaves this way—there’s a number of issues on GitHub discussing it but it doesn’t seem like there’s been any real progress. But anyway, now that this is set up, we can go ahead and try it out.
06:38
Go ahead and open up a Python interpreter, and if you want to get this in, we can actually import this from whatever script you’ve been writing it in. So just right off the bat, I’m going to import boto3
, and then I’ve named this boto3_guide
, so I’ll just say from boto3_guide import create_bucket
, and then you’re going to want to make a resource.
07:03
So say s3_resource
and set this equal to boto3.resource()
,
07:11
pass in 's3'
. And let’s make a first bucket using the client interface, so I’ll say first_bucket_name, first_response = create_bucket()
.
07:30
For your bucket_prefix
, go ahead and pass in 'firstpythonbucket'
, and for your s3_connection
you can pass in s3_resource
.
07:45
And if you remember, you can call .meta.client
. All right, let’s run this. Awesome! You’ve got your prefix here and this is your first bucket name, and then there’s the current region. That printed out as expected, and let’s take a look at what that response looked like.
08:06 Okay, there’s quite a bit here. You can see it’s a dictionary that was returned. And you’ve got a lot of information in here, but we’re not going to worry about that just yet.
08:17
Instead, let’s make a second bucket, so second_bucket_name, second_response = create_bucket()
. Pass in a new bucket_prefix
, which you can say 'secondpythonbucket'
.
08:37
And for s3_connection
, just pass in s3_resource
. This one will just be made with the resource interface. Awesome! That worked too. You’ve got your prefix here, the rest of the randomized name, and the current region.
08:54
And let’s take a look at what second_response
looks like with the resource interface. Okay, so you can see here that this has returned an s3.Bucket
object, which is not the dictionary response that you got up here.
09:10 All right! Now you’ve got a function that you can use to create S3 buckets using Python, and that’s pretty cool! Now you’re going to learn how to add files to them, but before that, let’s go ahead and save these bucket names so that we can use them later.
09:27 I’m just going to open up a new text file up here and just copy and paste these.
09:41
And there’s the secondpythonbucket
. Copy and paste. All right! In the next video, you’ll learn how to name files. Thanks for watching.
Joe Tatusko RP Team on Feb. 5, 2020
Hi bdorsey, are you located in the us-east region?
bdorsey327 on Feb. 13, 2020
yes I am
pksccbt on March 3, 2020
Hi I am getting error Invalid type for parameter CreateBucketConfiguration.LocationConstr aint, value: None, type: <class ‘NoneType’>, valid types: <class ‘s tr’>
Joe Tatusko RP Team on March 4, 2020
Hi pksccbt, it looks like boto3 is not able to read your region. Did you set up a config file at the end of the Installation and Setup video?
Should be located at ~/.aws/config and contain:
[default]
region = YOUR_PREFERRED_REGION
pksccbt on March 4, 2020
Yes I had set my region. It is region = us-east-2
pksccbt on March 6, 2020
Hi it is still not working.
Joe Tatusko RP Team on March 6, 2020
Interesting, I’m not sure why it wouldn’t collect that info correctly.
Can you post the entire error traceback? I’d like to see if it’s your S3 connection or the bucket creation line that is causing the problem. Somewhere your ‘us-east-2’ region is not being passed to boto3 the right way.
Thanks!
purplepython on May 9, 2020
I’m getting an error when I try to run s3_resource= boto3.resource(‘s3’) with AttributeError: module boto3 has no attribute ‘resource’ I’m running python3 and boto3 is up to date. This is when i try the part in the interactive shell to create the bucket.
crystal9563 on May 29, 2020
I get the following error message in my python session.
>>> first_bucket_name, first_response = create_bucket(
...
... bucket_prefix='firstpythonbucket',
... s3_connection=s3_resource.meta.client)
Traceback (most recent call last):
File "<stdin>", line 4, in <module>
NameError: name 's3_resource' is not defined
Has anyone else got this same issue?
barringermargaret on June 11, 2020
Was getting errors as well. The following worked for me after looking at video & tutorial.`#s3 bucket using a client
s3 bucket using a resourceimport boto3 from boto3_guide import create_bucket s3_resource = boto3.resource(‘s3’) first_bucket_name, first_response = create_bucket( … bucket_prefix=’firstpythonbucket’, … s3_connection=s3_resource.meta.client) firstpythonbucket5db905a0-b49d-4fa5-9d43-4f990c403de5 us-east-1
second_bucket_name, second_response = create_bucket( … bucket_prefix=’secondpythonbucket’, s3_connection=s3_resource) secondpythonbucketf2039215-b1d1-4f2c-99ea-7ed5d4696858 us-east-1`
gllanes005 on March 31, 2022
Hi, I am getting an error on the first step where you print out bucket name with the create_bucket_name function. I am using VSCode and it looks like its having an issue finding boto3 although I already pip installed it.
import boto3
import uuid #universal unique identifier for unique bucket name
def create_bucket_name(bucket_prefix):
#bucket name must be between 3-63 chars long
return ''.join([bucket_prefix, str(uuid.uuid4())])
# test bucket name creation
print(create_bucket_name('test_name'))
Error:
> PS C:\Users\gabriel.llanes\boto3_tutorial> python boto3_guide.py
> File "C:\Users\gabriel.llanes\boto3_tutorial\boto3_guide.py", line 0
>
> ^
> SyntaxError: unknown parsing error
gllanes005 on April 1, 2022
Disregard my previous comment, I fixed the issue. It was with VSCode python interpreter.
sam on Jan. 14, 2024
Hi Joe,
Can you please explain the use of session in more detail?
I found the below code snippted that also created S3 bucket without using “Sessions”
import logging
import boto3
from botocore.exceptions import ClientError
def create_bucket(bucket_name, region=None):
"""Create an S3 bucket in a specified region
If a region is not specified, the bucket is created in the S3 default
region (us-east-1).
:param bucket_name: Bucket to create
:param region: String region to create bucket in, e.g., 'us-west-2'
:return: True if bucket created, else False
"""
# Create bucket
try:
if region is None:
s3_client = boto3.client('s3')
s3_client.create_bucket(Bucket=bucket_name)
else:
s3_client = boto3.client('s3', region_name=region)
location = {'LocationConstraint': region}
s3_client.create_bucket(Bucket=bucket_name,
CreateBucketConfiguration=location)
except ClientError as e:
logging.error(e)
return False
return True
bob993 on Jan. 15, 2024
JSON output from S3 boto client.
Hi, I’m getting JSON-like output as the result of printing the list of buckets using the boto3
client. Is there any way to catch this part of output that prints a list of buckets? In reality, such JSON output is more obscure than the result of the aws s3 ls
command.
Become a Member to join the conversation.
bdorsey327 on Feb. 2, 2020
I’m getting error’s about regarding boto3 not being able to locate credentials even though I’ve configured the profile, credentials and config with the correct info. Also, I’ve had to modify the script b/c of the us-east api issue in order to get it to create a bucket. The script provided in the video results in an error even with the if/else logic provided