Join us and get access to hundreds of tutorials and a community of expert Pythonistas.

Unlock This Lesson

This lesson is for members only. Join us and get access to hundreds of tutorials and a community of expert Pythonistas.

Unlock This Lesson

Making Parallel HTTP Requests With aiohttp

Give Feedback

Learn how to use asyncio.gather() to make parallel HTTP requests in a real world application.

Comments & Discussion

toosto on April 28, 2019

Hi,

I have a couple of questions:

1) What is the state of the python thread when we are ‘awaiting’ on I/O processing? Is the python thread technically sleeping and woken up later by the operating system or is it just waiting/polling? 2) Why do we require the await keyword while attempting to get the text using response.text()? Is it because response becomes a coroutine object rather than a normal python object?

toosto on May 16, 2019

Guys, any updates?

Parijatha Kumar Pasupuleti on May 22, 2019

Where exactly the http GET request gets executed ? Is it in the response = await session.request(method='GET', url=url)statement or in the next statement i.e., value = response.text() statement ? If the request happens in only one of these statements, then why do we need await for both the statements ?

Par Akerstrom on June 1, 2019

Hi there Chyld. Great session, really liked it.

I was just wondering if there was a way to define a generator function like we did in the theory session and use that for the async http call? As opposed to using one line generator expression?

Say that the generator function would look like this:

   def gen_workers(stop):
        for member in range(1, stop + 1):
        yield member, randint(1, 10)

I played around with it but couldn’t figure out how to make the call asynchronous. The below works synchronously, but not in async!

            members = list(gen_workers(1))
            for member in members:
                responses = await asyncio.gather(worker(member[0], member[1], session))
            print(responses)

Thanks again /Par

Par Akerstrom on June 1, 2019

Ah, never mind. I read the tutorial of concurrency and figured out that we can use a different type of syntax for iterating over an already generated list.

Generator function stays the same:

    def gen_workers(stop):
        for member in range(1, stop + 1):
        yield member, randint(1, 10)

We do this for the async function:

    async def alternate_worker_pool():
        async with aiohttp.ClientSession() as session:
            # Synchronous way with external generator
            members = list(gen_workers(10))
            tasks = []
            for member in members:
                task = asyncio.ensure_future(worker(member[0], member[1], session))
                tasks.append(task)
            await asyncio.gather(*tasks, return_exceptions=True)

And we run the event loop with:

    if __name__ == '__main__':
        start = time.perf_counter()
        asyncio.get_event_loop().run_until_complete(
             alternate_worker_pool())
        elapsed = time.perf_counter() - start
        print(f'executed in {elapsed:0.2f} seconds')

Thanks for a great video. Cheers

Pygator on Sept. 14, 2019

So the speed up is from hoping that the website you formed the request to will parallelize the task on their machine? Isn’t Python running the tasks on different cores asynchronously?

Become a Member to join the conversation.