Generators in Python

Use cases of Generators

Swati Sinha
WiCDS
3 min readAug 17, 2021

--

In this blog post, we will discuss an interesting topic of Advanced Python. Generators and decorators are simpler ways to write complex and lengthy code and work as an asset for Python developers :)

Why Generators

Generators: It is like a function with a yield statement. Generators work like an iterator with an improved memory performance. We can write multiple yield statements in a function and it doesn’t terminate the function unlike a single return statement.

Sample use-cases

Let’s say we have to fetch 100K records of item numbers from an inventory. The beauty of this function is it creates a generator, it does not hold entire records in memory and yield one item number at a time whereas iterators load entire 100K records of item number in a memory and then iterate the process. This way it improves the performance efficiently.

Example 1: Using generator, find the squares of numbers

def square_of_numbers(n):
for i in n:
yield(i*i)

nums=square_of_numbers([5,25,30,50])
print(next(nums))
print(next(nums))
print(next(nums))
print(next(nums))
print(next(nums))

Below is the output of above execution :

Output from above execution
Image by Author

Example 2 : Compare the Memory performance of Normal functions and Generators

# Check the memory performance using Normal functionsimport memory_profiler as mem_profile
brand_name=[“Apple”,”Sony”,”Samsung”]
price=[“90K”,”40K”,”30K”]
def product_list(n):
output=[]
for i in range(n):
product={“id”:i,
“Brand_name”:brand_name[0],
“Price”:price[0]}
output.append(product)
return output
print(“Using Normal function”)
print(‘Memory (Before): ‘ + str(mem_profile.memory_usage()) + ‘MB’ )
list_of_order=product_list(10000000)
print(‘Memory (After): ‘ + str(mem_profile.memory_usage()) + ‘MB’ )
Using Normal function
Memory (Before): [51.55078125]MB
Memory (After): [2757.78515625]MB
Base memory was around 51 MB and after program execution it is approximately 2757 MB.# Check the memory performance using Generatorsimport memory_profiler as mem_profile
print(“Using generators”)
brand_name=[“Apple”,”Sony”,”Samsung”]
price=[“90K”,”40K”,”30K”]
def product_list(n):
for i in xrange(n):
product={
“id”:i,
“Brand_name”:brand_name[0],
“Price”:price[0]
}
yield product
print(‘Memory (Before): ‘ + str(mem_profile.memory_usage()) + ‘MB’ )
list_of_order=product_list(5)
print(‘Memory (After): ‘ + str(mem_profile.memory_usage()) + ‘MB’ )
Using generators
Memory (Before): [51.6953125]MB
Memory (After): [51.6953125]MB
Base memory was around 51 MB and after program execution it is approximately same.

In the above example, the normal function retrieves the 10 million records in a memory and then executes the next step, whereas generators just yield one record at a time based on the request.

Which one to choose:

Generators

Generators are definitely a powerful implementation, you can write multiple “ yield” statements. Whenever it yields a statement, the function is paused and the control is transferred to the caller.

To improve the memory, definitely a good choice, however it would be expensive, if the frequency of calling the function is multiple times.

Normal functions

Normal functions can use a return statement once in a function, whenever it hits the return statement its return value to the caller and terminates the function.

It would be less expensive as the records will be available in the memory on the first call of the function.

it would be less expensive as the records will be available in the memory on the first call of the function.

Hope you liked this post and it helps you decide which one to use for the development purpose

--

--