Generators and Iterators in Python

Python 10 min min read Updated: Mar 09, 2026 Advanced
Generators and Iterators in Python
Advanced Topic 4 of 10

Python Generators & Iterators

Iterators and generators are powerful concepts in Python that allow developers to handle large datasets efficiently. Instead of loading all data into memory at once, Python can generate values one at a time when needed. This technique is called lazy evaluation.

Understanding iterators and generators is essential for writing memory-efficient Python programs, especially when working with large data streams, file processing, and data pipelines.

What is an Iterator?

An iterator is an object that allows you to iterate through a collection of data one element at a time. Python uses iterators internally in loops such as the for loop.

An iterator must implement two methods:

  • __iter__() – Returns the iterator object
  • __next__() – Returns the next value in the sequence
python numbers = [1, 2, 3] iterator = iter(numbers) print(next(iterator)) print(next(iterator)) print(next(iterator))

The iter() function converts a collection into an iterator, and the next() function retrieves the next value.

Creating a Custom Iterator

You can create your own iterator by implementing the required methods.

python class Counter: def __init__(self, limit): self.limit = limit self.current = 0 def __iter__(self): return self def __next__(self): if self.current < self.limit: self.current += 1 return self.current else: raise StopIteration counter = Counter(3) for num in counter: print(num)

This custom iterator prints numbers from 1 to the specified limit.

What is a Generator?

A generator is a special type of iterator that simplifies the process of creating iterators. Instead of defining a class with __iter__() and __next__(), generators use the yield keyword.

Generators produce values one at a time and pause execution after each value until the next value is requested.

python def count_up_to(n): count = 1 while count <= n: yield count count += 1 for num in count_up_to(5): print(num)

The yield keyword returns a value and pauses the function state until the next iteration.

Generator vs Normal Function

Feature Normal Function Generator
Return Value Uses return Uses yield
Execution Runs completely Pauses after each yield
Memory Usage Higher Lower

Generator Expressions

Python also supports generator expressions, which are similar to list comprehensions but more memory efficient.

python numbers = (x * x for x in range(5)) for num in numbers: print(num)

This generates values one at a time instead of storing them in a list.

Advantages of Generators

  • Memory efficient for large datasets
  • Faster execution for streaming data
  • Simpler syntax compared to custom iterators
  • Useful in data processing pipelines

Real-World Example

Generators are commonly used when reading large files.

python def read_large_file(filename): with open(filename) as file: for line in file: yield line for line in read_large_file("data.txt"): print(line)

This approach reads the file line by line instead of loading the entire file into memory.

Best Practices

  • Use generators when dealing with large datasets.
  • Avoid storing large lists if data can be generated dynamically.
  • Use generator expressions for efficient iteration.

Conclusion

Iterators and generators are essential concepts in Python that enable efficient data processing. Iterators allow sequential access to data, while generators provide a simpler way to create iterators using the yield keyword.

By using generators, developers can build scalable and memory-efficient Python applications, especially when working with large datasets or streaming data.

In the next tutorial, we will explore Async Programming in Python and learn how Python handles asynchronous tasks and concurrency.

Get Newsletter

Subscibe to our newsletter and we will notify you about the newest updates on Edugators