polars.DataFrame.iter_rows#
- DataFrame.iter_rows(
- *,
- named: Literal[False] = False,
- buffer_size: int = 500,
- DataFrame.iter_rows(
- *,
- named: Literal[True],
- buffer_size: int = 500,
Returns an iterator over the DataFrame of rows of python-native values.
- Parameters:
- named
Return dictionaries instead of tuples. The dictionaries are a mapping of column name to row value. This is more expensive than returning a regular tuple, but allows for accessing values by column name.
- buffer_size
Determines the number of rows that are buffered internally while iterating over the data; you should only modify this in very specific cases where the default value is determined not to be a good fit to your access pattern, as the speedup from using the buffer is significant (~2-4x). Setting this value to zero disables row buffering (not recommended).
- Returns:
- iterator of tuples (default) or dictionaries (if named) of python row values
Warning
Row iteration is not optimal as the underlying data is stored in columnar form; where possible, prefer export via one of the dedicated export/output methods that deals with columnar data.
See also
rows
Materialises all frame data as a list of rows (potentially expensive).
rows_by_key
Materialises frame data as a key-indexed dictionary.
Notes
If you have
ns
-precision temporal values you should be aware that Python natively only supports up toμs
-precision;ns
-precision values will be truncated to microseconds on conversion to Python. If this matters to your use-case you should export to a different format (such as Arrow or NumPy).Examples
>>> df = pl.DataFrame( ... { ... "a": [1, 3, 5], ... "b": [2, 4, 6], ... } ... ) >>> [row[0] for row in df.iter_rows()] [1, 3, 5] >>> [row["b"] for row in df.iter_rows(named=True)] [2, 4, 6]