Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

read_csv followed by type conversion panics above a certain number of rows #21006

Open
2 tasks done
laurentS opened this issue Jan 30, 2025 · 0 comments
Open
2 tasks done
Labels
bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars

Comments

@laurentS
Copy link
Contributor

Checks

  • I have checked that this issue has not already been reported.
  • I have confirmed this bug exists on the latest version of Polars.

Reproducible example

import polars as pl

df = pl.read_csv("buggy3.csv")
# df = pl.read_csv("buggy3.csv", rechunk=True)  # does not panic
# df = pl.scan_csv("buggy3.csv")  # panics when .collect() is called after the code below


df.with_columns(
    date_added=pl.col("date").str.to_date("%Y-%m-%d"),
    account_verified=pl.col("verification").replace_strict({"yes": True, "no": False}),
)

The file below looks like, the panic only happens after about 300 records.

date,verification
2023-10-09,no
2023-08-10,no
1723359600,no   # a single record has a non-date value in the file
2024-06-12,no

buggy3.csv

Log output

thread 'polars-0' panicked at crates/polars-core/src/series/iterator.rs:88:9:
assertion `left == right` failed: impl error
  left: 2
 right: 1
stack backtrace:
   0: rust_begin_unwind
   1: core::panicking::panic_fmt
   2: core::panicking::assert_failed_inner
   3: core::panicking::assert_failed
   4: polars_core::series::iterator::<impl polars_core::series::Series>::iter
   5: polars_core::fmt::<impl polars_core::series::Series>::fmt_list
   6: polars_core::utils::series::handle_casting_failures
   7: polars_plan::dsl::function_expr::strings::to_date
   8: polars_plan::dsl::function_expr::strings::strptime
   9: <F as polars_plan::dsl::expr_dyn_fn::ColumnsUdf>::call_udf
  10: polars_expr::expressions::apply::ApplyExpr::eval_and_flatten
  11: <polars_expr::expressions::apply::ApplyExpr as polars_expr::expressions::PhysicalExpr>::evaluate
  12: <polars_expr::expressions::alias::AliasExpr as polars_expr::expressions::PhysicalExpr>::evaluate
  13: rayon::iter::plumbing::bridge_producer_consumer::helper
  14: rayon_core::join::join_context::{{closure}}
  15: rayon::iter::plumbing::bridge_producer_consumer::helper
  16: rayon_core::thread_pool::ThreadPool::install::{{closure}}
  17: <rayon_core::job::StackJob<L,F,R> as rayon_core::job::Job>::execute
  18: rayon_core::registry::WorkerThread::wait_until_cold
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.
Traceback (most recent call last):
  File "bug.py", line 7, in <module>
    df.with_columns(
  File ".venv/lib/python3.12/site-packages/polars/dataframe/frame.py", line 9586, in with_columns
    return self.lazy().with_columns(*exprs, **named_exprs).collect(_eager=True)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".venv/lib/python3.12/site-packages/polars/lazyframe/frame.py", line 2056, in collect
    return wrap_df(ldf.collect(callback))
                   ^^^^^^^^^^^^^^^^^^^^^
pyo3_runtime.PanicException: assertion `left == right` failed: impl error
  left: 2
 right: 1

Issue description

This seems to be the same issue as #14390 (the rechunk suggestion in it does work here as well).

If either of the type conversions inside with_columns is removed, the bug does not happen.

As explained in that other issue, the problem seems to lie with chunked data from the csv loading.

Expected behavior

When running the code with an input of 272 records including the invalid date (I didn't work out the exact limit as it's probably irrelevant), the following error is raised:

polars.exceptions.InvalidOperationError: conversion from `str` to `date` failed in column 'date' for 1 out of 271 values: ["1723359600"]

You might want to try:
- setting `strict=False` to set values that cannot be converted to `null`
- using `str.strptime`, `str.to_date`, or `str.to_datetime` and providing a format string

I would expect the same error with more records.

Installed versions

--------Version info---------
Polars:              1.21.0
Index type:          UInt32
Platform:            Linux-6.12.11-amd64-x86_64-with-glibc2.40
Python:              3.12.8 (main, Jan 11 2025, 09:42:09) [GCC 14.2.0]
LTS CPU:             False

----Optional dependencies----
Azure CLI            <not installed>
adbc_driver_manager  <not installed>
altair               <not installed>
azure.identity       <not installed>
boto3                1.36.9
cloudpickle          <not installed>
connectorx           <not installed>
deltalake            <not installed>
fastexcel            <not installed>
fsspec               <not installed>
gevent               <not installed>
google.auth          <not installed>
great_tables         <not installed>
matplotlib           <not installed>
numpy                2.2.2
openpyxl             <not installed>
pandas               2.2.3
pyarrow              19.0.0
pydantic             <not installed>
pyiceberg            <not installed>
sqlalchemy           2.0.37
torch                <not installed>
xlsx2csv             <not installed>
xlsxwriter           <not installed>
@laurentS laurentS added bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars labels Jan 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars
Projects
None yet
Development

No branches or pull requests

1 participant