diff options
| author | Noah D. Brenowitz <nbren12@gmail.com> | 2021-04-23 00:50:38 -0700 |
|---|---|---|
| committer | Noah D. Brenowitz <nbren12@gmail.com> | 2021-04-23 17:57:01 -0700 |
| commit | ceeaf2d0669cc5a4d86e6313cc0fdb6774659884 (patch) | |
| tree | f2f84390c74ea70d4a610f0bcfaa833af360e74a /pkgs/development/python-modules/python-mapnik/python-mapnik_std_optional.patch | |
| parent | c21475e7e8aeaa38b13af380c64da59690056086 (diff) | |
python3Packages.dask: fix sandboxed builds
Importing dask.dataframe in a sandboxed build results in a TypeError like
this:
File "/nix/store/nv60iri29bia4szhhcvsdxgsci4wxvp6-python3.8-dask-2021.03.0/lib/python3.8/site-packages/dask/dataframe/io/csv.py", line 392, in <module>
AUTO_BLOCKSIZE = auto_blocksize(TOTAL_MEM, CPU_COUNT)
File "/nix/store/nv60iri29bia4szhhcvsdxgsci4wxvp6-python3.8-dask-2021.03.0/lib/python3.8/site-packages/dask/dataframe/io/csv.py", line 382, in auto_blocksize
blocksize = int(total_memory // cpu_count / memory_factor)
TypeError: unsupported operand type(s) for //: 'int' and 'NoneType'
This occurs because dask.dataframe has a non-deterministic component which
generates an automatic chunk-size based on system information.
This went unnoticed because the dask tests were disabled.
Changes:
- add a patch making the chunk-size inference more robust
- re-enable the tests
Resolves #120307
Diffstat (limited to 'pkgs/development/python-modules/python-mapnik/python-mapnik_std_optional.patch')
0 files changed, 0 insertions, 0 deletions
