Skip to content

pytables error when resuming training with robustness repo #4

@dapello

Description

@dapello

Hello,
First, thanks for making this and sharing it, it seems very helpful!

I'm having an issue when I resume training with the robustness library. I wasn't sure whether to post it here or there, but it seems to be an issue with the store, so I figured I'd throw it here.

In short, lets say I start training with:

python -m robustness.main --dataset restricted_imagenet --data /data/ImageNet/ILSVRC2012 --adv-train=0 --arch resnet50 --out-dir ./logs/ --exp-name test_resnet

When I resume training with a string larger than the original, ie:

python -m robustness.main --dataset restricted_imagenet --data /data/ImageNet/ILSVRC2012 --adv-train=0 --arch resnet50 --out-dir ./logs/ --exp-name test_resnet --resume ./logs/test_resnet/0_chec
kpoint.pt

I get the following error:

Traceback (most recent call last):
  File "/nobackup/users/dapello/projects/condas/p36torch/miniconda3/envs/p36torch/lib/python3.6/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/nobackup/users/dapello/projects/condas/p36torch/miniconda3/envs/p36torch/lib/python3.6/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/nobackup/users/dapello/projects/neurips2020/robustness/robustness/main.py", line 112, in <module>
    store = setup_store_with_metadata(args)
  File "/nobackup/users/dapello/projects/neurips2020/robustness/robustness/main.py", line 103, in setup_store_with_metadata
    store['metadata'].append_row(args_dict)
  File "/nobackup/users/dapello/projects/neurips2020/cox/cox/store.py", line 296, in append_row
    self.flush_row()
  File "/nobackup/users/dapello/projects/neurips2020/cox/cox/store.py", line 417, in flush_row
    self._HDFStore.append(self._name, df, format='table')
  File "/nobackup/users/dapello/projects/condas/p36torch/miniconda3/envs/p36torch/lib/python3.6/site-packages/pandas/io/pytables.py", line 1182, in append
    errors=errors,
  File "/nobackup/users/dapello/projects/condas/p36torch/miniconda3/envs/p36torch/lib/python3.6/site-packages/pandas/io/pytables.py", line 1709, in _write_to_group
    data_columns=data_columns,
  File "/nobackup/users/dapello/projects/condas/p36torch/miniconda3/envs/p36torch/lib/python3.6/site-packages/pandas/io/pytables.py", line 4143, in write
    data_columns=data_columns,
  File "/nobackup/users/dapello/projects/condas/p36torch/miniconda3/envs/p36torch/lib/python3.6/site-packages/pandas/io/pytables.py", line 3813, in _create_axes
    errors=self.errors,
  File "/nobackup/users/dapello/projects/condas/p36torch/miniconda3/envs/p36torch/lib/python3.6/site-packages/pandas/io/pytables.py", line 4823, in _maybe_convert_for_string_atom
    eci = existing_col.validate_col(itemsize)
  File "/nobackup/users/dapello/projects/condas/p36torch/miniconda3/envs/p36torch/lib/python3.6/site-packages/pandas/io/pytables.py", line 2036, in validate_col
    f"Trying to store a string with len [{itemsize}] in "
ValueError: Trying to store a string with len [997] in [values_block_0] column but
this column has a limit of [941]!
Consider using min_itemsize to preset the sizes on these columns

Any thoughts? I quickly skimmed for a way to set min_itemsize, but didn't see anything obvious to pass in the args.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions