Fix bugs from past refactoring that went unnoticed
All threads resolved!
All threads resolved!
Merge request reports
Activity
Filter activity
- Resolved by Danilo Enoque Ferreira de Lima
one_key()
was not defined for me too. Is that because I am using an older version ofextra-data
? Not sure if this was because of my hack to fix the issue or an issue in itself.
- Resolved by Danilo Enoque Ferreira de Lima
I also tried it and I get another issue. I tested with:
/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/bin/exdf-reduce /gpfs/exfel/exp/SA3/202330/p900331/raw/r0070 --remove-sources SA3_XTD10_VAC/MDL/GATT_PHYSICS_UNIT SA3_XTD10_VAC/MDL/GATT_TRANSMISSION_MONITOR SCS_XTD10_IMGES/MOTOR/SWITCH_OPTICS SA3_XTD10_VAC/GAUGE/G30490D_IN SCS_XTD10_IMGES/ACTRL/MCP_GAIN SCS_XTD10_ESLIT/MDL/MAIN SCS_XTD10_IMGES/MOTOR/NAVITAR_ZOOM SA3_XTD10_VAC/MDL/GATT_P_CELL SA3_XTD10_VSLIT/MDL/BLADE SA3_XTD10_MONO/MDL/PHOTON_ENERGY SA3_XTD10_MIRR-2/MDL/FOCUS_POSITION SCS_XTD10_IMGES/MOTOR/SCREEN SA3_XTD10_VAC/GAUGE/G30480D_IN SA3_XTD10_VAC/DCTRL/AR_MODE_OK SA3_XTD10_UND/DOOCS/PHOTON_ENERGY_COLOR3 SA3_XTD10_VAC/DCTRL/D6_APERT_IN_OK SA3_XTD10_MONO/MOTOR/GRATING_AX SCS_XTD10_IMGES/CAM/BEAMVIEW_NAVITAR:daqOutput SA3_XTD10_UND/DOOCS/PHOTON_ENERGY SCS_XTD10_HSLIT/MDL/BLADE SCS_XTD10_IMGES/PROC/BEAMVIEW_NAVITAR SA3_XTD10_VAC/GAUGE/G30470D_IN SA3_XTD10_VAC/DCTRL/D12_APERT_IN_OK SA3_BR_UTC/MDL/BUNCHPATTERN_DECODER SA3_XTD10_VAC/DCTRL/N2_MODE_OK SA3_XTD10_PES/GAUGE/G30300F SA3_XTD10_MIRR-2/MOTOR/BENDER SQS_XTD10_ESLIT/MDL/MAIN SCS_XTD10_IMGES/DCTRL/LED_POWER SCS_XTD10_IMGES/MOTOR/NAVITAR_FOCUS SA3_XTD4_UND/DOOCS/UNDULATOR_CELLS SCS_XTD10_IMGES/CAM/BEAMVIEW_NAVITAR SCS_XTD10_IMGES/PROC/BEAMVIEW_NAVITAR:output SA3_XTD10_VAC/GAUGE/G30510C SA3_XTD10_UND/DOOCS/PHOTON_ENERGY_COLOR2 -o /gpfs/exfel/data/scratch/danilo/vspaper_data/dataset_A_test
Traceback:
Traceback (most recent call last): File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/bin/exdf-reduce", line 8, in <module> sys.exit(main()) File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/lib/python3.9/site-packages/exdf/cli/reduce.py", line 254, in main writer.write_collection(output_path) File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/lib/python3.9/site-packages/exdf/data_reduction/red_writer.py", line 242, in write_collection self.write_item( File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/lib/python3.9/site-packages/exdf/data_reduction/red_writer.py", line 315, in write_item self.write_sequence(seq_path, seq_sources, seq_no) File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/lib/python3.9/site-packages/exdf/write/sd_writer.py", line 118, in write_sequence self.write_base(f, sources, sequence) File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/lib/python3.9/site-packages/exdf/data_reduction/red_writer.py", line 320, in write_base super().write_base(f, sources, sequence) File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/lib/python3.9/site-packages/exdf/write/sd_writer.py", line 151, in write_base train_ids, *index_dsets = get_index_root_data(sources) File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/lib/python3.9/site-packages/exdf/write/sd_writer.py", line 378, in get_index_root_data sel_timestamps = np.array(fa.file['INDEX/timestamp'][sel_rows]) File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/lib/python3.9/site-packages/h5py/_hl/dataset.py", line 841, in __getitem__ selection = sel.select(self.shape, args, dataset=self) File "/gpfs/exfel/data/scratch/danilo/envs/pes_to_spec/lib/python3.9/site-packages/h5py/_hl/selections.py", line 82, in select return selector.make_selection(args) File "h5py/_selector.pyx", line 282, in h5py._selector.Selector.make_selection File "h5py/_selector.pyx", line 215, in h5py._selector.Selector.apply_args TypeError: Indexing elements must be in increasing order
mentioned in merge request !16 (merged)
mentioned in commit 584a2ffc
mentioned in merge request !17 (merged)
Please register or sign in to reply