Skip to content
Snippets Groups Projects

[LPD][Correct] Fix axis order for LPD-1M RelativeGain constant

Merged Thomas Kluyver requested to merge fix/lpd-relgain-axis-order into master

Description

From discussion with @yousefh last week, we realised that the RelativeGain constant is being interpreted in the wrong axis order. It should be (slow_scan, fast_scan, memory_cells, gain_stage) like the other gain constants, whereas we were treating it as (fast_scan, slow_scan, memory_cells, gain_stage) like the dark constants.

This is not as bad as it sounds, because apparently RelativeGain was set to all 1s until recently (2023-05-12), making it effectively a no-op. So the fix shouldn't change anything for older data.

Assuming this change is correct, we need to make the same adjustment in calng (cc @hammerd).

How Has This Been Tested?

Prepared new corrected data in /gpfs/exfel/exp/FXE/202304/p003338/scratch/r0125_recal_relgain_axisorder, which Hazem compared with corrected data in proc/ and with his own corrections.

xfel-calibrate lpd CORRECT --karabo-da LPD00 LPD01 LPD02 LPD03 LPD04 LPD05 LPD06 LPD07 LPD08 LPD09 LPD10 LPD11 LPD12 LPD13 LPD14 LPD15 \
    --creation-time "2023-04-21 04:02:41" \
    --mem-cells 512 --bias-voltage 250.0 --capacitor 5pf --photon-energy 9.2 --category 0 \
    --offset-corr --rel-gain --ff-map --gain-amp-map --chunks-data 1 --num-workers 6 --use-cell-order \
    --in-folder /gpfs/exfel/exp/FXE/202304/p003338/raw --out-folder /gpfs/exfel/exp/FXE/202304/p003338/scratch/r0125_recal_relgain_axisorder \
    --karabo-id FXE_DET_LPD1M-1 --run 125

Types of changes

  • Bug fix (non-breaking change which fixes an issue)

Checklist:

  • My code follows the code style of this project.

Reviewers

@schmidtp

Merge request reports

Loading
Loading

Activity

Filter activity
  • Approvals
  • Assignees & reviewers
  • Comments (from bots)
  • Comments (from users)
  • Commits & branches
  • Edits
  • Labels
  • Lock status
  • Mentions
  • Merge request status
  • Tracking
Please register or sign in to reply
Loading