Skip to content
Snippets Groups Projects
Commit 5b6da8a7 authored by Loïc Le Guyader's avatar Loïc Le Guyader
Browse files

Merge branch 'DevelopmentRG' into 'master'

Introduce package structure, generalized binning principle, ...

See merge request !87
parents 0bd1497f f82b4a4e
No related branches found
No related tags found
1 merge request!87Introduce package structure, generalized binning principle, ...
Showing with 8743 additions and 95 deletions
.ipynb*
src/*.egg*
*.pyc
*__pycache__*
tmp/
###########
SCS ToolBox
###########
Kernel
######
The SCS ToolBox is design to work in the exfel_anaconda3 environement. This can
be selected on the online cluster by:
`module load exfel exfel_anaconda3`
before launching the jupyter-notebook or on max-jhub by selecting the 'xfel'
kernel instead of the 'Python 3' anaconda environement maintained by DESY.
Installation
############
As long as the ToolBox is not yet added to the exfel_anaconda3 environment it needs to be installed locally.
Activate environment mentioned above and check installation of scs_toolbox:
.. code:: bash
pip show toolbox_scs
If the toolbox has been installed in your home directory previously, everything is set up. Otherwise it needs to be installed (only once). In that case enter the following command in the directory where the *setup.py* script is located:
.. code:: bash
pip install --user .
If you intend to develop code in the toolbox use the -e flag for installation. This creates a symbolic link to the source code you are working on.
.. code:: bash
pip install --user -e .
\ No newline at end of file
# SCS ToolBox
## Kernel
The SCS ToolBox is design to work in the exfel_anaconda3 environement. This can
be selected on the online cluster by:
`module load exfel exfel_anaconda3`
before launching the jupyter-notebook or on max-jhub by selecting the 'xfel'
kernel instead of the 'Python 3' anaconda environement maintained by DESY.
\ No newline at end of file
1.1.2rc1
from ToolBox.Load import *
from ToolBox.xgm import *
from ToolBox.XAS import *
from ToolBox.knife_edge import *
from ToolBox.Laser_utils import *
from ToolBox.DSSC import DSSC
from ToolBox.azimuthal_integrator import *
from ToolBox.DSSC1module import *
from ToolBox.bunch_pattern import *
from ToolBox.FastCCD import *
import numpy as np
class azimuthal_integrator(object):
def __init__(self, imageshape, center, polar_range, dr=2, aspect=204/236):
'''
Create a reusable integrator for repeated azimuthal integration of similar
images. Calculates array indices for a given parameter set that allows
fast recalculation.
Parameters
==========
imageshape : tuple of ints
The shape of the images to be integrated over.
center : tuple of ints
center coordinates in pixels
polar_range : tuple of ints
start and stop polar angle (in degrees) to restrict integration to wedges
dr : int, default 2
radial width of the integration slices. Takes non-square DSSC pixels into account.
aspect: float, default 204/236 for DSSC
aspect ratio of the pixel pitch
Returns
=======
ai : azimuthal_integrator instance
Instance can directly be called with image data:
> az_intensity = ai(image)
radial distances and the polar mask are accessible as attributes:
> ai.distance
> ai.polar_mask
'''
self.shape = imageshape
cx, cy = center
print(f'azimuthal center: {center}')
sx, sy = imageshape
xcoord, ycoord = np.ogrid[:sx, :sy]
xcoord -= cx
ycoord -= cy
# distance from center, hexagonal pixel shape taken into account
dist_array = np.hypot(xcoord * aspect, ycoord)
# array of polar angles
if np.abs(polar_range[1]-polar_range[0]) > 180:
raise ValueError('Integration angle too wide, should be within 180 degrees')
if np.abs(polar_range[1]-polar_range[0]) < 1e-6:
raise ValueError('Integration angle too narrow')
tmin, tmax = np.deg2rad(np.sort(polar_range)) % np.pi
polar_array = np.arctan2(xcoord, ycoord)
polar_array = np.mod(polar_array, np.pi)
self.polar_mask = (polar_array > tmin) * (polar_array < tmax)
self.maxdist = max(sx - cx, sy - cy)
ix, iy = np.indices(dimensions=(sx, sy))
self.index_array = np.ravel_multi_index((ix, iy), (sx, sy))
self.distance = np.array([])
self.flat_indices = []
for dist in range(dr, self.maxdist, dr):
ring_mask = self.polar_mask * (dist_array >= (dist - dr)) * (dist_array < dist)
self.flat_indices.append(self.index_array[ring_mask])
self.distance = np.append(self.distance, dist)
def __call__(self, image):
assert self.shape == image.shape, 'image shape does not match'
image_flat = image.flatten()
return np.array([np.nansum(image_flat[indices]) for indices in self.flat_indices])
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
\ No newline at end of file
.. code:: ipython3
import toolbox_scs as tb
import toolbox_scs.misc as tbm
proposalNB = 2511
runNB = 176
**option 1**
This method uses function we implemented.
.. code:: ipython3
fields = ["bunchPatternTable"]
run = tb.load(fields, runNB, proposalNB)
bpt = run['bunchPatternTable']
bpt_dec = tbm.extractBunchPattern(
run['bunchPatternTable'],'scs_ppl')
**option 2**
This method uses function from the euxfel_bunch_pattern package.
.. code:: ipython3
run = tb.load_run(proposalNB, runNB)
mnemonic = tb.mnemonics["bunchPatternTable"]
bpt = run.get_array(*mnemonic.values())
bpt_is_laser = tbm.is_ppl(bpt)
# -*- coding: utf-8 -*-
#
# Configuration file for the Sphinx documentation builder.
#
# This file does only contain a selection of the most common options. For a
# full list see the documentation:
# http://www.sphinx-doc.org/en/master/config
# -- Path setup --------------------------------------------------------------
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
import sys
sys.path.insert(0, os.path.abspath('..'))
import sphinx_rtd_theme
# -- Project information -----------------------------------------------------
project = 'SCS Toolbox'
copyright = '2021, SCS'
author = 'SCS'
# The short X.Y version
version = ''
# The full version, including alpha/beta/rc tags
release = ''
# -- General configuration ---------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#
# needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'sphinx.ext.mathjax',
'sphinx.ext.viewcode',
'sphinx.ext.coverage',
'sphinx.ext.napoleon',
'autoapi.extension',
'sphinx_rtd_theme',
]
autoapi_dirs = ['../src/toolbox_scs']
autoapi_ignore = ['*/deprecated/*']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'
# The master toctree document.
master_doc = 'index'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This pattern also affects html_static_path and html_extra_path.
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = None
# -- Options for HTML output -------------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
#html_theme = 'alabaster'
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#
# html_theme_options = {}
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static']
# Custom sidebar templates, must be a dictionary that maps document names
# to template names.
#
# The default sidebars (for documents that don't match any pattern) are
# defined by theme itself. Builtin themes are using these templates by
# default: ``['localtoc.html', 'relations.html', 'sourcelink.html',
# 'searchbox.html']``.
#
# html_sidebars = {}
# -- Options for HTMLHelp output ---------------------------------------------
# Output file base name for HTML help builder.
htmlhelp_basename = 'SCSToolboxdoc'
# -- Options for LaTeX output ------------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#
# 'preamble': '',
# Latex figure (float) alignment
#
# 'figure_align': 'htbp',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'SCSToolbox.tex', 'SCS Toolbox Documentation',
'SCS', 'manual'),
]
# -- Options for manual page output ------------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'scstoolbox', 'SCS Toolbox Documentation',
[author], 1)
]
# -- Options for Texinfo output ----------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'SCSToolbox', 'SCS Toolbox Documentation',
author, 'SCSToolbox', 'One line description of project.',
'Miscellaneous'),
]
# -- Options for Epub output -------------------------------------------------
# Bibliographic Dublin Core info.
epub_title = project
# The unique identifier of the text. This can be a ISBN number
# or the project homepage.
#
# epub_identifier = ''
# A unique identification for the text.
#
# epub_uid = ''
# A list of files that should not be packed into the epub file.
epub_exclude_files = ['search.html']
# -- Extension configuration -------------------------------------------------
Basic principle
===============
In a DSSC binner object we basically define maps, here called 'binners'. They define into which bucket a value along a certain dimension will be put into.
map: coordinate array along specified dimension -> array of buckets
Example 1
=========
Bin static run
~~~~~~~~~~~~~~
.. code:: ipython3
import os
import logging
import importlib
import numpy as np
import xarray as xr
import pandas as pd
import extra_data as ed
import toolbox_scs as tb
import toolbox_scs.detectors as tbdet
import toolbox_scs.detectors.dssc_plot as dssc_plot
#logging.basicConfig(level=logging.INFO)
.. code:: ipython3
# run settings
proposal_nb = 2711
run_nb = 97
.. code:: ipython3
# create a extra_data run object and collect detector information
run = tb.load_run(proposal_nb, run_nb)
dssc_info = tbdet.load_dssc_info(proposal_nb, run_nb)
.. code:: ipython3
# create toolbox bin object
bin_obj = tbdet.DSSCBinner(proposal_nb, run_nb)
.. code:: ipython3
# create array that will map the pulse dimension into
# buckets 'image' and 'dark' -> resulting dimension length = 2
buckets_pulse = ['image', 'dark'] * 30 # put all 60 pulses into buckets image and dark
# create array that will map the trainId dimension into
# buckets [0] -> resulting dimension length = 1
buckets_train = np.zeros(len(run.train_ids)).astype(int) # put all train Ids to the same bucket called 0.
.. code:: ipython3
#create binners (xarray data arrays that basically act as a map)
fpt = dssc_info['frames_per_train']
binnertrain = tbdet.create_dssc_bins("trainId",run.train_ids,buckets_train)
binnerpulse = tbdet.create_dssc_bins("pulse",np.linspace(0,fpt-1,fpt, dtype=int),buckets_pulse)
.. code:: ipython3
# add binners to bin object
bin_obj.add_binner('trainId', binnertrain)
bin_obj.add_binner('pulse', binnerpulse)
.. code:: ipython3
# get a prediction of how the data will look like (minus module dimension)
bin_obj.get_info()
.. parsed-literal::
Frozen(SortedKeysDict({'trainId': 1, 'pulse': 2, 'x': 128, 'y': 512}))
.. code:: ipython3
# bin 2 modules in parallel
mod_list = [0,15]
bin_obj.bin_data(chunksize=248, modules=mod_list, filepath='./')
.. code:: ipython3
# Save metadata into file
fname = 'testfile.h5'
tbdet.save_xarray(fname, binnertrain, group='binner1', mode='a')
tbdet.save_xarray(fname, binnerpulse, group='binner2', mode='a')
Example2
========
bin pump-probe data
~~~~~~~~~~~~~~~~~~~
.. code:: ipython3
# run settings
proposal_nb = 2212
run_nb = 235
.. code:: ipython3
# Collect information about run
run_obj = tb.load_run(proposal_nb, run_nb)
detector_info = tbdet.load_dssc_info(proposal_nb, run_nb)
.. code:: ipython3
bin_obj = tbdet.DSSCBinner(proposal_nb, run_nb)
.. code:: ipython3
# define buckets
buckets_trainId = (tb.get_array(run_obj, 'PP800_PhaseShifter', 0.03)).values
buckets_pulse = ['pumped', 'unpumped'] * 10
# create binner
binnerTrain = tbdet.create_dssc_bins("trainId",
detector_info['trainIds'],
buckets_trainId)
binnerPulse = tbdet.create_dssc_bins("pulse",
np.linspace(0,19,20, dtype=int),
buckets_pulse)
.. code:: ipython3
bin_obj.add_binner('trainId', binnerTrain)
bin_obj.add_binner('pulse', binnerPulse)
.. code:: ipython3
# get a prediction of how the data will look like (minus module dimension)
bin_obj.get_info()
.. parsed-literal::
Frozen(SortedKeysDict({'trainId': 271, 'pulse': 2, 'x': 128, 'y': 512}))
.. code:: ipython3
# bin 2 modules using the joblib module to precess the two modules
# in parallel.
bin_obj.bin_data(chunksize=248, modules=mod_list, filepath='./')
doc/dssc/hist1D.png

10.1 KiB

doc/dssc/plot1D.png

30.3 KiB

doc/dssc/xgm_threshold.png

38.4 KiB

``Getting started``
~~~~~~~~~~~~~~~~~~~
Installation
------------
The ToolBox may be installed in any environment. However, it depends on the extra_data and the euxfel_bunch_pattern package, which are no official third party python modules. Within environments where the latter are not present, they need to be installed by hand.
Furthermore, as long as the ToolBox is not yet added to one of our custom environments, it needs to be installed locally. Activate your preferred environment and check installation of scs_toolbox by typing:
.. code:: bash
pip show toolbox_scs
If the toolbox has been installed in your home directory previously, everything is set up. Otherwise it needs to be installed (only once). In that case enter following command from the the ToolBox top-level directory:
.. code:: bash
pip install --user .
Alternatively, use the -e flag for installation to install the package in development mode.
.. code:: bash
pip install --user -e .
``How to's``
~~~~~~~~~~~~
top
---
* :doc:`load run and data <load>`.
misc
----
* :doc:`bunch pattern decoding <bunch_pattern_decoding>`.
detectors (dssc)
----------------
Most of the functions within toolbox_scs.detectors can be accessed directly. This is useful during development, or when working in a non-standardized way, which is often neccessary during data evaluation. For frequent routines there is the possibility to use dssc objects that guarantee consistent data structure, and reduce the amount of recurring code within the notebook.
* bin data using toolbox_scs.tbdet -> *to be documented*.
* :doc:`bin data using the DSSCBinner <dssc/DSSCBinner>`.
* post processing, data analysis -> *to be documented*
routines
--------
* *to do*
Welcome to SCS Toolbox's documentation!
=======================================
.. toctree::
:maxdepth: 2
:caption: Contents:
:numbered:
:titlesonly:
:glob:
:hidden:
getting_started.rst
howtos.rst
``Contribute``
~~~~~~~~~~~~~~
For reasons of readability, contributions preferrably comply with the PEP8_ code structure guidelines.
.. _PEP8: https://www.python.org/dev/peps/pep-0008/#a-foolish-consistency-is-the-hobgoblin-of-little-minds
The associated code checker, called 'flake8', can be installed via PyPi.
Module index
============
*to to* (automatized doc generation)
**toolbox_scs**: Top-level entry point
**detectors**: detector specific routines
**routines**: Automatized evaluations involving several instruments
**misc**: Various sub-routines and helper functions
**test**: Test environment
**util**: Package related routines
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
**Option 1**:
.. code:: python3
import toolbox_scs as tb
# optional, check available mnemonics
# print(tb.mnemonics)
# fields is a list of available mnemonics, representing the data
# to be loaded
fields = ["FastADC4raw", "scannerX"]
proposalNr = 2565
runNr = 19
run_data = tb.load(fields, runNr, proposalNr)
run_data is an xarray dataArray. It has an attribute called 'run' containing the underlying extra_data dataCollection.
**Option 2**:
.. code:: python3
import toolbox_scs as tb
# get entry for single data source defined in mnemonics
mnemonic = tb.mnemonics["scannerX"]
proposalNr = 2565
runNr = 19
run = tb.load_run(proposalNr, runNr)
run_data = run.get_array(*mnemonic.values())
run is an extra_data dataCollection and run_data an xarray dataArray for a single data source.
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
:end
popd
This diff is collapsed.
This diff is collapsed.
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment