diff --git a/docs/source/changelog.rst b/docs/source/changelog.rst
index 5cd240c18df808915a1dd3fa0d2e03ae4363c44d..e06418d01a4bcd486964eac9e4c0865ff3a0bbf9 100644
--- a/docs/source/changelog.rst
+++ b/docs/source/changelog.rst
@@ -1,31 +1,108 @@
 Release Notes
 =============
 
+3.5.0a0
+-------
+
+- [AGIPD] Support to correct only specific trains
+- [AGIPD] Use PPU device to select trains to correct
+- [AGIPD] Faster transposition of constants
+- [AGIPD] Add option to skip plots
+- Add option to skip report generation in xfel-calibrate
+- Add option to skip env freezing in xfel-calibrate
+- Add warning if xfel-calibrate output may not be reproducible
+
+3.4.3
+-----
+- Update pyDetLib tag.
+- Add explicit dependencies on matplotlib, scipy.
+- Remove outdated matplotlib rcParams setting.
+- Update EXtra-geom to 1.6.
+- Remove cluster_profile parameter from notebooks which don't use it.
+- Fix checking availability for the concurrency parameter.
+- Fix launching work directly (not via Slurm).
+- Fix `sphinx-rep` temp folder recreation, if sphinx-rep already existed.
+- Fix missing string conversion for slurm-scheduling argument.
+- Fix title reports for multiple detectors per run folder.
+- Append to .out files for preemptable finalize job.
+- [AGIPD] [Correct] Reuse previously found constants.
+- [AGIPD] Fix missing memory cell index in SlopesPC constant sanitization.
+- [AGIPD] Only use bad pixels from darks in agipdutils.baseline_correct_via_stripes.
+- [AGIPD] [Dark] Use function to get list of karabo_da from run for making Slurm jobs.
+- [EPIX100][CORRECT] Set absolute_gain to false if relative gain was not retrieved.
+- [JUNGFRAU] Fix running for multiple modules and flip logic for do_relative_gain.
+- [JUNGFRAU] Style changes for Dark and Correct notebooks.
+- [REMI] Add notebook to reconstruct detector hits from raw data.
+- [webservice] Check run migration status using MyMDC.
+- Resolve "Skip ZMQ tests if zmq connection for calibration DB not available".
+- Reproducibility, step 1.
+
+
+
+
+3.4.2
+-----
+
+- Remove driver=core from all notebook
+- [webservice] Make use of Dynaconf for managing secrets.
+- [webservice] Make use of dedicated slurm partitions.
+- [webservice] Handle missing migration information (missing user.status fattr).
+- [webservice] Implement, raise, and catch, migration errors to send mdc messages.
+- [webservice] Simplify handling of user notebook paths.
+- [webservice] Update princess to 0.4 (use Unix sockets).
+- [webservice] Update MyMDC with begin and end times.
+- [webservice] create output folder before copying slow data.
+- [AGIPD] [CORRECT] read acq_rate from slow data.
+- [AGIPD][CORRECT] Set default memory cells to 352.
+- [AGIPD] [CORRECT] Set maximum pulses to correct based on file content.
+- [AGIPD] [FF] Correctly label legends in figures.
+- [AGIPD] [FF] Add HIBEF AGIPD500K and fix some issue with retrieval of conditions.
+- [Jungfrau] Add Gain setting to Jungfrau notebooks.
+- [Jungfrau] Fix max gain plot in LPD correct notebook
+- [JUNGFRAU] [DARK] Clearer error message for Jungfrau Dark notebooks no suitable files are found
+- [LPD] [CORRECT] Fix max gain plot.
+- [EPIX100] [CORRECT] Solve conflict between gain correction and clustering
+
+
+3.4.1
+-----
+
+- Update h5py to 3.3
+- Stop execution on notebook errors
+- [AGIPD] Add integration time as operating condition to all notebooks
+- [webservice] Add blocklist pattern when copying untouched files in webservice.
+- [webservice] Expose dark configurations in update_config.py
+- Fix MetadataClient.get_proposal_runs arguments call.
+- Fix Use snapshot for injecting constants for old PDU mappings
+- Fix the old time-summary (creation time for retrieved constants)
+- Update documentation notes on venv installation
+- Ignore all .so files in gitignore
+
+
 3.4.0
 -----
 
-- Update to Python 3.8
-- Update report upload parameter key
-- Try to make pre-commit pass
-- [AGIPD][DARK] Fix / Avoid processing empty sequence files
-- Bump numpy to 1.20.3 and remove fabio
-- Fix/filename lineno webservice logs
-- Override locale to always use UTF-8
-- Fix plotting-related warnings
-- Fix user notebook path for REMI correct notebook provisionally
-- Fix data paths in LPD notebook
-- Test update config
-- Test get_from_db and send_to_db
-- Fix typo in log message
-- [AGIPD][JF]Fix: Use all available sequences for processing darks
-- add common mode correction to epix correction notebook
-- Parallelise gain/mask compression for writing corrected AGIPD files
-- remove PyQT dependency
-- Disable dark requests from serve overview
-- Remove line that creates a run folder within the run output folder
-- Show clearer messages when running webservice in sim mode
-- Use full hostname for webservice overview
-- Use argparse only if name is main, call main with args dict
-- Assorted cleanup of xfel-calibrate
-- [AGIPD][DARK] Correctly retrieve old constants for comparison
-- [AGIPD][PC][FF] FIX/Update pc and ff notebooks with new calcat mapping
\ No newline at end of file
+- Update to Python 3.8.
+- Bump numpy to 1.20.3 and remove fabio.
+- remove PyQT dependency.
+- Disable dark requests from serve overview.
+- Update report upload parameter key.
+- Override locale to always use UTF-8.
+- Assorted cleanup of xfel-calibrate.
+- Fix pre-commit.
+- Use argparse only if name is main, call main with args dict.
+- [webservice] Use full hostname for webservice overview.
+- [webservice] Show clearer messages when running webservice in sim mode.
+- [webservice] Fix filename lineno and typos in webservice logs.
+- [webservice] Fix creating an extra run folder in run output folder.
+- [AGIPD] Parallelize gain/mask compression for writing corrected AGIPD files.
+- [AGIPD][DARK] Fix processing empty sequence files.
+- [AGIPD][PC][FF] Update notebooks with new CALCAT mapping.
+- [AGIPD][JUNGFRAU] Use all available sequences for processing darks for AGIPD and Jungfrau.
+- [AGIPD][LPD][DSSC] Fix retrieve old constants for comparison for modular detectors.
+- [LPD] Fix data paths in LPD notebook.
+- [REMI] Fix user notebook path for REMI correct notebook provisionally.
+- [EPIX][CORRECT] Add Common mode correction.
+- Fix plotting-related warnings.
+- Test update config.
+- Test get_from_db and send_to_db.
\ No newline at end of file
diff --git a/docs/source/configuration.rst b/docs/source/configuration.rst
index d7be92673758d13af39df4102b7c8a841dbe5bea..bd508db2afe3408d5f5b20e32420ebd9768513f4 100644
--- a/docs/source/configuration.rst
+++ b/docs/source/configuration.rst
@@ -61,8 +61,8 @@ The configuration is given in the form of a python dictionary::
          }
      }
 
-The first key is the detector, e.g. AGIPD. The second key is the calibration action name, e.g. DARK or PC.
-A dictionary is expected for each action with a notebook path and concurrency configuration.
+The first key is the detector, e.g. AGIPD. The second key is the calibration type name, e.g. DARK or PC.
+A dictionary is expected for each calibration type with a notebook path and concurrency configuration.
 For the concurrency three values are expected. Key `parameter` with a value name of type list, which is defined in the first notebook cell.
 The key `default concurrency` to define the range of values for `parameter` in each concurrent notebook, if it is not defined by the user.
 e.g. `"default concurrency": 16` leads to running 16 concurrent jobs, each processing one module with values of [0,1,2,...,15].
diff --git a/docs/source/how_it_works.rst b/docs/source/how_it_works.rst
deleted file mode 100644
index 8b137891791fe96927ad78e64b0aad7bded08bdc..0000000000000000000000000000000000000000
--- a/docs/source/how_it_works.rst
+++ /dev/null
@@ -1 +0,0 @@
-
diff --git a/docs/source/index.rst b/docs/source/index.rst
index 7deab968aa2d7cdd403c8d184bbe177188379b60..cc3e055ea6c82ffbd12c31a34b446115a380c458 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -4,7 +4,7 @@
    contain the root `toctree` directive.
 
 European XFEL Offline Calibration
-=================================================
+=================================
 
 The European XFEL Offline Calibration (pyCalibration) is a python package that consists of
 different services, responsible for applying most of the offline calibration
@@ -13,9 +13,10 @@ and characterization for the detectors.
 Running a calibration
 ---------------------
 
-The tool utilizes tools such as nbconvert_ and nbparameterise_
+It utilizes tools such as nbconvert_ and nbparameterise_
 to expose Jupyter_ notebooks to a command line interface.
 In the process reports are generated from these notebooks.
+
 The general interface is::
 
     % xfel-calibrate DETECTOR TYPE
diff --git a/docs/source/installation.rst b/docs/source/installation.rst
index 90bce9c185204de528a1ed2320fdce354e384008..594f54ada19351e1d9efd9cc39b16d75f5338e95 100644
--- a/docs/source/installation.rst
+++ b/docs/source/installation.rst
@@ -1,3 +1,5 @@
+.. _installation:
+
 ************
 Installation
 ************
diff --git a/docs/source/tutorial.rst b/docs/source/tutorial.rst
index 609d97bf6119c8ef034eede79a943f7a2a88202e..73018ff118c4c350ea3e6b17e6a398f5264da329 100644
--- a/docs/source/tutorial.rst
+++ b/docs/source/tutorial.rst
@@ -25,41 +25,7 @@ This will open a jupyter kernel running in your browser where you can then open
   ipcluster start --n=4 --profile=tutorial
 
 you can step through the cells and run them.
-If you run this notebook using the xfel-calibrate command as explaind at the end of this tutorial you do not need to start the cluster yourself, it will be done by the framework.
-
-
-Installation and Configuration
-------------------------------
-
-The offline calibration tool-chain is optimised to run on the maxwell cluster.
-For more information refer to the Maxwell_ documentation.
-
-.. _Maxwell: https://confluence.desy.de/display/IS/Running+Jobs+on+Maxwell
-
-In order to use the offline calibration tool a few steps need to be carried out
-to install the necessary packages and setup the environment:
-
-1. Log into max-exfl with you own user name/account.
-
-2. Install karabo in your home directory or under /gpfs/exfel/data/scratch/username
-   by typing the following commands on you shell::
-
-     wget http://exflserv05.desy.de/karabo/karaboFramework/tags/2.2.4/karabo-2.2.4-Release-CentOS-7-x86_64.sh
-
-     chmod +x karabo-2.2.4-Release-CentOS-7-x86_64.sh
-
-     ./karabo-2.2.4-Release-CentOS-7-x86_64.sh
-
-     source karabo/activate
-
-3. Get the package pycalibration which contains the offline calibration tool-chain::
-
-     git clone https://git.xfel.eu/gitlab/detectors/pycalibration.git
-
-4. Install the necessary requirements and the package itself::
-
-     cd pycalibration
-     pip install -r requirements.txt .
+If you run this notebook using the xfel-calibrate command as explained at the end of this tutorial you do not need to start the cluster yourself, it will be done by the framework.
 
 
 Create your own notebook
@@ -87,7 +53,7 @@ Running the notebook
 
    You can see your job in the queue with::
 
-     squeue -u username
+     squeue -u $USERNAME
 
 3. Look at the generated report in the chosen output folder.
 4. More information on the job run on the cluster can be found in the temp folder.
diff --git a/docs/source/workflow.rst b/docs/source/workflow.rst
index 0b9f0c6ff2d8870b18b0f85959c62d548f1c0ca5..a9369b4fb016aab521711f070809daa328e4116a 100644
--- a/docs/source/workflow.rst
+++ b/docs/source/workflow.rst
@@ -4,7 +4,7 @@ Development Workflow
 ====================
 
 The following walkthrough will guide you through a possible workflow
-when developing new offline calibration tools.
+when developing new notebook for offline calibration.
 
 Fresh Start
 -----------
@@ -12,7 +12,7 @@ Fresh Start
 If you are starting a blank notebook from scratch you should first
 think about a few preconsiderations:
 
-* Will the notebook performan a headless task, or will it also be
+* Will the notebook perform a headless task, or will it also be
   an important interface for evaluating the results in form of a
   report.
 * Do you need to run concurrently? Is concurrency handled internally,
@@ -25,7 +25,7 @@ cells in the notebook. You should also structure it into appropriate
 subsections.
 
 If you plan on running concurrently on the cluster, identify which variable
-should be mapped to concurent runs. For autofilling it an integer list is
+should be mapped to concurrent runs. For autofilling it an integer list is
 needed.
 
 Once you've clarified the above points, you should create a new notebook,
@@ -139,7 +139,7 @@ to the following parameters being exposed via the command line::
 
 .. note::
 
-    Nbparameterise can only parse the mentioned subset of variable types. An expression
+    nbparameterise_ can only parse the mentioned subset of variable types. An expression
     that evaluates to such a type will note be recognized: e.g. `a = list(range(3))` will
     not work!
 
@@ -170,7 +170,7 @@ Best Coding Practices
 In principle there a not restrictions other than that parameters that are exposed to the
 command line need to be defined in the first code cell of the notebook.
 
-However, a few guidelines should be observered to make notebook useful for display as
+However, a few guidelines should be observed to make notebook useful for display as
 reports and usage by other.
 
 External Libraries
@@ -182,8 +182,8 @@ wanting to run the tool will need to install these requirements as well. Thus,
 * do not use a specialized tool if an accepted alternative exists. Plots e.g. should usually
   be created using matplotlib_ and numerical processing should be done in numpy_.
 
-* keep runtimes and library requirements in mind. A library doing its own parallelism either
-  needs to programatically be able to set this up, or automatically do so. If you need to
+* keep runtime and library requirements in mind. A library doing its own parallelism either
+  needs to programmatically be able to set this up, or automatically do so. If you need to
   start something from the command line first, things might be tricky as you will likely
   need to run this via `POpen` commands with appropriate environment variable.
 
@@ -195,9 +195,9 @@ such that it is available as soon as possible. Detailed plotting and inspection
 possibly done later on in a notebook.
 
 Also consider using HDF5 via h5py_ as your output format. If you correct or calibrated
-input data, which adhears to the XFEL naming convention, you should maintain the convention
+input data, which adheres to the XFEL naming convention, you should maintain the convention
 in your output data. You should not touch any data that you do not actively work on and
-should assure that the `INDEX` and identifier entries are syncronized with respect to
+should assure that the `INDEX` and identifier entries are synchronized with respect to
 your output data. E.g. if you remove pulses from a train, the `INDEX/.../count` section
 should reflect this.
 
@@ -233,10 +233,10 @@ a context. Make sure to label your axes.
 
 Also make sure the plots are readable on an A4-sized PDF page; this is the format the notebook
 will be rendered to for report outputs. Specifically, this means that figure sizes should not
-exeed approx 15x15 inches.
+exceed approx 15x15 inches.
 
 The report will contain 150 dpi png images of your plots. If you need higher quality output
-of individual plot files you should save these separetly, e.g. via `fig.savefig(...)` yourself.
+of individual plot files you should save these separately, e.g. via `fig.savefig(...)` yourself.
 
 
 Calibration Database Interaction
@@ -245,7 +245,7 @@ Calibration Database Interaction
 Tasks which require calibration constants or produce such should do this by interacting with
 the European XFEL calibration database.
 
-In terms of developement workflow it is usually easier to work with file-based I/O first and
+In terms of development workflow it is usually easier to work with file-based I/O first and
 only switch over to the database after the algorithmic part of the notebook has matured.
 Reasons for this include:
 
@@ -261,7 +261,7 @@ documentation.
 Testing
 -------
 
-The most important test is that your notebook completes flawlessy outside any special
+The most important test is that your notebook completes flawlessly outside any special
 tool chain feature. After all, the tool chain will only replace parameters, and then
 launch a concurrent job and generate a report out of notebook. If it fails to run in the
 normal Jupyter notebook environment, it will certainly fail in the tool chain environment.
@@ -274,11 +274,11 @@ Specifically, you should verify that all arguments are parsed correctly, e.g. by
 
     xfel-calibrate DETECTOR NOTEBOOK_TYPE --help
 
-From then on, check include if parallel slurm jobs are exectuted correctly and if a report
+From then on, check include if parallel slurm jobs are executed correctly and if a report
 is generated at the end.
 
 Finally, you should verify that the report contains the information you'd like to convey and
-is inteligable to people other than you.
+is intelligible to people other than you.
 
 .. note::
 
diff --git a/docs/source/xfel_calibrate_conf.rst b/docs/source/xfel_calibrate_conf.rst
index 54e6211c627da473dc3815782654fad71afbcbb5..52d1bdf5cf0fe28f1a299e5e057d36927d1de97b 100644
--- a/docs/source/xfel_calibrate_conf.rst
+++ b/docs/source/xfel_calibrate_conf.rst
@@ -3,4 +3,7 @@ xfel_calibrate
 
 .. module:: xfel_calibrate.calibrate
 
-.. autofunction:: balance_sequences
\ No newline at end of file
+.. autofunction:: balance_sequences
+
+
+:py:mod: xfel_calibrate.notebooks
\ No newline at end of file