Skip to content
Snippets Groups Projects
Commit a45291fb authored by Karim Ahmed's avatar Karim Ahmed
Browse files

doc(README): Refactor README and convert from RST to MD

- Added new part for install notebook and for using Jupyter notebook
parent 178a5b54
No related branches found
No related tags found
1 merge request!1048doc[README][RTD]: Refactor README from RST to MarkDown and add jupyter usage installation instructions.
###################
Offline Calibration
###################
# Offline Calibration
The offline calibration is a package that consists of different services,
responsible for applying most of the offline calibration and characterization
for the detectors.
.. contents::
Offline Calibration Installation
********************************
## Offline Calibration Installation
It's recommended to install the offline calibration (pycalibration) package on
maxwell, using the anaconda/3 environment.
maxwell, using the suggested python virtual environment.
The following instructions clone from the EuXFEL GitLab instance using SSH
remote URLs, this assumes that you have set up SSH keys for use with GitLab
already. If you have not then read the appendix section on `SSH Key Setup for
GitLab`_ for instructions on how to do this .
already. If you have not then read the appendix section on [SSH Key Setup for GitLab](#ssh-key-setup-for-gitlab) for instructions on how to do this .
Installation using python virtual environment - recommended
===========================================================
### Installation using python virtual environment - recommended
`pycalibration` uses the same version of Python as Karabo, which in June 2021
updated to use Python 3.11. Currently the default python installation on Maxwell
`pycalibration` uses Python 3.11. Currently the default python installation on Maxwell
is still Python 3.9, so Python 3.11 needs to be loaded from a different
location.
Therefore `pyenv` is used, we provide a pyenv installation at
`/gpfs/exfel/sw/calsoft/.pyenv` which we use to manage different versions of
python. This can be activated with ``source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate``
python. This can be activated with `source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate`
A quick setup would be:
1. ``source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate``
2. ``git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git && cd pycalibration`` - clone the offline calibration package from EuXFEL GitLab
3. ``pyenv shell 3.11.9`` - load required version of python
4. ``python3 -m venv .venv`` - create the virtual environment
5. ``source .venv/bin/activate`` - activate the virtual environment
6. ``python3 -m pip install --upgrade pip`` - upgrade version of pip
7. ``python3 -m pip install .`` - install the pycalibration package (add ``-e`` flag for editable development installation)
1. `source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate`
2. `git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git && cd pycalibration` - clone the offline calibration package from EuXFEL GitLab
3. `pyenv shell 3.11.9` - load required version of python
4. `python3 -m venv .venv` - create the virtual environment
5. `source .venv/bin/activate` - activate the virtual environment
6. `python3 -m pip install --upgrade pip` - upgrade version of pip
7. `python3 -m pip install .` - install the pycalibration package (add `-e` flag for editable development installation)
Copy/paste script:
.. code::
source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate
git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git
cd pycalibration
pyenv shell 3.11.9
python3 -m venv .venv
source .venv/bin/activate
python3 -m pip install --upgrade pip
python3 -m pip install . # `-e` flag for editable install, e.g. `pip install -e .`
```bash
source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate
git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git
cd pycalibration
pyenv shell 3.11.9
python3 -m venv .venv
source .venv/bin/activate
python3 -m pip install --upgrade pip
python3 -m pip install . # `-e` flag for editable install, e.g. `pip install -e .`
```
Installation into user home directory
=====================================
### Installation into user home directory
This is not recommended as `pycalibration` has pinned dependencies for
stability, if you install it directly into you users home environment then it
......@@ -66,40 +54,66 @@ will downgrade/upgrade your local packages, which may cause major issues and may
**break your local environment**, it is highly recommended to use the venv
installation method instead.
1. ``source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate``
2. ``git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git && cd pycalibration`` - clone the offline calibration package from EuXFEL GitLab
3. ``pyenv shell 3.11.9`` - load required version of python
4. ``pip install .`` - install the pycalibration package (add ``-e`` flag for editable development installation)
5. ``export PATH=$HOME/.local/bin:$PATH`` - make sure that the home directory is in the PATH environment variable
1. `source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate`
2. `git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git && cd pycalibration` - clone the offline calibration package from EuXFEL GitLab
3. `pyenv shell 3.11.9` - load required version of python
4. `pip install .` - install the pycalibration package (add `-e` flag for editable development installation)
5. `export PATH=$HOME/.local/bin:$PATH` - make sure that the home directory is in the PATH environment variable
Copy/paste script:
.. code::
```bash
source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate
git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git
pyenv shell 3.11.9
cd pycalibration
pip install --user . # `-e` flag for editable install, e.g. `pip install -e .`
export PATH=$HOME/.local/bin:$PATH
```
source /gpfs/exfel/sw/calsoft/.pyenv/bin/activate
git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git
pyenv shell 3.11.9
cd pycalibration
pip install --user . # `-e` flag for editable install, e.g. `pip install -e .`
export PATH=$HOME/.local/bin:$PATH
### Working with Jupyter Notebooks
If you plan to work with Jupyter notebooks interactively, you have two main options:
Creating an ipython kernel for virtual environments
===================================================
#### Option 1: Install Jupyter Notebook locally
To create an ipython kernel with pycalibration available you should (if using a
venv) activate the virtual environment first, and then run:
if you prefer to run Jupyter notebooks on your local machine or on Maxwell, you can install the `notebook` package in your virtual environment:
.. code::
```bash
python3 -m pip install notebook
```
After installation, you can start a Jupyter notebook server by running:
python3 -m pip install ipykernel # If not using a venv add `--user` flag
python3 -m ipykernel install --user --name pycalibration --display-name "pycalibration" # If not using a venv pick different name
```bash
jupyter notebook
```
This can be useful for Jupyter notebook tools as https://max-jhub.desy.de/hub/login
#### Option 2: Use max-jhub (Recommended)
Alternatively, we recommend using max-jhub, a JupyterHub instance available at DESY.
This option provides a convenient web-based environment for running Jupyter notebooks without needing
to set up everything locally.
Offline Calibration Configuration
*********************************
For detailed instructions on how to use max-jhub, please refer to these documentations:
- [Max-jhub DESY Documentation](https://confluence.desy.de/display/MXW/JupyterHub+on+Maxwell)
- [Max-jhub EuXFEL User Documentation](https://rtd.xfel.eu/docs/data-analysis-user-documentation/en/latest/jhub/#via-max-jhub-recommended)
To use max-jhub effectively with pycalibration, make sure you've created an ipython kernel as
described in the [Creating an ipython kernel for virtual environments](#creating-an-ipython-kernel-for-virtual-environments) section below.
### Creating an ipython kernel for virtual environments
To create an ipython kernel with pycalibration available you should (if using a
venv) activate the virtual environment first, and then run:
```bash
python3 -m pip install ipykernel # If not using a venv add `--user` flag
python3 -m ipykernel install --user --name pycalibration --display-name "pycalibration" # If not using a venv pick different name
```
## Offline Calibration Configuration
The offline calibration package is configured with three configuration files:
......@@ -139,8 +153,7 @@ Note that the order of priority is:
- user configuration - e.g. `~/.config/pycalibration/webservice/webservice.yaml`
- environment variables - e.g. `export CAL_WEBSERVICE_*=...`
Examples
========
### Examples
For example, `webservice/config/webservice.yaml` has:
......@@ -169,8 +182,7 @@ metadata-client:
Alternatively, this file could be placed at `~/.config/pycalibration/webservice/webservice.yaml`
Checking Configurations
=======================
### Checking Configurations
Having multiple nested configurations can get a bit confusing, so `dynaconf`
includes a command to help view what a configuration will be resolved to. Once
......@@ -178,142 +190,144 @@ you have activated the python environment pycalibration is installed in, you
can run the command `dynaconf -i webservice.config.webservice list` to list the
current configuration values:
```
```bash
> dynaconf -i webservice.config.webservice list
Working in main environment
WEBSERVICE_DIR<PosixPath> PosixPath('/home/roscar/work/git.xfel.eu/detectors/pycalibration/webservice')
CONFIG-REPO<dict> {'local-path': '/home/roscar/calibration_config',
'url': 'https://haufs:AAABBBCCCDDDEEEFFF@git.xfel.eu/gitlab/detectors/calibration_configurations.git'}
WEB-SERVICE<dict> {'allowed-ips': '131.169.4.197, 131.169.212.226',
'bind-to': 'tcp://*',
'job-db': '/home/roscar/work/git.xfel.eu/detectors/pycalibration/webservice/webservice_jobs.sqlite',
'job-timeout': 3600,
'job-update-interval': 60,
'port': 5556}
METADATA-CLIENT<dict> {'auth-url': 'https://in.xfel.eu/test_metadata/oauth/authorize',
'base-api-url': 'https://in.xfel.eu/metadata/api/',
'metadata-web-app-url': 'https://in.xfel.eu/test_metadata',
'refresh-url': 'https://in.xfel.eu/test_metadata/oauth/token',
'scope': '',
'token-url': 'https://in.xfel.eu/test_metadata/oauth/token',
'user-email': 'calibration@example.com',
'user-id': 'AAABBBCCCDDDEEEFFF',
'user-secret': 'AAABBBCCCDDDEEEFFF'}
KAFKA<dict> {'brokers': ['it-kafka-broker01.desy.de',
'it-kafka-broker02.desy.de',
'it-kafka-broker03.desy.de'],
'topic': 'xfel-test-offline-cal'}
CORRECT<dict> {'cmd': 'python -m xfel_calibrate.calibrate {detector} CORRECT '
'--slurm-scheduling {sched_prio} --slurm-mem 750 --request-time '
'{request_time} --slurm-name '
'{action}_{instrument}_{detector}_{cycle}_p{proposal}_{runs} '
'--report-to '
'/gpfs/exfel/exp/{instrument}/{cycle}/p{proposal}/usr/Reports/{runs}/{det_instance}_{action}_{proposal}_{runs}_{time_stamp} '
CONFIG-REPO<dict> {
'local-path': '/home/roscar/calibration_config',
'url': 'https://haufs:AAABBBCCCDDDEEEFFF@git.xfel.eu/gitlab/detectors/calibration_configurations.git'
}
WEB-SERVICE<dict> {
'allowed-ips': '131.169.4.197, 131.169.212.226',
'bind-to': 'tcp://*',
'job-db': '/home/roscar/work/git.xfel.eu/detectors/pycalibration/webservice/webservice_jobs.sqlite',
'job-timeout': 3600,
'job-update-interval': 60,
'port': 5556
}
METADATA-CLIENT<dict> {
'auth-url': 'https://in.xfel.eu/test_metadata/oauth/authorize',
'base-api-url': 'https://in.xfel.eu/metadata/api/',
'metadata-web-app-url': 'https://in.xfel.eu/test_metadata',
'refresh-url': 'https://in.xfel.eu/test_metadata/oauth/token',
'scope': '',
'token-url': 'https://in.xfel.eu/test_metadata/oauth/token',
'user-email': 'calibration@example.com',
'user-id': 'AAABBBCCCDDDEEEFFF',
'user-secret': 'AAABBBCCCDDDEEEFFF'\
}
KAFKA<dict> {
'brokers': [
'it-kafka-broker01.desy.de',
'it-kafka-broker02.desy.de',
'it-kafka-broker03.desy.de'
],
'topic': 'xfel-test-offline-cal'
}
CORRECT<dict> {
'cmd': 'python -m xfel_calibrate.calibrate {detector} CORRECT '
'--slurm-scheduling {sched_prio} --slurm-mem 750 --request-time '
'{request_time} --slurm-name '
'{action}_{instrument}_{detector}_{cycle}_p{proposal}_{runs} '
'--report-to /gpfs/exfel/exp/{instrument}/{cycle}/p{proposal}/usr/Reports/{runs}/{det_instance}_{action}_{proposal}_{runs}_{time_stamp} '
'--cal-db-timeout 300000 --cal-db-interface '
'tcp://max-exfl-cal001:8015#8044',
'in-folder': '/gpfs/exfel/exp/{instrument}/{cycle}/p{proposal}/raw',
'out-folder': '/gpfs/exfel/d/proc/{instrument}/{cycle}/p{proposal}/{run}',
'sched-prio': 80}
DARK<dict> {'cmd': 'python -m xfel_calibrate.calibrate {detector} DARK --concurrency-par '
'karabo_da --slurm-scheduling {sched_prio} --request-time '
'{request_time} --slurm-name '
'{action}_{instrument}_{detector}_{cycle}_p{proposal}_{runs} '
'--report-to '
'/gpfs/exfel/d/cal/caldb_store/xfel/reports/{instrument}/{det_instance}/{action}/{action}_{proposal}_{runs}_{time_stamp} '
'--cal-db-interface tcp://max-exfl-cal001:8015#8044 --db-output',
'in-folder': '/gpfs/exfel/exp/{instrument}/{cycle}/p{proposal}/raw',
'out-folder': '/gpfs/exfel/u/usr/{instrument}/{cycle}/p{proposal}/dark/runs_{runs}',
'sched-prio': 10}
'in-folder': '/gpfs/exfel/exp/{instrument}/{cycle}/p{proposal}/raw','out-folder': '/gpfs/exfel/d/proc/{instrument}/{cycle}/p{proposal}/{run}',
'sched-prio': 80
}
DARK<dict> {
'cmd': 'python -m xfel_calibrate.calibrate {detector} DARK --concurrency-par '
'karabo_da --slurm-scheduling {sched_prio} --request-time '
'{request_time} --slurm-name '
'{action}_{instrument}_{detector}_{cycle}_p{proposal}_{runs} '
'--report-to /gpfs/exfel/d/cal/caldb_store/xfel/reports/{instrument}/{det_instance}/{action}/{action}_{proposal}_{runs}_{time_stamp} '
'--cal-db-interface tcp://max-exfl-cal001:8015#8044 --db-output',
'in-folder': '/gpfs/exfel/exp/{instrument}/{cycle}/p{proposal}/raw',
'out-folder': '/gpfs/exfel/u/usr/{instrument}/{cycle}/p{proposal}/dark/runs_{runs}',
'sched-prio': 10
}
```
And here you can see that `metadata-client: user-id: ` contains the ID now
And here you can see that `metadata-client: user-id:` contains the ID now
instead of the note "add this to secrets file", so the substitution has worked
correctly.
## Contributing
Contributing
************
Guidelines
==========
### Guidelines
Development guidelines can be found on the GitLab Wiki page here: https://git.xfel.eu/gitlab/detectors/pycalibration/wikis/GitLab-Guidelines
Development guidelines can be found on the GitLab Wiki page here: <https://git.xfel.eu/gitlab/detectors/pycalibration/wikis/GitLab-Guidelines>
Basics
======
### Basics
If you are installing the package for development purposes then you should
install the optional dependencies as well. Follow the instructions as above, but
instead of ``pip install .`` use ``pip install ".[test,dev]"`` to install both
instead of `pip install .` use `pip install ".[test,dev]"` to install both
the extras.
The installation instructions above assume that you have set up SSH keys for use
with GitLab to allow for passwordless clones from GitLab, this way it's possible
to run ``pip install git+ssh...`` commands and install packages directly from
to run `pip install git+ssh...` commands and install packages directly from
GitLab.
To do this check the settings page here: https://git.xfel.eu/gitlab/profile/keys
To do this check the settings page here: <https://git.xfel.eu/gitlab/profile/keys>
Pre-Commit Hooks
================
### Pre-Commit Hooks
This repository uses pre-commit hooks automatically run some code quality and
standard checks, this includes the following:
a. ``identity`` - The 'identity' meta hook prints off a list of files that the hooks will execute on
a. `identity` - The 'identity' meta hook prints off a list of files that the hooks will execute on
b. 'Standard' file checks
1. ``check-added-large-files`` - Ensures no large files are committed to repo
2. ``check-ast`` - Checks that the python AST is parseable
3. ``check-json`` - Checks json file formatting is parseable
4. ``check-yaml`` - Checks yaml file formatting is parseable
5. ``check-toml`` - Checks toml file formatting is parseable
6. ``rstcheck`` - Checks rst file formatting is parseable
7. ``end-of-file-fixer`` - Fixes EoF to be consistent
8. ``trailing-whitespace`` - Removes trailing whitespaces from lines
9. ``check-merge-conflict`` - Checks no merge conflicts remain in the commit
10. ``mixed-line-ending`` - Fixes mixed line endings
1. `check-added-large-files` - Ensures no large files are committed to repo
2. `check-ast` - Checks that the python AST is parseable
3. `check-json` - Checks json file formatting is parseable
4. `check-yaml` - Checks yaml file formatting is parseable
5. `check-toml` - Checks toml file formatting is parseable
6. `rstcheck` - Checks rst file formatting is parseable
7. `end-of-file-fixer` - Fixes EoF to be consistent
8. `trailing-whitespace` - Removes trailing whitespaces from lines
9. `check-merge-conflict` - Checks no merge conflicts remain in the commit
10. `mixed-line-ending` - Fixes mixed line endings
c. Code checks
1. ``flake8`` - Code style checks
2. ``isort`` - Sorts imports in python files
3. ``check-docstring-first`` - Ensures docstrings are in the correct place
1. `flake8` - Code style checks
2. `isort` - Sorts imports in python files
3. `check-docstring-first` - Ensures docstrings are in the correct place
d. Notebook checks
1. ``nbqa-flake8`` - Runs flake8 on notebook cells
2. ``nbqa-isort`` - Runs isort on notebook cells
3. ``nbstripoutput`` - Strips output from ipynb files
1. `nbqa-flake8` - Runs flake8 on notebook cells
2. `nbqa-isort` - Runs isort on notebook cells
3. `nbstripoutput` - Strips output from ipynb files
To install these checks, set up you environment as mentioned above and then run
the command:
.. code::
pre-commit install-hooks
```bash
pre-commit install-hooks
```
This will set up the hooks in git locally, so that each time you run the command
``git commit`` the hooks get executed on the **staged files only**, beware that
`git commit` the hooks get executed on the **staged files only**, beware that
if the pre-commit hooks find required changes some of them will **modify your
files**, however they only modify the current working files, not the ones you
have already staged. This means that you can look at the diff between your
staged files and the ones that were modified to see what changes are suggested.
#### Run Checks Only On Diffs
Run Checks Only On Diffs
------------------------
Typically ``pre-commit`` is ran on ``--all-files`` within a CI, however as this
Typically `pre-commit` is ran on `--all-files` within a CI, however as this
is being set up on an existing codebase these checks will always fail with a
substantial number of issues. Using some creative workarounds, the CI has been
set up to only run on files which have changed between a PR and the target
branch.
If you want to run the pre-commit checks as they would run on the CI, then you
can use the ``bin/pre-commit-diff.sh`` to execute the checks as on the CI
can use the `bin/pre-commit-diff.sh` to execute the checks as on the CI
pipeline.
A side effect of this is that the checks will run on **all** of the differences
......@@ -325,81 +339,69 @@ If this happens and the hooks in the CI (or via the script) run on the wrong
files then you should **rebase onto the target branch** to prevent the checks
from running on the wrong files/diffs.
Skipping Checks
---------------
#### Skipping Checks
If the checks are failing and you want to ignore them on purpose then you have two options:
- use the ``--no-verify`` flag on your ``git commit`` command to skip them, e.g. ``git commit -m "Commit skipping hooks" --no-verify``
- use the variable ``SKIP=hooks,to,skip`` before the git commit command to list hooks to skip, e.g. ``SKIP=flake8,isort git commit -m "Commit skipping only flake8 and isort hooks"``
- use the `--no-verify` flag on your `git commit` command to skip them, e.g. `git commit -m "Commit skipping hooks" --no-verify`
- use the variable `SKIP=hooks,to,skip` before the git commit command to list hooks to skip, e.g. `SKIP=flake8,isort git commit -m "Commit skipping only flake8 and isort hooks"`
In the CI pipeline the pre-commit check stage has ``allow_failure: true`` set so
In the CI pipeline the pre-commit check stage has `allow_failure: true` set so
that it is possible to ignore errors in the checks, and so that subsequent
stages will still run even if the checks have failed. However there should be a
good reason for allowing the checks to fail, e.g. checks failing due to
unmodified sections of code being looked at.
## Python Scripted Calibration
Python Scripted Calibration
***************************
To launch correction or characterisation jobs, run something like this
To launch correction or characterisation jobs, run something like this::
xfel-calibrate AGIPD CORRECT \
--in-folder /gpfs/exfel/exp/SPB/202131/p900215/raw --run 591 \
--out-folder /gpfs/exfel/data/scratch/kluyvert/agipd-calib-900215-591 \
--karabo-id SPB_DET_AGIPD1M-1 --karabo-id-control SPB_IRU_AGIPD1M1 \
--karabo-da-control AGIPD1MCTRL00 --modules 0-4
```bash
xfel-calibrate AGIPD CORRECT \
--in-folder /gpfs/exfel/exp/SPB/202131/p900215/raw --run 591 \
--out-folder /gpfs/exfel/data/scratch/kluyvert/agipd-calib-900215-591 \
--karabo-id SPB_DET_AGIPD1M-1 --karabo-id-control SPB_IRU_AGIPD1M1 \
--karabo-da-control AGIPD1MCTRL00 --modules 0-4
```
The first two arguments refer to a *detector* and an *action*, and are used to
find the appropriate notebook to run. Most of the optional arguments are
translated into parameter assignments in the notebook, e.g. ``--modules 0-4``
sets ``modules = [0, 1, 2, 3]`` in the notebook.
translated into parameter assignments in the notebook, e.g. `--modules 0-4`
sets `modules = [0, 1, 2, 3]` in the notebook.
This normally submits jobs to Slurm to do the work; you can check their status
with ``squeue --me``. If you are working on a dedicated node, you can use the
``--no-cluster-job`` option to run all the work on that node instead.
with `squeue --me`. If you are working on a dedicated node, you can use the
`--no-cluster-job` option to run all the work on that node instead.
The notebooks will be used to create a PDF report after the jobs have run.
This will be placed in ``--out-folder`` by default, though it can be overridden
with the ``--report-to`` option.
This will be placed in `--out-folder` by default, though it can be overridden
with the `--report-to` option.
Reproducing calibration
=======================
### Reproducing calibration
The information to run the calibration code again is saved to a directory next to
the PDF report, named starting with ``slurm_out_``. It can be run as a new job
like this::
the PDF report, named starting with `slurm_out_`. It can be run as a new job
like this
python3 -m xfel_calibrate.repeat \
/gpfs/exfel/data/scratch/kluyvert/agipd-calib-900215-591/slurm_out_AGIPDOfflineCorrection \
--out-folder /gpfs/exfel/data/scratch/kluyvert/agipd-calib-900215-591-repro
```bash
python3 -m xfel_calibrate.repeat /gpfs/exfel/data/scratch/kluyvert/ agipd-calib-900215-591/slurm_out_AGIPDOfflineCorrection \
--out-folder /gpfs/exfel/data/scratch/kluyvert/agipd-calib-900215-591-repro
```
The information in the directory includes a Pip ``requirements.txt`` file
The information in the directory includes a Pip `requirements.txt` file
listing the packages installed when this task was first set up. For better
reproducibility, use this to create a similar environment, and pass
``--python path/to/bin/python`` to run notebooks in that environment.
`--python path/to/bin/python` to run notebooks in that environment.
Future work will automate this step.
.. note::
> **Note**
> Our aim here is to run the same code as before, with the same parameters, in a similar software environment. This should produce essentially the same results, but not necessarily exactly identical. The code which runs may use external resources, or involve some randomness, and even different hardware may make small differences.
Our aim here is to run the same code as before, with the same parameters,
in a similar software environment. This should produce essentially the same
results, but not necessarily exactly identical. The code which runs may
use external resources, or involve some randomness, and even different
hardware may make small differences.
Appendix
********
## Appendix
Important information that doesn't really fit in as part of the readme.
TODO: Place this into the docs? Also, improve docs (out of scope for PR !437)
SSH Key Setup for GitLab
========================
### SSH Key Setup for GitLab
It is highly recommended to set up SSH keys for access to GitLab as this
simplifies the setup process for all of our internal software present on GitLab.
......@@ -407,77 +409,70 @@ simplifies the setup process for all of our internal software present on GitLab.
To set up the keys:
1. Connect to Maxwell
2. Generate a new keypair with ``ssh-keygen -o -a 100 -t ed25519``, you can
either leave this in the default location (``~/.ssh/id_ed25519``) or place it
2. Generate a new keypair with `ssh-keygen -o -a 100 -t ed25519`, you can
either leave this in the default location (`~/.ssh/id_ed25519`) or place it
into a separate directory to make management of keys easier if you already
have multiple ones. If you are using a password for your keys please check
this page to learn how to manage them: https://docs.github.com/en/github/authenticating-to-github/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent#adding-your-ssh-key-to-the-ssh-agent
3. Add the public key (``id_ed25519.pub``) to your account on GitLab: https://git.xfel.eu/gitlab/profile/keys
4. Add the following to your ``~/.ssh/config`` file
.. code::
# Special flags for gitlab over SSH
Host git.xfel.eu
User git
Port 10022
ForwardX11 no
IdentityFile ~/.ssh/id_ed25519
this page to learn how to manage them: <https://docs.github.com/en/github/authenticating-to-github/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent#adding-your-ssh-key-to-the-ssh-agent>
3. Add the public key (`id_ed25519.pub`) to your account on GitLab: <https://git.xfel.eu/gitlab/profile/keys>
4. Add the following to your `~/.ssh/config` file
```bash
# Special flags for gitlab over SSH
Host git.xfel.eu
User git
Port 10022
ForwardX11 no
IdentityFile ~/.ssh/id_ed25519
```
Once this is done you can clone repositories you have access to from GitLab
without having to enter your password each time. As ``pycalibration``
without having to enter your password each time. As `pycalibration`
requirements are installed from SSH remote URLs having SSH keys set up is a
requirement for installing pycalibration.
### GitLab Access for `xcaltst` and `xcal`
GitLab Access for ``xcaltst`` and ``xcal``
==========================================
To make it easier to work with and deploy software via ``xcaltst``/``xcal``, we
To make it easier to work with and deploy software via `xcaltst`/`xcal`, we
have created an xcal account for gitlab with the following details:
- Full Name: ReadOnly Gitlab Calibration External
- User ID: 423
- Username: ``xcalgitlab``
- Username: `xcalgitlab`
- Password: ask Robert Rosca
This account is intended to be used as a read only account which can be given
access to certain repos to make it easier to clone them when using our
functional accounts on Maxwell.
The ``xcaltst`` account has an ed25519 keypair under ``~/.ssh/gitlab/``, the
public key has been added to the ``xcalgitlab``'s approved SSH keys.
Additionally this block has been added to ``~/.ssh/config``:
The `xcaltst` account has an ed25519 keypair under `~/.ssh/gitlab/`, the
public key has been added to the `xcalgitlab`'s approved SSH keys.
.. code::
Additionally this block has been added to `~/.ssh/config`:
# Special flags for gitlab over SSH
Host git.xfel.eu
User git
Port 10022
ForwardX11 no
IdentityFile ~/.ssh/gitlab/id_ed25519
```bash
# Special flags for gitlab over SSH
Host git.xfel.eu
User git
Port 10022
ForwardX11 no
IdentityFile ~/.ssh/gitlab/id_ed25519
```
Now any repository that ``xcalgitlab`` has read access to, e.g. if it is added as
Now any repository that `xcalgitlab` has read access to, e.g. if it is added as
a reporter, can be cloned on Maxwell without having to enter a password.
For example, ``xcalgitlab`` is a reporter on the pycalibration
https://git.xfel.eu/gitlab/detectors/pycalibration repository, so now
``xcalgitlab`` can do passwordless clones with SSH:
.. code::
[xcaltst@max-exfl-cal002 tmp]$ git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git
Cloning into 'pycalibration'...
remote: Enumerating objects: 9414, done.
remote: Counting objects: 100% (9414/9414), done.
remote: Compressing objects: 100% (2858/2858), done.
remote: Total 9414 (delta 6510), reused 9408 (delta 6504)
Receiving objects: 100% (9414/9414), 611.81 MiB | 54.87 MiB/s, done.
Resolving deltas: 100% (6510/6510), done.
References:
- Redmine ticket: https://in.xfel.eu/redmine/issues/83954
- Original issue: https://git.xfel.eu/gitlab/detectors/calibration_workshop/issues/121
For example, `xcalgitlab` is a reporter on the pycalibration
<https://git.xfel.eu/gitlab/detectors/pycalibration> repository, so now
`xcalgitlab` can do passwordless clones with SSH:
```bash
[xcaltst@max-exfl-cal002 tmp]$ git clone ssh://git@git.xfel.eu:10022/detectors/pycalibration.git
Cloning into 'pycalibration'...
remote: Enumerating objects: 9414, done.
remote: Counting objects: 100% (9414/9414), done.
remote: Compressing objects: 100% (2858/2858), done.
remote: Total 9414 (delta 6510), reused 9408 (delta 6504)
Receiving objects: 100% (9414/9414), 611.81 MiB | 54.87 MiB/s, done.
Resolving deltas: 100% (6510/6510), done.
```
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment