{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Characterize Dark Images #\n",
    "\n",
    "Author: S. Hauf, Version: 0.1\n",
    "\n",
    "The following code analyzes a set of dark images taken with the AGIPD detector to deduce detector offsets , noise, bad-pixel maps and thresholding. All four types of constants are evaluated per-pixel and per-memory cell. Data for the detector's three gain stages needs to be present, separated into separate runs.\n",
    "\n",
    "The evaluated calibration constants are stored locally and injected in the calibration data base.\n",
    "\n",
    "**The offset** ($O$) is defined as the median ($M$) of the dark signal ($Ds$) over trains ($t$) for a given pixel ($x,y$) and memory cell ($c$). \n",
    "\n",
    "**The noise** $N$ is the standard deviation $\\sigma$ of the dark signal.\n",
    "\n",
    "$$ O_{x,y,c} = M(Ds)_{t} ,\\,\\,\\,\\,\\,\\, N_{x,y,c} = \\sigma(Ds)_{t}$$\n",
    "\n",
    "**The bad pixel** mask is encoded as a bit mask.\n",
    "\n",
    "**\"OFFSET_OUT_OF_THRESHOLD\":**\n",
    "\n",
    "Offset outside of bounds:\n",
    "\n",
    "$$M(O)_{x,y} - \\sigma(O)_{x,y} * \\mathrm{thresholds\\_offset\\_sigma} < O < M(O)_{x,y} + \\sigma(O)_{x,y} * \\mathrm{thresholds\\_offset\\_sigma} $$\n",
    "\n",
    "or offset outside of hard limits\n",
    "\n",
    "$$ \\mathrm{thresholds\\_offset\\_hard}_\\mathrm{low} < O < \\mathrm{thresholds\\_offset\\_hard}_\\mathrm{high} $$\n",
    "\n",
    "**\"NOISE_OUT_OF_THRESHOLD\":**\n",
    "\n",
    "Noise outside of bounds:\n",
    "\n",
    "$$M(N)_{x,y} - \\sigma(N)_{x,y} * \\mathrm{thresholds\\_noise\\_sigma} < N < M(N)_{x,y} + \\sigma(N)_{x,y} * \\mathrm{thresholds\\_noise\\_sigma} $$\n",
    "\n",
    "or noise outside of hard limits\n",
    "\n",
    "$$\\mathrm{thresholds\\_noise\\_hard}_\\mathrm{low} < N < \\mathrm{thresholds\\_noise\\_hard}_\\mathrm{high} $$\n",
    "\n",
    "**\"OFFSET_NOISE_EVAL_ERROR\":**\n",
    "\n",
    "Offset and Noise both not $nan$ values \n",
    "\n",
    "Values: $\\mathrm{thresholds\\_offset\\_sigma}$, $\\mathrm{thresholds\\_offset\\_hard}$, $\\mathrm{thresholds\\_noise\\_sigma}$, $\\mathrm{thresholds\\_noise\\_hard}$ are given as parameters."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2019-02-20T12:42:51.255184Z",
     "start_time": "2019-02-20T12:42:51.225500Z"
    }
   },
   "outputs": [],
   "source": [
    "cluster_profile = \"noDB\" # The ipcluster profile to use\n",
    "in_folder = \"/gpfs/exfel/d/raw/SPB/202030/p900138/\" # path to input data, required\n",
    "out_folder = \"/gpfs/exfel/data/scratch/ahmedk/test/AGIPD3\" # path to output to, required\n",
    "sequences = [0] # sequence files to evaluate.\n",
    "\n",
    "run_high = 199 # run number in which high gain data was recorded, required\n",
    "run_med = 200 # run number in which medium gain data was recorded, required\n",
    "run_low = 201 # run number in which low gain data was recorded, required\n",
    "\n",
    "mem_cells = 0 # number of memory cells used, set to 0 to automatically infer\n",
    "local_output = True # output constants locally\n",
    "db_output = False # output constants to database\n",
    "bias_voltage = 300 # detector bias voltage\n",
    "cal_db_interface = \"tcp://max-exfl016:8020\" # the database interface to use\n",
    "cal_db_timeout = 3000000 # timeout on caldb requests\"\n",
    "interlaced = False # assume interlaced data format, for data prior to Dec. 2017\n",
    "rawversion = 2 # RAW file format version\n",
    "dont_use_dir_date = False # don't use the dir creation date for determining the creation time\n",
    "\n",
    "thresholds_offset_sigma = 3. # thresholds in terms of n sigma noise for offset deduced bad pixels\n",
    "thresholds_offset_hard = [4000, 8500] # thresholds in absolute ADU terms for offset deduced bad pixels\n",
    "\n",
    "thresholds_noise_sigma = 5. # thresholds in terms of n sigma noise for offset deduced bad pixels\n",
    "thresholds_noise_hard = [4, 20] # thresholds in absolute ADU terms for offset deduced bad pixels\n",
    "\n",
    "instrument = \"SPB\"\n",
    "high_res_badpix_3d = False # set this to True if you need high-resolution 3d bad pixel plots. Runtime: ~ 1h\n",
    "modules = [0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15]  # module to run for\n",
    "acq_rate = 0. # the detector acquisition rate, use 0 to try to auto-determine\n",
    "gain_setting = 0.1 # the gain setting, use 0.1 to try to auto-determine\n",
    "\n",
    "h5path_ctrl = '/CONTROL/SPB_IRU_AGIPD1M1/MDL/FPGA_COMP_TEST' # path to control information\n",
    "karabo_da_control = 'AGIPD1MCTRL00' # karabo DA for control infromation"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2019-02-20T12:42:52.599660Z",
     "start_time": "2019-02-20T12:42:51.472138Z"
    }
   },
   "outputs": [],
   "source": [
    "# imports and things that do not usually need to be changed\n",
    "from datetime import datetime\n",
    "import dateutil.parser\n",
    "import warnings\n",
    "warnings.filterwarnings('ignore')\n",
    "from collections import OrderedDict\n",
    "import os\n",
    "import h5py\n",
    "import numpy as np\n",
    "import matplotlib\n",
    "import tabulate\n",
    "\n",
    "matplotlib.use('agg')\n",
    "import matplotlib.pyplot as plt\n",
    "from IPython.display import display, Markdown, Latex\n",
    "%matplotlib inline\n",
    "\n",
    "from cal_tools.tools import (gain_map_files, parse_runs, \n",
    "                             run_prop_seq_from_path, get_notebook_name, \n",
    "                             get_dir_creation_date, save_const_to_h5,\n",
    "                             get_random_db_interface, get_from_db)\n",
    "from cal_tools.influx import InfluxLogger\n",
    "from cal_tools.enums import BadPixels\n",
    "from cal_tools.plotting import show_overview, plot_badpix_3d, create_constant_overview\n",
    "from cal_tools.agipdlib import get_gain_setting\n",
    "\n",
    "# make sure a cluster is running with ipcluster start --n=32, give it a while to start\n",
    "from ipyparallel import Client\n",
    "\n",
    "view = Client(profile=cluster_profile)[:]\n",
    "view.use_dill()\n",
    "\n",
    "from iCalibrationDB import ConstantMetaData, Constants, Conditions, Detectors, Versions\n",
    "\n",
    "gains = np.arange(3)\n",
    "\n",
    "# no need to change this\n",
    "\n",
    "QUADRANTS = 4\n",
    "MODULES_PER_QUAD = 4\n",
    "DET_FILE_INSET = \"AGIPD\"\n",
    "\n",
    "IL_MODE = interlaced\n",
    "max_cells = mem_cells\n",
    "   \n",
    "offset_runs = OrderedDict()\n",
    "offset_runs[\"high\"] = parse_runs(run_high)[0]\n",
    "offset_runs[\"med\"] = parse_runs(run_med)[0]\n",
    "offset_runs[\"low\"] = parse_runs(run_low)[0]\n",
    "\n",
    "creation_time=None\n",
    "if not dont_use_dir_date:\n",
    "    creation_time = get_dir_creation_date(in_folder, run_high)\n",
    "\n",
    "\n",
    "run, prop, seq = run_prop_seq_from_path(in_folder)\n",
    "logger = InfluxLogger(detector=\"AGIPD\", instrument=instrument, mem_cells=mem_cells,\n",
    "                      notebook=get_notebook_name(), proposal=prop)\n",
    "\n",
    "print(\"Using {} as creation time of constant.\".format(creation_time))\n",
    "\n",
    "cal_db_interface = get_random_db_interface(cal_db_interface)\n",
    "print('Calibration database interface: {}'.format(cal_db_interface))\n",
    "\n",
    "loc = None\n",
    "if instrument == \"SPB\":\n",
    "    loc = \"SPB_DET_AGIPD1M-1\"\n",
    "    dinstance = \"AGIPD1M1\"\n",
    "else:\n",
    "    loc = \"MID_DET_AGIPD1M-1\"\n",
    "    dinstance = \"AGIPD1M2\"\n",
    "print(\"Detector in use is {}\".format(loc))    "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "control_fname = '{}/r{:04d}/RAW-R{:04d}-{}-S00000.h5'.format(in_folder, run_high, \n",
    "                                                             run_high, karabo_da_control)\n",
    "\n",
    "if gain_setting == 0.1:\n",
    "    if creation_time.replace(tzinfo=None) < dateutil.parser.parse('2020-01-31'):\n",
    "        print(\"Set gain-setting to None for runs taken before 2020-01-31\")\n",
    "        gain_setting = None\n",
    "    else:\n",
    "        try:\n",
    "            gain_setting = get_gain_setting(control_fname, h5path_ctrl)\n",
    "        except Exception as e:\n",
    "            print(f'Error while reading gain setting from: \\n{control_fname}')\n",
    "            print(e)\n",
    "            print(\"Gain setting is not found in the control information\")\n",
    "            print(\"Data will not be processed\")\n",
    "            sequences = []"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2019-02-20T12:42:52.608214Z",
     "start_time": "2019-02-20T12:42:52.601257Z"
    }
   },
   "outputs": [],
   "source": [
    "print(\"Parameters are:\")\n",
    "print(\"Proposal: {}\".format(prop))\n",
    "print(\"Memory cells: {}/{}\".format(mem_cells, max_cells))\n",
    "print(\"Runs: {}\".format([ v for v in offset_runs.values()]))\n",
    "print(\"Sequences: {}\".format(sequences))\n",
    "print(\"Interlaced mode: {}\".format(IL_MODE))\n",
    "print(\"Using DB: {}\".format(db_output))\n",
    "print(\"Input: {}\".format(in_folder))\n",
    "print(\"Output: {}\".format(out_folder))\n",
    "print(\"Bias voltage: {}V\".format(bias_voltage))\n",
    "print(\"Gain setting: {}\".format(gain_setting))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The following lines will create a queue of files which will the be executed module-parallel. Distiguishing between different gains."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2019-02-20T12:42:54.024731Z",
     "start_time": "2019-02-20T12:42:53.901555Z"
    }
   },
   "outputs": [],
   "source": [
    "# set everything up filewise\n",
    "if not os.path.exists(out_folder):\n",
    "    os.makedirs(out_folder)\n",
    "\n",
    "gmf = gain_map_files(in_folder, offset_runs, sequences, DET_FILE_INSET, QUADRANTS, MODULES_PER_QUAD)\n",
    "gain_mapped_files, total_sequences, total_file_size = gmf\n",
    "\n",
    "print(\"Will process at total of {} sequences: {:0.2f} GB of data.\".format(total_sequences, total_file_size))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Calculate Offsets, Noise and Thresholds ##\n",
    "\n",
    "The calculation is performed per-pixel and per-memory-cell. Offsets are simply the median value for a set of dark data taken at a given gain, noise the standard deviation, and gain-bit values the medians of the gain array."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2019-02-20T10:50:55.839958Z",
     "start_time": "2019-02-20T10:50:55.468134Z"
    }
   },
   "outputs": [],
   "source": [
    "import copy\n",
    "from functools import partial\n",
    "def characterize_module(il_mode, cells, bp_thresh, rawversion, loc, acq_rate, inp):\n",
    "    import numpy as np\n",
    "    import copy\n",
    "    import h5py\n",
    "    from cal_tools.enums import BadPixels\n",
    "    from cal_tools.agipdlib import get_num_cells, get_acq_rate\n",
    "    \n",
    "    filename, filename_out, channel = inp\n",
    "    \n",
    "\n",
    "    if cells == 0:\n",
    "        cells = get_num_cells(filename, loc, channel)\n",
    "\n",
    "    print(\"Using {} memory cells\".format(cells))\n",
    "    \n",
    "    if acq_rate == 0.:\n",
    "        acq_rate = get_acq_rate(filename, loc, channel)\n",
    "    \n",
    "    thresholds_offset_hard, thresholds_offset_sigma, thresholds_noise_hard, thresholds_noise_sigma = bp_thresh \n",
    "\n",
    "    infile = h5py.File(filename, \"r\", driver=\"core\")\n",
    "    if rawversion == 2:\n",
    "        count = np.squeeze(infile[\"/INDEX/{}/DET/{}CH0:xtdf/image/count\".format(loc, channel)])\n",
    "        first = np.squeeze(infile[\"/INDEX/{}/DET/{}CH0:xtdf/image/first\".format(loc, channel)])\n",
    "        last_index = int(first[count != 0][-1]+count[count != 0][-1])\n",
    "        first_index = int(first[count != 0][0])\n",
    "    else:\n",
    "        status = np.squeeze(infile[\"/INDEX/{}/DET/{}CH0:xtdf/image/status\".format(loc, channel)])\n",
    "        if np.count_nonzero(status != 0) == 0:\n",
    "            return\n",
    "        last = np.squeeze(infile[\"/INDEX/{}/DET/{}CH0:xtdf/image/last\".format(loc, channel)])\n",
    "        first = np.squeeze(infile[\"/INDEX/{}/DET/{}CH0:xtdf/image/first\".format(loc, channel)])\n",
    "        last_index = int(last[status != 0][-1]) + 1\n",
    "        first_index = int(first[status != 0][0])\n",
    "    im = np.array(infile[\"/INSTRUMENT/{}/DET/{}CH0:xtdf/image/data\".format(loc, channel)][first_index:last_index,...])    \n",
    "    cellIds = np.squeeze(infile[\"/INSTRUMENT/{}/DET/{}CH0:xtdf/image/cellId\".format(loc, channel)][first_index:last_index,...]) \n",
    "    \n",
    "    infile.close()\n",
    "\n",
    "    if il_mode:\n",
    "        ga = im[1::2, 0, ...]\n",
    "        im = im[0::2, 0, ...].astype(np.float32)\n",
    "        cellIds = cellIds[::2]\n",
    "    else:\n",
    "        ga = im[:, 1, ...]\n",
    "        im = im[:, 0, ...].astype(np.float32)\n",
    "        \n",
    "    im = np.rollaxis(im, 2)\n",
    "    im = np.rollaxis(im, 2, 1)\n",
    "\n",
    "    ga = np.rollaxis(ga, 2)\n",
    "    ga = np.rollaxis(ga, 2, 1)\n",
    "\n",
    "    mcells = cells #max(cells, np.max(cellIds)+1)\n",
    "    offset = np.zeros((im.shape[0], im.shape[1], mcells))\n",
    "    gains = np.zeros((im.shape[0], im.shape[1], mcells))\n",
    "    noise = np.zeros((im.shape[0], im.shape[1], mcells))\n",
    "    \n",
    "    for cc in np.unique(cellIds[cellIds < mcells]):\n",
    "        cellidx = cellIds == cc\n",
    "        offset[...,cc] = np.median(im[..., cellidx], axis=2)\n",
    "        noise[...,cc] = np.std(im[..., cellidx], axis=2)\n",
    "        gains[...,cc] = np.median(ga[..., cellidx], axis=2)\n",
    "        \n",
    "    # bad pixels\n",
    "    bp = np.zeros(offset.shape, np.uint32)\n",
    "    # offset related bad pixels\n",
    "    offset_mn = np.nanmedian(offset, axis=(0,1))\n",
    "    offset_std = np.nanstd(offset, axis=(0,1))    \n",
    "    \n",
    "    bp[(offset < offset_mn-thresholds_offset_sigma*offset_std) |\n",
    "       (offset > offset_mn+thresholds_offset_sigma*offset_std)] |= BadPixels.OFFSET_OUT_OF_THRESHOLD.value\n",
    "    bp[(offset < thresholds_offset_hard[0]) | (offset > thresholds_offset_hard[1])] |= BadPixels.OFFSET_OUT_OF_THRESHOLD.value\n",
    "    bp[~np.isfinite(offset)] |= BadPixels.OFFSET_NOISE_EVAL_ERROR.value\n",
    "    \n",
    "    # noise related bad pixels\n",
    "    noise_mn = np.nanmedian(noise, axis=(0,1))\n",
    "    noise_std = np.nanstd(noise, axis=(0,1))    \n",
    "    \n",
    "    bp[(noise < noise_mn-thresholds_noise_sigma*noise_std) |\n",
    "       (noise > noise_mn+thresholds_noise_sigma*noise_std)] |= BadPixels.NOISE_OUT_OF_THRESHOLD.value\n",
    "    bp[(noise < thresholds_noise_hard[0]) | (noise > thresholds_noise_hard[1])] |= BadPixels.NOISE_OUT_OF_THRESHOLD.value\n",
    "    bp[~np.isfinite(noise)] |= BadPixels.OFFSET_NOISE_EVAL_ERROR.value\n",
    "\n",
    "\n",
    "    return offset, noise, gains, bp, cells, acq_rate\n",
    "        \n",
    "        \n",
    "offset_g = OrderedDict()\n",
    "noise_g = OrderedDict()\n",
    "gain_g = OrderedDict()\n",
    "badpix_g = OrderedDict()\n",
    "gg = 0\n",
    "\n",
    "start = datetime.now()\n",
    "all_cells = []\n",
    "all_acq_rate = []\n",
    "for gain, mapped_files in gain_mapped_files.items():\n",
    "    \n",
    "    inp = []\n",
    "    dones = []\n",
    "    for i in modules:\n",
    "        qm = \"Q{}M{}\".format(i//4 +1, i % 4 + 1)    \n",
    "        if qm in mapped_files and not mapped_files[qm].empty():\n",
    "            fname_in = mapped_files[qm].get()            \n",
    "            dones.append(mapped_files[qm].empty())\n",
    "        else:\n",
    "            continue\n",
    "        fout = os.path.abspath(\"{}/{}\".format(out_folder, (os.path.split(fname_in)[-1]).replace(\"RAW\", \"CORR\")))\n",
    "        inp.append((fname_in, fout, i))\n",
    "    first = False\n",
    "    p = partial(characterize_module, IL_MODE, max_cells,\n",
    "               (thresholds_offset_hard, thresholds_offset_sigma,\n",
    "                thresholds_noise_hard, thresholds_noise_sigma), rawversion, loc, acq_rate)\n",
    "    results = list(map(p, inp))\n",
    "    #results = view.map_sync(p, inp)\n",
    "    for ii, r in enumerate(results):\n",
    "        i = modules[ii]\n",
    "        offset, noise, gain, bp, thiscell, thisacq = r\n",
    "        all_cells.append(thiscell)\n",
    "        all_acq_rate.append(thisacq)\n",
    "        qm = \"Q{}M{}\".format(i//4 +1, i % 4 + 1)\n",
    "        if qm not in offset_g:\n",
    "            offset_g[qm] = np.zeros((offset.shape[0], offset.shape[1], offset.shape[2], 3))\n",
    "            noise_g[qm] = np.zeros_like(offset_g[qm])\n",
    "            gain_g[qm] = np.zeros_like(offset_g[qm])\n",
    "            badpix_g[qm] = np.zeros_like(offset_g[qm], np.uint32)\n",
    "        \n",
    "        offset_g[qm][...,gg] = offset\n",
    "        noise_g[qm][...,gg] = noise\n",
    "        gain_g[qm][...,gg] = gain\n",
    "        badpix_g[qm][...,gg] = bp\n",
    "    gg +=1\n",
    "\n",
    "duration = (datetime.now()-start).total_seconds()\n",
    "logger.runtime_summary_entry(success=True, runtime=duration,\n",
    "                                   total_sequences=total_sequences,\n",
    "                                   filesize=total_file_size)\n",
    "logger.send()\n",
    "max_cells = np.max(all_cells)\n",
    "print(\"Using {} memory cells\".format(max_cells))\n",
    "acq_rate = np.max(all_acq_rate)\n",
    "print(\"Using {} MHz acquisition rate\".format(acq_rate))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The thresholds for gain switching are then defined as the mean value between in individual gain bit levels. Note that these thresholds need to be refined with charge induced thresholds, as the two are not the same."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2018-12-06T09:38:18.220833Z",
     "start_time": "2018-12-06T09:38:17.926616Z"
    }
   },
   "outputs": [],
   "source": [
    "thresholds_g = {}\n",
    "for qm in gain_g.keys():\n",
    "    thresholds_g[qm] = np.zeros((gain_g[qm].shape[0], gain_g[qm].shape[1], gain_g[qm].shape[2], 5))\n",
    "    thresholds_g[qm][...,0] = (gain_g[qm][...,1]+gain_g[qm][...,0])/2\n",
    "    thresholds_g[qm][...,1] = (gain_g[qm][...,2]+gain_g[qm][...,1])/2\n",
    "    for i in range(3):\n",
    "        thresholds_g[qm][...,2+i] = gain_g[qm][...,i]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2018-12-06T09:38:18.234582Z",
     "start_time": "2018-12-06T09:38:18.222838Z"
    }
   },
   "outputs": [],
   "source": [
    "res = OrderedDict()\n",
    "for i in modules:\n",
    "    qm = \"Q{}M{}\".format(i//4+1, i%4+1)\n",
    "    res[qm] = {'Offset': offset_g[qm],\n",
    "               'Noise': noise_g[qm],\n",
    "               'ThresholdsDark': thresholds_g[qm],\n",
    "               'BadPixelsDark': badpix_g[qm]    \n",
    "               }\n",
    "    \n",
    "if local_output:\n",
    "    for qm in offset_g.keys():\n",
    "        ofile = \"{}/agipd_offset_store_{}_{}.h5\".format(out_folder, \"_\".join(offset_runs.values()), qm)\n",
    "        store_file = h5py.File(ofile, \"w\")\n",
    "        store_file[\"{}/Offset/0/data\".format(qm)] = offset_g[qm]\n",
    "        store_file[\"{}/Noise/0/data\".format(qm)] = noise_g[qm]\n",
    "        store_file[\"{}/Threshold/0/data\".format(qm)] = thresholds_g[qm]\n",
    "        store_file[\"{}/BadPixels/0/data\".format(qm)] = badpix_g[qm]\n",
    "        store_file.close()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Retrieve existing constants for comparison\n",
    "clist = [\"Offset\", \"Noise\", \"ThresholdsDark\",\"BadPixelsDark\"]\n",
    "old_const = {}\n",
    "old_mdata = {}\n",
    "detinst = getattr(Detectors, dinstance)\n",
    "\n",
    "print('Retrieve pre-existing constants for comparison.')\n",
    "\n",
    "for qm in res:\n",
    "    for const in res[qm]:\n",
    "        metadata = ConstantMetaData()\n",
    "        dconst = getattr(Constants.AGIPD, const)()\n",
    "        dconst.data = res[qm][const]\n",
    "        metadata.calibration_constant = dconst\n",
    "\n",
    "        # Setting conditions\n",
    "        condition = Conditions.Dark.AGIPD(memory_cells=max_cells,\n",
    "                                          bias_voltage=bias_voltage,\n",
    "                                          acquisition_rate=acq_rate,\n",
    "                                          gain_setting=gain_setting)\n",
    "        device = getattr(detinst, qm)\n",
    "        data, mdata = get_from_db(device,\n",
    "                                  getattr(Constants.AGIPD, const)(),\n",
    "                                  condition,\n",
    "                                  None,\n",
    "                                  cal_db_interface, creation_time=creation_time,\n",
    "                                  verbosity=2, timeout=cal_db_timeout)\n",
    "\n",
    "        old_const[const] = data\n",
    "        if mdata is not None and data is not None:\n",
    "            time = mdata.calibration_constant_version.begin_at\n",
    "            old_mdata[const] = time.isoformat()\n",
    "            os.makedirs('{}/old/'.format(out_folder), exist_ok=True)\n",
    "            save_const_to_h5(mdata, '{}/old/'.format(out_folder))\n",
    "        else:\n",
    "            old_mdata[const] = \"Not found\""
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "proposal = list(filter(None, in_folder.strip('/').split('/')))[-2]\n",
    "file_loc = 'proposal:{} runs:{} {} {}'.format(proposal, run_low, run_med, run_high)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2018-12-06T09:49:32.449330Z",
     "start_time": "2018-12-06T09:49:20.231607Z"
    }
   },
   "outputs": [],
   "source": [
    "for qm in res:\n",
    "    for const in res[qm]:\n",
    "        metadata = ConstantMetaData()\n",
    "        dconst = getattr(Constants.AGIPD, const)()\n",
    "        dconst.data = res[qm][const]\n",
    "        metadata.calibration_constant = dconst\n",
    "\n",
    "        # set the operating condition\n",
    "        condition = Conditions.Dark.AGIPD(memory_cells=max_cells,\n",
    "                                          bias_voltage=bias_voltage,\n",
    "                                          acquisition_rate=acq_rate,\n",
    "                                          gain_setting=gain_setting)\n",
    "        detinst = getattr(Detectors, dinstance)\n",
    "        device = getattr(detinst, qm)\n",
    "        \n",
    "        metadata.detector_condition = condition\n",
    "\n",
    "        # specify the a version for this constant\n",
    "        if creation_time is None:\n",
    "            metadata.calibration_constant_version = Versions.Now(device=device)\n",
    "        else:\n",
    "            metadata.calibration_constant_version = Versions.Timespan(device=device,\n",
    "                                                                      start=creation_time)\n",
    "\n",
    "        metadata.calibration_constant_version.raw_data_location = file_loc\n",
    "\n",
    "        if db_output:\n",
    "            try:\n",
    "                metadata.send(cal_db_interface, timeout=cal_db_timeout)\n",
    "                msg = 'Const {} for module {} was injected to the calibration DB. Begin at: {}'\n",
    "                print(msg.format(const, qm,\n",
    "                                 metadata.calibration_constant_version.begin_at))\n",
    "            except Exception as e:\n",
    "                print(e)\n",
    "\n",
    "        if local_output:\n",
    "            save_const_to_h5(metadata, out_folder)\n",
    "            print(\"Calibration constant {} is stored locally.\".format(const))\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "scrolled": false
   },
   "outputs": [],
   "source": [
    "qm = \"Q{}M{}\".format(modules[0]//4+1, modules[0] % 4+1)\n",
    "display(Markdown('## Position of the module {} and it\\'s ASICs##'.format(qm)))\n",
    "\n",
    "fig, ax = plt.subplots(1, figsize=(10, 10))\n",
    "ax.set_axis_off()\n",
    "\n",
    "ax.set_xlim(0, 90)\n",
    "ax.set_ylim(0, 75)\n",
    "\n",
    "asic_pos = 5\n",
    "\n",
    "q_st = 8\n",
    "\n",
    "l_y = 6\n",
    "l_x = 5\n",
    "\n",
    "for iq, q_x in enumerate([[43,66],[45,34],[4,32],[2,64]]):\n",
    "    for im in range(4):\n",
    "        x = q_x[0]\n",
    "        for i_as in range(8):\n",
    "            ax.add_patch(matplotlib.patches.Rectangle((x,q_x[1]-q_st*im), l_x, l_y, linewidth=2, edgecolor='c',\n",
    "                                           facecolor='lightblue', fill=True))\n",
    "\n",
    "            if iq*4+im == modules[0]:\n",
    "                ax.add_patch(matplotlib.patches.Rectangle((x,q_x[1]-q_st*im), l_x, l_y/2, linewidth=2,edgecolor='c',\n",
    "                                           facecolor='sandybrown', fill=True))\n",
    "                ax.add_patch(matplotlib.patches.Rectangle((x,(q_x[1]-q_st*im+3)), l_x, l_y/2, linewidth=2,edgecolor='c',\n",
    "                                           facecolor='sandybrown', fill=True))\n",
    "            x += asic_pos\n",
    "            \n",
    "        if iq*4+im == modules[0]:\n",
    "            # Change Text for current processed module.\n",
    "            ax.text(q_x[0]+13, q_x[1]-q_st*im+1.5, 'Q{}M{}'.format(\n",
    "                iq+1, im+1),  fontsize=28, color='mediumblue',weight='bold')\n",
    "        else:\n",
    "            ax.text(q_x[0]+14, q_x[1]-q_st*im+1.5, 'Q{}M{}'.format(\n",
    "                                        iq+1, im+1),  fontsize=25, color='k')\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Single-Cell Overviews ##\n",
    "\n",
    "Single cell overviews allow to identify potential effects on all memory cells, e.g. on sensor level. Additionally, they should serve as a first sanity check on expected behaviour, e.g. if structuring on the ASIC level is visible in the offsets, but otherwise no immediate artifacts are visible."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### High Gain ###"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2018-12-06T09:49:14.540552Z",
     "start_time": "2018-12-06T09:49:13.009674Z"
    },
    "scrolled": false
   },
   "outputs": [],
   "source": [
    "cell = 3\n",
    "gain = 0\n",
    "out_folder = None\n",
    "show_overview(res, cell, gain, out_folder=out_folder, infix=\"_\".join(offset_runs.values()))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Medium Gain ###"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "scrolled": false
   },
   "outputs": [],
   "source": [
    "cell = 3\n",
    "gain = 1\n",
    "show_overview(res, cell, gain, out_folder=out_folder, infix=\"_\".join(offset_runs.values()))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Low Gain ###"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "scrolled": true
   },
   "outputs": [],
   "source": [
    "cell = 3\n",
    "gain = 2\n",
    "show_overview(res, cell, gain, out_folder=out_folder, infix=\"_\".join(offset_runs.values()))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Global Bad Pixel Behaviour ##\n",
    "\n",
    "The following plots show the results of bad pixel evaluation for all evaluated memory cells. Cells are stacked in the Z-dimension, while pixels values in x/y are rebinned with a factor of 2. This excludes single bad pixels present only in disconnected pixels. Hence, any bad pixels spanning at least 4 pixels in the x/y-plane, or across at least two memory cells are indicated. Colors encode the bad pixel type, or mixed type."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### High Gain ###"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "scrolled": false
   },
   "outputs": [],
   "source": [
    "cols = {BadPixels.NOISE_OUT_OF_THRESHOLD.value: (BadPixels.NOISE_OUT_OF_THRESHOLD.name, '#FF000080'),\n",
    "        BadPixels.OFFSET_NOISE_EVAL_ERROR.value: (BadPixels.OFFSET_NOISE_EVAL_ERROR.name, '#0000FF80'),\n",
    "        BadPixels.OFFSET_OUT_OF_THRESHOLD.value: (BadPixels.OFFSET_OUT_OF_THRESHOLD.name, '#00FF0080'),\n",
    "        BadPixels.OFFSET_OUT_OF_THRESHOLD.value | BadPixels.NOISE_OUT_OF_THRESHOLD.value: ('MIXED', '#DD00DD80')}\n",
    "\n",
    "rebin = 8 if not high_res_badpix_3d else 2\n",
    "\n",
    "gain = 0\n",
    "for mod, data in badpix_g.items():\n",
    "    plot_badpix_3d(data[...,gain], cols, title=mod, rebin_fac=rebin)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Medium Gain ###"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "gain = 1\n",
    "for mod, data in badpix_g.items():\n",
    "    plot_badpix_3d(data[...,gain], cols, title=mod, rebin_fac=rebin)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Low Gain ###"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "gain = 2\n",
    "for mod, data in badpix_g.items():\n",
    "    plot_badpix_3d(data[...,gain], cols, title=mod, rebin_fac=rebin)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Aggregate values, and per Cell behaviour ##\n",
    "\n",
    "The following tables and plots give an overview of statistical aggregates for each constant, as well as per cell behavior."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "scrolled": false
   },
   "outputs": [],
   "source": [
    "create_constant_overview(offset_g, \"Offset (ADU)\", max_cells, 4000, 10000,\n",
    "                         out_folder=out_folder, infix=\"_\".join(offset_runs.values()),\n",
    "                         badpixels=badpix_g)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "scrolled": false
   },
   "outputs": [],
   "source": [
    "create_constant_overview(noise_g, \"Noise (ADU)\", max_cells, 0, 100,\n",
    "                         out_folder=out_folder, infix=\"_\".join(offset_runs.values()),\n",
    "                        badpixels=badpix_g)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "scrolled": false
   },
   "outputs": [],
   "source": [
    "create_constant_overview(thresholds_g, \"Threshold (ADU)\", max_cells, 4000, 10000, 3,\n",
    "                         out_folder=out_folder, infix=\"_\".join(offset_runs.values()),\n",
    "                        badpixels=badpix_g)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "bad_pixel_aggregate_g = OrderedDict()\n",
    "for m, d in badpix_g.items():\n",
    "    bad_pixel_aggregate_g[m] = d.astype(np.bool).astype(np.float)\n",
    "create_constant_overview(bad_pixel_aggregate_g, \"Bad pixel fraction\", max_cells, 0, 0.10, 3,\n",
    "                         out_folder=out_folder, infix=\"_\".join(offset_runs.values()))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Summary tables ##\n",
    "\n",
    "The following tables show summary information for the evaluated module. Values for currently evaluated constants are compared with values for pre-existing constants retrieved from the calibration database."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "table = []\n",
    "gain_names = ['High', 'Medium', 'Low']\n",
    "for qm in badpix_g.keys():\n",
    "    for gain in range(3):\n",
    "\n",
    "        l_data = []\n",
    "        l_data_old = []\n",
    "\n",
    "        data = np.copy(badpix_g[qm][:,:,:,gain])\n",
    "        datau32 = data.astype(np.uint32)\n",
    "        l_data.append(data)\n",
    "        l_data.append(datau32 - np.bitwise_or(datau32,BadPixels.NOISE_OUT_OF_THRESHOLD.value))\n",
    "        l_data.append(datau32 - np.bitwise_or(datau32,BadPixels.OFFSET_OUT_OF_THRESHOLD.value))\n",
    "        l_data.append(datau32 - np.bitwise_or(datau32,BadPixels.OFFSET_NOISE_EVAL_ERROR.value))\n",
    "\n",
    "        if old_const['BadPixelsDark'] is not None:\n",
    "            dataold = np.copy(old_const['BadPixelsDark'][:, :, :, gain])\n",
    "            datau32old = dataold.astype(np.uint32)\n",
    "            l_data_old.append(dataold)\n",
    "            l_data_old.append(datau32old - np.bitwise_or(datau32old,BadPixels.NOISE_OUT_OF_THRESHOLD.value))\n",
    "            l_data_old.append(datau32old - np.bitwise_or(datau32old,BadPixels.OFFSET_OUT_OF_THRESHOLD.value))\n",
    "            l_data_old.append(datau32old - np.bitwise_or(datau32old,BadPixels.OFFSET_NOISE_EVAL_ERROR.value))\n",
    "\n",
    "        l_data_name = ['All bad pixels', 'NOISE_OUT_OF_THRESHOLD', \n",
    "                       'OFFSET_OUT_OF_THRESHOLD', 'OFFSET_NOISE_EVAL_ERROR']\n",
    "\n",
    "        l_threshold = ['', '{}'.format(thresholds_noise_sigma), '{}'.format(thresholds_offset_sigma),\n",
    "                      '{}/{}'.format(thresholds_offset_hard, thresholds_noise_hard)]\n",
    "\n",
    "        for i in range(len(l_data)):\n",
    "            line = ['{}, gain {}'.format(l_data_name[i], gain_names[gain]),\n",
    "                      l_threshold[i],\n",
    "                      len(l_data[i][l_data[i]>0].flatten())\n",
    "                   ]\n",
    "\n",
    "            if old_const['BadPixelsDark'] is not None:\n",
    "                line += [len(l_data_old[i][l_data_old[i]>0].flatten())]\n",
    "            else:\n",
    "                line += ['-']\n",
    "\n",
    "        table.append(line)\n",
    "\n",
    "display(Markdown('### Number of bad pixels ###'.format(qm)))\n",
    "if len(table)>0:\n",
    "    md = display(Latex(tabulate.tabulate(table, tablefmt='latex', \n",
    "                                     headers=[\"Pixel type\", \"Threshold\", \"New constant\", \"Old constant \"])))  "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "\n",
    "header = ['Parameter', \n",
    "          \"New constant\", \"Old constant \", \n",
    "          \"New constant\", \"Old constant \", \n",
    "          \"New constant\", \"Old constant \",\n",
    "          \"New constant\", \"Old constant \"]\n",
    "\n",
    "for const in ['Offset', 'Noise', 'ThresholdsDark']:\n",
    "    table = [['','High gain', 'High gain', 'Medium gain', 'Medium gain', 'Low gain', 'Low gain']]\n",
    "    for qm in res.keys():\n",
    "\n",
    "        data = np.copy(res[qm][const])\n",
    "        if const == 'ThresholdsDark':\n",
    "            data[...,0][res[qm]['BadPixelsDark'][...,0]>0] = np.nan\n",
    "            data[...,1][res[qm]['BadPixelsDark'][...,1]>0] = np.nan\n",
    "            data[...,2:][res[qm]['BadPixelsDark']>0] = np.nan\n",
    "        else:\n",
    "            data[res[qm]['BadPixelsDark']>0] = np.nan\n",
    "\n",
    "        if old_const[const] is not None and old_const['BadPixelsDark'] is not None:\n",
    "            dataold = np.copy(old_const[const])\n",
    "            if const == 'ThresholdsDark':\n",
    "                dataold[...,0][old_const['BadPixelsDark'][...,0]>0] = np.nan\n",
    "                dataold[...,1][old_const['BadPixelsDark'][...,1]>0] = np.nan\n",
    "                dataold[...,2:][old_const['BadPixelsDark']>0] = np.nan\n",
    "            else:\n",
    "                dataold[old_const['BadPixelsDark']>0] = np.nan\n",
    "\n",
    "        f_list = [np.nanmedian, np.nanmean, np.nanstd, np.nanmin, np.nanmax]\n",
    "        n_list = ['Median', 'Mean', 'Std', 'Min', 'Max']\n",
    "\n",
    "        for i, f in enumerate(f_list):\n",
    "            line = [n_list[i]]\n",
    "            for gain in range(3):\n",
    "                if const == 'ThresholdsDark':\n",
    "                    gain += 2\n",
    "                line.append('{:6.1f}'.format(f(data[...,gain])))\n",
    "                if old_const[const] is not None and old_const['BadPixelsDark'] is not None:\n",
    "                    line.append('{:6.1f}'.format(f(dataold[...,gain])))\n",
    "                else:\n",
    "                    line.append('-')\n",
    "\n",
    "            table.append(line)\n",
    "\n",
    "    display(Markdown('### {} [ADU], good pixels only ###'.format(const)))\n",
    "    md = display(Latex(tabulate.tabulate(table, tablefmt='latex', headers=header)))  "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.7"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 1
}