2023 KLF Simulations

From kl project
Jump to navigation Jump to search

Installing Software

The reconstruction and analysis software are built on the same software stack used for the analysis of photon beam measurements with the GlueX detector.

  • You can find some general information on GlueX software and computing at this page.
    • If you are running the software at JLab, you should use the instructions on this page to set up the GlueX software and maybe this as well.
  • A general overview of the GlueX software can be found here.

There are a couple of different ways to install the GlueX software stack.

You can then install the KLF-specific branches using the my_halld_update.py script from the build_scripts package, and the XML configuration file given below, e.g., my_halld_update.py -x /path/to/version.klf.xml.

Example version.klf.xml configuration file (Click "Expand" to the right for more details -->):

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="https://halldweb.jlab.org/halld_versions/version7.xsl"?>
<gversions file="version_5.11.0.xml" date="2023-06-14">
<package name="amptools" version="0.14.5"/>
<package name="ccdb" version="1.06.09"/>
<package name="cernlib" version="2005" word_length="64-bit"/>
<package name="diracxx" version="2.0.2"/>
<package name="evio" version="4.4.6"/>
<package name="evtgen" version="01.07.00"/>
<package name="geant4" version="10.04.p02"/>
<package name="gluex_MCwrapper" version="v2.7.0"/>
<package name="gluex_root_analysis" version="1.24.0" dirtag="hdr441"/>
<package name="halld_recon" branch="sdobbs_klong_beam"/>
<package name="halld_sim" version="4.45.0"/>
<package name="hdds" version="4.15.0"/>
<package name="hdgeant4" version="2.35.0" dirtag="hdr441"/>
<package name="hd_utilities" version="1.45"/>
<package name="hepmc" version="2.06.10"/>
<package name="jana" version="0.8.2" dirtag="ccdb169"/>
<package name="lapack" version="3.9.0"/>
<package name="photos" version="3.61"/>
<package name="rcdb" version="0.07.01"/>
<package name="root" version="6.24.04"/>
<package name="sqlitecpp" version="3.1.1"/>
<package name="sqlite" version="3.36.0" year="2021"/>
<package name="xerces-c" version="3.2.3"/>
</gversions>

A version of the dedicated KLF event generator can be downloaded and installed from here https://github.com/sdobbs/KLGenerator_hddm_V3 using the following commands:

git clone https://github.com/sdobbs/KLGenerator_hddm_V3
cd KLGenerator_hddm_V3
scons install

Note that KLGenerator_hddm_V3 is installed under $HALLD_MY/bin, so make sure that you have that directory in your PATH.

Easy Configuration at JLab

If you are on the JLab CUE, you can load a KLF environment using the following script:

source /work/halld/home/sdobbs/KLF/load_KLF_sw.sh

If you want to start a development environment, you can copy the .sh and .xml files from this directory, check out a version of halld_recon, and edit the XML file to have the field home=/wherever/you/put/your/halld_recon.


Apptainer (Singularity) Container

These instructions can be used for running KLF software anywhere that the CVMFS software is installed, including the JLab farm. If you do not have this software installed, please follow these instructions.

To download the software, follow these steps:

  1. Download the container: https://halldweb.jlab.org/dist/KLF/gluex_almalinux-9.2_sng1.3.sif
  2. Download the KLF software: https://halldweb.jlab.org/dist/KLF/KLF_soft.tar.gz
  3. Unpack the KLF software: tar xzf KLF_soft.tar.gz

You can then start the container with a command similar to the following: apptainer shell --bind /cvmfs/oasis.opensciencegrid.org/gluex/group/halld:/group/halld,/d/home --bind [directory you download the KLF software into]/KLF:/sw/KLF gluex_almalinux-9.2_sng1.3.sif

Generally, you want to make sure whatever directory you unpacked the KLF software into is mounted into the container as /sw/KLF. Don't forget to also bind any local directories that you need, like your home directory and where any analysis or data files are stored.

Inside the container, you can load the KLF software by running the command source /sw/KLF/load_KLF_sw.sh, and you should be good to go!


CCDB settings

Note that right now there is a problem loading the geometry from the main MySQL CCDB. For now, please use the following settings:

JANA_CALIB_URL=sqlite:////work/halld/home/sdobbs/KLF/klong.sqlite
JANA_CALIB_CONTEXT="variation=klong"

Generating Events

The KLGenerator_hddm_V3 program has some good online help through the -h switch.

For an example, let's try generating 1M KL p -> Ks p events.

KLGenerator_hddm_V3 -M1000000 -Fgenerated.root -Ekaon:histo:1.0:4.0 -Rkl2

This writes events out to a file named generated.hddm.

Here are some useful/interesting switches to consider for these studies:

  • -t - disables KL beam propagation calculation - this basically forces hdgeant4 to treat this as a photon beam reaction
  • -c - disables simulation of KPT z size, so all KLs are produced exactly 24 m from the target center.

There are two next steps: running the events through the GEANT4-based detector simulation, and applying the detector response. When running all of these pieces of software, the following environment variable must be set to the following value: JANA_CALIB_CONTEXT variation=klong.

The GEANT4-based simulation package is called hdgeant4. It relies on a configuration file called control.in. There is some thorough documentation at this location, however, you can find a simple control.in file to start out with below.

Example control.in file

(Click "Expand" to the right for more details -->):

INFILE 'generated.hddm'
TRIG 10
RUNG 30730
OUTFILE 'hdgeant4_output.hddm'
TOFMAX 1e-5
HADR 1
CKOV 1
LABS 1

You can then run the program as, for example, hdgeant4 -t6 (the -t6 says to run with 6 threads - please change this to whatever is convenient to your system). The output is then saved as hdgeant4_output.hddm.

You can apply the detector smearing then by running the command mcsmear hdgeant4_output.hddm. The output file will then be hdgeant4_output_smeared.hddm.

Analyzing simulated data

There are two ways we can go about analyzing the simulated data.

The program hd_dump prints out values of different objects event by event.

Here is an example of how to run it:

hd_dump -DDVertex:KLong -DBeamKLong -DBeamKLong:MCGEN DTrackTimeBased -DChargedTrackHypothesis -DMCThrown -PVERTEX:USEWEIGHTEDAVERAGE=1 hdgeant4_output_smeared.hddm

Running hd_root

To generate histograms or trees, you can use the program hd_root. Here is an example of how to run it:

hd_root --nthreads=8 -PEVENTRFBUNCH:USE_TAG=KLong -PVERTEX:USEWEIGHTEDAVERAGE=1 -PVERTEX:USE_KLONG_VERTEX=1 -PTRIG:BYPASS=1 -PPLUGINS=monitoring_hists hdgeant4_output_smeared.hddm

Some notes:

  • hd_root runs very well multithreaded! Use as many as you want, i.e., with the --nthreads flag
  • Most options can be set using the -P switch, as -P[key]=[value]. Use the -Pprint option to get a list of different flags that can be set!
  • -PPLUGINS lets you specify a comma-separated list of plugins to use. There are many standard plugins, but monitoring_hists is a good place to start - it makes a ton of standard histograms.
  • You should include the `-PTRIG:BYPASS=1` flag if you are making selections on the simulated trigger, or using the ReactionFilter


Note that you should always include the flags -PEVENTRFBUNCH:USE_TAG=KLong -PVERTEX:USE_KLONG_VERTEX=1 !

One can also write your own plugins, instructions to come.

Creating analysis trees

The ReactionFilter plugin is the easiest way to generate ROOT trees to analyze. It automatically performs some loose particle ID selections, combinatorics for a specific final state, and an optional kinematic fit.

Some notes:

  • Here is a summary page with additional documentation on how to use the ReactionFilter Plugin
  • For current KLF analysis, it would be useful to understand how the results depend on different kinematic fit settings, specifically: no kinematic fit (F0), vertex-only kinematic fit (F2), 4-momentum and vertex kinematic fit (F4),
  • It would also be useful to study the effect of different particle ID selections. The default cuts are described in this page and the way to modify them is given in this page.
  • For general software reference, look at these documentation wiki pages and also the slides from these tutorials.

Here is an example of how to run it, for K_L p -> K_S p:

hd_root --nthreads=8 -PTRIG:BYPASS=1 -PEVENTRFBUNCH:USE_TAG=KLong -PVERTEX:USEWEIGHTEDAVERAGE=1 -PVERTEX:USE_KLONG_VERTEX=1 -PPLUGINS=monitoring_hists,ReactionFilter -PReaction1=10_14__16_14 -PReaction1:Flags=B0_M16 hdgeant4_output_smeared.hddm

Here is an example of how to make a tree of just the thrown particles: hd_root --nthreads=8 -PPLUGINS=mcthrown_tree -PMCTHROWN:TAGCHECK=0 hdgeant4_output_smeared.hddm

Interesting things to look at

  • How does the vertex spatial and time resolution depend on the number of charged particles in the event, and where they go in the detector (BCAL vs. TOF)?
  • Is the time-of-flight (Delta t_"RF") properly centered at zero, and what is the resolution, in various detectors?

Relevant objects to consider

  • DVertex:KLong - uses charged tracks to determine the primary vertex position and time. There are several interesting flags associated with this:
    • VERTEX:USEWEIGHTEDAVERAGE=(0/1) - to get the event time, we take the simple average of the track times propagated to the vertex. We can also weight this average by the resolution of these times (i.e. the track times as determined by fast timing detectors).
    • VERTEX:MINTRACKINGFOM - to get the overall event vertex, we only want to use reasonably good tracks that come from near the target. this is one quality control flag we can use, cutting on the track fit FOM
    • VERTEX:MINTRACKNDF - same thing here, but cut (implicitly) on the number of hits on the track (num. hits = NDF + 5). junk and beam background tracks will tend to have few hits
  • DEventRFBunch:KLong - uses the DVertex:KLong object to determine the event timing
  • DBeamKLong - the reconstructed KLong beam object (momentum and timing)
  • DBeamKLong:MCGEN - the thrown KLong beam parameters

Next steps

Determine some loose default TOF PID selections.

Implement KL beam into the kinematic fitting framework