Digitization and Reconstruction¶
Marlin¶
The next two steps, digization and reconstruction, are managed with the Marlin framework. Marlin stands for Modular Analysis and Reconstruction for the LINear Collider.
Marlin tasks are implemented as "processors". Each processor can access the data of an event, and it can optionally create new data. For example, a clustering processor could read calorimeter hits, execute a clustering algorithm, and create new calorimeter clusters. A Marlin user defines which processors are executed, and their order of execution.
Our Marlin processors are run with the k4run
exectuable in the Key4hep software stack, which uses the Gaudi event processing framework. If you're familiar with the Marlin
executable, an introduction to using the k4run
executable is available on the key4hep docs.
Digitization¶
The next step of simulating data is called "digitization". This is the process of converting simulated hits into realistic hits, more similar to an actual particle detector and front-end electronics. Typically this involves a minimum amount of energy to be considered a hit, and smearing the energy to a realistic detector resolution.
# Runs in about 1 minute
cd /scratch/$USER/tutorial2024
k4run mucoll-benchmarks/digitisation/k4run/digi_steer.py --LcioEvent.Files output_sim.slcio
ls
A few output files will be produced:
output_digi.slcio
- contains all the collections produced by the processorsoutput_digi_light.slcio
- contains a light subset of output collections
The files can be inspected using the usual slcio tools:
Note that the names of the collections can change depending on the way Marlin is configured. For example, in previous tutorials, the digitized barrel ecal hits are called ECALBarrelHits
. However, in this tutorial, they are called EcalBarrelCollectionDigi
.
You can also inspect individual digitized hits with a python script using pyLCIO
:
Reconstruction¶
The final step is called "reconstruction", which itself is composed of multiple steps:
- Further filtering the list of tracker and calorimeter hits
- Creating tracks and clusters from tracker hits and calorimeter hits
- Creating particles (PFO, particle flow objects) from tracks and clusters
Finally, we are reconstructing particles based on the simulated measurements of the detector. Our reconstruction processors are driven by ACTS for tracking, and Pandora for clustering and particle creation. Nice presentations about ACTS and Pandora are available online.
Our interfaces with these projects are MuonColliderSoft/ACTSTracking and MuonColliderSoft/DDMarlinPandora. The container includes data necessary for ACTS tracking, and their locations are stored as environment variables:
env | grep ACTS_
# Should give:
# ACTS_TGeoFile=/opt/spack/opt/spack/linux-almalinux9-x86_64/gcc-11.3.1/actstracking-1.2.2-tjfu4av5xb6ivzyihvi2a3djbpnqx5nk/share/ACTSTracking/data/MuColl_v1.root
# ACTS_MatFile=/opt/spack/opt/spack/linux-almalinux9-x86_64/gcc-11.3.1/actstracking-1.2.2-tjfu4av5xb6ivzyihvi2a3djbpnqx5nk/share/ACTSTracking/data/material-maps.json
We can then run muon collider reconstruction like:
# Runs in about 1 minute
cd /scratch/$USER/tutorial2024
cp -a mucoll-benchmarks/reconstruction/k4run/PandoraSettings ./
k4run mucoll-benchmarks/reconstruction/k4run/reco_steer.py \
--LcioEvent.Files output_digi.slcio \
--MatFile ${ACTS_MatFile} \
--TGeoFile ${ACTS_TGeoFile}
ls
A few more output files will be produced:
output_reco.slcio
- contains all the collections produced by the processorsoutput_reco_light.slcio
- contains a light subset of output collections
The files can be inspected using the usual slcio tools:
Note the new collections of clusters, tracks, and reconstructed particles.
You can also inspect reconstructed objects with a python script using pyLCIO
: