Skip to content

Running Event Reconstruction with the Beam Induced Background

Info

This guide is current as of October 2025, and was written for version 2.9.7 of the software.

This tutorial extends the "Running MAIA" guide to event reconstruction with the beam induced background (BIB). Running with BIB is computationally expensive. Per event, overlaying BIB takes just under an hour, tracking takes around an hour, and reconstruction can take several hours even with Pandora optimization. For this reason, we recommend using Condor to submit the reconstruction jobs in batches of a few events each.

These instructions are written for the Open Science Grid (OSG). It should be straightforward to adapt these instructions to other machines, with the caveat that you will have to figure out a method to access the (~60 GB) of simulated BIB files from the worker node. Worker nodes on the OSG can access the BIB files through /cvmfs/public-uc.osgstorage.org/ospool/uc-shared/public/futurecolliders/.

Feedback on these instructions is welcome!

Splitting signal slcio files

We assume simulated signal samples have already been made according to the "Running MAIA" instructions. For batch submission, you should split your simulated signal samples into chunks. You can do so with:

lcio_split_file <infilename> <outfilename> <sizeInBytes>

File sizes of ~ 1 MB are probably sufficient, but precise optimization of file size depends on the task. You may find that you can make the files larger (if you're not running reconstruction) or that you need to make them smaller (if you're submitting a complex reconstruction task). Condor jobs on the OSG are limited to 20 hours of execution time each. The split files will have a file name pattern like: filename.001.slcio, file_name.002.slcio, ... , file_name.N.slcio.

Dependencies: Condor submission scripts

The condor submission script templates are at the repository (to be updated to a central location soon):

https://github.com/gregory-penn/MuColl-PFlow/tree/main/condor_scripts/alma9/reconstruction_BIB

There are several files here:

  • reco.sub, the condor submission script. This file requires editing.
  • digi_reco_condor.py, the BIB overlay + digi + reconstruction script.
  • reco.sh, a bash script that runs the job on the worker node.
  • joblist.txt, a text file that directs which files are to be submitted. This file requires editing.
  • PandoraSettings, the directory that contains the configuration file for the Pandora particle flow algorithm. These files correspond to the configurations used for analysis of the v7 BIB samples.

reco.sub will steer the condor job submission, where the worker node will execute the bash script reco.sh, which sets up the proper environment and runs BIB overlay + digitization + event reconstruction with digi_reco_condor.py.

Editing the condor submission script

You will need to update a few paths in reco.sub. The next few should be in accordance with the name and location of your simulated signal samples and where you installed the reconstruction scripts:

-    input_file_name = split_sim_pions.$(sample).slcio
+    input_file_name = <outfilename>.$(sample).slcio
-    input_file_path = /scratch/gregorypenn/muColl-Taus/taus/MuColl-PFlow/samples/bib_v5_pions/sim/$(input_file_name)
+    input_file_path = </your/path/to>/$(input_file_name)
-    digi_reco_script = /scratch/gregorypenn/muColl-Taus/taus/MuColl-PFlow/condor_scripts/alma9/reconstruction_BIB/digi_reco_condor.py
+    digi_reco_script = </your/path/to>/digi_reco_condor.py
-    pandoraSettings_path = /scratch/gregorypenn/muColl-Taus/taus/MuColl-PFlow/condor_scripts/alma9/reconstruction_BIB/PandoraSettings
+    pandoraSettings_path = </your/path/to>/PandoraSettings

The following paths are to MAIA geometry files. For more details, see "Running MAIA":

-    maia_GEO = /scratch/gregorypenn/lccontent/maia_Geometry/MAIA_versions/v1_jun27/detector-simulation/geometries/MAIA_v0
+    maia_GEO = <your/path/to>/detector-simulation/geometries/MAIA_v0
-    myBIBUtils_path = /scratch/gregorypenn/MyBIBUtils
+    myBIBUtils_path = <your/path/to>/MyBIBUtils
-    ACTSTracking_path = /scratch/gregorypenn/ACTSTracking
+    ACTSTracking_path = <your/path/to>/ACTSTracking

These are all of the required edits!

Submitting condor jobs

The condor submission script will queue one job per line in joblist.txt. This can be disabled by commenting out the last line:

-    Queue sample from joblist.txt  
+    # Queue sample from joblist.txt  

Then the script will only submit the job for the file numbered "1". We strongly urge you test the submission of a single job first. You can submit the job (note you must not be in the singularity environment) with:

condor_submit reco.sub

and monitor the submission with:

condor_q <yourUsername>

The output logs should also be streamed to your directory so you can monitor the jobs in real time.

Once you have verified the condor submission works and the output is sensible, you can submit more jobs. If you want to submit 3 files, your joblist.txt should be:

0
1
2

You can generate an appropriate joblist.txt for a given number of jobs with the command:

seq 0 3333 | awk '{ if ($1 < 1000) printf "%03d\n", $1; else printf "%d\n", $1 }' > joblist.txt

Where this generated file numbers from 000 to 3333. This should match the syntax from your files from lcio_split_file.

Happy BIB submitting!