SAMPA SRO
Project dependencies
Installation
NB. For installation you should define ERSAP_HOME environmental variable.
SAMPA SRO diagram
Building SAMA DAQ codebase
NB. The SAMPA SRO package is kindly provided by the ALICE collaboration and is modified the EPSCI SRO group to make it streaming. The modified package can be found at /home/gurjyan/Devel/stream/exp-sampa
- login into alkaid.jlab.org
- copy the ALICE modified package into your own directory
- follow instructions in README to build the package
Configuration and running
ATTN. Keeping the order of instructions are important.
NB. We recommend defining and creating $ERSAP_USER_DATA directory. No worries about the mentioned directory structure. After the first ERSAP execution the $ERSAP_USER_DATA directory will get proper structure. Use the ersap-shell (ERSAP CLI) to run the services locally. The CLI provides a high-level interface to configure and start the different ERSAP components required to run an application.
- Start the ERSAP shell:
- $ERSAP_HOME/bin/ersap-shell
- Define the application within a services.yaml file. An example of the file can be found in the ersap-jana installation manual. NB: The default location for the application definition file is in $ERSAP_USER_DATA/config dir
- Optionally you can change the number of parallel threads used by the services to process requests
- ersap> set threads <NUM_THREADS>
- Start the data processing. This will start the main Java DPE, a C++ DPE if the C++ service is listed in services.yaml,and it will run the streaming orchestrator to process the data-stream.
- ersap> run local
- Run SAMPA FE (on some other terminal. NB: use bash shell)
- >source [modified ALICE code directory]/dist/trorc/trorc-operator/setenv.sh
- >treadout --data-type 1 --frames 2000 --mode das --mask 0x1 --events 0
ALICE code instructions
Open 2 terminal sessions on alkaid
Terminal 1 (Configure front-end boards; needs to be done ONCE unless you change or modify the c++ code):
NB: Use bash shell.
> cd /daqfs/gurjyan/trorc
> python init/go-trorc.py -mode das -mask 0x1F –nbp 5 –c
- ‘das’ for raw ADC sample mode; substitute ‘dsp’ for threshold zero suppression mode
- ‘0x1F’ with ‘5’ configures all five front-end cards; do this even if you don’t read them all out
- ‘c’ compiles and copies modules – needed if you change modes; do it to be safe (few seconds)
Terminal 2 (run and analyze data):
NB: Use bash shell.
> source /usr/local/trorc/trorc-operator/setenv.sh
> cd /daqfs/gurjyan/trorc
> treadout --data-type 1 --frames 2000 --mode das --mask 0x1 --events 0
- --data-type 1' - always use
- --frames 4000' - collect 4000 GBT frames; with 1 board read out (see mask) this number can be large (e.g. 100000) if a large number of frames is chosen with multiple boards read out, some disk files may be truncated due to inability of file writing to keep up with data volume (file is still readable)
- --mode das’ for raw ADC sample mode; substitute ‘dsp’ for threshold zero suppression mode (MUST be same as in terminal 1)
- --0x1F’ reads out all 5 front-end cards; substitute '0x1' to read out card #0, '0x3' to read out cards #0 & #1, etc.
<CTRL C> to end the run.