SAMPA SRO

From epsciwiki
Jump to navigation Jump to search


Project dependencies

  1. ersap-java
  2. ersap-cpp
  3. ersap-sampa
  4. ersap-jana

Installation

NB. For installation you should define ERSAP_HOME environmental variable.

  1. ersap-java instructions
  2. ersap-cpp instructions
  3. ersap-sampa instructions
  4. ersap-jana instructions

SAMPA SRO diagram

  1. diagram

Building SAMA DAQ codebase

NB. The SAMPA SRO package is kindly provided by the ALICE collaboration and is modified the EPSCI SRO group to make it streaming. The modified package can be found at /home/gurjyan/Devel/stream/exp-sampa

  1. login into alkaid.jlab.org
  2. copy the ALICE modified package into your own directory
  3. follow instructions in README to build the package

Configuration and running

ATTN. Keeping the order of instructions are important.
NB. We recommend defining and creating $ERSAP_USER_DATA directory. No worries about the mentioned directory structure. After the first ERSAP execution the $ERSAP_USER_DATA directory will get proper structure. Use the ersap-shell (ERSAP CLI) to run the services locally. The CLI provides a high-level interface to configure and start the different ERSAP components required to run an application.

  1. Start the ERSAP shell:
    • $ERSAP_HOME/bin/ersap-shell
  2. Define the application within a services.yaml file. An example of the file can be found in the ersap-jana installation manual. NB: The default location for the application definition file is in $ERSAP_USER_DATA/config dir
    • ersap> set servicesFile services.yaml
  3. Optionally you can change the number of parallel threads used by the services to process requests
    • ersap> set threads <NUM_THREADS>
  4. Start the data processing. This will start the main Java DPE, a C++ DPE if the C++ service is listed in services.yaml,and it will run the streaming orchestrator to process the data-stream.
    • ersap> run local
  5. Run SAMPA DAQ (on some other terminal. NOTE bash shell)
    • >source [modified ALICE code directory]/dist/trorc/trorc-operator/setenv.sh
    • >treadout --data-type 1 --frames 2000 --mode das --mask 0x1 --events 0

ALICE code instructions

Open 2 terminal sessions on alkaid


Terminal 1 (Configure front-end boards; needs to be done ONCE unless you change or modify the c++ code):

NB. Use bash shell.

> cd /daqfs/gurjyan/trorc
> python init/go-trorc.py -mode das -mask 0x1F –nbp 5 –c

  1. ‘das’ for raw ADC sample mode; substitute ‘dsp’ for threshold zero suppression mode
  2. ‘0x1F’ with ‘5’ configures all five front-end cards; do this even if you don’t read them all out
  3. ‘c’ compiles and copies modules – needed if you change modes; do it to be safe (few seconds)


Terminal 2 (run and analyze data):


NB. Use bash shell.

> source /usr/local/trorc/trorc-operator/setenv.sh
> cd /daqfs/gurjyan/trorc
> treadout --data-type 1 --frames 2000 --mode das --mask 0x1 --events 0

  1. --data-type 1' - always use
  2. --frames 4000' - collect 4000 GBT frames; with 1 board read out (see mask) this number can be large (e.g. 100000) if a large number of frames is chosen with multiple boards read out, some disk files may be truncated due to inability of file writing to keep up with data volume (file is still readable)
  3. --mode das’ for raw ADC sample mode; substitute ‘dsp’ for threshold zero suppression mode (MUST be same as in terminal 1)
  4. --0x1F’ reads out all 5 front-end cards; substitute '0x1' to read out card #0, '0x3' to read out cards #0 & #1, etc.

<CTRL C> to end run