Difference between revisions of "SRGS 2022"

From epsciwiki
Jump to navigation Jump to search
Line 3: Line 3:
 
* [[Media:Mini_research_projects_v2.pdf | Potential Projects Summaries]]
 
* [[Media:Mini_research_projects_v2.pdf | Potential Projects Summaries]]
 
* [https://jeffersonlab.sharepoint.com/:f:/r/sites/SciEdStudents/Shared%20Documents/2022_Summer/David%20Lawrence,%20Nathan%20Brei,%20Carl%20Timmer?csf=1&web=1&e=btBNDn Sharepoint Link]
 
* [https://jeffersonlab.sharepoint.com/:f:/r/sites/SciEdStudents/Shared%20Documents/2022_Summer/David%20Lawrence,%20Nathan%20Brei,%20Carl%20Timmer?csf=1&web=1&e=btBNDn Sharepoint Link]
 +
 +
== PHASM: neural net models of PDE solvers==
 +
 +
Students:
 +
* Dhruv Bejugam
 +
* Hari Gopal
 +
* Colin Wolfe
 +
 +
Useful links:
 +
* Project slides:
 +
* Background reading material
 +
* PHASM repository: [[https://github.com/nathanwbrei/phasm]]
  
 
== AI Feature Recognition: Extract Spectrometer Angle from Image ==
 
== AI Feature Recognition: Extract Spectrometer Angle from Image ==

Revision as of 19:10, 27 June 2022

General Info

PHASM: neural net models of PDE solvers

Students:

  • Dhruv Bejugam
  • Hari Gopal
  • Colin Wolfe

Useful links:

  • Project slides:
  • Background reading material
  • PHASM repository: [[1]]

AI Feature Recognition: Extract Spectrometer Angle from Image

Students:

  • Anna Rosner
  • William Savage

Useful links/info:

  • angle-cam-image-recognition.pdf
  • Location of example images: /work/hallc/shms/spring17_angle_snaps/
    • Time the image was acquired is embedded in the image file
    • The numbers in the snapshot filenames are the run numbers
    • 4,265 images ; ~92kB/file ; 391MB total
  • The value of the encoders are stored in the MYA EPICS archive
    • PV names are:
      • ecSHMS_Angle
      • ecHMS_Angle
  • Example logbook entry

Initial thoughts from Brad

   I had been imagining splitting the photos into two regions: one with
   the digits, and a second with the vernier scale. Each region would be
   evaluated/interpreted separately with some 'optimized' algorithms.
   
   'Real' errors/discrepancies would be best indicated by a scanning for a
   mismatch between MYA and the analysis database record and/or the value
   flagged in the logbook which has generally been vetted and updated by a
   human. The simplest way to test 'bad' angles would be just to
   (randomly) shift the truth angle by a small amount -- that would be
   indistinguishable from an observed drift in the EPICS encoder system.
   
   I (or the students) can also look for angle shifts in the 'real' data,
   but that will take some poking around. It should be indicated by a
   sharp (small) jump in the MYA value as an offset is changed to bring
   the EPICS value in agreement with the camera readback.
   
   One other dataset that I could obtain is a movie of the angle changing
   over a range (the movie is just a compilation of frame grabs). The
   individual frames could be pulled out of the mp4 and evaluated
   individually over a continuously varying range of angles.