SRGS 2022

From epsciwiki
Revision as of 20:32, 15 July 2022 by Davidl (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

General Info

Zoom Connection Info:

You can connect using https://jlab-org.zoomgov.com (Meeting ID: 160 808 1515). (Click "Expand" to the right for details -->):

Join Zoom Meeting
One tap mobile: US: +16692545252,,1608081515# or +16468287666,,1608081515#
Meeting URL: https://jlab-org.zoomgov.com/j/1608081515?pwd=c0pNakJVT0NHVUtSQWFxWlhHN3ZiUT09&from=addon
Meeting ID: 160 808 1515
Passcode: 798688

Join by Telephone
For higher quality, dial a number based on your current location.
Dial:
US: +1 669 254 5252 or +1 646 828 7666 or +1 669 216 1590 or +1 551 285 1373 or 833 568 8864 (Toll Free)
Meeting ID: 160 808 1515
International numbers
Join from an H.323/SIP room system
H.323: 161.199.138.10 (US West)
161.199.136.10 (US East)
Meeting ID: 160 808 1515

Passcode: 798688
SIP: 1608081515@sip.zoomgov.com
Passcode: 798688


Presentation Schedule

     June 27 - orientation
     June 28 - Will S.
     June 29 - Dhruv B.
     June 30 - Anna R.
     July  1 - Hari G.
     July 4 Independence Day 
     July 5 - Will S.
     July 6 - Colin W.
     July 7 - Anna R.
     July 8 - Dhruv B.
     July 11 - Hari G.
     July 12 - Will S.
     July 13 - Colin W.
     July 14 - Anna R.
     July 15 - Dhruv B.
     July 18 - Hari G.
     July 19 - Will S.
     July 20 - Colin W.
     July 21 - Anna R.
     July 22 - Dhruv B.


Useful Links (General)

PHASM: neural net models of PDE solvers

Students:

  • Dhruv Bejugam
  • Hari Gopal
  • Colin Wolfe

Useful links:



AI Feature Recognition: Extract Spectrometer Angle from Image

Students:

  • Anna Rosner
  • William Savage

Useful links/info:

Initial thoughts from Brad

Brad's initial thoughts on approaching the problem (Click "Expand" to the right for details -->):

   I had been imagining splitting the photos into two regions: one with
   the digits, and a second with the vernier scale. Each region would be
   evaluated/interpreted separately with some 'optimized' algorithms.
   
   'Real' errors/discrepancies would be best indicated by a scanning for a
   mismatch between MYA and the analysis database record and/or the value
   flagged in the logbook which has generally been vetted and updated by a
   human. The simplest way to test 'bad' angles would be just to
   (randomly) shift the truth angle by a small amount -- that would be
   indistinguishable from an observed drift in the EPICS encoder system.
   
   I (or the students) can also look for angle shifts in the 'real' data,
   but that will take some poking around. It should be indicated by a
   sharp (small) jump in the MYA value as an offset is changed to bring
   the EPICS value in agreement with the camera readback.
   
   One other dataset that I could obtain is a movie of the angle changing
   over a range (the movie is just a compilation of frame grabs). The
   individual frames could be pulled out of the mp4 and evaluated
   individually over a continuously varying range of angles.