MODS Overview
MODS1 and MODS2 are UV to nearIR Multi-Object Double Spectrgraphs that are mounted at the Direct Gregorian focal stations of the LBT. They were designed and built by The Ohio State University Department of Astronomy in Columbus Ohio.
Each
MODS is a low- and medium-resolution spectrograph (R=100-2300) and imager working across the entire 330-1100nm band with a 6x6-arcminute field of view. Multi-object spectroscopy is accomplished using laser-machined focal-plane slit masks fed into the beam by a 24-position mask cassette. A beam selector below the slit carries a dichroic that splits the incoming beam into separate red- and blue-optimized channels (the dichroic cross-over wavelength is 575nm), each with its own collimator, grating, camera and detector, allowing simultaneous operation across the entire CCD band. The beam selector can also direct light into the red or blue channels alone, providing blue-/red-only modes to extend wavelength coverage across the dichroic cross-over wavelength as required. The
MODS CCDs are custom-built 3x8K monolithic e2v CCDs; blue-coated standard silicon on the blue channel and extended-red coated 40-micron deep depletion silicon on the red channel.
The following image shows both
MODS 1 and 2 mounted on the LBT (looking from the back of LBT).
MODS machines and how to login
The following table lists the main machines associated with the
MODS system. The usual login account can be obtained from the shared Google drive document "Anonymous Accounts" (
https://docs.google.com/spreadsheets/d/1fYW8KX5Wf86jkDvDjfyHWt-hsAxljN9WaeykR24IrDY/edit?ts=5c0eaa87#gid=0).
MODS1 Computers |
mods1 |
Owns the MODS processes (CentOS Linux) |
mods1data |
Stores data (FC3 Linux) |
blue ccd |
DOS machine for CCD |
red ccd |
DOS machine for CCD |
MODS2 Computers |
mods2 |
Owns the MODS processes (CentOS Linux) |
mods2data |
Stores data (FC3 Linux) |
blue ccd |
DOS machine for CCD |
red ccd |
DOS machine for CCD |
Others |
APC |
Connected power management (192.168.52.38 & 39) |
Raritan/KVM |
Remote access (192.168.139.19) |
Note the APC unit is a network connected power management unit (similar to the dataprobe units on the
LBC). You can login to the APC unit and then remotely restart or shutdown instruments (see section on PDU
#APC_PDU). The
MODS CCD machines are running DOS and are usually logged in via the KVM.
MODS Software Overview
The figure below shows the software components and architecture for a single
MODS instance (substitute the "#" in the figure with either 1 or 2). Both
MODS have identical underlying systems that operate independently. The dashed boxes enclose programs running on particular host computers, with the exception of "LBTO Systems" which is a blanket container for all LBTO subsystems that the
MODS data-taking systems interact with.
Interface protocols are shown in the key at the lower left corner. "Ethernet" is a mix of TCP/IP and UDP/IP depending on the system. Optical fibers are either Fiber Ethernet (the one into the IUB) or dedicated data I/O fibers for the
MODS science CCDs and the AGw Guide and WFS cameras. "Virtual C&C" is a catch-all indicating "other" virtual command-and-control protocols, for example the modsTCS to LBT
IIF processes communicate via the Zero-C ICE interface, whereas the LBT
GCS and
MODS AGw server and Az Cam servers communicate using a TCP/IP socket interfaces.
The
MODS GUI program is shown running on an obsN workstation, but note that this is just a window into the "modsUI" program running on the mods1/mods2 linux host, with the GUI portion instantiated on the obsN workstation via X tunneling through an ssh session between the observer account on obsN and the relevant mods# host computer.
In binocular
MODS mode, the acqMODS and execMODS programs are run via a wrapper script that makes sure that the appropriate binocular context flag is set, and which permits both sides to be running either the same acquisition or observation script (called the "identical twin mode") or different acq/obs scripts for each
MODS (called the "fraternal twin mode").
ISIS = Integrated Science Instrument Server, see
http://www.astronomy.ohio-state.edu/~pogge/Software/ISIS/server/. The ISIS server is the message passing server that mediates all interprocess communications between all
MODS instrument subsystems via the IMPv2.5 messaging protocol. It also records all command and control messaging traffic in runtime communications log stored on this machine.
The following is a list of abbreviated named used in the figure. We do not give the details of them in this document as they are more related to hardware.
- IEB: Instrument Electronics Box, mainly for power connection and management.
- IUB: Instrument Utility Box.
- LLB: Lamp/Laser Box.
- HEB: CCD Head Electronics Box.
Software Builds
Currently the only
MODS software built by LBTO is the library
libagwutils.a
used by
GCS. In the repository, the top-level makefile (
Makefile.build
) builds only the
API
and
app
directories - the
app
directory builds the
libagwutils.a
used by
GCS.
Log in to a 64-bit CentOS machine in Tucson (for example:
tcs@tcs-test
), export/checkout to the version-number you are building, make and install.
For instance, for the build version 1.2.3:
ssh tcs@tcs-test
cd modsagw
git clone git@github.com:LBTO/mods.git
cd mods/agw
edit Makefile.build to set the VERSION, SUBLEVEL, PATCHLEVEL numbers for what you want (top of file suppose you change them to 1.2.3)
make
make install (copies to /lbt/tcs_devel)
ssh tcs@tcs1.mountain.lbto.org
cd /lbt/tcs_devel
scp -pr tcs@tcs-test.tucson.lbto.org:/lbt/tcs_devel/mods-1.2.3 .
- Note that the GCS links in the static library, so the TCS file
.build/environment
has to be modified for the path of the latest MODS library.
Software/System Use and Management
Power Up and Down
Normally support astronomers will do the power up. In case you are wondering about the steps, you can follow the instructions at:
https://wiki.lbto.org/Instrumentation/MODSStartUp.
If an emergency (or unanticipated) power outage is impending, the automatic UPS script will take care of putting the instrument into a safe configuration and powering off the mechanisms and computers (see
https://wiki.lbto.org/Instrumentation/InstrumentsShutdownForPowerOutage). Since this procedure does not home all of the mechanisms, a full reset of the
MODS system is required when the power is returned - home all mechanisms, then send to nominal configuration positions.
If however you want to manually power down the
MODS system for some reason, then you need to follow the instructions at:
https://wiki.lbto.org/Instrumentation/MODSSupportUtils#Power_Outage_45_modsShutdown. Note that the scripts and operations mentioned in the instructions are located/performed on any
obsN
machine (e.g.,
obs2
,
obs3
etc.).
Also see notes from Olga on post-shutdown startup
https://wiki.lbto.org/Instrumentation/MODSPostShutdownStartup.
Start/stop status checking scripts
On the
obs3
machine (note: it must be obs3), there are scripts that periodically query the
MODS status and send out email alerts if status problems are detected. These scripts are located at the directory:
/home/telescope/bin/
. There are two scripts there
emailMODSErrors.sh
and
emailMODS2Errors.sh
. They are run by the cron daemon. So to disable them (e.g., after
MODS have been turned off or taken off from the telescope), just login to
obs3
as user
telescope
, and do
"crontab -e"
from the command line and then comment out the lines corresponding to these jobs (by putting
"#"
in front of that line). To re-enable them, do
"crontab -e"
from the command line and then uncomment the lines.
APC PDU
MODS hardware are connected to PDU (Power Distribution Unit) manufactured by APC. They are similar to the dataprobe units used by the
LBC. The PDUs have internal addresses:
http://192.168.52.38 and
http://192.168.52.39 (one for each
MODS). You can access these PDUs in two ways:
- Use
telnet
to login to a PDU (user name is mods
and password is the same). Then you can issue commands there (much the same as the dataprobe units used by LBC).
- You can also use your web browser to login to the PDUs (use the above address, though you need to be in the LBTO intranet). Use
mods
to login and then you will be presented with an interface screen similar to the figure shown below. You can then click to operate the PDU.
All the essential
MODS related processes are running on the
MODS machines (mods{1,2}, mods{1,2}data, and the DOS CCD control machines). See the above notes on power up (
#Power_Up_and_Down) to know these processes and services. Regular users and observers will however use the observation machines (obs{2,3,4} for example) to launch GUI and execute scripts, which will communicate with the
MODS processes and services on the
MODS machines. The follow list is a summary of the common commands and scripts.
- The main user interface is to launch
mods1 start gui
on an obs machine (e.g. obs3). This will launch the MODS control panel GUI. Refer to the OSU MODS manual (http://www.astronomy.ohio-state.edu/MODS/Manuals/MODSManual.pdf).
- On any obs machine, the directory
/home/MODSeng/modsScripts/Support/
contains various support scripts that an observer will likely use. The OSU MODS script reference (http://www.astronomy.ohio-state.edu/MODS/Manuals/MODSScripts.pdf) is a detailed manual for everything about MODS scripts (in particular acqMODS
and execMODS
).
-
modsView
is a visualization and guide star selection tool. Refer to http://www.astronomy.ohio-state.edu/MODS/ObsTools/modsView/ for help.
-
modsDisplay
for raw data display (see the MODS manual for details).
-
modsAlign
is an interactive mask alignment tool (see the MODS manual for details).
-
mods1 status
and mods2 status
, this will bring up a summary of the status of the major MODS services. You can either run the command on an obs machine, or you can also run it on the mods{1,2}data machine. The following screen shows an example of what you'd expect to see.
lbto@obs3:3 % mods1 status
MODS1 Status at 2018 Dec 19 23:24:29
Instrument Server:
Service State Owner
--------------------------------------
IE Running mods (IE = Instrument Electronics Command & Control)
AGw Running mods (Acquisition, Guiding, and Wavefront Sensing)
redIMCS Running mods (IMCS = Image Motion Compensation System)
blueIMCS Running mods
modsenv Running mods (Environmental Monitor)
lbttcs Running mods (TCS IIF Interface)
modsUI Stopped (MODS Control Panel GUI)
--------------------------------------
Data Server:
Service State Owner
--------------------------------------
isis Running mods (Integrated Science Instrument Server)
Red Caliban Running mods (Caliban Data Transfering Service)
Blue Caliban Running mods
--------------------------------------
-
isisCmd
sends command string to the ISIS client (detailed description and documents missing, mainly seen used to power cycle instruments).
-
execISIS
executes an ISIS script (detailed description and documents missing).
A quick reference for a usual night operation of
MODS can be found at:
https://wiki.lbto.org/PartnerObserving/MODSQuickRefWiki.
Data Management
MODS uses the "Caliban" data handling program developed at OSU to transfer raw FITS images from the CCD control computers to the
/lhome/data
data storage drive on host machines mods{1,2}data. The OSU team has produced a help page for the "Caliban" program (
http://www.astronomy.ohio-state.edu/~prospero/Caliban/).
The details of the data transfer occur in three steps:
Step 1: CCD Controller to
MODS Data Server
A
MODS CCD image is first read off the CCD into memory on the CCD control computer (e.g., M1.RC for the MODS1 Red Channel, M1.BC for the MODS1 Blue Channel). From memory it is written onto a transfer disk shared between the DOS-based CCD control computer and the
MODS data server machine (the Linux workstation
mods1data
for MODS1, and
mods2data
for MODS2). Once on the transfer disk, a dedicated instance of the Caliban data-transfer daemon running on the data server copies it from the transfer disk and writes it onto the
/lhome/data
staging disk in FITS format. Data from the red and blue channels are written to the same data server staging disk. Once written, its FITS header is checked and augmented with additional archive and engineering keywords, its header is scanned and the running data log in
/Logs/
is updated , and the image is ready to be copied to the LBTO data archive.
The step of copying the raw image from the transfer disk onto the data server’s staging disk is between 4 and 12 seconds, depending on the size of the image (this is a known bottleneck in the system and is on the list of improvement with future hardware upgrades). Post-processing of the image header and logging take less than a second. This final transfer-and-process step is asynchronous: if a sequence of images is being acquired, the next image in the sequence will be started as soon as the last file is written to the inter-machine transfer disk, and the final transfer-and-process step will occur while the next image starts.
However there is an
IMPORTANT NOTE: It is at the data-transfer step between the CCD control computer and the
MODS data server that the data-transfer queue can stall. The symptom is that the
LastFile
counter will fail to increment and images will stop appearing in the
modsDisp
windows, despite the fact that the
NextFile
counter is incrementing. If the difference between
LastFile
and
NextFile
grows larger, the data-transfer queue has stalled. If you notice that the transfer queue has stalled, type the command
fitsflush
in the Command window on the
MODS dashboard GUI. This should restart (“flush”) the FITS data transfer queue, and you’ll start seeing the
LastImage
counter increment and images appearing on the disk.
Step 2: Data Server to the LBTO Archive Staging Disk (/newdata)
After a
MODS FITS image
MODS data server’s staging disk has been processed for archiving, it is then copied across the network to the LBTO “new data” archive staging disk named, appropriately enough,
/newdata
. The transfer from the
MODS data server to the LBTO
/newdata
disk usually takes around 1 second for unbinned 8K×3K images. Data written to
/newdata
are read-only, but may be copied onto the observing workstation disks for further analysis. The
/newdata
disk is where new
MODS images first become available to the observers on the observing workstations. This disk is where
modsDisp
will watch for new raw-images to display, and where
modsAlign
will get thru-slit and field images for mask alignment.
Step 3: /newdata to LBTO archive (/Repository and beyond)
Each image that appears on
/newdata
will be immediately ingested by the LBTO data archive software, triggering a sequence of events that typically take no more than a couple minutes to complete. These steps include (in approximately this order):
- Image FITS header keywords are read and the archive database is updated.
- The image is copied to
/Repository/
for access on subsequent days.
- The image is gzip compressed and filed on the
/archive
disk (no user access).
- The image copied to the Tucson archive machine.
- The Tucson archive repeats steps 1 through 4 above.
- The Tucson archive sends copies as needed to the Germany & Italy archives.
Within a few minutes multiple copies of each image will be distributed across a network of RAID6 data arrays in the observatory archive machines. The images will stay on the
/newdata
disk until noon the following day when they are deleted to make room for the next night's data. Images copied to the
/Repository
disk will be available (read-only) for a months, stored in subdirectories organized by date. For example, data from
UTC 2018 Dec 24
will be stored in
/Repository/20181224/
. Guider and WFS images taken during that same night will be stored in a
GCS
subdirectory of this same folder. The
/Repository
disk is kept organized by the archive software; older data are automatically deleted to make room for the newest data. Both
/newdata
and
/Repository
are available (read-only) to observers logged into the observing workstations at the LBT.
The
PARTNER
,
PROPID
, and
PI_NAME
FITS header keywords are used to assign ownership of the data, primarily by the
PARTNER
keyword. The
PARTNER
is defined by the observatory to be one or more of these reserved values:
Partner |
Code |
LBT Observatory Staff |
LBTO |
LBTB (Germany) Partner Observing |
LBTB |
INAF (Italy) Partner Observing |
INAF |
Ohio State & Research Corp Partners |
OSURC |
Arizona Partner Observing |
AZ |
For regular science operations. Additional
PARTNER
IDs (e.g.,
COMMISSIONING
and
CALIBRATION
) are used for technical observing and special applications. The
PROPID
and
PI_NAME
are used differently by different partners and are user definable (or at least defined within a partner group, for example the OSURC partner block has internal conventions for how to assign
PROPID
and
PI_NAME
values for its observing queue). If multiple partners are sharing data,
PARTNER
can be a comma-separated list, for example
PARTNER AZ,LBTB
with no spaces before or after the comma. This will allow observers from either AZ or LBTB to access the data from the archive. Note that
CALIBRATION
partner ID is a special flag that should be used for all calibration data (biases, flats, comparison lamps, etc.) that allows all partners access to calibration images.
Log Files
Log |
Machine |
Description |
/home/mods/Logs/mmc.log |
mods{1,2} |
communication between devices |
/home/mods/Logs/agw.log |
mods{1,2} |
all communication between AGw interface and LBT GCS instance |
/home/mods/Logs/Env/mods1.YYYYMMDD.log /home/mods/Logs/Env/mods2.YYYYMMDD.log |
mods{1,2} |
environmental sensor data |
/Logs/Caliban/cb_blue.log /Logs/Caliban/cb_red.log |
mods{1,2}data |
|
/Logs/Data/YYYYMMDD.log |
mods{1,2}data |
image header data |
/Logs/Env |
mods{1,2}data |
old MODS1 env logs |
/Logs/CCD/ |
mods{1,2}data |
CCD environmental monitor logs |
/Logs/ISIS/isis.YYYYMMDD |
mods{1,2}data |
also communication between devices |
Config Files
Most of the config files are located on the
mods1
and
mods2
machine under the directory
/home/mods/Config/
. The config files for mods data server are located under the directory
/lhome/dts/Config/
on machine
mods1data
and
mods2data
("dts" is a user that stands for "data-taking system").
Troubleshooting
So far most of the troubleshooting procedures are for the support astronomers, operators, and the observers. Refer to:
As usual, many of the first steps are to try to power cycle the various instruments (via the
isisCmd
command line).