Telemetry Data Visualization

Link to the visualization page:

Link to graph_local:



In 2013, the LBTO telemetry data from the telescope control system moved to HDF5 files. Although HDF5 files are easily processed via a variety of tools, there are many "standard" operations graphs that should be available to everyone on a daily basis. We would like to have web-based access to the LBTO telemetry data with easy graphing capabilities. This data is valuable, but hardly used.

In 2015, work was done to lay the foundation for creating user-friendly (web-based) graphical presentation for HDF5 telemetry using Dygraphs graphs. In 2016, additional work proceeded to refine the capabilities: merging of two data streams, time paring, scatter plots. In 2017: populating specific graphs of interest, transitioning some other logging output data (syslog and component logs) to HDF5 telemetry for use by the web-based presentation tools as needed. The goal of the effort is to provide user-friendly access to a wide variety of telemetry products in web-based graphical format.

Use Cases

  1. Daily plot of last night's standard parameters for observing performance
  2. Plot a single telemetry stream for a given day
  3. Plot multiple days of a single telemetry stream
  4. Plot selectable columns of a telemetry stream
  5. Plot combinations of telemetry streams
  6. Plot any time-series CSV with column headers

Should it just be a Quick Look tool?

The next level is much more complicated and will require more collapsing of data - cannot be done on a whole stream. It will require more sophisticated column extraction and merging of streams.
  • need to implement the column dictionary to allow selection of columns from multiple streams
  • regression lines should be limited to a graph with only 1 to 4 columns
  • multi-stream data merging should only be done on specific columns of streams - the TCS Merge Logs Tool will likely help with that
    no merge logs tool wasn't the right thing - it just orders things together, kind of a sort by time rather than merging data

Description of our HDF5 Configuration

All files contain two fields: timestamp in MJD nanoseconds and a nanoseconds-between-MJD-and-TAI field.

Files are stored in a directory structure based on subsystem/stream/date.
This has required the interface software (telemToCSV function) to know the details of our directory structure. The web side and the low-level h5csv process don't know anything about our directory structure.

Streams can be modified at any time by the source subsystem/instrument code, so we will require tools to mine the telemetry streams for fields/units/descriptions.
In the short term we've been able to ignore this by accessing fields of a stream by name rather than by position in the file (i.e., column number).

Our data is written at fairly high rates, in large streams (for example, up to 400 fields in a stream at 20Hz).
The operations stream created by the DDS subsystem has given us a way to decimate high frequency, important data to 1Hz. It's contents are a conglomeration of multiple streams.


The diagram at the top of the page represents the implementation.

Initial Design Thoughts -- historical notes.

Front Page Prototype March 2016 (2 stream correlation, field selection, time paring)



Four functions:

  1. User only cares about "telescope parameter" to graph and a timeframe. This should be presented as a form to fill in on the web.
    (Along with canned graphs we always want to see.)
  2. Program (or dictionary) needs to take that info and come up with stream/field combinations and files that map to the timeframe
  3. (Same?) Program converts the stream/field combinations from the files into a single CSV file - or returns an error string.
  4. Web graph interface only has to know about CSV files.

This keeps the interfaces clean to allow almost any graphing interface on the web side since the interface is strictly a CSV file.


Application Issues

  • synchronization issues between web, server, php, executable that creates CSV file
  • browser memory limitations on some machines
  • what makes sense to live where between web, server, Python, Java script, C/C++ - the CSV generation, time conversion from microseconds MJD to UTC
  • if we use a CSV file directly for the graphing, instead of building the CSV file with python - do we have to read the file twice into the server to get the header row and then the data?
  • cleanup issues - /tmp for CSV files
  • precision issues with the CSV creation - using %.9g but str is not using that -- check again!
  • Using the dygraphs time class, it interprets the unix timestamp nicely for the graph, however it uses the local machine's locale (not the webserver's) for how to interpret the time; so if you use a browser with MST, it converts the time to MST.
  • We will be limited with this tool for how much data it can process. When we use multiple days for the 20+ columns of data, it gets bogged down. Maybe this tool is a "Quick Look" tool as opposed to an analysis tool? Maybe column compression will help this - it's not bad to get a month's worth of say 3 columns. But that brings up the issue of streams changing. Some of these streams have changed over time and the columns are not the same - how do we handle that kind of column compression?
  • nested compound structures are handled by using the structure name with underscore to flatten it to a CSV file - this is what the table view does in HDFView - but it uses an arrow instead of underscore.
  • Checkboxes for subsystems didn't work in the initial prototype for Safari, Chrome, IE. Turns out to be a cache'ing issue - Firefox caches automatically the state for you, but the other browsers don't - had to implement cookies
  • time is a major issue -- if we use a human readable time, we can only get seconds precision (digraphs wants a timestamp like "YYYY/MM/DD HH:MM:SS") - looks nice (see below), but if we merge multiple dates of data, which timestamp should we use? and how can we get better than seconds resolution?
    graph using simple digraphs with "YYYY/MM/DD .." timestamp:
  • Allowing a generic graphing capability requires an upload of any CSV file. This is a security issue allowing uploads to your server. Solved by moving the read and graphing of non-telemetry data to the browser side - using Papa Parse parser, can do it all on the browser side - no uploads to the server.

Data Issues

  • SMT weather data from ENV is written even when we know it's out of date, we're even saving the flag that says it's bad
  • LBT weather data is more complicated, there are flags for different flags for different data - but definitely shouldn't write anything if all the data is bad
  • PCS is writing some initialization data to the guiding stream? sometimes that's all you get - why do we need this bogus data?
  • Too many columns in this data for quick look visualization, for instance:
     122  PMC actuator_NNN_NNN
     410  PMC thermal 
     237  mcspu az and el servo_data
     107  PSF primary collimation
     116  PSF secondary collimation 
     382  PSF wavefront 
  • GCS has mostly "none" for units, but obviously some of these have units: imgtimeout, imgheight, imgweight, relhotspotx, relhotspoty
  • Many streams are written at much higher rates than required for visualization. Our tools should cull the data down to reasonable sizes for visualization.

I tried to do a month's worth of weather data and got an error in the console window:
  "allocation size overflow"     dygrap...:3:25954

CSV is out of order; order it correctly to speed loading    dygraph...:3:21539 
when I tried to do 1-15 june lbt_weather

uncaught exception: out of memory
Even with just June 1-10, I get the complaint about "out of order"
h5csv takes 100% of cpu during generation
Just 4 days of weather data caused my firefox to crash
But, I did it again and it worked

Front Page Prototype Aug-2015 (single stream, no field selection)

Initially, we will support a single stream plot for any chosen day. Not dealing with fields/columns within the streams yet, or multiple streams selected. When you select a subsystem and left/right/both, the stream names are populated. This web page allows selection of start and end date. When a stream is selected, the "Graph" button sends the info to the CSV generation program and a CSV file is generated for a single stream (multiple days are supported if you select an End Date). Only a single stream is selected.


Selecting a date an clicking graph, gives you an empty graph with all the available columns of the stream to select from:


Selecting temperature and dewpoint and "points" at the bottom of the page graphs like this:


Web Prototype July-2015

Initial prototype using python direct interface to create csv files
also using XML, but I'm not sure how
Allows you to see all the fields of the stream on the left and choose any (or all) to graph.

  • Seeing data graph - version 1:

The csv generation time was prohibitive - Stephen ran standalone versions of the python (44 secs) and the C-version (3 secs) for a standard weather file.

Building the graph of a single column also took a long time to generate

    f = h5py.File(_teldir + os.environ['PATH_INFO'])
  for name in f:
    dset = f[name]
    print('<' + name + ' file="' + os.environ['PATH_INFO'] + '">')

    # figure out what we are doing with the column headers
    i = 0
    timestamp = False
    timestamp_idx = 0
    units = {}

    for field in dset.dtype.names:
      if field == 'time_stamp':
        timestamp = True
        timestamp_idx = i

      print('<' + field,end = '')

      # print out the units if we have them
          ' units="' +
          dset.attrs['Units'][i].decode('UTF-8') +
        ,end = '') 

    # print out timestamp if we have them
    if timestamp:
      ts = dset[0][timestamp_idx]
      timeval = telemetry_time(ts)

graph.js :
function draw_graph(name) {
  var checkboxes =
  var columns = new Array();
  for(var i = 0; i < checkboxes.length; i++)
  columns.push(name + '.time_stamp')
  graph = document.getElementById('graph_' + name);
  csv =
    location.pathname.replace('/graph','/csv') + '?columns=' + columns.join()
  new Dygraph(graph,csv,{ drawPoints: true, strokeWidth: 0.0 });

csv Generation

Revisited the h5dump version of h5csv that we had created a couple of years ago. This has been modified to allow specified columns and is quicker than the python version Stephen initially used.

Still to do:
  • units should be part of the CSV file so that the web side can yank them - it works for a full stream, but not for individual column selections

See HDF5 CSV and TelemToCSV Tool wiki pages for more notes.

I Attachment Action Size Date Who Comment
2StreamFrontPage.pngpng 2StreamFrontPage.png manage 48 K 30 Mar 2016 - 22:27 UnknownUser 2 stream correlation front page prototype
DIMMGraphExample1.pngpng DIMMGraphExample1.png manage 40 K 15 Jul 2015 - 15:42 UnknownUser Seeing data graph - version 1
DyGraphs20150825.pngpng DyGraphs20150825.png manage 31 K 25 Aug 2015 - 20:32 UnknownUser Telemetry Viz main page
IMG_20150717_084357243.jpgjpg IMG_20150717_084357243.jpg manage 2 MB 20 Jul 2015 - 15:55 UnknownUser Custom telemetry graph page
IMG_20150717_092057824.jpgjpg IMG_20150717_092057824.jpg manage 2 MB 20 Jul 2015 - 15:55 UnknownUser Notional telemetry entry page, version 1
IMG_20150721_084649827.jpgjpg IMG_20150721_084649827.jpg manage 2 MB 21 Jul 2015 - 16:44 UnknownUser Updated notional telemetry main page
LBTWeather-TempDpt.pngpng LBTWeather-TempDpt.png manage 69 K 25 Aug 2015 - 20:39 UnknownUser lbt_weather stream graph - temp and dewpoint graphed
LBTWeatherGraph-empty.pngpng LBTWeatherGraph-empty.png manage 26 K 25 Aug 2015 - 20:38 UnknownUser lbt_weather stream graph - empty
TelemVisGraphTest.pngpng TelemVisGraphTest.png manage 41 K 07 Aug 2015 - 21:29 UnknownUser graph using simple digraphs with "YYYY/MM/DD .." timestamp
TelemetryDictionary.csvcsv TelemetryDictionary.csv manage 59 K 21 Jul 2015 - 19:53 UnknownUser first cut at telemetry dictionary as CSV file
TelemetryVisualization-vsn2.pngpng TelemetryVisualization-vsn2.png manage 30 K 06 Aug 2015 - 23:09 UnknownUser Telemetry Viz main page, version 1
VizProcessDiagram-Color.pngpng VizProcessDiagram-Color.png manage 1 MB 19 Jan 2017 - 16:57 UnknownUser Visualization Process Diagram
Topic revision: r29 - 19 Jan 2017, KelleeSummers
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback