Posted on and Updated on

Data Art ⁄ Assignment 3

A friend of mine is a lighting artist. Last week I had the chance to go see one of his works in person:

It consists of three suspended rings with moving light mechanisms projecting out from the insides of the rings. To me this piece was the epitome of the “technological sublime” in that in the sense that Lev Manovich’s contemporary, Toby Miller, describes as the merging of the sublime and the beautiful:

“the sublime (the awesome, the ineffable, the uncontrollable, the powerful) and the beautiful (the approachable, the attractive, the pliant, the soothing) are generally regarded as opposites. The unique quality of electronic technology has been its ability to combine them.”

[1]

A smoke machine was pumping a steady stream of fog into the room so that the lights would project visible rays across space. I began considering how I might be able to apply this use of light in 3d space to a visualization of the North American Aerospace Defense Command’s (NORAD) satellite data I’d began working on, which displays each satellite’s apogee and perigee around a circle, with visual similarities to the light installation:

This visual also incorporates an interface to let the viewer explore different aspects of the dataset:

Similar to the Museum of Natural History’s The Known Universe visual tour of space, my intent with this visualization has been just to enable exploration of the vast physical scale the data describes. Of course there are countless other visualizations using this same data,  I think that’s because it’s a quintessential example for a dataset describing something that’s beyond our capacity to see, while also describing the physicality of a system which just so happen to be really easy to visualize due to its near-perfect geometries with abundant data to describe it. But I think satellites are also pretty loaded with symbolism for modern-day communication issues around surveillance, privacy, and censorship as they are essentially cameras and communication devices looking down at us from high up in the sky, far beyond our reach and control. They can see us, we can’t see them.

Bringing this visualization into tangible space could help make this spacial understanding a little more accurate.

How will this be different from slapping on a Google Cardboard with Google Earth, or going to the planetarium?

1 ) Looking at the data again

The NORAD satellite data is recorded in the form of a TLE:

1    16U 58002A   17102.55134124 -.00000137 +00000-0 -16915-3 0  9996
2    16 034.2628 269.3658 2028494 018.9069 347.6942 10.48649171336856

space-track.org is a free, publicly accessible API that can serve this data as json:

{
    "ORDINAL": "1",
    "COMMENT": "GENERATED VIA SPACETRACK.ORG API",
    "ORIGINATOR": "JSPOC",
    "NORAD_CAT_ID": "16",
    "OBJECT_NAME": "VANGUARD R/B",
    "OBJECT_TYPE": "ROCKET BODY",
    "CLASSIFICATION_TYPE": "U",
    "INTLDES": "58002A",
    "EPOCH": "2017-04-12 13:13:55",
    "EPOCH_MICROSECONDS": "883136",
    "MEAN_MOTION": "10.48649171",
    "ECCENTRICITY": "0.2028494",
    "INCLINATION": "34.2628",
    "RA_OF_ASC_NODE": "269.3658",
    "ARG_OF_PERICENTER": "18.9069",
    "MEAN_ANOMALY": "347.6942",
    "EPHEMERIS_TYPE": "0",
    "ELEMENT_SET_NO": "999",
    "REV_AT_EPOCH": "33685",
    "BSTAR": "-0.00016915",
    "MEAN_MOTION_DOT": "-1.37e-06",
    "MEAN_MOTION_DDOT": "0",
    "FILE": "2165150",
    "TLE_LINE0": "0 VANGUARD R/B",
    "TLE_LINE1": "1    16U 58002A   17102.55134124 -.00000137 +00000-0 -16915-3 0  9996",
    "TLE_LINE2": "2    16 034.2628 269.3658 2028494 018.9069 347.6942 10.48649171336856",
    "OBJECT_ID": "1958-002A",
    "OBJECT_NUMBER": "16",
    "SEMIMAJOR_AXIS": "8816.882",
    "PERIOD": "137.319",
    "APOGEE": "4227.246",
    "PERIGEE": "650.247"
  }

2 ) Mapping possible interactions

  • Zoom
    • HEO
    • MEO
    • LEO
  • View mode
    • 2d (side-by-side comparison)
    • 3d (geographically accurate)
      • Panoptic view (entire visualization and Earth scaled down in the space)
      • “Real” location view (show satellites positioned directly above viewer in real-time)
  • Sort
    • Chronologically (launch date)
    • OBJECT_NAME (alphabetically)
    • Apogee altitude
    • Perigee altitude
  • Filter
    • Function
      • Military / surveillance
      • Communication
      • Navigation
      • Science
      • Engineering
      • Unknown / junk
    • Time period (of launch)
      • Decade
      • Year
      • OBJECT_ID numerical value (i.e. 1999-002ABC)
      • OBJECT_ID letter value (i.e. 1999-002ABC)
    • Object type
      • Rocket body (junk?)
      • Debris
      • Payload (carrying humans or cargo)
    • Source / ownership
      • https://celestrak.com/satcat/sources.asp

3 ) Map out the space

  • What kind of space?
    • Enclosed and climate controlled (in order to house fog and proper darkness)
    • Publicly accessible, able to be inhabited by multiple viewers but with 1 user at any given moment
  • How will users interact?
    • Gesture?
    • Portable controller?
    • Stationary controller?
    • Smartphone app?
  • What will the light/projection mechanisms look like, how will they fit in the space?

 

 


[1] http://www.tobymiller.org/images/techenviro/OurDirtyLoveAffair.pdf

 

Leave a Reply

Your email address will not be published. Required fields are marked *