Posted on

Live Web ⁄ Class 1 ⁄ Self Portrait

[Live url here]

My self portrait consists of a series of videos, which together compose an image of my head, that collectively follow the cursor as it moves left and right across the window. In order to achieve this, I first recorded a video of myself turning from one profile position to the other. The video lasts about 25 seconds. After recording I split the video vertically into 10 even pieces and added them to the DOM as 10 separate <video> elements. Then I used js to map the length of the video in seconds to the width of the window in pixels, and had the video fast-forward or rewind to the time that corresponds to the current mouse position, every time mouse movement is detected.

I split the portrait into several pieces so that I could play around with the variables that determine the number of frames and the time each video transition would take to fast-forward/rewind from one point to another. I like the resulting jumpy effect and the way my face gets abstracted at certain moments and then slowly pieces back together.

The html/css…

<html>
	<head>
		<script type="text/javascript" src="main.js"></script>
	</head>
	<body>
	<br>
	<br>
	<div id="2">
		<video id="vid11" src="vid/Comp7_1.mp4" width="9%" height="100%" loop></video>
		<video id="vid12" src="vid/Comp7_2.mp4" width="9%" height="100%" loop></video>
		<video id="vid13" src="vid/Comp7_3.mp4" width="9%" height="100%" loop></video>
		<video id="vid14" src="vid/Comp7_4.mp4" width="9%" height="100%" loop></video>
		<video id="vid15" src="vid/Comp7_5.mp4" width="9%" height="100%" loop></video>
		<video id="vid16" src="vid/Comp7_6.mp4" width="9%" height="100%" loop></video>
		<video id="vid17" src="vid/Comp7_7.mp4" width="9%" height="100%" loop></video>
		<video id="vid18" src="vid/Comp7_8.mp4" width="9%" height="100%" loop></video>
		<video id="vid19" src="vid/Comp7_9.mp4" width="9%" height="100%" loop></video>
		<video id="vid20" src="vid/Comp7_10.mp4" width="9%" height="100%" loop></video>
	</div>
	</body>
	<style>
		video {
		object-fit: fill;
		}
		#2 video{
			display:inline;
			position:relative;
			float:left;
		}
		#2{
			height:800px;
		}
	</style>
</html>

…and the JS.

//src:
//[1] https://stackoverflow.com/a/7790764 - capturing mouse pos
//[2] https://stackoverflow.com/a/10756409 - range conversion
//[3] https://stackoverflow.com/a/36731430 - FF/RW video to given time

function init() { //all js that needs to happen after page has loaded
    document.onmousemove = handleMouseMove; //[1]

    function handleMouseMove(event) {
        var dot, eventDoc, doc, body, pageX, pageY;

        event = event || window.event; //IE

        // If pageX/Y aren't available and clientX/Y are,
        // calculate pageX/Y - logic taken from jQuery.
        // (For old IE)
        if (event.pageX == null && event.clientX != null) {
            eventDoc = (event.target && event.target.ownerDocument) || document;
            doc = eventDoc.documentElement;
            body = eventDoc.body;

            event.pageX = event.clientX +
                (doc && doc.scrollLeft || body && body.scrollLeft || 0) -
                (doc && doc.clientLeft || body && body.clientLeft || 0);
            event.pageY = event.clientY +
                (doc && doc.scrollTop || body && body.scrollTop || 0) -
                (doc && doc.clientTop || body && body.clientTop || 0);
        }

        // console.log(event.pageX);
        var width, vid_length_s, vid_timeToSkipTo;
        // width = screen.width;
        width = window.innerWidth;
        vid_length = 25;
        vid_timeToSkipTo = convertToRange(event.pageX, [0, width], [0, vid_length]);
        console.log(vid_timeToSkipTo);
        goToTime(Math.floor(vid_timeToSkipTo));
    }
}

window.addEventListener('load', init);

function convertToRange(value, srcRange, dstRange) { //[2]
    // value is outside source range return
    if (value < srcRange[0] || value > srcRange[1]) {
        return NaN;
    }
    var srcMax = srcRange[1] - srcRange[0],
        dstMax = dstRange[1] - dstRange[0],
        adjValue = value - srcRange[0];

    return (adjValue * dstMax / srcMax) + dstRange[0];

}

function goToTime(time) { //[3]
    var vid11 = document.getElementById('vid11'),
        ticks11 = 10, // number of frames during fast-forward
        frms11 = 100, // number of milliseconds between frames in fast-forward/rewind
        endtime11 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta11 = (endtime11 - vid11.currentTime) / ticks11;
    var startTime11 = vid11.currentTime;
    for (let i = 0; i < ticks11; ++i) {
        (function(j) {
            setTimeout(function() {
                vid11.currentTime = startTime11 + tdelta11 * j;
            }, j * frms11);
        })(i);
    }
    var vid12 = document.getElementById('vid12'),
        ticks12 = 10, // number of frames during fast-forward
        frms12 = 150, // number of milliseconds between frames in fast-forward/rewind
        endtime12 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta12 = (endtime12 - vid12.currentTime) / ticks12;
    var startTime12 = vid12.currentTime;
    for (let i = 0; i < ticks12; ++i) {
        (function(j) {
            setTimeout(function() {
                vid12.currentTime = startTime12 + tdelta12 * j;
            }, j * frms12);
        })(i);
    }
    var vid13 = document.getElementById('vid13'),
        ticks13 = 10, // number of frames during fast-forward
        frms13 = 200, // number of milliseconds between frames in fast-forward/rewind
        endtime13 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta13 = (endtime13 - vid13.currentTime) / ticks13;
    var startTime13 = vid13.currentTime;
    for (let i = 0; i < ticks13; ++i) {
        (function(j) {
            setTimeout(function() {
                vid13.currentTime = startTime13 + tdelta13 * j;
            }, j * frms13);
        })(i);
    }
    var vid14 = document.getElementById('vid14'),
        ticks14 = 10, // number of frames during fast-forward
        frms14 = 250, // number of milliseconds between frames in fast-forward/rewind
        endtime14 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta14 = (endtime14 - vid14.currentTime) / ticks14;
    var startTime14 = vid14.currentTime;
    for (let i = 0; i < ticks14; ++i) {
        (function(j) {
            setTimeout(function() {
                vid14.currentTime = startTime14 + tdelta14 * j;
            }, j * frms14);
        })(i);
    }
    var vid15 = document.getElementById('vid15'),
        ticks15 = 10, // number of frames during fast-forward
        frms15 = 300, // number of milliseconds between frames in fast-forward/rewind
        endtime15 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta15 = (endtime15 - vid15.currentTime) / ticks15;
    var startTime15 = vid15.currentTime;
    for (let i = 0; i < ticks15; ++i) {
        (function(j) {
            setTimeout(function() {
                vid15.currentTime = startTime15 + tdelta15 * j;
            }, j * frms15);
        })(i);
    }
    var vid16 = document.getElementById('vid16'),
        ticks16 = 10, // number of frames during fast-forward
        frms16 = 350, // number of milliseconds between frames in fast-forward/rewind
        endtime16 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta16 = (endtime16 - vid16.currentTime) / ticks16;
    var startTime16 = vid16.currentTime;
    for (let i = 0; i < ticks16; ++i) {
        (function(j) {
            setTimeout(function() {
                vid16.currentTime = startTime16 + tdelta16 * j;
            }, j * frms16);
        })(i);
    }
    var vid17 = document.getElementById('vid17'),
        ticks17 = 10, // number of frames during fast-forward
        frms17 = 300, // number of milliseconds between frames in fast-forward/rewind
        endtime17 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta17 = (endtime17 - vid17.currentTime) / ticks17;
    var startTime17 = vid17.currentTime;
    for (let i = 0; i < ticks17; ++i) {
        (function(j) {
            setTimeout(function() {
                vid17.currentTime = startTime17 + tdelta17 * j;
            }, j * frms17);
        })(i);
    }
    var vid18 = document.getElementById('vid18'),
        ticks18 = 10, // number of frames during fast-forward
        frms18 = 250, // number of milliseconds between frames in fast-forward/rewind
        endtime18 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta18 = (endtime18 - vid18.currentTime) / ticks18;
    var startTime18 = vid18.currentTime;
    for (let i = 0; i < ticks18; ++i) {
        (function(j) {
            setTimeout(function() {
                vid18.currentTime = startTime18 + tdelta18 * j;
            }, j * frms18);
        })(i);
    }
    var vid19 = document.getElementById('vid19'),
        ticks19 = 10, // number of frames during fast-forward
        frms19 = 200, // number of milliseconds between frames in fast-forward/rewind
        endtime19 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta19 = (endtime19 - vid19.currentTime) / ticks19;
    var startTime19 = vid19.currentTime;
    for (let i = 0; i < ticks19; ++i) {
        (function(j) {
            setTimeout(function() {
                vid19.currentTime = startTime19 + tdelta19 * j;
            }, j * frms19);
        })(i);
    }
    var vid20 = document.getElementById('vid20'),
        ticks20 = 10, // number of frames during fast-forward
        frms20 = 150, // number of milliseconds between frames in fast-forward/rewind
        endtime20 = time; // time to fast-forward/rewind to (in seconds)
    // fast-forward/rewind video to end time 
    var tdelta20 = (endtime20 - vid20.currentTime) / ticks20;
    var startTime20 = vid20.currentTime;
    for (let i = 0; i < ticks20; ++i) {
        (function(j) {
            setTimeout(function() {
                vid20.currentTime = startTime20 + tdelta20 * j;
            }, j * frms20);
        })(i);
    }
}

The code is pretty sloppy and could probably be optimized a lot if given more time. The videos perform horribly on Firefox and Chrome, while working fine in Safari. In general, playing 10 videos at once obviously causes average-performing computers to increase in temperature at an unacceptable rate and would probably eventually cause the browser to crash. Instead, there is probably a way for one video file to be used that can be duplicated and cropped as necessary.

Posted on and Updated on

Understanding Networks ⁄ Class 1 ⁄ Ball Drop Game

The controller I built consists of a two-axis joystick (`l`,`r`,`u`,`d`) and two buttons (one for `x`,and one that prints `in=andrew\n` on the first press, and just `i` on every subsequent press after that). I was unable to implement a Wifi101 or Ethernet shield, so the controller must be plugged in to function. All of this fit nicely into an iPhone case:

A video of it working:

The shell command is something like the following but with the appropriate usb input name and ip address:

cat /dev/cu.usbmodem1421 | nc 172.16.234.202 8080

Posted on and Updated on

Live Web ⁄ Class 1 ⁄ Analyzing a Live Online Platform: Second Life

Second Life is a massively multiplayer online role-playing sandbox game. Whether or not it’s to be considered a game is questionable as many users seem to take it as seriously as they would their normal lives, even going so far as to make and spend real money for virtual goods and services. One of the defining feature of SL is that it’s a 3d-rendered environment that functions in real-time, similarly to many other contemporary MMOs.

SL provides real-time visual and sound-related interactions with the environment and between users. A user can have their avatar perform actions with objects (sit in chairs, open doors, etc), and perform actions with other users (chat via text or microphone, have their avatars perform actions together, swap items, teleport to each other, etc).

But what’s most interesting about SL is the way it connects various parts of the WWW to its environments. Owners of a specific piece of land (a “parcel”) on a server (a “sim”) can stream music in the format of a radio station into the simulation for all visitors to hear. Some sims are designated shopping centers containing walls of objects that can be clicked on to purchase and download goods and services from https://marketplace.secondlife.com.

Seemingly any kind of web media can be embedded into the environment:

How does SL use live data streams?

http://wiki.secondlife.com/wiki/Live_Data_Feeds#Statistics

http://wiki.secondlife.com/wiki/Streaming_Video_in_Second_Life

Posted on and Updated on

Subtraction ⁄ Week 14

At home I have a small aquarium with a bio-filtration system I made using a mason jar and some airline tubing:

A biofilter cleans the aquarium water by having it travel through these small plastic or ceramic pellets aka filter medium (in this case ceramic), which provide ideal conditions for bacteria-eating microorganisms to flourish and eat the bacteria out of the water flowing through. My current filter setup is attached on one end to a pump inside the aquarium, pumping the aquarium water into the bottom of the jar, where the water rises through the filter medium and out through another tube at the top, back into the aquarium:

I want to rebuild this filter:

  • to make it look like a home product by a mock internet infrastructure technology service called “EcoNet” that I’m working on
  • to optimize the filtration process (it’s not very good at the moment) by designing separate chambers into the piece, enabling more thorough filtration and more controlled ability to measure / quantify water purity

I was initially opposed to building this because I need the material to be clear transparent, but since I couldn’t find proper recycled materials I figured the best I could do at the moment is use virgin plastic even though ultimately I’d like to avoid hypocritically using such materials to make products for projects that advocate environmental awareness. So although this is to be a finished, functioning filter, I consider it a prototype for a final version made possibly of glass or recycled/reclaimed materials.

So I picked out a 12″x18″ sheet of 1/8″ clear colorless acrylic and another of 1/4″. My design for the 1/8″ sheet is as follows (the smaller square is the bottom piece, the larger the lid):

For the 1/4″ sheet (the overlaying rectangles are pockets— I’m cutting two of the filter’s walls out of the 1/4″ sheet so that I have space to add pockets to carve out a few ledges to delimit the chambers with screens (I’m going to design or pick out the screens at a later time, the only requirements are that they don’t float and are semi-flexible so they can be placed inside):

I designed in the rounded legs as a place to screw the piece down while cutting.

I also bought a 1/4″ hard plastic bit from McMaster Carr.

I started cutting the 1/8″ acrylic sheet, but hadn’t realized they’d be  loose after just the first cut. This caused some cracked corners. I should have used the laser cutter for these pieces, so I ended up cutting the top and bottom squares on the laser cutter:

I also then added the product logo to the front face:

For the 1/4″ acrylic sheet, I made sure I cut small portions and ran the machine at a faster speed:

Pocket depth (less than half of ~0.25″ total thickness):

The sheet:

Second issue: I’d forgotten to set the pocket depth to negative, took me a good long while to figure this out:

This time the machine was correctly making multiple passes, which allowed me to see the position of the legs after a few:

So I paused the machine and carefully screwed them in:

But on the other face, the screw cracked the plastic:

It turns out the pockets were too deep, as the sheet was warped / not completely flat:

So I screwed down the center of the sheet and tried again with a new Z-origin:

I broke my bit by confusing the Y and Z axis controls at fast tracking speed, but thankfully I’d just finished 🙂

 

With all my parts cut, the next step is to drill the holes for the water entry and exit ducts. First some practice drills:

Then the real thing:

Both ducts drilled and tube adapters inserted:

Now to attach the faces. Since this container is to be filled with water, I need hermetically sealed corners. For this I used silicone aquarium sealant:

Taped the front face into position:

Added my first strip of sealant:

Placed the side face, making sure it was 90° and added some more strips of sealant:

Placed the bottom and the other side faces, used the throwaway square cutout to prop the faces at 90º:

Then taped down the back face:

Added sealant to the sides of the other faces:

And secured it into position:

This sealant needs to cure overnight.

 

Posted on and Updated on

Temporary Expert ⁄ Week 13 ⁄ EcoNet Progress

I went back to Pacific Aquarium and bought 3 silvertip tetras. Although the tank says “RARE,” silvertips along with other tetras are very common aquarium fish. They’re labeled as rare because they don’t breed in large numbers, but they aren’t rare in the wild. They’re native to the Amazon so the water shouldn’t drop below 70°F. They’re somewhat aggressive, and they school together. They’re omnivores, so regular fish food will be fine but can be given occasional larvae or other small creatures:

The food and water primers (I used spring water, then inserted the prime and stability a few hours before introducing the fish):

Acclimating to the temperature:

Supplies purchased for the filtration system. A water pump, mason jar, airline tubing, ceramic biofilter medium (little ceramic blocks that house bacteria feeding organisms as water travels through) and aquarium sealant caulk which was used to seal the holes I drilled into the lid of the mason jar for the airline tubing:

The pump pumps water into the tube that introduces the water into the bottom of the jar. Then the water rises through the filter medium and back out through an exit tube at the lid:

Posted on and Updated on

Subtraction ⁄ Week 11 ⁄ 3D on the 4-Axis

I made a copy of the bit that kept going missing (after it was found) 🙂

First iteration — I designed in the tabs but they were too long and thin, so I ended up going back to make them just over .5in in length and a bit wider, removed the tab at the back of the bit, then manually added margins to the tip and the back and removed the side margins:

Final iteration with fixed tabs — the tip took hours to figure out, it’s a combination of a stretched cone, a cone with parts subtracted from it, and a custom shape extruded upward then distorted into a helix:

Cut a 7″ piece of poplar:

Done:

After very carefully removing the shape from the block on the band saw and sanding:

Posted on and Updated on

Temporary Expert ⁄ Week 12 ⁄ EcoNet Progress

I went back to Pacific Aquarium & Plant and spoke with an employee about what I should include in my tank and how to go about my first steps. He insisted that I don’t buy fish just yet, that I have my tank set up before doing so. I made sure that I left with something so I had him help me pick out easy to maintain plants that would inhabit freshwater at room temperature. An anubias plant:

Java moss (in the black container):

Gravel as substrate (gravel is easier for planting than sand, it holds the roots down much more easily):

2.5 gallon tank, washed:

Substrate rinsed and put into the tank:

Planted anubias and moss and filled with bottled water (room temperature, not refrigerated) from the grocery store:

The moss is held down with with a couple pebbles which probably won’t be sufficient once a pump system introduces current.I will probably have to replant as shown in this video:

Plants also need light. I knew I would go with an LED fixture because they produce far less energy and are generally cheaper. Lighting aquatic plants seems fairly complicated though:

“When concerned with supporting photosynthetic aquatic life, hobbyists should look for PAR values of LED fixtures. PAR or Photosynthetically Active Radiation designates a spectral range of light that photosynthetic organisms utilize during photosynthesis. Keep in mind that PAR values vary at different depths and distances from the LED light source. In other words, the same LED fixture will have multiple PAR values capable of supporting different species with different light requirements. Due to the relatively complex nature of expressing PAR levels and a lack of standardization, not all manufactures will provide PAR information the same way. To determine which LED aquarium light fixture is right for you, please refer to our handy LED Lighting Comparison Guide.” [1]

Since my tank is small and doesn’t require too much light, I used the abundance of LEDs I have at home to make my own fixture, adding a potentiometer so I could experiment with various colors and brightness over time. Daylight (more reds):

Night (more blues):

Next steps:

  • Pump (not required right away but would be nice to have)
  • Filter: could start with a DIY biofilter, similar to the ones they had bobbing around in some of the tanks at the store

Eventually I want to build out a custom filtration system in a way that will facilitate easy/accessible, automated water content observation/measurement.

  • Water testing kits
    • ammonia test kit
    • nitrate test kit
    • co2 test kit
    • thermometer
  • Finally introduce a fish or two: possibly a small school of guppies or similar micro fish, but not a betta in case another fish is to be added later on.

 


[1] http://www.liveaquaria.com/PIC/article.cfm?aid=42

Posted on and Updated on

Data Art ⁄ Assignment 3

A friend of mine is a lighting artist. Last week I had the chance to go see one of his works in person:

It consists of three suspended rings with moving light mechanisms projecting out from the insides of the rings. To me this piece was the epitome of the “technological sublime” in that in the sense that Lev Manovich’s contemporary, Toby Miller, describes as the merging of the sublime and the beautiful:

“the sublime (the awesome, the ineffable, the uncontrollable, the powerful) and the beautiful (the approachable, the attractive, the pliant, the soothing) are generally regarded as opposites. The unique quality of electronic technology has been its ability to combine them.”

[1]

A smoke machine was pumping a steady stream of fog into the room so that the lights would project visible rays across space. I began considering how I might be able to apply this use of light in 3d space to a visualization of the North American Aerospace Defense Command’s (NORAD) satellite data I’d began working on, which displays each satellite’s apogee and perigee around a circle, with visual similarities to the light installation:

This visual also incorporates an interface to let the viewer explore different aspects of the dataset:

Similar to the Museum of Natural History’s The Known Universe visual tour of space, my intent with this visualization has been just to enable exploration of the vast physical scale the data describes. Of course there are countless other visualizations using this same data,  I think that’s because it’s a quintessential example for a dataset describing something that’s beyond our capacity to see, while also describing the physicality of a system which just so happen to be really easy to visualize due to its near-perfect geometries with abundant data to describe it. But I think satellites are also pretty loaded with symbolism for modern-day communication issues around surveillance, privacy, and censorship as they are essentially cameras and communication devices looking down at us from high up in the sky, far beyond our reach and control. They can see us, we can’t see them.

Bringing this visualization into tangible space could help make this spacial understanding a little more accurate.

How will this be different from slapping on a Google Cardboard with Google Earth, or going to the planetarium?

1 ) Looking at the data again

The NORAD satellite data is recorded in the form of a TLE:

1    16U 58002A   17102.55134124 -.00000137 +00000-0 -16915-3 0  9996
2    16 034.2628 269.3658 2028494 018.9069 347.6942 10.48649171336856

space-track.org is a free, publicly accessible API that can serve this data as json:

{
    "ORDINAL": "1",
    "COMMENT": "GENERATED VIA SPACETRACK.ORG API",
    "ORIGINATOR": "JSPOC",
    "NORAD_CAT_ID": "16",
    "OBJECT_NAME": "VANGUARD R/B",
    "OBJECT_TYPE": "ROCKET BODY",
    "CLASSIFICATION_TYPE": "U",
    "INTLDES": "58002A",
    "EPOCH": "2017-04-12 13:13:55",
    "EPOCH_MICROSECONDS": "883136",
    "MEAN_MOTION": "10.48649171",
    "ECCENTRICITY": "0.2028494",
    "INCLINATION": "34.2628",
    "RA_OF_ASC_NODE": "269.3658",
    "ARG_OF_PERICENTER": "18.9069",
    "MEAN_ANOMALY": "347.6942",
    "EPHEMERIS_TYPE": "0",
    "ELEMENT_SET_NO": "999",
    "REV_AT_EPOCH": "33685",
    "BSTAR": "-0.00016915",
    "MEAN_MOTION_DOT": "-1.37e-06",
    "MEAN_MOTION_DDOT": "0",
    "FILE": "2165150",
    "TLE_LINE0": "0 VANGUARD R/B",
    "TLE_LINE1": "1    16U 58002A   17102.55134124 -.00000137 +00000-0 -16915-3 0  9996",
    "TLE_LINE2": "2    16 034.2628 269.3658 2028494 018.9069 347.6942 10.48649171336856",
    "OBJECT_ID": "1958-002A",
    "OBJECT_NUMBER": "16",
    "SEMIMAJOR_AXIS": "8816.882",
    "PERIOD": "137.319",
    "APOGEE": "4227.246",
    "PERIGEE": "650.247"
  }

2 ) Mapping possible interactions

  • Zoom
    • HEO
    • MEO
    • LEO
  • View mode
    • 2d (side-by-side comparison)
    • 3d (geographically accurate)
      • Panoptic view (entire visualization and Earth scaled down in the space)
      • “Real” location view (show satellites positioned directly above viewer in real-time)
  • Sort
    • Chronologically (launch date)
    • OBJECT_NAME (alphabetically)
    • Apogee altitude
    • Perigee altitude
  • Filter
    • Function
      • Military / surveillance
      • Communication
      • Navigation
      • Science
      • Engineering
      • Unknown / junk
    • Time period (of launch)
      • Decade
      • Year
      • OBJECT_ID numerical value (i.e. 1999-002ABC)
      • OBJECT_ID letter value (i.e. 1999-002ABC)
    • Object type
      • Rocket body (junk?)
      • Debris
      • Payload (carrying humans or cargo)
    • Source / ownership
      • https://celestrak.com/satcat/sources.asp

3 ) Map out the space

  • What kind of space?
    • Enclosed and climate controlled (in order to house fog and proper darkness)
    • Publicly accessible, able to be inhabited by multiple viewers but with 1 user at any given moment
  • How will users interact?
    • Gesture?
    • Portable controller?
    • Stationary controller?
    • Smartphone app?
  • What will the light/projection mechanisms look like, how will they fit in the space?

 

 


[1] http://www.tobymiller.org/images/techenviro/OurDirtyLoveAffair.pdf

 

Posted on and Updated on

Temporary Expert ⁄ Week 10 ⁄ EcoNet Planning

1) Establish container which houses a simple ecosystem that generates toxic ammonia (which is produced by fish waste, excess food, decaying organic matter)

B.O.M.:

2) Use a biofilter to convert ammonia into nitrate

3) Measure nitrate levels with colorimeter; nitrate/ammonia/ph levels mapped to internet speed

Automatic colorimetric aquarium monitors exist and would relay data to computer, but are brand new, expensive, not accessible enough yet:

  • https://reefbuilders.com/2015/10/28/colorimeter-testing-finally-ready-breakthrough-aquarium-hobby/#
  • https://www.seneye.com/

Manual process:

I spoke to someone who used to own an aquarium with a few different species of fish, and she claimed that maintenance only consisted of changing a filter inside a motorized pump and periodic tank cleaning. In terms of data collection, the following quantifiable elements are possible:

4) Feed data into internet speed visualization

Currently don’t know of a way to gauge my internet speed considering time/money/technical feasibility constraints, so my options are:

  • Consider possibilities of browser plugins to map some aspect of web browsing experience to aquarium data
  • Feed data into a mock browsing speed visualization, possibly modeled after popular http://www.speedtest.net/ interface:

I can incorporate geographic ecological data and internet submarine cable infrastructure into this visualization.