Posted on and Updated on

Detourning the Web | Assignment 3

https://youtu.be/tdn8ubicWsY

youtube-dl commands look like this (but the name of the video is an incremented integer— 1.mp4, 2.mp4, etc):

youtube-dl -f worstvideo --max-downloads 5 -o /Users/a/Documents/of_v0.9.8_osx_release/apps/myApps/20180318_detourningTheWeb_assignment3/bin/data/vids_sky/1.mp4 "https://www.youtube.com/results?sp=CAISAhgB&search_query=sky+timelapse"

youtube-dl -f worstvideo --max-downloads 5 -o /Users/a/Documents/of_v0.9.8_osx_release/apps/myApps/20180318_detourningTheWeb_assignment3/bin/data/vids_grass/1.mp4 "https://www.youtube.com/results?search_query=grass+timelapse&sp=CAISAhgB"

 

I called this shell command from within my oF app with a `system()` call from a separate thread (created with the `fork()` function):

void ofApp::downloadVid(int vidType, bool _isVidDownloaded, string dir, string vidUrl, string vidId){
    if(!_isVidDownloaded){
        if(!forkOnce_downloadVid){
            string vidName = "/Users/a/Documents/of_v0.9.8_osx_release/apps/myApps/20180318_detourningTheWeb_assignment3/bin/data/" + dir + vidId + ".mp4";
            
            cout << "vidName = " << vidName << endl;
            
            //youtube-dl -f worstvideo --max-downloads 5 -o /Users/a/Documents/of_v0.9.8_osx_release/apps/myApps/20180318_detourningTheWeb_assignment3/bin/data/vids/sky.mp4 "https://www.youtube.com/results?sp=CAISAhgB&search_query=sky+timelapse"
            
            string cmd = "/usr/local/bin/youtube-dl -f worstvideo --max-downloads 1 --playlist-start " + vidId + " --recode-video mp4 -o " + vidName + " "" + vidUrl.c_str() + """;
            
            cout << "cmd = " << cmd << endl;
            
            int pid = fork();
            switch(pid){
                case -1:{
                    perror("fork");
                    _exit(EXIT_FAILURE);
                    break;
                }
                case 0:{ // child process
                    system(cmd.c_str());
                    _exit(0);
                    break;
                }
                default:{ // parent process
                    // wait for child process
                    int status = 0;
                    waitpid(-1, &status, WNOHANG);
                    printf("child status:%dn",status);
                    break;
                }
            }
            forkOnce_downloadVid = true;
        } else {
            if(vidType == 0) {
                signal(SIGCHLD,signalHandler_sky);
            } else {
                signal(SIGCHLD,signalHandler_grass);
            }
        }
        
    }
}

Posted on and Updated on

Detourning the Web | Assignment 2: AirBnb Experiences, pt 1

For the list capture assignment I’ve chosen the Experiences section of the AirBnb website:

This section utilizes an infinite scrolling feature, fetching 40 items each time the bottom of the page is hit. In order to programmatically capture all elements I found the specific fetch request in the inspector panel:

which looks something like this:

https://www.airbnb.com/api/v2/explore_tabs?version=1.3.4&_format=for_explore_search_web&experiences_per_grid=20&items_per_grid=18&guidebooks_per_grid=20&auto_ib=true&fetch_filters=true&has_zero_guest_treatment=false&is_guided_search=true&is_new_cards_experiment=true&luxury_pre_launch=false&query_understanding_enabled=true&show_groupings=true&supports_for_you_v3=true&timezone_offset=-300&metadata_only=false&is_standard_search=true&tab_id=experience_tab§ion_offset=7&items_offset=120&recommendation_item_cursor=&refinement_paths[]=/experiences&last_search_session_id=&federated_search_session_id=a86f985a-7bac-46e4-8cfe-0e1988808c5a&screen_size=large&_intents=p1&key=d306zoyjsyarp7ifhu67rjxn52tv0t20¤cy=USD&locale=en

In this url the `items_offset` var is iterated by 40 each time the bottom of the page is hit, so iterating this in python would allow me to fetch all items:

import requests

def get_page(_offset):
	url = "https://www.airbnb.com/api/v2/explore_tabs?version=1.3.3&_format=for_explore_search_web&experiences_per_grid=20&items_per_grid=18&guidebooks_per_grid=20&auto_ib=true&fetch_filters=true&is_guided_search=true&is_new_cards_experiment=true&luxury_pre_launch=false&query_understanding_enabled=false&show_groupings=true&supports_for_you_v3=true&timezone_offset=-300&metadata_only=false&is_standard_search=true&tab_id=experience_tab§ion_offset=3&items_offset=" + str(_offset) + "&recommendation_item_cursor=&refinement_paths[]=/experiences&query=&last_search_session_id=&federated_search_session_id=320016fd-d09c-48c0-b7ed-2786432d35fb&screen_size=large&_intents=p1&key=d306zoyjsyarp7ifhu67rjxn52tv0t20¤cy=USD&locale=en"
	responses = requests.get(url).json()
	return responses

offset = 0
while offset <= 280: #280

	#add all results from this json pull to results var
	results = get_page(offset)
	items = results['explore_tabs'][0]['sections'][0]['trip_templates']
	for item in items:
		print item['title'].encode('utf-8')
		print item['kicker_text'].encode('utf-8')
		print item['country'].encode('utf-8')
		print item['picture']['large_ro']
		print item['star_rating']
		print item['lat']
		print item['lng']
		print 'n'

	#update offset
	offset = offset + 40

In the above script I printed the following items from each Experience post:

  • title: “Paris’ Best Kept Secrets Tour”
  • kicker_text: “history walk · Paris”
  • country: “France”
  • picture: “https://a0.muscache.com/im/pictures/571e0e7b-6867-4c44-bd91-776a5d698fae.jpg”
  • star_rating: 5.0
  • lat: 48.8674018702
  • lng: 2.32934203089

I piped these results to a .txt file via the command line:

python get_airbnb.py > output.txt

The full output.txt can be seen here.

Posted on

Research: The Enduring Ephemeral, or the Future Is a Memory (Wendy Hui Kyong Chun)

src


Not “what is new media?” but rather “What *was* new media? What will it be?”

constant repetition, tied to an inhumanly precise and unrelenting clock,
points to a factor more important than speed—a nonsimultaneousness of
the new, which I argue sustains new media as such (148)

The Future, This Time Around

this paper argues these dreams of superhuman digital programmability create, rather than solve, archi-
val nightmares (149)

“vapor theory”

speed makes criticism / reflection difficult

Speed and variability apparently confound critical analysis. According
to Lovink, “because of the speed of events, there is a real danger that an
online phenomenon will already have disappeared before a critical dis-
course reflecting on it has had the time to mature and establish itself as
institutionally recognized knowledge.”

Paul Virilio … has argued that cyberspace has implemented a real time that is
eradicating local spaces and times [threatening] “a total
loss of the bearings of the individual” and “a loss of control over reason,”
as the interval between image and subject disappears. (151)

malleability also makes criticism / reflection difficult

“malleability also makes criticism difficult by troubling a grounding
presumption of humanities research: the reproducibility of sources. (152)”

my words: a text can disappear instantly, or move, or change, making information that cites or builds or otherwise relies upon this text difficult to trust

Digital media, through the memory at its core, was supposed to
solve, if not dissolve, archival problems such as degrading celluloid or
scratched vinyl, not create archival problems of its own. (154)

ephemerality is not new to new media (153)

so what defines “new media?”

The major characteristic of digital media is memory. (154)

Memory allegedly makes digital media an ever-increasing archive in which no
piece of data is lost. (154)

By saving the past, it was supposed to make knowing the future easier. (154)

As a product of programming, it was to program the future. (155)

As we may think

Bush, in “As We May Think,” writing at the end of World War II, argues the crucial problem facing scientists and scientific progress is access (156)

the memex sought to provide people with
“the privilege of forgetting”
by storing and indexing memories for them to be accessed later

memex revisited saw the failure of this:
“We are being buried in our own product. Tons of printed material are
dumped out every week. In this are thoughts, certainly not often as
great as Mendel’s, but important to our progress. Many of them be-
come lost; many others are repeated over and over and over”(158)

Thus the scientific archive, rather than pointing us to the future, is trap-
ping us in the past, making us repeat the present over and over again. Our
product is burying us and the dream of linear additive progress is limiting
what we may think(158)

“The difficulty supposedly lies in selecting the data, not
in reading it”(159)

The pleasure of forgetfulness is to some extent the
pleasure of death and destruction. It is thus no accident that this supple-
menting of human memory has also been imagined as the death of the
human species in so many fictions and films and de ́ja vu as the mark of the
artificial in The Matrix. (160)

Moving memory

an instruction or program is functionally equivalent to its result… this conflation grounds programming, in which process in time is reduced to process in space (161)

By making genes a form of memory, von Neumann also erases the difference between individual and transgenerational memory, making plausible Lamarckian transmission; if chromosomes are a form of secondary memory, they can presumably be written by the primary. This genetic linkage to memory makes clear the stakes of conflating memory with storage— a link from the past to the future. (164)

A memory must be held in order to keep it from moving or fading. Memory does not equal storage. (165)

digital media is truly a time-based medium, which, given a screen’s refresh cycle and the dynamic flow of information in cyberspace, turns images, sounds, and text into discrete moments in time. These images are frozen for human eyes only. (166)

without cultural artifacts, civilization has no memory and no mechanism to learn from its successes and failures. And paradoxically, with the explosion of the Internet, we live in what Danny Hillis has referred to as our “digital dark age.” (168)

the internet, which is in so many ways about memory, has, as Ernst argues, no memory— at least not without the intervention of something like the IWM (wayback machine). (169)

This belief in the internet as cultural memory, paradoxically, threatens to spread this lack of memory everywhere and plunge us negatively into a way way back machine: the so-called digital dark age (169)

Virilio’s constant insistence on speed as distorting space-time and on real time as rendering us susceptible to the dictatorship of speed has generated much good work in the field, but it can blind us to the ways in which images do not simply assault us at the speed of light. Just because images flash up all of a sudden does not mean that response or responsibility is impossible or that scholarly analysis is no longer relevant. As the new obsession with repetition reveals, an image does not flash up only once. The pressing questions are, Why and how is it that the ephemeral endures? And what does the constant repetition and regeneration of information effect? What loops and what instabilities does it introduce into the logic of programmability? (171)

Reliability is linked to deletion; a database is considered to be unreliable (to contain “dirty data”) if it does not adequately get rid of older inaccurate information. (171)

Rather than getting caught up in speed, then, we must analyze, as we try to grasp a present that is always degenerating, the ways in which ephemerality is made to endure. What is surprising is not that digital media fades but rather that it stays at all and that we stay transfixed by our screens as its ephemerality endures. (171)

 

Posted on and Updated on

Thesis week 1: ideas

Idea 1:

I’m interested in exploring topics in the area of scientific imaging and data visualization based on research I’ve done in the areas of the history of scientific objectivity (see notes here) and information visualization. Epistemological historian Lorraine Daston’s research introduces a notion called “mechanical objectivity” which describes the attempt of modern imaging technologies (from scientific atlas drawing, to the camera obscura, then the modern photograph) to create representations that achieve “truth-to-nature” (what’s depicted in the image accurately describes the actual phenomena). Modern photography for scientific imaging was an attempt to remove human intervention from the process altogether, but what it really did was shift the faculties of human judgement from the expert (the scientist) to the layperson.

I think Daston’s ideas have far reaching implications regarding today’s technological landscape, particularly with the rise of big data over the past several decades, opening the floodgates for any individual to gain seemingly infinite perspective for any given phenomenon.

“Seeing the whole world is a fantasy that Michel DeCerteau calls the ‘totalizing eye’ and Donna Haraway calls ‘the God Trick’.” (src)

Achieving truth-to-nature through a visual representation seems like a problem of expertise (the doctor must know what a healthy hip bone looks like in order to notice what is unusual or what aspects of a hip bone are missing from the X-ray), perspective, self-surveillance (when selecting the data to be captured), and of interpretation (to analyze aesthetic and moral faculties when making judgement).

So how do we make up our understanding of the world today, then? How do our understandings differ? What does “scientific objectivity” look like when every individual has the capacity to be their own expert, have their own perspective, and draw their own conclusions?

One approach to exploring these questions could be to create an interface that would let users explore a given dataset themselves. I’m drawn to geospatial data aggregated by satellites because the prospect of visualizing physical phenomena much, much larger than us is a very concrete way of illustrating the “god trick” and how the technologies that give us this “view from nowhere” are a massive infrastructural network full of complexity that can only be understood holistically.

Idea 2:

I’m also interested in the idea of cultural memory and how modern media affects our understanding of the present. The content feed and attention economics have been focal points for me. What is our relationship to the past in a landscape constantly feeding us the “now”? What are the aesthetics and politics of infinite distraction and relentless forgetting? I want to attempt to uncover our current cultural zeitgeist and explore current archival nightmares, for example:

“because of the speed of events, there is a real danger that an online phenomenon will already have disappeared before a critical discourse reflecting on it has had the time to mature and establish itself as institutionally recognized knowledge.” (src)

For this idea I’m thinking of making some sort of immersive feed experience similar to Akira Wakita’s INFOTUBE (and probably many other works) but with a focus on the passage of time (performative?) and the contents’ subject matter moreso than a speculative exercise in information architecture.

Posted on and Updated on

Research: Sabotage! Glitch Politix Man[ual/ifesto] (Curt Cloninger & Nick Briz)

src


“Computers don’t make mistakes, people do”

“a glitch as an interruption in a system”

“technologies are not neutral tools, but rather are symptoms of our worldview and cultural norms”

“We need not create glitches in sterile glitch laboratories… The glitch naturally arises from within our currently existent… systems”

“aestheticized breakage”

“something that looks like it’s breaking” vs “something which is ‘honestly’ breaking.”

“Honest informatic failures” = “failures of function”; unexpected

“A glitch is experienced when a human mis-expects one thing and winds up with something else”

“The last mile of all media is analog… The glitch happens when the media runs (last mile) on human bodies.”

“The goal of the ‘political’ glitch artist is to stage/wire/infuse/pre-load her glitch event so that it purposefully unfolds from ‘a-ha’ to ‘oh shit.’ The ‘political’ glitch artist is (eventually) dis-satisfied with the production of trippy looking shit. The ‘political’ glitch artist (eventually) seeks to glitch out shit in ways that lead to an implicit awareness of our human/system entanglements, and an implicit onus on the part of the ‘viewer’ to move those entanglements higher up and further in. Political glitch art means to endow its ‘viewer’ with a feeling of her own agency and the heaviness/obligation that accompanies it.”

an event must implicate its “viewer” in order for it to be a considered a glitch

“The ‘political’ glitch artist” seeks “fulcrums of agency” within a system; that is, places where the smallest intervention results in the largest impact (see Dana Meadows’ Leverage Points: Places to Intervene in a System)

“Political glitch art” can happen in any medium (not just digital, not just visually)… or it has to involve machines in some way as per the etymological definition of “glitch”… this man[ual/ifesto] deliberately and clearly contradicts itself perhaps to illustrate the point within the contents of the document with the document itself; is it a glitch?

“A glitch ceases to be political when it becomes a [mere] image.”; a political glitch must be an actual glitch not a representation of one; it must not be symbolic of a political stance but actively practice one

“How might glitch artists purposefully load/lade their glitches and release them back upstream into and beyond the entangled systems that initially birthed them?

What short-curcuited explosions might be triggered? What system modulations might such chain-reactions engender?

How might glitches be intentionally rigged to {buffer} overflow their containing systems?

Within (and without) the machine?”

“The most efficacious glitch art begins to eat away its own frames”; similar to institutional critique or avant-garde in general, good political glitch art subverts its medium and context (new media that resists archiving conventions, art criticizing the nature of the gallery that supports it, etc)… these are systems interventions

“The glitch image… makes visible the violence inherent in the system.”

Posted on and Updated on

Thesis Research: The Image of Objectivity (Lorraine Daston & Peter Galison)

src


p82

“Modern objectivity mixes rather than integrates disparate components” which each have their own history and are conceptually distinct

Noninterventionist aka mechanical objectivity = one element of our current notion of objectivity

mechanical objectivity combats “the subjectivity of scientific and aesthetic judgment, dogmatic system building, and anthropomorphism.”

“texts so laconic that they threaten to disappear entirely”

mechanical objectivity in science “attemps to eliminate the mediating presence of the observer”

 

p83

“the vastness and variety of nature require that observations be endlessly repeated”

mechanical objectivity “is a vision of scientific work that glorifies the plodding reliability of the bourgeois rather than the moody brilliance of the genius.”

mechanized vs moralized science; these two seemingly disparate types are actually closely related

 

p85

“The atlas aims to make nature safe for science; to replace raw experience— the accidental, contingent experience of specific individual objects— with digested experience.”

 

p86

“the faithful drawing, like nature, outlives ephemeral theories— a standing reproach to all who would, whether ‘by their error or bad faith,’ twist a fact to fit a theory.”

drawings (in atlases) were in this way ^ one of the first signs of mechanical objectivity to come (in mid/late 1800s)

“in order to decide whether an atlas picture is an accurate rendering of nature, the atlas maker must first decide what nature is.”
– the problem of choice:
1. “which object should be presented as the standrd phenomena of the discipline…”
2. “…and from which viewpoint?”

 

p87

“the mere idea of an archetype in general implies that no particular animal can be used as our point of comparison; the particular can never serve as a pattern for the whole”

p88

“the observer never sees the pure phenomenon with his own eyes; rather, much depends on his mood, the state of his senses, the light, air, weather, the physical object, how it is handled, and a thousand other circumstances.”
– the experience of an object derived from an individual instance (not the archetype which is derived by a series of individual experiences) is entirely dependent on the environment, nothing is experienced within a vacuum

before mechanical objectivity the observer had to manually distill the typical from the variable and accidental
– this is how archetypes were formed

two major variants of “typical” images:
1. ideal = not merely the typical but the “perfect”
2. charactristic = locates the typical in an individual

these 2 variants had different ontologies and aesthetics

p89

the ideal as “the best pattern of nature”

p90

Bernhard Albinus’s pictures of skeletons “are pictures of an idel skeleton, which may or may not be realized in nature, and of which this particular skeleton is at best an approximation.”

“nature is full of diversity, but science cannot be. [the atlas maker] must choose [their] images, and [their] principle of choice is frankly normative”
– example of normative choice: Albinus picked a male skeleton of “middle stature” without blemish or deformity, with pleasing proportions, etc… he took artistic liberties to make the image of the ideal skeleton “more perfect”

p91

“Whenever the artist alone, without the guidance and instruction of the anatomist, undertakes the drawing, a purely individual and partly arbitrary representation will be the result, even in advanced periods of anatomy. Where, however, this individual’s drawing is executed carefully and under the supervision of an expert anatomist, it becomes effctive through its individual truth, its harmony with nature, not only for purposes of instruction, but also for the development of anatomic science; since this norm, which is no longer individual but has become ideal, can only be attained through an exact knowledge of the countless peculiaritis of which it is the summation.”
– an expert (not just an artist) is required to make an accurate “ideal” image because one must know what’s peculiar and what’s common about the object/phenomena in order to address/eliminate/correct/alter it

p93

“Hunter, like Albinus, considered the beauty of the depiction [(aesthetic judgement)] to be part and parcel of achieving that accuracy, not a seduction to betray it.”

“scientific naturalism and the cult of individuating detail long antedated the technology of the photograph.”

naturalism ≠ rejection of aesthetics

p95

“statistical essentialism”

p96

mid 19th century atlases (scientific imaging) marked transition from “truth to nature” via the typical (ideal, charactristic exemplar, average) to mechanical objectivity

“the aesceticism of noninterventionist objectivity” (withholding internal aesthetic judgement)

“the dangers of excessive accuracy”

p98

“images of individuals came to be preferred to those of types, and … techniques of mechanical reproduction seemed to promise scientists salvation from their own worst selves” (late 1800s)

Objectivity & Mechanical Reproduction

“photographic depiction entered the fray along with X-rays, lithographs, photoengravings, camera obscura drawings, and ground glass tracings as attempts— never wholly successful— to extirpate human intervention between object and representation.”

“Interpretation, selectivity, artistry, and judgment itself all came to appear as subjective temptations requiring mchanical or procedural safeguards.”

“th image, as standard bearer of objectivity, is inextricably tied to a relentless search to replace individual volition and discretion in depiction by the invariable routines of mechanical reproduction.”

p100

“the reproduction of nature by man will never be a reproduction and imitation, but always an interpretation… since man is not a machine and is incapable of rendering objects mechanically.”

p103

“what charactrized the creation of late nineteenth-century pictorial objectivism was *self*-surveillance, a form of self-control at once moral and natural-philosophical.”

p107

“the most a picture could do was to serve as a signpost, announcing that this or that individual anatomical configuration stands in the domain of the normal.”

“while in the early ninetenth century, the burden of representation was supposed to lie in the picture itself, now it fell to the audience. the psychology of pattern recognition in the audience had replaced the metaphysical claims of the author.”

p109

“Caught between the infinite complexity of variation and their commitment to the representation of individuals, the authors must cede to the psychological. Selection and distillation, previously among the atlas writer’s principle tasks, now were removed from the authorial domain and laid squarely in that of the audience. Such a solution preserved the purity of individual representation at the cost of acknoeldging the essential *role* of the reader’s response: the human capacity to render judgement, the electroencephalographers cheerfully allow, is ‘exceedingly servisable.'”
– judgement becomes the role of the reader not the atlas maker

p110

“the X-ray operator ither by wilfulness of negligence in fastening the plate and making the exposure may exaggerate any existing deformity and an unprejudiced artist should be insisted upon.”

p111

“It is not that a photograph has more resemblance than a handmade picture, but that our belief guarantees its authenticity… We tend to trust the camera more than our own eyes.”

p112

“Photographs lied.”

“The photograph… did not end the debate over objectivity; it entered the debate.”

Attempts to make photographs (x-rays in this case) achieve 100% truth to nature by viewer interpretation:
– “demand witnesses to the production of the image”
– “require experts to mediate betwen picture and the public”
– “recommend that the surgeons themselves learn the techniques necessary to eliminate their dependence on the intermediate readers.”

This is a problem of expertise. In order for data to be turned into information a data analyst needs to vet the data then the communication designer needs to translate it for a given audience. The layperson doesn’t have time to search through datasets and adopt these areas of expertise themselves.

What if an area of expertise existed which sought to supply laypeople with the tools to be their own expert?
– Can an interface be built for a dataset that would let the user turn the data into knowledge themselves?
– What role or impact on the data would the interface designer have?
– How would this impact be different than that of an infographic designer, which expresses one specific point using chery-picked data from the larger set?
– More generalized audience (?)
– Not cherry-picked data, not one point to be made with the data
– Modes of viewing/understanding the dataset would still be under complete control of the designer. The designer would still have to make decisions regarding *how* the data *should* be explored by designing-in specific user experiences.

“Knowledge obtained by long experience and positive indications is far more valuable than any representation visible alone to the eye.”
– me doing research for this project, becoming an expert on the topic of truth-to-nature in scientific imaging
– what’s the significance of this?
– make the point of the importance of expertise and raise the question of whether it can be appropriated temporarily or “handed-off” to the end-recipient of the selected information

p117

“authenticity before mere similarity” – the benefit/offering of mechanical objectivity

Objectivity Moralized

“no atlas maker could dodge the responsibility of presenting figures that would teach the reader how to recognize the typical, the ideal, the characteristic, the average, or the normal.”

Posted on and Updated on

Live Web | Feeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeed

take1:

take2:

screen version:

For the remainder of this class I wanted to explore the possibilities in creating some sort of online performative visual experience. I wanted to see what could be done by harnessing the masses of content available on the web. I was initially interested in thinking about the internet as something that was “collapsing” time from a cultural perspective facilitated by its massive and cheap capacity to preserve and distribute media—

What was your first experience of going online?What about Google image search? “I specifically remember my first image search. I searched for trees. Like, “green trees.” And it was really overwhelming because there were so many pictures of trees, all at once. Before, you could only look at books.” (src)

art history has been “flattened” for artists of a certain age. “When they typed in ‘tree’ ” in a search engine, … “they got a thousand pictures of a tree: a picture of a tree made in the 18th century, a tree made last year, a cartoon of a tree. You have this flattening of time.” (src)

 

But what if this isn’t actually the case? When we Google something or look at one of our many content feeds on various social or news platforms, it’s all about *now*:

Digital media is truly a time-based medium, which, given a screen’s refresh cycle and the dynamic flow of information in cyberspace, turns images, sounds, and text into discrete moments in time. These images are frozen for human eyes only.(src)


References:

Edward Shenk “Theorist” — Visual artist leveraging right-leaning conspiracy theorist type of content that he apparently finds often on facebook and can probably be recognized to be more or less prevalent on the web by most people. He breaks this style of image-making down to its formal qualities, revealing the sort of sinister underlying tone… “There’s this manic connection-making like in the darker parts of A Beautiful Mind. If something looks like something else then that is proof enough. There is no such thing as pure coincidence, and that’s a hallmark of paranoia.” (src)

XTAL fSCK video performance — The visuals these performers create exemplify the “default” visual effects inherent in the MacOS UI: mashing [ctrl+alt+cmd+8] to use the color-inversion accessibility feature to create a strobe effect, using Quicktime’s screen recording feature to create endless recursion, swiping between desktop spaces, opening and minimizing windows with genie-effect in slow-motion, emojis and iconography are all overlaid and mashed together ad nauseum. The way the MacOS UI is exposed in this formal performative context reveals and accentuates visual qualities that were hidden in plain sight, forcing the audience to realize age-old questions of artistic merit and technological control in a context that we find ourselves in every day but perhaps overlook.

Anne Hirsch “Horny Lil Feminist”

Aaron David Ross “Deceptionista”

Posted on and Updated on

Live Web | Group Project: 360 Livestream

Tiri, Lin and I really wanted to use the 9th floor studio (or a similar setup — we ended up using the micro studio) and decided to explore the possibility of livestreaming with 360 video in this context. The idea we thought would be most interesting was the first that came to mind: presenting a how-to for what we were doing in the form of a Coding Train episode.

Our final production pipeline ended up being pretty simple: Theta S (360 camera) -> OBS (streaming client) -> YouTube. Before getting to this point we had tried many different avenues. Our first approach was to embed the stream into our own webpage and use a javascript library called Aframe to VR-ify the stream content. The first problem with this was cross-origin header issues while embedding. We spoke to Rubin about this and he explained to us that YouTube delivers streams via GET as HLS which doesn’t have native support in Chrome, but more importantly solving the cross-origin issue required each user to install browser extensions which isn’t ideal. We were able to embed Twitch streams but couldn’t block ads on load and couldn’t get a proper .mp4 or .m3u8 (HLS) stream url to work with Aframe because Twitch embeds a webpage via iframe, effectively obscuring the actual video url. At the end of the day, Youtube has a built-in VR feature and essentially plug-and-play 360-video streaming capabilities, so it made no sense to build our own custom implementation.

The livestream imagery is composed of several layers of images chroma-keyed together with several green screens.

Our first test, simply getting a livestream going:

Then managing a 360 livestream:

The final output (viewer’s POV):

Final output (flattened, OBS POV):

Posted on and Updated on

Live Web | Week 3 | Canvas

I really appreciate the non-antialiased, jagged style of digital imagery so my goal for this assignment was to create some sort of collaborative tool for exploring glitchy, abstract visuals taking advantage of this style.

The user draws horizontal or vertical lines (even numbered connection counts are assigned horizontal lines, odds are vertical) along the small 200px canvas by dragging the mouse button. So for users creating vertical lines, at each mouse position along the x-axis of the canvas, a for loop iterates through each pixel in the column and fills it with a 1px black rectangle. In order to keep the canvas from just being filled completely black I made it so that pixels that are already black are turned white.

The result is a lot of unexpected behavior, rendering each canvas image unique, even between users on the same canvas.

The algorithm for filling pixels is as follows:

canvas.addEventListener('mousemove', function(evt) {

	//when mouse is pressed
	if (isMouseDown) {

		//draw pixel at current mousepos:
		if (numUsers % 2 == 0) { //if even number of users
			//set the user to horizontal
			drawLineHor_(evt.clientY, context);
		} else { //if odd number of users
			//set the user to vertical
			drawLineVer_(evt.clientX, context);
		}

		//send drawing
		var objtosend = {
			x: evt.clientX,
			y: evt.clientY,
			px: px,
			py: py,
			userNum: numUsers
		};
		socket.emit('drawing', objtosend);
	}
	//receive drawing
    socket.on('drawing', function(receivedData) {
        if (receivedData.userNum % 2 == 0) { //if this user is even
            //set the user to horizontal
            drawLineHor(receivedData.y, context);
        } else { // if this user is odd
            //set the user to vertical
            drawLineVer(receivedData.x, context);
        }
    });
});

//these functions could probably all be consolidated somehow
function drawLineHor(y, ctx) {
    for (var x = 0; x < canvas.width; x++) {

        //get color
        var p = ctx.getImageData(x, y, 1, 1).data;

        //compare color
        if (p == "0,0,0,255") { //if black
            //set to clear
            ctx.fillStyle = white;
        } else { //if clear
            ctx.fillStyle = black;
        }

        //fill this pixel
        ctx.fillRect(x, y, 1, 1);
    }
}

function drawLineVer(x, ctx) {

    for (var y = 0; y < canvas.height; y++) {

        //get color
        var p = ctx.getImageData(x, y, 1, 1).data;

        //compare color
        if (p == "0,0,0,255") { //if black
            //set to clear
            ctx.fillStyle = white;
        } else { //if clear
            ctx.fillStyle = black;
        }

        //fill this pixel
        ctx.fillRect(x, y, 1, 1);
    }
}

function drawLineHor_(y, ctx) {
    for (var x = 0; x < canvas.width; x++) {
        ctx.fillStyle = black;
        ctx.fillRect(x, y, 1, 1);
    }
}

function drawLineVer_(x, ctx) {

    for (var y = 0; y < canvas.height; y++) {
        ctx.fillStyle = black;
        ctx.fillRect(x, y, 1, 1);
    }
}

I found that there is no way to get perfect, 1px non-antialiased lines in a canvas with the built-in drawing methods. One trick is to translate the entire canvas half a pixel (ctx.translate(0.5, 0.5)) to offset the interpolation that causes 1px lines to fill 2px, but this still doesn’t keep lines 1px when drawn at angles. So what I did instead (which also allowed me individual pixel control) was make 1px sized rects with  ctx.fillRect(x,y,1,1). The canvas thus is so small because this requires large for loops which is detrimental to performance.