Posted on

Research: The Enduring Ephemeral, or the Future Is a Memory (Wendy Hui Kyong Chun)

src


Not “what is new media?” but rather “What *was* new media? What will it be?”

constant repetition, tied to an inhumanly precise and unrelenting clock,
points to a factor more important than speed—a nonsimultaneousness of
the new, which I argue sustains new media as such (148)

The Future, This Time Around

this paper argues these dreams of superhuman digital programmability create, rather than solve, archi-
val nightmares (149)

“vapor theory”

speed makes criticism / reflection difficult

Speed and variability apparently confound critical analysis. According
to Lovink, “because of the speed of events, there is a real danger that an
online phenomenon will already have disappeared before a critical dis-
course reflecting on it has had the time to mature and establish itself as
institutionally recognized knowledge.”

Paul Virilio … has argued that cyberspace has implemented a real time that is
eradicating local spaces and times [threatening] “a total
loss of the bearings of the individual” and “a loss of control over reason,”
as the interval between image and subject disappears. (151)

malleability also makes criticism / reflection difficult

“malleability also makes criticism difficult by troubling a grounding
presumption of humanities research: the reproducibility of sources. (152)”

my words: a text can disappear instantly, or move, or change, making information that cites or builds or otherwise relies upon this text difficult to trust

Digital media, through the memory at its core, was supposed to
solve, if not dissolve, archival problems such as degrading celluloid or
scratched vinyl, not create archival problems of its own. (154)

ephemerality is not new to new media (153)

so what defines “new media?”

The major characteristic of digital media is memory. (154)

Memory allegedly makes digital media an ever-increasing archive in which no
piece of data is lost. (154)

By saving the past, it was supposed to make knowing the future easier. (154)

As a product of programming, it was to program the future. (155)

As we may think

Bush, in “As We May Think,” writing at the end of World War II, argues the crucial problem facing scientists and scientific progress is access (156)

the memex sought to provide people with
“the privilege of forgetting”
by storing and indexing memories for them to be accessed later

memex revisited saw the failure of this:
“We are being buried in our own product. Tons of printed material are
dumped out every week. In this are thoughts, certainly not often as
great as Mendel’s, but important to our progress. Many of them be-
come lost; many others are repeated over and over and over”(158)

Thus the scientific archive, rather than pointing us to the future, is trap-
ping us in the past, making us repeat the present over and over again. Our
product is burying us and the dream of linear additive progress is limiting
what we may think(158)

“The difficulty supposedly lies in selecting the data, not
in reading it”(159)

The pleasure of forgetfulness is to some extent the
pleasure of death and destruction. It is thus no accident that this supple-
menting of human memory has also been imagined as the death of the
human species in so many fictions and films and de ́ja vu as the mark of the
artificial in The Matrix. (160)

Moving memory

an instruction or program is functionally equivalent to its result… this conflation grounds programming, in which process in time is reduced to process in space (161)

By making genes a form of memory, von Neumann also erases the difference between individual and transgenerational memory, making plausible Lamarckian transmission; if chromosomes are a form of secondary memory, they can presumably be written by the primary. This genetic linkage to memory makes clear the stakes of conflating memory with storage— a link from the past to the future. (164)

A memory must be held in order to keep it from moving or fading. Memory does not equal storage. (165)

digital media is truly a time-based medium, which, given a screen’s refresh cycle and the dynamic flow of information in cyberspace, turns images, sounds, and text into discrete moments in time. These images are frozen for human eyes only. (166)

without cultural artifacts, civilization has no memory and no mechanism to learn from its successes and failures. And paradoxically, with the explosion of the Internet, we live in what Danny Hillis has referred to as our “digital dark age.” (168)

the internet, which is in so many ways about memory, has, as Ernst argues, no memory— at least not without the intervention of something like the IWM (wayback machine). (169)

This belief in the internet as cultural memory, paradoxically, threatens to spread this lack of memory everywhere and plunge us negatively into a way way back machine: the so-called digital dark age (169)

Virilio’s constant insistence on speed as distorting space-time and on real time as rendering us susceptible to the dictatorship of speed has generated much good work in the field, but it can blind us to the ways in which images do not simply assault us at the speed of light. Just because images flash up all of a sudden does not mean that response or responsibility is impossible or that scholarly analysis is no longer relevant. As the new obsession with repetition reveals, an image does not flash up only once. The pressing questions are, Why and how is it that the ephemeral endures? And what does the constant repetition and regeneration of information effect? What loops and what instabilities does it introduce into the logic of programmability? (171)

Reliability is linked to deletion; a database is considered to be unreliable (to contain “dirty data”) if it does not adequately get rid of older inaccurate information. (171)

Rather than getting caught up in speed, then, we must analyze, as we try to grasp a present that is always degenerating, the ways in which ephemerality is made to endure. What is surprising is not that digital media fades but rather that it stays at all and that we stay transfixed by our screens as its ephemerality endures. (171)

 

Posted on and Updated on

Live Web | Feeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeed

take1:

take2:

screen version:

For the remainder of this class I wanted to explore the possibilities in creating some sort of online performative visual experience. I wanted to see what could be done by harnessing the masses of content available on the web. I was initially interested in thinking about the internet as something that was “collapsing” time from a cultural perspective facilitated by its massive and cheap capacity to preserve and distribute media—

What was your first experience of going online?What about Google image search? “I specifically remember my first image search. I searched for trees. Like, “green trees.” And it was really overwhelming because there were so many pictures of trees, all at once. Before, you could only look at books.” (src)

art history has been “flattened” for artists of a certain age. “When they typed in ‘tree’ ” in a search engine, … “they got a thousand pictures of a tree: a picture of a tree made in the 18th century, a tree made last year, a cartoon of a tree. You have this flattening of time.” (src)

 

But what if this isn’t actually the case? When we Google something or look at one of our many content feeds on various social or news platforms, it’s all about *now*:

Digital media is truly a time-based medium, which, given a screen’s refresh cycle and the dynamic flow of information in cyberspace, turns images, sounds, and text into discrete moments in time. These images are frozen for human eyes only.(src)


References:

Edward Shenk “Theorist” — Visual artist leveraging right-leaning conspiracy theorist type of content that he apparently finds often on facebook and can probably be recognized to be more or less prevalent on the web by most people. He breaks this style of image-making down to its formal qualities, revealing the sort of sinister underlying tone… “There’s this manic connection-making like in the darker parts of A Beautiful Mind. If something looks like something else then that is proof enough. There is no such thing as pure coincidence, and that’s a hallmark of paranoia.” (src)

XTAL fSCK video performance — The visuals these performers create exemplify the “default” visual effects inherent in the MacOS UI: mashing [ctrl+alt+cmd+8] to use the color-inversion accessibility feature to create a strobe effect, using Quicktime’s screen recording feature to create endless recursion, swiping between desktop spaces, opening and minimizing windows with genie-effect in slow-motion, emojis and iconography are all overlaid and mashed together ad nauseum. The way the MacOS UI is exposed in this formal performative context reveals and accentuates visual qualities that were hidden in plain sight, forcing the audience to realize age-old questions of artistic merit and technological control in a context that we find ourselves in every day but perhaps overlook.

Anne Hirsch “Horny Lil Feminist”

Aaron David Ross “Deceptionista”

Posted on and Updated on

Live Web | Group Project: 360 Livestream

Tiri, Lin and I really wanted to use the 9th floor studio (or a similar setup — we ended up using the micro studio) and decided to explore the possibility of livestreaming with 360 video in this context. The idea we thought would be most interesting was the first that came to mind: presenting a how-to for what we were doing in the form of a Coding Train episode.

Our final production pipeline ended up being pretty simple: Theta S (360 camera) -> OBS (streaming client) -> YouTube. Before getting to this point we had tried many different avenues. Our first approach was to embed the stream into our own webpage and use a javascript library called Aframe to VR-ify the stream content. The first problem with this was cross-origin header issues while embedding. We spoke to Rubin about this and he explained to us that YouTube delivers streams via GET as HLS which doesn’t have native support in Chrome, but more importantly solving the cross-origin issue required each user to install browser extensions which isn’t ideal. We were able to embed Twitch streams but couldn’t block ads on load and couldn’t get a proper .mp4 or .m3u8 (HLS) stream url to work with Aframe because Twitch embeds a webpage via iframe, effectively obscuring the actual video url. At the end of the day, Youtube has a built-in VR feature and essentially plug-and-play 360-video streaming capabilities, so it made no sense to build our own custom implementation.

The livestream imagery is composed of several layers of images chroma-keyed together with several green screens.

Our first test, simply getting a livestream going:

Then managing a 360 livestream:

The final output (viewer’s POV):

Final output (flattened, OBS POV):

Posted on and Updated on

Live Web | Week 3 | Canvas

I really appreciate the non-antialiased, jagged style of digital imagery so my goal for this assignment was to create some sort of collaborative tool for exploring glitchy, abstract visuals taking advantage of this style.

The user draws horizontal or vertical lines (even numbered connection counts are assigned horizontal lines, odds are vertical) along the small 200px canvas by dragging the mouse button. So for users creating vertical lines, at each mouse position along the x-axis of the canvas, a for loop iterates through each pixel in the column and fills it with a 1px black rectangle. In order to keep the canvas from just being filled completely black I made it so that pixels that are already black are turned white.

The result is a lot of unexpected behavior, rendering each canvas image unique, even between users on the same canvas.

The algorithm for filling pixels is as follows:

canvas.addEventListener('mousemove', function(evt) {

	//when mouse is pressed
	if (isMouseDown) {

		//draw pixel at current mousepos:
		if (numUsers % 2 == 0) { //if even number of users
			//set the user to horizontal
			drawLineHor_(evt.clientY, context);
		} else { //if odd number of users
			//set the user to vertical
			drawLineVer_(evt.clientX, context);
		}

		//send drawing
		var objtosend = {
			x: evt.clientX,
			y: evt.clientY,
			px: px,
			py: py,
			userNum: numUsers
		};
		socket.emit('drawing', objtosend);
	}
	//receive drawing
    socket.on('drawing', function(receivedData) {
        if (receivedData.userNum % 2 == 0) { //if this user is even
            //set the user to horizontal
            drawLineHor(receivedData.y, context);
        } else { // if this user is odd
            //set the user to vertical
            drawLineVer(receivedData.x, context);
        }
    });
});

//these functions could probably all be consolidated somehow
function drawLineHor(y, ctx) {
    for (var x = 0; x < canvas.width; x++) {

        //get color
        var p = ctx.getImageData(x, y, 1, 1).data;

        //compare color
        if (p == "0,0,0,255") { //if black
            //set to clear
            ctx.fillStyle = white;
        } else { //if clear
            ctx.fillStyle = black;
        }

        //fill this pixel
        ctx.fillRect(x, y, 1, 1);
    }
}

function drawLineVer(x, ctx) {

    for (var y = 0; y < canvas.height; y++) {

        //get color
        var p = ctx.getImageData(x, y, 1, 1).data;

        //compare color
        if (p == "0,0,0,255") { //if black
            //set to clear
            ctx.fillStyle = white;
        } else { //if clear
            ctx.fillStyle = black;
        }

        //fill this pixel
        ctx.fillRect(x, y, 1, 1);
    }
}

function drawLineHor_(y, ctx) {
    for (var x = 0; x < canvas.width; x++) {
        ctx.fillStyle = black;
        ctx.fillRect(x, y, 1, 1);
    }
}

function drawLineVer_(x, ctx) {

    for (var y = 0; y < canvas.height; y++) {
        ctx.fillStyle = black;
        ctx.fillRect(x, y, 1, 1);
    }
}

I found that there is no way to get perfect, 1px non-antialiased lines in a canvas with the built-in drawing methods. One trick is to translate the entire canvas half a pixel (ctx.translate(0.5, 0.5)) to offset the interpolation that causes 1px lines to fill 2px, but this still doesn’t keep lines 1px when drawn at angles. So what I did instead (which also allowed me individual pixel control) was make 1px sized rects with  ctx.fillRect(x,y,1,1). The canvas thus is so small because this requires large for loops which is detrimental to performance.

Posted on and Updated on

Live Web | Final Project Idea

For the remainder of this class I’ll attempt to discover the sort of “hidden style” of not just the structures of interfaces in digital media (windows, buttons, menus, scrolling… GUI paradigms in general) but of the content itself that fills these structures, specifically on the web. Because content on the web (image, video, and text, interactive or not) encapsulates objects, people, places, and ideas, from arguably every period of human history, could the resulting aesthetic vernacular really be considered an overarching style of everything, a mess of clashing styles and perspectives from wildly different paradigms? Is this what defines our current zeitgeist, and if so, what does that mean for the future? What could possibly come after this massive collapse of time and space in aesthetic understanding?

This kind of flattening seems to have an affect on many things, like political perspectives, cultural diversity (global homogenization in fashion, language, icons, etc), and maybe even more. So not only could the web be considered an “artistic medium” but it’s also arguably the primary source of information dissemination and media consumption for many people all over the world.

I will realize this using appropriated media as facilitated by the use of web APIs for various content aggregators and media platforms. I’ll try to focus on content “hubs” to minimize bias but it will be interesting to see the bias inherent in my choosing regardless, if not instrumental in helping to discover to what degree the web today acts as a window into any sort of objective paradigm of everything.

References:

What was your first experience of going online?What about Google image search? “I specifically remember my first image search. I searched for trees. Like, “green trees.” And it was really overwhelming because there were so many pictures of trees, all at once. Before, you could only look at books.” (src)

 

art history has been “flattened” for artists of a certain age. “When they typed in ‘tree’ ” in a search engine, … “they got a thousand pictures of a tree: a picture of a tree made in the 18th century, a tree made last year, a cartoon of a tree. You have this flattening of time.” (src)

Edward Shenk “Theorist” — Visual artist leveraging right-leaning conspiracy theorist type of content that he apparently finds often on facebook and can probably be recognized to be more or less prevalent on the web by most people. He breaks this style of image-making down to its formal qualities, revealing the sort of sinister underlying tone… “There’s this manic connection-making like in the darker parts of A Beautiful Mind. If something looks like something else then that is proof enough. There is no such thing as pure coincidence, and that’s a hallmark of paranoia.” (src)

XTAL fSCK video performance — The visuals these performers create exemplify the “default” visual effects inherent in the MacOS UI: mashing [ctrl+alt+cmd+8] to use the color-inversion accessibility feature to create a strobe effect, using Quicktime’s screen recording feature to create endless recursion, swiping between desktop spaces, opening and minimizing windows with genie-effect in slow-motion, emojis and iconography are all overlaid and mashed together ad nauseum. The way the MacOS UI is exposed in this formal performative context reveals and accentuates visual qualities that were hidden in plain sight, forcing the audience to realize age-old questions of artistic merit and technological control in a context that we find ourselves in every day but perhaps overlook.

Posted on

Design for Discomfort | Final Project

A major day-to-day struggle for me is retaining self control at my computer. As one who generally works primarily in the digital media space, distractions are not only infinitely abundant but also very easy to get lost into. Not only does everything happen to be “a few clicks away,” they are such efficient traps by design.

In the grand scheme of things the stress and discomfort induced by such an ecosystem of intentional distraction (decades of attention economics at work) may seem somewhat petty, but maybe it’s worth investigating. Behavioral psychology and the marketing industry as a whole have been around for a while now, but the digital context within which we experience it today on such a massive scale is still fairly new. How can the discomfort caused by our interactions in these systems and the resulting feedback loops that psychologically impact users for clicks and eyeballs be recreated in such a designed experience in order to enlighten rather than obscure and distract?

I figured the best way to address this question would be to replicate such a web experience with exaggerated effects. The experience I designed has a progressive structure— the user follows a linear path while the uncomfortable elements build. From page to page the user is slowly conditioned to increasingly intense visual, auditory and intellectual stimuli. In the future I intend to also incorporate more heavily political content to attempt to emulate the “philosophical dizziness” brought forth by an overload of information/”soft propaganda”/conspiracy theorist fanaticism from all sides of the political spectrum at once. Ideally the experience would go on until the user is too uncomfortable to progress any further.

The experience would be contained in an embraced magic circle. The circle involves an already existing reality within which it exists (the web and it’s capacity to cause you to procrastinate), but the difference within the magic circle is that you are forced to confront an active awareness that you’re wasting time, or simply how you waste time in general, and how it makes you feel.

Posted on

Design for Discomfort | Final Project Pitch

Option 2. A Difficult Conversation

Taking a cue from Chris Crawford, conceptualize your design for an interactive device, application, or experience as a literal difficult conversation. Script out the conversation as the starting point for the design. Test throughout the process to determine if participants are hearing what you intend from your design, and aim to have a device or experience that can keep a person in the challenging conversation long enough to have a meaningful experience.


Concept/purpose: The idea is to build a series of interconnected web pages filled with visual tactics commonly found all over the web to invoke extreme sensory discomfort toward its users. I also want to address our interactions in these systems and the resulting feedback loops that psychologically impact users. I aim for it to serve as somewhat of a timestamp for the current state of our digital landscape, and my hope is that it will prompt people to think about this kind of emotional/behavioral stimuli we all subject ourselves to.

Specific tactics to invoke:

  • Hook users with an infinite stream of memes / low-quality content with “can’t look away” qualities (excluding x-rated content for this assignment because I feel that the likelihood of dealing with institutional politics here would hijack the assigned purpose, but more so because there is absolutely plenty of anodyne content which is just as effective at garnering and holding attention)
  • Invoke anxiety in users by juxtaposing the attention-grabbing content stream with the recognition that they are wasting their time, perhaps procrastinating even
  • Sensory overload (as I learned from my journey prototype, it should be SUBTLE and long-form; blasting users immediately would make intentions immediately obvious and possibly incentivize leaving before being hooked)

Conversation:

user’s internal monologue (1) while interacting with anthropomorphized website (2)

1: “I’m experiencing a fleeting desire to escape from reading the next paragraph of this essay, I’m craving some quick mindless satisfaction”

*Opens new tab, goes to the site*

2: “I have a new visitor. Let me show them content that will bring them immediate satisfaction so that they stay.”

*Pulls the most viral post from imgur.com’s API and displays them.*

1: “Oh look at this adorable puppy!”

*Scrolls down*

2: “User wants more.”

*Pulls the next most viral posts from the imgur API and begins an infinite scrolling mechanism.*

“Perhaps user might be interested in shopping.”

*Displays an ad in the screen’s corner*

*Some time passes*

*Shows “You might be interested in…” link to Forbes 30 Under 30 and similar articles*

1: “Wish I had that much success. What the hell am I doing with my life? I think I’ve wasted enough time here.”

*Begins scrolling back up to the top of the page*

2: “User has abandoned the meme feed, let’s introduce them to other things they might like”

*Displays link to chat room*

1: “Wow I haven’t seen one of these in ages, I wonder what goes on here.”

*Enters chat room page*

2:”User has entered chat room page, let’s make sure this opportunity isn’t wasted.”

*Displays pop-up modal ad*

*Loads eye-catching background*

*Enables emojis in chat*

*Loads some more “You might also like” links to articles about anxiety, procrastination, rare success stories, global issues, luxury goods*

1:”Oh look Trump said a thing on Twitter and embarrassed himself in front of a prime minister, there are riots happening in Charlottesville and a supervolcano is going to erupt any day now in Yellowstone, am I even going to live to retirement age? The people in this chat room aren’t very friendly, I think they’re trolling me. I can’t get caught up in internet arguments but this person is pissing me off. This background is giving me a headache. Maybe I should see if I can find a decent shirt on Ebay, I don’t have any shirts that go my pants. I can’t afford to buy new clothes anyway. This page runs really slowly. I might go back to the imgur feed after reading that article that’s tangentially related to my work— oh, I forgot I’d distracted myself with this stupid site. I should go back now”

*User closes tab and reflects on the experience*

 

Posted on

Live Web | Mid-Term Proposal

I’d like to take this opportunity to put the tools I’ve learned in this class thus far toward a project I’m starting to develop for Design for Discomfort. The idea is to build a series of interconnected web pages filled with visual tactics commonly found all over the web to invoke extreme sensory discomfort toward its users. I also want to address our interactions in these systems and the resulting feedback loops that affect our actions. I aim for it to serve as somewhat of a timestamp for the current state of our digital landscape, and my hope is that it will prompt people to think about this kind of emotional/behavioral stimuli we all subject ourselves to.  What comprises our contemporary digital landscape? Is it different from, say, 10 years ago? If so, how is it different? What has changed, why, and what does it say about us in terms of what we are using the web for, what we want the web to be, and what it might become in the future?

Posted on and Updated on

Live Web | Class 4 | Camera & WebRTC

For this assignment I was most interested in understanding how image data was encoded into the base64 format. I spent a great deal of time attempting to understand exactly how this was done so that I could alter the image pixel data with code. So far I’ve been unsuccessful, and I have no idea why my current code doesn’t work, but perhaps I’m close, or maybe I’ve led myself completely astray.

First of all I learned that images are a binary data format, meaning the color of each of its pixels can be interpreted via a 1 or 0, (black or white), but in this case the png I am sending as base64 is not just 1 bit per pixel but rather 24 (I think), meaning there are 24 bits being used to represent a whole spectrum of rgb values. Knowing this, I then read up on how base64 encodes such binary data. The way I understand it is that the image takes the binary data in 24-bit chunks and divides each of those into four 6-bit chunks and assigns each of those 6-bits (which has 64 possible values) to an ascii character based on the base64 index table. In the end the long ascii string representing all the data in the image is “padded” with = or == to indicate whether the remainder of bits in the divided up binary image is either 1 bit or 2 bits.

So in order for me to alter the pixel data I thought I’d need to

  1. Decode it from base64 back to raw binary
  2. Do stuff to the binary (here I applied a random shuffle function)
  3. Encode it back to base64

I am able to do this successfully, except that the resulting base64 data just returns a broken image. I’m not sure why this would happen assuming the new base64 data should contain the same exact number of bits (right?). I also made sure that the final base64 string begins with `data:image/png;base64,` and ends with an = or ==. Here’s the corresponding code:

index.html:

<!DOCTYPE html>
<html>
	<head>
		<title></title>
		<script type="text/javascript" src="/socket.io/socket.io.js"></script>
		<script type="text/javascript" src="skrypt.js"></script>
		<style>
			#imgcontainer {
				position:relative;
				float:left;
				width:100%;
				height:100%;
				margin:0 auto;
				border:solid 1px #000;
			}
			#imagecontainer img{
				position:relative;
				float:left;
			}
			#txtbox{
				width:100%;
				word-wrap: break-word;
			}
		</style>
	</head>
	<style>
		canvas{
			border:solid 1px #000;
		}
	</style>
	<body>
		<video id="thevideo" width="320" height="240"></video>
		<canvas id="thecanvas" width="320" height="240" style="display:none"></canvas>
		<div id="imgcontainer">
			<img id="receive" width="320" height="240">
		</div>
	</body>
</html>

skrypt.js:

// HTTP Portion
// var http = require('http');
var https = require('https');
var fs = require('fs'); // Using the filesystem module
// var httpServer = http.createServer(requestHandler);

const options = {
    key: fs.readFileSync('my-key.pem'),
    cert: fs.readFileSync('my-cert.pem')
};

var httpServer = https.createServer(options, requestHandler);
var url = require('url');
httpServer.listen(8080);

function requestHandler(req, res) {

    var parsedUrl = url.parse(req.url);
    console.log("The Request is: " + parsedUrl.pathname);

    fs.readFile(__dirname + parsedUrl.pathname,
        // Callback function for reading
        function(err, data) {
            // if there is an error
            if (err) {
                res.writeHead(500);
                return res.end('Error loading ' + parsedUrl.pathname);
            }
            // Otherwise, send the data, the contents of the file
            res.writeHead(200);
            res.end(data);
        }
    );
}


// WebSocket Portion
// WebSockets work with the HTTP server
var io = require('socket.io').listen(httpServer);

// Register a callback function to run when we have an individual connection
// This is run for each individual user that connects
io.sockets.on('connection',
    // We are given a websocket object in our function
    function(socket) {

        console.log("We have a new client: " + socket.id);

        // When this user emits, client side: socket.emit('otherevent',some data);
        socket.on('image', function(data) {
            // Data comes in as whatever was sent, including objects
            console.log("Received at server: " + data);

            var buf = Buffer.from(data, 'base64'); // Ta-da //encode the base64 to binary?

            // //turn buffer object array into uint8 array
            var uint8 = new Uint8Array(buf);
            console.log("uint8 array: " + uint8);

            // //re-sort binary array
            var arr = uint8;
            var sortedArr = shuffle(arr);
            console.log("sorted: " + sortedArr);

            //turn back into base64
            var newB64str = (new Buffer(sortedArr)).toString("base64");
            console.log("newB64str = " + newB64str);

            //try adding another `=` to end of string
            var finalB64Str = newB64str + "=";

            socket.broadcast.emit('image', finalB64Str); //send to all except sender
        });

        socket.on('disconnect', function() {
            console.log("Client has disconnected " + socket.id);
        });
    }
);

function shuffle(array) { //https://stackoverflow.com/a/2450976/1757149
    var currentIndex = array.length,
        temporaryValue, randomIndex;

    // While there remain elements to shuffle...
    while (0 !== currentIndex) {

        // Pick a remaining element...
        randomIndex = Math.floor(Math.random() * currentIndex);
        currentIndex -= 1;

        // And swap it with the current element.
        temporaryValue = array[currentIndex];
        array[currentIndex] = array[randomIndex];
        array[randomIndex] = temporaryValue;
    }

    return array;
}

server.js (this is where it’s all done):

// HTTP Portion
// var http = require('http');
var https = require('https');
var fs = require('fs'); // Using the filesystem module
// var httpServer = http.createServer(requestHandler);

const options = {
    key: fs.readFileSync('my-key.pem'),
    cert: fs.readFileSync('my-cert.pem')
};

var httpServer = https.createServer(options, requestHandler);
var url = require('url');
httpServer.listen(8080);

function requestHandler(req, res) {

    var parsedUrl = url.parse(req.url);
    console.log("The Request is: " + parsedUrl.pathname);

    fs.readFile(__dirname + parsedUrl.pathname,
        // Callback function for reading
        function(err, data) {
            // if there is an error
            if (err) {
                res.writeHead(500);
                return res.end('Error loading ' + parsedUrl.pathname);
            }
            // Otherwise, send the data, the contents of the file
            res.writeHead(200);
            res.end(data);
        }
    );
}


// WebSocket Portion
// WebSockets work with the HTTP server
var io = require('socket.io').listen(httpServer);

// Register a callback function to run when we have an individual connection
// This is run for each individual user that connects
io.sockets.on('connection',
    // We are given a websocket object in our function
    function(socket) {

        console.log("We have a new client: " + socket.id);

        // When this user emits, client side: socket.emit('otherevent',some data);
        socket.on('image', function(data) {
            // Data comes in as whatever was sent, including objects
            console.log("Received at server: " + data);

            var buf = Buffer.from(data, 'base64'); // Ta-da //encode the base64 to binary?

            // //turn buffer object array into uint8 array
            var uint8 = new Uint8Array(buf);
            console.log("uint8 array: " + uint8);

            // //re-sort binary array
            var arr = uint8;
            var sortedArr = shuffle(arr);
            console.log("sorted: " + sortedArr);

            //turn back into base64
            var newB64str = (new Buffer(sortedArr)).toString("base64");
            console.log("newB64str = " + newB64str);

            //try adding another `=` to end of string
            var finalB64Str = newB64str + "=";

            socket.broadcast.emit('image', finalB64Str); //send to all except sender
        });

        socket.on('disconnect', function() {
            console.log("Client has disconnected " + socket.id);
        });
    }
);

function shuffle(array) { //https://stackoverflow.com/a/2450976/1757149
    var currentIndex = array.length,
        temporaryValue, randomIndex;

    // While there remain elements to shuffle...
    while (0 !== currentIndex) {

        // Pick a remaining element...
        randomIndex = Math.floor(Math.random() * currentIndex);
        currentIndex -= 1;

        // And swap it with the current element.
        temporaryValue = array[currentIndex];
        array[currentIndex] = array[randomIndex];
        array[randomIndex] = temporaryValue;
    }

    return array;
}

I also attempted to perform the shuffle to the base64 encoding itself (without transforming to binary first):

// HTTP Portion
// var http = require('http');
var https = require('https');
var fs = require('fs'); // Using the filesystem module
// var httpServer = http.createServer(requestHandler);

const options = {
    key: fs.readFileSync('my-key.pem'),
    cert: fs.readFileSync('my-cert.pem')
};

var httpServer = https.createServer(options, requestHandler);
var url = require('url');
httpServer.listen(8080);

function requestHandler(req, res) {

    var parsedUrl = url.parse(req.url);
    console.log("The Request is: " + parsedUrl.pathname);

    fs.readFile(__dirname + parsedUrl.pathname,
        // Callback function for reading
        function(err, data) {
            // if there is an error
            if (err) {
                res.writeHead(500);
                return res.end('Error loading ' + parsedUrl.pathname);
            }
            // Otherwise, send the data, the contents of the file
            res.writeHead(200);
            res.end(data);
        }
    );
}


// WebSocket Portion
// WebSockets work with the HTTP server
var io = require('socket.io').listen(httpServer);

// Register a callback function to run when we have an individual connection
// This is run for each individual user that connects
io.sockets.on('connection',
    // We are given a websocket object in our function
    function(socket) {

        console.log("We have a new client: " + socket.id);

        // When this user emits, client side: socket.emit('otherevent',some data);
        socket.on('image', function(data) {
            // Data comes in as whatever was sent, including objects
            console.log("Received at server: " + data);

            var slicedData = data.slice(22); //slice off data:image/png;base64,
            console.log("sliced data: " + slicedData);

            var b64string, numEquals;
            if (slicedData.length - 2 === "=") {
                b64string = slicedData.substr(0, slicedData.length - 2); //remove last 2 `==`
                numEquals = 2;
            } else {
                b64string = slicedData.substr(0, slicedData.length - 1); //remove last `=`
                numEquals = 1;
            }
            console.log("sliced data without == : " + b64string);

            var b64array = Array.from(b64string);
            var shuffledb64array = shuffle(b64array);
            var newB64str = shuffledb64array.join('');
            console.log("newB64str: " + newB64str);

            //try adding another `=` to end of string
            var finalB64Str;
            if (numEquals == 1) {
                finalB64Str = newB64str + "=";
            } else if (numEquals == 2) {
                finalB64Str = newB64str + "==";
            } else {
                console.log("error calculating number of equal signs");
            }

            socket.broadcast.emit('image', finalB64Str); //send to all except sender
        });

        socket.on('disconnect', function() {
            console.log("Client has disconnected " + socket.id);
        });
    }
);

function shuffle(array) { //via https://stackoverflow.com/a/2450976/1757149
    var currentIndex = array.length,
        temporaryValue, randomIndex;

    // While there remain elements to shuffle...
    while (0 !== currentIndex) {

        // Pick a remaining element...
        randomIndex = Math.floor(Math.random() * currentIndex);
        currentIndex -= 1;

        // And swap it with the current element.
        temporaryValue = array[currentIndex];
        array[currentIndex] = array[randomIndex];
        array[randomIndex] = temporaryValue;
    }

    return array;
}