E-180: The Decay of Digital Things @ the d.school

Posted by on Mar 16, 2016 in Decay | No Comments

Repost from E-180 Magazine – The Decay of Digital Things visits the d.school
lilprinter2

The Decay of Digital Things started life as a series of essays exploring the role that time and decay play in networked objects. What happens when the networks and businesses that support connected devices shut their doors? How do we design for both our users’ death, and the death of the system? It was a plea for designers and developers of digital things to take responsibility for the role their creations have in time. I set out to explore this space through essays and some artistic explorations, but wanted to find an opportunity to bring more people into the conversation around the topic. Being (incredibly) new to San Francisco though, I was at a bit of a loss as to how.

While exploring this project, it was suggested to me by a colleague at IDEO that perhaps this topic could be explored at Stanford’s Hasso Plattner Institute of Design, also known as the d.school. The d.school was founded in 2004 by IDEO’s founder, David Kelly, and has since become a major port of calling for students of many disciplines at Stanford (and elsewhere) who are looking to approach cross-disciplinary problems using the tools and framing of design practice.

The d.school offers a “popup” class format that allows faculty and d.school fellows to propose new class designs outside of the standard curriculum offered by the d.school and other faculties, but instead explore new topics in non-traditional formats. A call for these popups goes out to the d.school fellow/faculty community every few months, and range from highly structured, credit-based courses that factor into existing curriculum, to somewhat gonzo uncredited classes, to somewhere in-between. Popup classes are presented to the student body during the “Popup Fair,” which sees the different popup classes vying for student attention and signup. The quality of the pitch and the survival of the classes dependent on how many students you can pull in. It was pretty exciting, so upon learning all this, I jumped at the chance.

Overall, the creation of the Decay of Digital Things popup class at Stanford took three broad arcs: Negotiating the d.school as an outsider, Collaboration with my co-teachers to design the curriculum, and working with the students to create outcomes.

Negotiation

In pitching the class to the d.school, I was given a fairly key constraint: I was required to find a co-teacher from the d.school itself. For the d.school, this serves as a layer of defence against mismatches in subject matter and method. Admittedly, I had an easier time gaining access to the d.school as an IDEO employee, but lacking that layer of connection with a d.school faculty member, the class would have likely fallen short.

After a few cold calls from the faculty site, I was eventually introduced to Maryanna Rogers, a graduate of Stanford herself and a teacher at the d.school who was interested in speculative design angle of the class. Once Maryanna was brought into the thesis of the class and was interested in moving forward, she was able to help me navigate the approval process for the class, and we got it on the roster for the coming semester.

Collaboration

The next step was to start designing the class itself. What did we want out of it? How did we want our students to grow? What kind of outcomes fit both the theme, and the frame of speculative design?

While thinking this through, we decided to reach out to a third co-teacher, Liz Goodman, who’s a thinker on design practice and connected objects. One of the first things we explored was wanting some form of tangible outcome: something fitting to the design backgrounds of the teachers, as well as the need to push folk towards translating idea or provocation into tangible object.

We started looking for a gallery space where we could host the students, and eventually found one in Workshop Residence, a gallery and craft space in San Francisco. Despite not knowing how many students we might get, what their skillsets might be, and how they might engage with the class, we had resolved at least to ask them to create something tangible and physical to articulate an otherwise abstract idea about time and digital materialism. This desire on our part to challenge the students became a strong asset for the class going into it.

Co-creation

Going into the course, we had designed the class time to be focused around co-creation and fast prototyping of concepts as a route to understanding. We had created a small digital card game (playable at http://cards.decay.io) to help students brainstorm scenarios for their final project, which would be displayed in the gallery. By playing the card game, we found the students were able to much more quickly connect the abstractions underpinning the class to events or situations in their own life, and in turn helped them rapidly come up with ideas for the final work for the gallery show. The added benefit was that the students helped challenge our understanding of the class and subject matter as well. While we had initially been very focused around IoT enabled objects, for example, several of the students focused heavily on the notion of “decay” within social networks, and how to celebrate or cope in situations of lost love or death. On an opposite vein, one of our students (who now teaches at the d.school) explored the role that ritual might play around objects with “intelligence,” and how we might be emotionally affected by the death of an object that was also our companion.

The class ended on a very strong note, with the students rising to the challenge of creating a series of individual and team-based installation pieces that were showcased at the Workshop Residence gallery. The popup format at the d.school gave me an incredible opportunity to glimpse inside, learn from, and contribute to an institution that otherwise can be pretty difficult to parse. Popup classes are a fascinating and accessible form of low-risk experimentation for schools. Especially in design programs where prototyping is the word of the day, having an accessible way to prototype new subjects and teaching methods goes a long way.

Decay: Bacterial Computing

Posted by on Apr 24, 2014 in Decay | No Comments

virus_edit_copy

Computers are ecosystem. Tiny unix programs support the workings of massive GUI-based applications, hidden processes spin in tandem to create useful (or useless) heat and information, and deeply buried systems check from time-to-time to return computers to some factory-defined state.

But since we created these artificial jungles of transistors and computation, we’ve also had the virus. Slipping across networks and self-replicating within systems, computer viruses are at one time a stunning example of engineering virtuosity, and a stark reminder of the strengths and weaknesses of the network.

My suspicion is that the computer virus, despite being a product of human creativity like all computational things, is a natural and perhaps necessary part of the networked computational ecosystem. Viruses slow down processes, corrupt data, can cause physical damage (as in the case of w32.Stuxnet), and persist in PRAM and bios chips long after one might think they have started from a blank slate. They can cross ecological boundaries, as in the case of stuxnet’s USB jumping or in the 2006 case of Apple iPods accidentally shipping with a windows viruses. They grow more resilient, such as a virus being enabled in the fight against the personal computer’s leucocytes. Viruses enabled with polymorphic or metamorphic engines are set up to obfuscate the lens of anti-virus software, enabling a broader spread and slower meaningful response on the part of those who craft these defenses.

Even with these analogies to the natural world though, the computer virus is a conscious creation on the part of an individual or organization. Something released into the world in order to accomplish a task, to execute some mischief, or to forward to malicious intent. With this frame, how can the virus be something both natural and valuable in a networked and constructed ecosystem?

Viralmimicracy

I want to explore two very small things: viruses and bacteria. Viruses operate by infecting a host and taking over its mechanisms for reproduction and other purposes. This process can kill the host, or damage them, but often benefit the host through greater overall resilience and other emergent qualities from contact with the viral infection.

In contrast, bacteria are single-celled organisms that thrive in different organic environments, causing a range of effects in the same. For example, bacteria serves as a primary mechanism for enabling decay in biological systems, the same bacteria that help us digest food.

Contemporary human craft often rebels against the notion that these qualities might be positive. In physical products, decay can be manifest through material degradation or obsolescence: the cultural context for the object has shifted. In computational systems, decay is often manifest through informational entropy: the value of information changing, becoming lost, becoming unsearchable, or losings its connection to its network and its context. In most cases, the effect is almost entirely deleterious to the host.

So what happens when we try to create beneficial viruses (which is where I’m going with all of this)? In the mid-90s, there was a hidden battle (of sorts) propagated across millions of systems worldwide. The BLAST virus and the Welchia worm were two viruses with different intentions. Blast (w32/Lovesan.worm) was a malicious virus that infected windows systems through window’s insecure remote system mechanism, but Welchia was different. Welchia infected systems through the same mechanism, but its payload to be released on successful infection was a series of beneficial Windows software patches. It also fixed the mechanism its entrance mechanism and self deleted after a time.

Despite this good intent, Welchia was a big program for a lot of people: messing up system configurations, tying up network traffic, and restarting systems once its well meaning payload was installed. One can’t help but wonder then: is a beneficial virus possible? Or might we use the self-replicating and invasive qualities of the virus as a template for what might be a natural component to the lifecycle of a system? Perhaps we can understand such thing to to serve some of the roles that bacteria hold in living systems: healthy decay, build resilience in the host against the environment, and a partner to their hosts’ core functions.

Computational Bacteria

I wonder how a viralmimicracy movement in computer science and systems design might emerge? Perhaps we might develop applications and operating systems that were susceptible to viruses, if only to combat the glut of legacy systems emerging with cheaper computation and greater access. Perhaps this becomes more aggressive: application being shipped infected and ready to infect, doing harm to the competitors and setting up an environment habitable to the new application. Perhaps we might do as Stuxnet: target the specific problems by blanketing networks of generalized machines? (This seems a recipe for disaster)

Let’s explore computational bacteria. Might we shift away from the modern monolithic application environment; returning to the earlier days of command-line programs and piped (|) information transfer. An application is no longer a single thing, but a fog of information, computation, and modular utility. Could a network of computers pass good health and maintenance practices to each other, making everyone stronger as a result? Viral in their capacity to spread, but bacterial in their neutrality, emergent utility, and otherwise banal qualities.

With that in mind, I created a small bacteria. “OSX/mom” will make sure things don’t get too messy around here, but might frustrate you a bit in the process. A simple C program that is fired manually and does a little cleanup right now, but it might be fired by launchd or exist as an XPC service in the OSX environment. It might be bundled with other tools tied to the intent of better system management: memory cleans, disk space watchers, etc. It might be one amongst hundred, thousands of tiny programs: piping data into each other, serving little functions, and making your computer, your network, your computational ecosystem just a little bit stronger every day.

Explore OSX/Mom here: https://github.com/readywater/osx-mom (I’m still tweaking it, so it might not run…)

And a few links and resources I found or was pointed to!

Thanks to @admsyn, @LadyOniyide, @clintonfein, @ilesinge, @trullyhansen, and others for pointing me at some compelling examples of computer virus art.

Decay: Stanford d.school class and GAFFTA talk

Posted by on Mar 16, 2014 in Decay | No Comments

First, I will be giving a short talk this Monday at the GAFFTA Creative Code meetup on the Decay of Digital Things. I’m hoping to have things turn into a conversation around the project and where it can go!

Second, we’re excited to announce that the Decay of Digital Things will be a popup class this May at the Stanford D.School. Updates will be posted at Decay.io/dschool.

In collaboration with Maryanna Rogers and Elizebeth Goodman, we’ll explore the Decay of Digital Things with Stanford students in a studio workshop format, culminating in a one-night show of the results in San Francisco.

0211_poster-2

Decay: Haunted Machines

Posted by on Jan 30, 2014 in Decay | No Comments

lena

The world keeps tabs on us. The recent revelations about the NSA and other organizations reinforced the impressive scale of this tracking, and show the contrast between modern techniques and those of the past. The expiration date on a driver’s license is a tool for biometrics. Registering to vote is a mechanism for observation. But now, we are so much better at it.

I want to explore whether these systems can rot from within. One of the most lauded qualities of software is its ability to be “upgraded”, often without physical change. New software is written, drivers are updated, techniques are refined, etc. But the reality is one of legacy systems holding a controlling interest on our interactions: old infrastructure running outdated software designed for calcified human systems.

What happens when our new software – with massive databases and sophisticated learning models – are left for ten, twenty, thirty years or more? Core to the human experience is that we grow and learn while having only limited foresight into our future state. What if we asked the same of computational systems?

A human baby learns to recognize faces early, and there is some debate as to whether facial recognition is hard-wired into the human brain, or something that’s entirely learned. In either case, our ability to recognize faces means understanding key physical form (that a nose goes here and not there), understanding how these forms are abstracted across categories (this is a human nose and not a Llama nose), and tying meaning to specific members of that category based on unique traits (your nose is different from my mother’s nose, so you can’t be my mother). These abilities make us highly specialized systems for understanding and reacting to the state of other human beings, and form some of the underlying mechanisms for navigating community, others’ emotions, and relationships.

Computers are not as good at this. A specially-configured computer might recognize a human face in an image, or a differently configured one might recognize – or even learn to recognize – a number of specific faces, but neither approaches the kind of adaptable specialization that humans demonstrate.

Implicit in the language of configuration is a kind of permanence that learning navigates around. The human condition is one of constant, often monumental, change that manifests itself in countless external and internal ways. A computer might be configured to interpret the world like a professional forensics artists, for example, and project how features might change over time, but it would be ill-suited to speculating on the details of that change, and likely not even considered for that task. I wonder if we can truly build computers that can grow along with us, even if we configure them with those core capabilities.

Paranoid Computers in a Haunted World

I suspect that “learning” can be a kind of decay within a system. Learning might manifest itself through unexpected behaviour that drifts from a configured state, as if by dead reckoning. Just as an analog synthesizer needs to be adjusted like the instrument that it is, maybe the learning machine can drift out of tune through its own functionality.

Let’s imagine a system that could learn to interact on a pseudo-emotional level, had a database for remembering people, and could improve its interactions by “learning” from each encounter with a user. This system tries to identify visually, interprets an emotional response, self-corrects, and tries again. Like a person, it learns by re-configuring based on experience.

Let’s say it’s a concierge system for a condo parking lot.

But what if it was mostly unable to recognize faces like it was designed to: its lens is distorted, or maybe its image processor is poorly coded. Maybe’s it’s simply old. This distortion causes that computer to collect false, or semi-formed experiences. A series of cameras in a parking lot – now a normal thing – algorithmically squint at detected movement, trying to make sense of the known and the unknown. The sun sets, its lens flares, and suddenly everyone looks like that one visitor two weeks ago who was allowed in that time, right?

How would that computer – confused and confronted by aggressive users reacting to false negatives – reconfigure itself, and react to its inability to make sense of the world it was configured for, but can’t quite see. By trying to absorb a context based on a skewed understanding of the world, the computer is experiencing a kind of decay. It slowly drifts away from its original configuration and inadvertently finds itself in a state of obsolescence.

I find myself feeling a weird empathy for these kinds of machines. It’s an empathy for a simple thing caught unawares, almost like Charlie Stross’ semi-sentient Lobsters serving as confused artificial intelligence models for the human-centered internet. There is no easy solution for legacy software and systems. So how do we deal with them?

Maybe we exist with old systems the way that we will (hopefully) exist with our aging human population: with respect and inclusivity. An aged facial recognition system might be bolstered by proximity, the new media pose, or a patient pause of ones facial expression. Or perhaps its understanding that some names need to be spelled phonetically or pronounced as such.

Bruce Sterling suggested our future is one of “old people in big cities who are afraid of the sky.” I think this prediction might be applied to old cities with too many people that struggle for purpose, as well. As we continue to grapple with the challenges of legacy systems and their moldering infrastrcuture, perhaps we ought to also develop a practice of patient interaction design to deal with this future gracefully on both fronts.

Decay: Sensor Ghosts

Posted by on Jan 16, 2014 in Decay | No Comments

shoesfromwire
In high school, a friend once told me that a pair of sneakers hanging from a power line marked a place where drugs were sold. I later found out that this (mostly) wasn’t the case, but I still notice them dangling in the wind, as though they had meaning.

Lately, our shoes have become a vector for reading our behaviours. Tools like Nike+ currently exist as small sensor pods that attach to our shoes and talk to our phones, but in proposals and patent drawings, a future is being sketched that describes washable computers and fabric-like sensors, seeing these once detachable objects embedded directly into our clothing. And like any other fashion, if it wears out in style or in structure, it is discarded.

The MIT Senseable City Lab explored the strange life of the discarded with their project Trash|Track. For this project, the lab designed a circuit called Trash Tag to be embedded in the refuse – including an old shoe – which worked by responding to movement and broadcasting a cellular signal to local towers. This signal was triangulated by the service providers, and sent back to the lab for analysis.

I’m wondering what the “fully embedded” future might look like, versus one that needs to be attached. If the seams of a piece of clothing can be designed as an antenna, then anything from a shirt or a shoe can be made to record and report. Presumably this will be in no way nefarious: these sensor-clothes will analyze our posture, or track how many calories we’ve burned today. They’ll measure heart rates, buzz when they detect an open Wifi system, or change shape depending on the temperature.

The electronics sewn into our clothes will be cheap, might be powered by Microelectromechanical systems (MEMS) or inductive laundry baskets, and will be utterly discardable like our clothing is now.

Recycle lost sensor networks

So what happens after a wearable is discarded?

In my neighbourhood, there are a fair number of forgotten shoes jostling each other from their wirey perch. Since starting to think about this project, I’ve been wondering if they could talk to each other, or teach us something from their vantage point, were they suitably enabled with software and sensors.

What if the networked waste we’re creating continues to broadcast into the ether? What if their MEMS-enabled batteries and ultra low powered processors doing what they were designed to do: reporting on location, or movement, or localized heat, or brightness. They might send that data to a networked server, or ping helplessly for a phone or router they think is nearby.

I want to think about recycling digital things as not just reusing their components, but by recycling or upcycling their purpose. I’m wondering if there’s some way to turn the detritus of our vogues and narcism into a localized story. Maybe we could create a beacon for these discarded smart things: giving dead wearables a new life through access, organization, and transparency. A cheap computer, some clever routing, and a community catalogue of wireless protocols and APIs might be all we need to turn once private products into a localized and public sensor network. Like a digital version of Jane Jacob’s eyes on the street, these discarded smart things just might teach us something incredible.

recycledshoenetwork-1

Decay: Printing Things

Posted by on Jan 13, 2014 in Decay | No Comments

image_2

I make things. A restlessness sets in when I’m not using my body to manipulate some tool – be it a computer, a kitchen knife, or my bicycle. And I’m far from alone in this compulsion. What I hadn’t expected was to share this compulsion with machines.

Machines like the BERG Little Printer and the Makerbot genus share this compulsion, and project their compulsion on us. More than any other type of tool, they evoke a style of creation that is specific to their nature, instead of complementary to ours. This makes it easy to get started within the frame they create. Whether by API or CAD, these tools are made to make, and we become their managers.

Constructor

The Little Printer is a powerful case study in managed creation, or “instancing.”

The Little Printer emerged from the theory of a Social Printer as articulated by Matt Webb, suggesting that the printer itself isn’t what’s valuable, but rather the appearance of an artefact tied to my tribe. The mail suddenly becomes mediated by the mailbox, and the mailbox’s role is to describe and manage the mail. The mail becomes an instance of the mailbox.

You subscribe to a variety of feeds via Berg’s web app, and from that point on, your Little Printer will print out the contents of that feed according to when you’ve scheduled it. The printer uses Thermal Paper, which has information printed on its specially-treated surface by a heated printer head. Berg points out that they use recycled and BPA free thermal paper, and they’ve made clear efforts to be environmentally conscious.

~Destructor

The question is what happens after the printing. There is no method to remove this instance from the world. The Little Printer is a thing that creates without giving an option to destroy the creation when its usefullness is over.

What I want is a Little Printer with a Destructor Function. I want the Little Printer to take responsibility for its waste – not burden me with it. Just as well-written software cleans up after itself by freeing the memory it was using on my computer, I want the well-designed Making Machine to clean up after its mess.

So let’s start with the quintessential “Bad Idea.” I present to you, Little Printer: self powered edition.

lilprinter2

Decay: Designing for Decay

Posted by on Jan 11, 2014 in Decay | No Comments

image

What happens when things are capable of navigating their lifespans? I wonder what those things would be like.

Would they be buildings that pay their own property taxes with their energy savings?

Or smartphones that knew where their recycling plant lived, and guessed at when they would arrive there? What if that recycling plant knew when to expect them?

Decay

In the biological world, Decay is a physical phenomenon where materials become a simpler form of material and energy.

But in the culture of making objects, decay is a multidimensional problem. The physical decay of an object fails to sync up to the behavioural, cultural, or digital decay of that thing.

Decay becomes the natural output of an ecosystem of use, disuse, and obsolescence not dictated by material, but by software and consumer expectation from software behaviour. This decay is taking the form of obsolescence and apathy: a world of forgotten things with short lifespans and nowhere to go afterwards.

The danger is that culture rot is claiming the utility of objects before material rot ever does, and the physical casings that held the once functional circuits and software can take an eternity to decay.

To combat this, decay must be reframed as inherent to the value of an object. This can be done by situating time as something that adds value (or detracts by its absence), and by challenging the emerging anonymity and replaceability of network connected objects.

We want to enable a graceful ecosystem of creation, decay, and rebirth in a software-infested and thing-saturated world.

Viridian Inspiration

I suspect that designing for decay means designing Viridian things.

In 1998, Bruce Sterling set out on a decade long journey called the Viridian Design Movement. One of its many goals was to address the failures in the communication design of existing environmentalist movements, and to develop an interface for designers to become not green, but Viridian (i.e. effective).

His 2005 book, Shaping Things, synthesized much of the Viridian movement into a manifesto reminiscent of Machiavelli’s The Prince. Razor focused in its intentions, accessible, and action oriented; Shaping Things gives its (possibly unwitting) readers a toolkit for parsing the past and present through Viridian goggles. For designers in particular, this means changing the way we make decisions for our world.

Viridian things are shaped by many principles, and I’ve come to focus on two in particular:

“Avoid the Timeless, Embrace Decay” and “Planned Evanescence”.

A third principle, “Be When You Are”, serves to ground this exploration. I’ll explore decay through the lens of computationally-enabled things and contemporary technology: a subset of “Gizmo” culture that Sterling references in Shaping Things.

I’m looking forward to this journey with you.