Here’s the edited transcript of a small new talk I gave this week at the delightful WordHack, a three-year running event at BabyCastles in New York. The first formal presentation was Taeyoon Choi, co-founder of SFPC (School for Poetic Computation), an inspiring artist and thinker I look up to, and Andrew of HAWRAF, the design firm that designed the gorgeous Poetic Computation: Reader.
Empty Models/Flattened Language
(Thanks to Todd, BabyCastles, and WordHack for inviting and hosting me).
So I was a research resident at Eyebeam last year, working on a (nearing its close!) series about chatbots, and narrative design and engineering, which led me down a ton of rabbit holes around interface design, how the design of comment threads and interfaces reflects ideology, and how these interfaces flatten us.
So my background is as a fiction writer, and my start in critical writing was in games and game design, so those two not-seemingly similar things have often played off of one another. The limits of games and character design and the limits of poetry are frequently concurrent and run parallel. I like to look for moments that software and interface design produce something like poetic computation (Taeyoon!), where limits produce generativity, or beauty, or newness.

This presentation came out of a conversation I had at DLD, which is a massive gathering of ex- and present- SV folks, tech- entrepreneurs from throughout Europe and the world. I got into a conversation with someone from the Netherlands, who had spent a year living in San Francisco and found the whole experience appalling. She wasn’t familiar with homelessness or poverty or addiction, or the degree to which it is on display there. I told her about an experience I had going to speak at a festival, full of lovely art and tech- people, all very earnest, and passing block after block of homeless encampments, and feeling that very shameful twitch of, I should be doing something, and something is wrong here, going to talk about the future while passing the abyss of the present. Something is definitely wrong.

And she asked me for my opinion of the city and homelessness and poverty and addiction in America, small topics, and how was it that the startups didn’t see what was next door, and we got into this conversation about neoliberalism and the history of Silicon Valley. She asked, how would you get someone rich in a city S.F. to understand the lost context of, say, city zoning and history in American cities – so that they’d understand opportunity and success are not a pure matter of willpower? An augmented reality map that noted, depending on which neighborhood you went to, the history of zoning, of business investment, of divestment of public education funds, chronicled? So the facts are visualized?
And she said, we have those maps, but people still choose to model the world as they want. What she was saying, to me, is that modeling, or the assumed baseline model, affects language, and affects, shapes, generates, how we speak about people around us. The model shapes the story we tell about them. The model, in much game design and social interface design, is this:
It is the body without history and the body without politics. The 3D model, empty when you cutaway, is without history or politics or backstory. This body exists on an equal plane with all other bodies, all with essentially the same ability, or, if deficient, the ability, through sheer will, to conquer that “lack” to become a “real,” full model.
I think of this amazing quote from an amazing interview by Fred Turner that links some of the history of this modeling. He notes the ideology driving engineers is one of the world without politics.

In this world, you can have different –enough looking models, a whole palette of representation,: so visually, yes, different ages, ethnicities, genders, orientations, weights, heights, level of ability that manifest in mechanics that are fluid and slotted into the game’s mechanics. There is no friction. You have no history. You are just a body moving through space.
A young girl, though, is on the same plane as a grown woman,
Who is on the same plane as a woman from another ethnicity of more or less privilege, which is hidden, which isn’t factored into the model, the way it is carried in our actual flesh.
And we all are on the same plane, ready to go, loaded up with willpower and strength and a good attitude to pull ourselves up by the bootstraps.

The XYZ plane extends around as far as the eye can see, asking to be filled up with action, movement. No trace is left behind.
I see this idea of the model translating into much of interface design and how we interact with strangers online. I spend a lot of time on social media, and thinking about how all those accumulated hours has affected my writing and speech style in accumulation, and before that game forums. But the interface demands you speak about yourself, position yourself, and this particular kind of narcissism and self-involvement that makes you wan to disappear or claim yourself. All of these additional layers through which we pass through to come to one another is comes through in how you speak about yourself.
So how to keep this blank model, in mind, what is lost, as we enter these social spaces? Digital interfaces orchestrate affect and emotion, and careful design can potentially intervene in digital communication along the front of the interface. How complexity leaks out through the design of the interface is something that weighs on me, namely, the complexity of the human being you are speaking to. There is the spectator, passive, cesspool feeling you get in threads that struggle with “the political” – where the the flattening of people out to identity markers, camps, types of language and language use reveal that, of course, we’re not living in an apolitical world. And the primary political and social violence and eruption we are seeing, often play out in these interfaces, between camps that believe we live in an XYZ plane, and those that insist and know we don’t.
Example One: a thread on appropriation of dreadlocks hosted by someone we might all know, who is a meme master and genius. This thread is public, but I’ve blocked out the names, just for reasons. The conversation is between a white woman who claims “we are all just people,” and multiple individuals of color, scholars, academics, activists, with way more grace and patience than anyone should ever be expected to have, parse facts for her, history for her, as a kind of public service. I come back to this thread because I can’t get over her final reply (not seen in this image), sometimes – that those coming at her with history have “hate in their heart,” and how the structure of the thread allows for that escape.
When I look at these threads, I think of how these threads flatten us out, because we have access only to the avatar of this clownish OP (“OP is a clown”), and then I think about how people become avatars, in real life. I go from this interface out into the world, and people become flat. What comes out, is this desperate need to explain one’s history, one’s background, one’s experiences – you know, all the things that make you a person, through narrative – then also become circulated, flattened, and fodder for ridicule. That history becomes “hate in one’s heart” is enough to make you throw your hands up in the air and want to disappear.
So the resentment I see in this thread, the counter-defense, is a response to the fear of losing control over one’s identity. Could nontoxic antidotes to this fear could be designed? Currently, design decisions for major platforms are driven to maximize efficiency. But how the platform’s interface changes our capacity for nuanced communication amongst diverse social groups is much less explored. Can interfaces be designed to encourage deep- narrative dives into others’ lives?
In a proposal for an upcoming project I wrote with designer Aiwen Yin, we came up with the following questions to explore. I’m not an engineer, or designer, but I’d encourage both to think about the possibility for nuance, digression, lingering, and slowing down in tour questions :
Can we design for better conversation chemistry, flow, and nuance between two radically different people can be found through non-punitive design for more just emotions: love, compassion, empathy?
Could we map interface possibilities in which interlocutors at odds find the space to make an effort to understand each other?
Could we map counter-dialogues that value non-economically beneficial goals, like productive digression, or the valuable lingering which we encourage, even privilege, in offline social interactions, for their power to create social bonds across ideology?
Originally we had created the proposal for a contest around the theme of Resentment, for Triple Canopy, we wrote of how “online political wars bleed out or absent vital historical context like proof of systemic and institutional oppression, such that all resentments appear on the same plane,” and how “the alt- right’s conspiracies of a global racialized war waged through identity politics are tendered in the same space as those fighting for justice for police brutality.” Is the flattened plane the issue? How do you make anyone listen to facts? Are all resentments created equal?
And what design choices would encourage lingering? There’s the possibility of slowed down replies – a ten or twenty minute hold before you can tap into a thread. There’s the possibility of a slowed down scroll, or punishments for violent language. I think of the friction produced by rougher, free communication apps – my use of Signal on Android stalled when I couldn’t bear the sharp edges and orange emergency coloring as it made conversation flow less as smooth as in, say, Instagram DMs. We might think about how radical intolerance, how trolling and aggression are valued and rewarded, by the feed, scrolling in one direction, the spartan brutality of threading.
So Fred Turner then goes on to say that building out our expressive abilities won’t be the solution, nor will making better technologies for expression. Only attacking social infrastructure will do anything ever. He writes:

I think I disagree, in that how we speak and think about each other online is exactly what bleeds into the social sphere, the one-to-one mapping of identity to politics, of what you look like to what you are capable of, which is based on that empty model among others on the XYZ plane. How we are flattened out through language and models is political, too, and changing technology changes how we think and speak to each other. If corporate interface design favors the transactional over the non-utilitarian, then we can design alternatives. If the flat conception of identity as a static object leads to flat communication interface design, then we have to render fuller conceptions of identity.
That said, I think the way these interfaces and systems flatten is also extremely productive, because even as I tell my story, I am aware of its poetic resonance with a thousand others. In the flattening is a pressure to keep describing yourself compulsively, to tell others about what you have been through to get to where you are. This is a pretty American obsession, as we’re obsessed with origin and making stories. Your. So for me:
I am a daughter of parents from a Third-World nation called Bangladesh which you know as desperately poor and they came here with no money to go to school in the middle of a war and then graduate school and then built their lives and mine piece by piece through unimaginable … ETCETERA UGH, UGH, UGH, I’m sick of myself for even trying to attempt telling their story but why shouldn’t I but why should I when it will be material for curators or employers or schools or culture industry to frame me in words that aren’t my own narrative that isn’t my own (LOOP INDEFINITELY)
Let’s try again. I am a writer and I love digital art and fiction and poetry and computers and games and flirting with the intense communities around these interests. I like yoga and trap and a combination of the two. I obviously live in Brooklyn. This also sucks, ugh
AGAIN. I am a dot of light spinning in a void. I am fifteen thousand years old. I am flotsam on the river of time
I am no one. You don’t really know me. Please leave me alone. Fin
There’s something like poetic pressure in those limits – a move to be opaque, to evade, to actually push language and self-naming into complexity, and insist on complexity and unknowability. You pressure a poet to define herself and she will give you back something infuriating, something can’t be slotted into the interface very well, or a button clicker. I am no one is not an acceptable interface answer, but to me, in my mind, it is perfectly true. It depends on how I feel from day to day. From minute to minute.
Another productive limit scenario is image captioning. How Google image captions are being lauded as “properly captioning to 94%” what is in an image is fun marketing, but also pretty good humor and material for writing, which I’d like to use in a class down the line. You can take these banal descriptions – the feat of incredible technical research – and then construct further stories.
A dog is sitting on the beach next to a dog. (They are lost, left behind by their owners, who lost their Vineyard property in the 2008 crash, and could not afford to take them with, so they now wander the beaches looking for food, and have learned to survive together).
Or, I love this script from BladeRunner 2049, which is based on an acting acting exercise called “Dropping In.” Definition: “The process was developed by theater legends Tina Packer from Shakespeare & Co. and Kristen Linklater, though I’m not sure if it has deeper roots. It involves pulling specific words from a piece of acting text and asking a series of questions related to the word to the actor while they repeat that word back to the questioner. The goal is to load up the actor’s internal life for the character as much as possible by creating mental associations for each word, making them inherently more meaningful.”
The script is unbelievably powerful read aloud, a near-human, a replicant, either masking emotion and feeling or history or memory to seem non-human, an effort that breaks him and cracks him at many parts of the film. And even in the bot-like coda, Cells, or Interlinked, even where it seems superficially to not connect, you start to reformat the meaning of the phrase before. The juxtaposition of context creates new meaning for the statement. The automated generative game creates meaning through forced context.
I want to close with a video [pause, scramble] from In Sondra Perry’s It’s All in The Game, in a side monitor version of the work, she has the model of her brother, who was used in an NCAA basketball game that was the subject of a famous lawsuit.
So chosen language is linked to the assumed model you start with. You could change the angle of the plane; you can look more closely at what you think is inside the avatar you are speaking to, and what you know of its story, and how much time you’ve actually spent speaking to it. Have we imagined what’s inside this person’s head? What do we think is in their brain? Do we imagine them as having a brain, or an inner life, at all? Have we lingered with this model? Have we tried to fill it in on its own terms, with its own language, or have we told its story before it spoke?
Leave a Reply