2100.02 ~ Architecting Law
The Politics of Technology, Black Lives Matter, Nuclear Powerplants
|Chris Neels||Jun 15, 2020|
Over the past two weeks, we've seen a significant volume in commentary around the role of social media companies in political discourse. Amidst the surge in political activism around Black Lives Matter, the surge in communications flowing through social media platforms has emphasized the political nature of these platforms in themselves, as well as the people who operate them.
I normally will not discuss current events in this newsletter. However, the events that have transpired over a short period of time teach an important lesson on the politics of technology.
This post will unpack the argument that by building technology, you, dear reader, are a political actor.
The technology you make is imbued with your values, and this technology exerts those values on others.
It might seem strange to suggest that founders, designers, developers, managers, executives, etc. are political actors, especially when many such people profess to be apolitical. Yet the reality is that the things you release into the world can influence, afford, or even dictate certain behaviours—whether consciously or indirectly. The protocols around the use and effects of technologies become social protocols.
The influence that technology imposes can come from all kinds of technologies—irrespective of their profundity. At the macro scale, Langdon Winner argues that nuclear powerplants served as a centralizing societal force. In order for people to reap the benefits of abundant energy they had to accept rigid state authority over the procurement of nuclear materials and the safety and operations of massive powerplants. And, like industrial systems of production, workers needed to be organised into large-scale, hierarchical forms of organisation. Failure to do so would risk the literal spill-over of nuclear materials for unsafe or nefarious uses. Winner contrasts nuclear energy to solar power generation, a decentralized form of power: both literal and figurative.
A nuclear powerplant in Doel, Belgium. Credit: @fredography
Amongst digital products, Lawrence Lessig argues that code is law:
[The] regulator is code — the software and hardware that make cyberspace as it is. This code, or architecture, sets the terms on which life in cyberspace is experienced. It determines how easy it is to protect privacy, or how easy it is to censor speech. It determines whether access to information is general or whether information is zoned. It affects who sees what, or what is monitored. […] This code, or architecture, sets the terms on which life in cyberspace is experienced.
Unlike how the power of nuclear power resides in the state, the power behind software is placed in its architects. From the low levels of internet protocols, to the high tower apartments listed on software applications, code regulates how we navigate the world. Paradoxically, the unregulatability of the internet is a matter of perspective: gaps where governments cannot exert sovereignty are spaces governed by non-state actors.
Ever since the fourth estate (the press), we've never seen so much political influence in communications as from social media companies. The digital infrastructures of social media platforms serve as the rails on which information flows: algorithms mediate what content appears on “feeds;” buttons (e.g., "like" and "retweet") mediate engagement with this content; messaging facilitates action and reaction to this content; ads harvest user behavioural data to keep the content flowing.
While social companies share a number of similarities, there was a bifurcation in how they responded to the tensions that arose during the Black Lives Matter protests. The epicentre of this is perhaps a message from Donald J. Trump warning protestors that "when the looting starts, the shooting starts."
Twitter and Facebook responded differently. Twitter permitted the Tweet to remain accessible. However, it added a disclaimer about how the tweet violated its policy on the glorification of violence and "turned off" options for users to socially engage with it. Facebook left the post unmodified. In a town hall, Mark Zuckerberg defended the decision, saying he didn't believe the post “read as a dog whistle for vigilante supporters to take justice into their own hands.”
That each company has the ability to mediate difficult decisions on free speech—including those involving messages from the President of the United States—illustrates their political power, however apolitical they may self-profess to be.
A number of other social media companies made political decisions during this time period. For instance, Zoom's CEO stated they would not encrypt free video calls "because we also want to work together with FBI, with local law enforcement in case some people use Zoom for a bad purpose.” Additionally, the dating/hookup app Grindr removed the ethnicity filter from its app. Such decisions reveal how extrajurisdicial "laws" are programmed into the interfacial layers of digital services. The layers of sovereignty that private companies are able to establish appear to be permitted as a price the public is willing to pay for innovation. But, with public trust in institutions declining, it is a privilege and responsibility you should not take for granted.
Overall, we see that the makers of technology exert power proportionate to the reach of the technology. Tremendous power is now placed in the hands of those who design our most ubiquitous tools. As Eric Weinstein put it, “we are now gods, but for the wisdom.” Part of this should excite. The other part should terrify.
While your organization adheres to existing laws, the act of creating technology, in a sense, creates new laws. Acknowledge that views towards these “laws” may not be universally held and, just like the legal system, should be built to adapt.
Questions to ponder
What sorts of behaviours does my technology afford, reward, require, or prohibit?
How might my technology look if the opposite values were mediated?
What are ways in which my own values are reflected—whether consciously or not—into my technology?
How might I bring in the perspectives of people with different, or even competing, values? Do I hold the values to action on this information?
Thanks for reading. Remember: all technology is political. The question is not how to remove politics from technology, but to consciously acknowledge which are present and to what end.
Regular housekeeping ~
A special thank you to those who shared their feedback and compliments from the first newsletter.
If you or someone you know is working on an emerging technology and has an interest in contemplating its long-term implications to society, send me a note.