Technology Is Political. No Matter How Much People Try to Say It's Not.
The Inherent Politics of Tech: Don't Be Bamboozled
Talofa Reader,
In my career and in life, the following statements I've heard repeatedly make me want to headbutt a brick wall to save me from a lawsuit:
"Don't bring politics into x, y, z!"
"I don't do office politics."
"I don't care about politics, I just want to write code!"
Sure, they're all worded differently but essentially all say the same thing, which is, "I don't understand how society and human beings work and I'm basically an underdeveloped human being."
I mean, even look at the recent uproar over a women's rugby team doing a Haka that was critical of the government.
"Keep politics out of sports," scream the same crowd that tells everyone else to "harden up, change the channel if you don't like it".
Weakest demographic of human beings to ever exist.
Politics is in everything because people are in everything and people are inherently political beings shaped by the ideologies, values, and power structures around them.
This is just a fundamental understanding of people, so it pains me to think this is missing from anyone's consciousness.
In terms of technology - not just the way people who work in tech think about politics, but just how anyone views technology, the most dangerous misunderstanding about technology is that it's not political.
People think the technology is objective; it's just doing what it's programmed to do, abdicating it of any responsibility.
But the truth of the matter is, technology is very much political - in how it's created, developed, by whom, for whom, and why?
Technology doesn't exist in a vacuum.
It's developed within socio-political contexts that influence how it's designed, where it's deployed, and how it's used.
A great example of this is the development and debate around encryption technologies.
You have governments and law enforcement agencies on one side arguing that encryption helps criminals and that there should exist backdoors in the technology.
On the other side, you have privacy advocates and technologists counter that backdoors in encryption technologies inherently compromise the security of the technology, thereby making them vulnerable not just to law enforcement agencies but to malicious actors as well.
Here we’re talking about the mathematical algorithms that encrypt and secure data, how it's developed (with or without backdoors) and who it's developed for and against.
It’s very much the proverbial football being kicked between two sets of political ideologies.
The encryption debate has raged on for a long time now, but a lot has developed and evolved for tech over the years, and in the exact same way, the same political vulnerabilities for encryption are there for these new topics as well.
Let’s have a look at some of these topics shall we?
How Political? Let Me Count The Ways
If you're already familiar with how terrible the world is in various ways, these won't surprise you.
For everyone else—and it genuinely surprises me how many people are still in the "else" group, even in and after their 30s—let's have a quick look through:
Algorithmic Bias in Recruitment Tools
AI-driven recruitment tools have been caught out being biased against certain demographic groups. Amazon abandoned an AI recruiting tool because it showed bias against women.
From the Reuters article:
"…Amazon's computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period.”
I wonder what could go wrong here?
“…Most came from men, a reflection of male dominance across the tech industry… In effect, Amazon's system taught itself that male candidates were preferable."
just one example of how tech inherits societal biases.
Facial Recognition and Racial Bias
This specific issue seems to be the gold standard of outright bias that I can remember coming up time and again in the news.
Examples include the 2015 incident of images of black people coming up in Google searches for "Gorillas" (which they haven’t really fixed1); facial recognition software not detecting black faces2, etc.
Studies3 and reports have shown that facial recognition technologies can show racial biases, leading to higher rates of misidentification for people of colour.
How do you think this impacts privacy, fairness, and discrimination in surveillance and law enforcement practices?
I'll keep my final two examples brief, but they are by no means any less impactful on society.
Internet Sovereignty and Censorship
You don't have to look far to see how an entire nation uses technology to enforce political ideologies, restricting free speech and access to information4. The Great Firewall of China and its history of controlling citizens via access to information, as well as social scoring and its impact on daily life, are prime examples.
Deepfakes and Misinformation
Last but not least—Deepfakes. The ability to create misleading or entirely false information, in audio and video form, that is indistinguishable from the real thing has huge implications on things like political elections, public discourse, and trust in media.
Earlier this year was the story of a "Fake Joe Biden robocall tells New Hampshire Democrats not to vote on Tuesday" - our new technology capabilities, literally playing with politics.
What’s the worst that could happen?
I think it's clear what's at risk if we don't acknowledge that technology is not some agnostic, objective, public servant that has been programmed with the best of humanity and is infallible to human faults.
The problem seems to be that we're either too naive or wilfully apathetic to it because we live in the privileged position of not being directly or severely impacted by any of the negative outcomes of it, should it go "worst-case scenario".
But that's what the people who are in the firing line have to think about—my mum always used to tell my brothers and me, "Prevention is better than cure."
Don't mess it up in the first place, and there won't be a big cleanup to do.
And so it remains for the demographics historically vulnerable to the negative effects of technology, policy, laws, or similar, to stay vigilant. They must also stay vocal about these issues. The vigilance is necessary because, typically, once shitty things are in place, these "out" groups suffer.
They then face the arduous task of working to remove these measures from the system.
If we accept this naive stance on the political nature of technology, we're going to sleepwalk ourselves into a bad time for people already not having as good a time as many others.
What do you think solves this?
I often think there must be a better way than fighting multiple battles on numerous fronts for equity, pay, housing, economics, or just being treated the same as everyone else.
Anarchists, communists, and socialists talk about dismantling the system.
I'm all for that—even from my position of privilege.
Why? you may ask…
It's hard to see a world with so few of us "with all this" and so many "without a lot" and think, "Yes, this is the world I want to exist in."
Pretending technology isn't political is a problem.
But, I believe the issue goes deeper. Forgive my cynicism and on-brand distrust of everything, as it might weigh in a bit heavy in my next points:
I see people who argue that technology isn't political as falling into two camps:
Those who don't want to face the truth that technology is political and its implications for them.
Those who know it's political and understand they will benefit regardless, but lack the courage to admit this stance.
And I consider both groups cowards.
There's power in the truth "setting you free." and so acknowledging this reality exposes the cowards. Starting from a foundation of truth and honesty is crucial, no matter how unpleasant the truth or how difficult the honesty.
Accept that technology is political.
It's used, designed, developed, and deployed with the inherent biases of engineers, CEOs, and industries.
Starting from this acknowledgment, we can make some real moves and progress.
I don't expect those unaffected to care immediately.
Perhaps they'll become "allies" in time. But for those of us at risk of being directly impacted, vigilance is key. We must be educated, skilled, and active in the "design, development, and deployment" of technology.
Personal Update
On a personal note, things at work and on the community side have started to ramp up, but I'm determined to keep building things out here.
I'm thinking of maybe doing a shorter weekly edition focused mainly on tech commentary that comes up during the week. This will also help build up my writing skills more.
I might keep the long-form editions as they are, every fortnight.
I'm also change the schedule long-form posts to Saturday mornings. That seems like a better time for you to read a longer rant from me, not during your ride to work.
At any rate, the goal is to keep reading, keep formulating my thoughts and ideas, and above all, keep writing.
Thanks for being here.
...and thanks for reading. See you in the next one!
Ia manuia,
Ron.
2018 Verge Article: “as a new report from Wired shows, nearly three years on and Google hasn’t really fixed anything. The company has simply blocked its image recognition algorithms from identifying gorillas altogether — preferring, presumably, to limit the service rather than risk another miscategorization.”
Buolamwini, J., & Gebru, T. (2018). "Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification." Proceedings of Machine Learning Research.
King, G., Pan, J., & Roberts, M. E. (2013). "How censorship in China allows government criticism but silences collective expression." American Political Science Review, 107(2), 326-343.