Left To Our Own Devices
A program for tech degrowth
We should immediately begin to degrow the technology industry.
Before you shut this webpage in disgust, let me stop you: Tech is not the most important thing to degrow. If it weren't for power consumption due to blockchain operations, it wouldn't even be top five. The reason we should do this thing, right now, is that tech is one of the few things we (literally you and I, not some mythical, wishful "we") already have the power to degrow.
The reason we have this power is that technology is not only dependent upon hardware, which is manufactured in the typical capitalist fashion, with the excess and the marketing and the planned obsolescence and the artificial scarcity. It is also dependent upon software, and that software is a choke point of a very strange type—part of that excess, marketing, planned obsolescence and artificial scarcity comes from the quality and features of the software that runs on it. The capitalists keep telling us to learn to code? Very well then, let's learn to code.
The first code you need to know is what the tech industry thinks of their product—it's a durable good that they need to disguse as a non-durable good. What do I mean by that? Well, a durable good is a product that you buy and you expect to use basically the same way for years on end. Your couch, your fridge, your washer, your car. If it breaks, it's usually cheaper and more worthwhile to fix it rather than to throw it out, until its long lifetime has run its course. This is not good enough for hardware manufacturers, because there's not very much money in computer repair.
So, then, you need to make people throw out old computers and phones and buy new ones. For a very long time, this just meant you made them faster and with more memory, better displays, and the like. "Build a better mousetrap", as they say. Perhaps you've heard of Moore's Law: that computing power (often considered as the calculating speed of the CPU: the thing that does the computing in the computer) doubles every 18-24 months. Well, what Moore actually said was that the amount of computer guts we can fit on a circuit board doubles in that timeframe, and you and I can see where that's not going to go on forever. Eventually you get to atomic scales and into sci-fi world. A lot of people think that Moore's Law died a long time ago.
So then how are hardware companies supposed to get people to buy new stuff? It honestly seems to mimic a lot of what you might see in the toy industry, only at higher stakes:
- Marketing cultivates demand by engaging in brand warfare. (Not going to deviate into a comparison of "It's NERF or nothing" and the 1990s' Pentium craze, but you could!)
- R&D creates selling points and depreciates existing inventory by adding on attractive features of varying usefulness.
- A culture of enthusiasts create social pressure to obtain goods with certain features, and the feedback loop begins again.
Even with such a hardened marketing-sales-development loop, the hardware industry would have stalled out a long time ago without parallel but much more insidious developments in the software industry. Of course without software, hardware is not useful to the average consumer, but development companies did not have to go this hard. People like to imagine that the riches and fame of Gates, Jobs, Larry Ellison, the Zuck, etc., are due to either advanced technical prowess, or an unusual lucky strike, getting in on the ground floor of an entirely new industry, but no. No, they are remarkably successful because they and their corporations engaged in remarkably avaricious and opportunistic practices, the price of which is paid by the consumer.
The basic model for modern software isn't too shocking, though it does offer its industry quite a range of capital-accumulation possibilities:
- Software serves marketing—not just from the parent company, but from anyone willing to pay. This is especially relevant and useful on the internet.
- It also gathers data on the responses to this marketing, again for sale to anyone willing to pay. Here the paths to accumulation multiply, though: if you work it right, you can sell the same data to LOTS of different customers. (It is essential that you never ask what they're going to do with this until and unless your userbase threatens to walk, which is why "Don't be evil" was never an option, and why Apple doesn't actually have your back against the state.)
- It serves media that isn't just advertising. Once software realized you could use the internet as your CD collection, as your library, and as your movie store, game was absolutely over for anyone who believed in the old "Information Wants to Be Free" model of Online. Money to and from studios and publishers, money from subscribers, more ads, and pressure to build machines that could replicate all this in the highest friggin fidelity.
- And this leads into the last and worst. Capitalist software dev drives obsolescence. If you make flashier, fancier applications and websites, you need more resources to even interface with them. Hence more and more software companies are getting into the hardware game (through partners or just directly taking the Apple route of trying to lock users into your "walled garden" by baking reduced compatibility into your product).
These two feedback loops feed on each other, creating a "growth for the sake of growth" paradigm that makes Edwin Abbey's cancer cell look tame.
Sometimes, when you're writing a computer program, you find a little annoying bug. Your machine is looping through the code, doing things just a tiny bit wrong. This compounds and compounds and soon enough, it's crashed or given you ridiculous data or mangled output. Debugging eventually reveals the culprit—you made an assumption at the very beginning of development that was just...incorrect. A number was off by one, a filename was misspelled, you meant to append instead of extend, whatever. The end result was a runaway train of computational catastrophe. So it goes.
Capitalist software has made a similar mistake. Long ago in its history, it baked into its money-making program something it thought was a feature, that will eventually become a bug that destroys it, and it is the same glitch that makes modern computing so frustrating for so many. The code: "embrace, extend, extinguish".
This strategy is largely associated with Microsoft, but they're really just the company who got caught saying it. The plan is: there's some software tech that belongs to a community (even if the community may be heavily composed of private corporations). No one company can corner this market. A single, devious private company adopts the technology and joins the community of interested developers (embrace). Then the company adds features that are only compatible with their software. As an example: internet coding standards cover the basic features of certain web languages, but the private company comes along and adds a non-standard feature that can only be accessed in their own browser. This is the "extend" part. Then as your technology gains market share, you can "extinguish" both the competition and the commons by converting the original tech into something you can own, patent, and profit from. This is just enclosure for nerds.
Of course, in all this hustle and bustle, capitalists have lost the plot on what most people do with their computing devices, that they can't do anywhere else: communicate with each other through text (and images, sounds, and video). Fundamentally, all of the social media, all of the blogs, all of the wikis, all of the office software as a service, all of the forums, are iterations of the original uses of networked computers: to talk to each other in that new, everyone-to-everyone way. It's really useful!
On the long term, though, it is becoming harder and harder to do it well. Everything seems so bloated, so unnecessarily slow and flashy. This is of course by design—gotta get you to order a new computer one day after all—but there's absolutely no good technical reason for it. Most computers that have been thrown out for "too slow" can and should be useful machines for communication, learning, and development. If we can make this reality, we can take a small but meaningful step in the direction of degrowth.
Let me start out discussing this program with the foreword of "I fucking hate political programs". Modern leftist political programs are like the appendix: everyone has one, none of them do anything, and if they get too large you have to get them excised or they explode inside of you and kill you. The only reason I'm bothering sketching the barest outline of a political program for tech degrowth is that I think it will make the discussion of tech degrowth more real and prevent backlash via strawman.
The core motivation for any usable program for autonomous tech degrowth is as follows: home users and "home coders" (various people with a wide range of usable technology production skills, not limited to writing production code) have the power to plant a seed that may eventually force a drawdown of software and hardware production by the bourgeoisie. This could be done by developing and using low-power, encrypted, free, and open source software for their most common text- and image- based computing functions.
Let's unpack that. Because capitalist production of hardware and software relies on planned obsolescence, and software that eats up more and more computing resources is the most cost-effective way of ensuring that users' hardware will eventually stop working, providing software that will do what users need without unnecessary resource use offers us a path that will extend the useful lifetimes of computing devices, breaking the artificial demand feedback loop that drives both software and hardware production. This in turn should eventually create an untenable situation for capitalists in the hardware manufacturing, software development, and rare earth mining industries. Harming capital is an inherent good, because it is fun as hell, but also tech contributes to the coming environmentally-triggered societal collapse, so kneecapping it right now may give the exploited more time to prepare to divert that collapse energy into building a better world.
From this starting place, the general points of any autonomous degrowth program flow more or less naturally:
- Identify software or software types that should be replaced (basically any Software as a Service will do)
- Create a low-powered, encrypted replacement to that software
- Using an autonomous, structured, horizontal development system
- With the goals of familiarity and usability as top priority
- And use existing social graphs to drive adoption
- When a usable product comes into existence, avoid the temptation to enter an eternal dev loop. Add features and fix bugs only as obviously necessary.
- Allow forks to take care of "new and improved versions". (Sure, you may be part of that fork, but we don't want to replicate the system we seek to replace.)
- Move on to the next thing
This all drives to an idea called "Permacomputing", also described here. I'll try to put together a synthesis of these ideas that makes sense in the framework of this piece. Because it takes so many resources and so much energy to create computing components, especially boards and chips, we should treat these components as extremely valuable no matter their market price. (Money is fake, friends.) Basically, throwing away etched silicon should be a very very last resort. The amount of ingress that computing has made into our personal lives is a result of capitalist excess and marketing rather than of these devices' actual improvements in our material condition, and tech degrowth will necessarily be a part of any plan to divert or survive societal collapse due to climate degradation. So, while repair is going to be an essential part of this dynamic (hence one of these commentators taking the name 'solderpunk'), software will also be part of this dynamic. The web and the rest of the software infrastructure is unsustainable.
The problem with Permacomputing as it's formulated right now, is the problem with any other "if we could all just do X" idea of any random person with a wish for a better world—how? Too often we divide into camps: one camp that says "it's all the capitalists' fault" and one that says "it's everyone's fault". And quite frankly, it doesn't fucking matter who's right. It really does not. I err more on the side of the former rather than the latter because I think guilt is a terrible motivator, but in reality it does not make one bit of difference whose fault this is. The only relevant question is: how do we fix this?
There have been some fantastic pieces on what society should do, and there have been some technically impressive developments that show us a glimpse into an ecosystem of sufficiently fast, low-weight, secure, and free software. What's missing is the thing that drives adoption. You can write about what should be done all you want, and you can build all you want, but in the world of Permacomputing, you can't make the difference, you can't actually change the planet, without convincing people to switch from the world they know to a more sustainable world they don't.
So, then the program must include its own evangelism, and not from the shiny release party stage, but in my very humble opinion, through the veins of those social justice and anticapitalist activist networks we find ourselves attached to. We therefore need to build an ecosystem that includes repair, software that appeals to activists, and training. There are some groups that are already doing some of this, I say as I type this out on Riseup Pad! We need more, and we need technology that will not depend on a browser ecosystem that the capitalists intend to use for more enclosure.
We also need to be with our people. The degrowth tech-builders, the permacompute-heads, are one part of a vibrant anticapitalist ecosystem, and if we cloister ourselves, we will rightly lose that community. But it does not follow that those with certain skills use them only for our bosses. The capitalists have sold us the rope, so to speak. Let's see what we can do with it.
- Other Apps