(2023-10-17) Berjon The Web Is For User Agency
Robin Berjon: The Web Is For User Agency. As Joseph Weizenbaum noted in 1976: “I think the computer has from the beginning been a fundamentally conservative force. It has made possible the saving of institutions pretty much as they were, which otherwise might have had to be changed.
You can pile layers of indirection on top of it, and pile indirections we sure have, but at the end of the day a computer is just a tool to automate tedious, clerical, bureaucratic tasks. And how we automate things matters a lot.
Whenever something is automated, you lose some control over it. Sometimes that loss of control improves your life because exerting control is work, and sometimes it worsens your life because it reduces your autonomy. Unfortunately, it's not easy to know which is which
when we're designing new parts of the Web and need to articulate how to make them good even though they will be automating something, I think that we're better served (for now) by a principle that is more rule-of-thumby and directional, but that can nevertheless be grounded in both solid philosophy and established practices that we can borrow from an existing pragmatic field.
That principle is user agency. I take it as my starting point that when we say that we want to build a better Web our guiding star is to improve user agency and that user agency is what the Web is for.
there are very good reasons to take the time to align on what it is that we're trying to build when we build the Web, and good reasons why this has to import at least a little bit of conceptual engineering from the vast and alien lands of philosophy.
First, people who claim not to practice any philosophical inspection of their actions are just sleepwalking someone else's philosophy
Second, in the same way that the more abstract forms of computer science can indeed help us produce better architectures, philosophy can (and should) be applied to the creation of better technology
There is no useful future for the Web that doesn't wrestle with harder problems and social externalities
Third, we have a loose, vernacular notion that we are doing this "for the user"
but we could benefit from being a bit more precise about what we mean by "putting users first".
the idea of user agency ties well the capabilities approach, an approach to human welfare that is concrete, focused on real, pragmatic improvements, and that has been designed to operate at scale. (Bad-Ass)
Three things that are important to share some minimal foundations about are:
Working towards ethical outcomes doesn't mean relying on vapid grand principles but rather has to be focused on concrete recommendations;
When considering agency we need to be thinking about real agency rather than theoretical freedoms; and
Counterintuitively, giving people greater agency sometimes means making decisions for them, and that's okay if it's done properly.
Let's look at these three points in turn.
Capabilities
We're all familiar with vaporware freedoms: you are given a nominal right to do something but the world is architected in such a way to effectively prevent the exercise of that right. Everyone can start a business or own a house!
they aren't new. Global economic development struggled for decades with comparable issues in which people may have acquired rights they couldn't really put to work or saw economic improvements that didn't translate to more fulfilling lives. In response to this, Martha Nussbaum and Amartya Sen developed a pragmatic understanding of quality-of-life and social justice known as the capabilities approach.
it supports an unfailing commitment to people's ability to solve their own problems if not prevented from doing so
One important architectural aspect of this project is the need to remove external authority from the system so as to prevent chokepoints of capture from emerging
what Jay Graber aptly described as "" described as "user-generated authority, enabled by self-certifying web protocols."
Capabilities were designed with development in mind: they are meant to change people's actual lives, not to check theoretical items off a list.
User authority doesn't mean that people have to build their own thing. Not everyone can run their own server for the same reason that not everyone can eat exclusively from their own pet organic garden. Not all control is agency; control has to be proportionate to bounded rationality and reasonable cost
Ethics in the Trenches
ethical tech documents. These seem to mostly be lists of lofty principles that exude a musty scent of meeting room detergent and commit to all manners of good behavior requirements
When working on Web standards, we only consider requirements that can be verified with a test to be meaningful — everything else is filler.
At a high level, the question to always ask is "in what ways does this technology increase people's agency?"
This can take place in different ways, for instance by increasing people's health, supporting their ability for critical reflection, developing meaningful work and play, or improving their participation in choices that govern their environment. The goal is to help each person be the author of their life, which is to say to have authority over their choices.
Technologists trying to maximise user agency often fall into the trap of measuring agency by looking only at time saved
Even things that many would consider chores aren't always best automated or delegated away: you may not wish to clean your house but you might want a say in the chemicals introduced into your home, about how your things are organised
Consider:
The great level of detail that has gone (and continues to go) into specifying how to make the Web and Web content accessible.
An equally-impressive trove of actionable principles can be found in the Internationalization work
It's hard to act freely if you can't act safely, which makes work on security core to the agency project. RFC8890
And the same can be said about privacy, which is key to trust as well. Privacy further matters (as discussed in the Privacy Principles) in that it includes the right to decide what identity you present to others in which contexts.
These shared foundations for Web technologies (which the W3C refers to as "horizontal review" but they have broader applicability in the Web community beyond standards) are all specific, concrete implementations of the Web's goal of developing user agency
A great way to build the future of the Web is to work through a gap analysis of the ways in which we could be developing user agency.
we need to think beyond the individual.
Agency is Collective
focusing on user agency is not an individualistic position.
the Web is (as per Aurora Apolito) “a form of ‘collectivity’ that everywhere locally maximizes individual agency, while making collective emergent structures possible and interesting.”
Much of the Web that exists today rests on the assumption that users exist on someone's server: essentially as guests on someone else's property.
No matter how you set things up, the server can ultimately change the rules on a whim. In order to protect user agency and to imagine the Web as “a global community thoroughly structured by non-domination” we need to shape Web technology so that it shifts power to people rather than away from them. Doing so requires the return of protocols to the fore so as to push power to the edges
At a purely technical level, it requires a user agent more powerful than what browsers alone can provide, and at the policy level it necessitates a legal framework to prevent user agents from abusing the trust users have to place in them.
Edited: | Tweet this! | Search Twitter for discussion