Unraveling TechnoSolutionism: How I Fell Out Of Love With “Ethical” Machine Learning

Unraveling TechnoSolutionism: How I Fell Out Of Love With “Ethical” Machine Learning

At the recent QCon in San Francisco, Kathryn Jarmuhl, a privacy activist and chief data officer at ThoughtWorks, gave a talk on discovering technology solutions, where she explored the inherent bias in AI training data sets to assume there must be technology. a solution to almost all problems and the solution of that technology will benefit humanity. It discusses ways to discover techno-solvency and questions technologists need to consider when developing products.

It begins by discussing how biases based on labels provided by human labelers are introduced to training data sets used in AI systems. A large number of taggers are among the lowest paid in the tech industry. As an example, he showed footage of a man and a woman talking, labeled as a worker being scolded by his boss with harsh speech, and a blonde girl being criticized by her boss . The image doesn't show which description is correct, but the tags go to a database to train the AI ​​system.

He defines technoanalysis as the naive belief that any problem can be solved by introducing a magic box of technology, and that the introduction of technology will change society for the better. Technoanalysis considers that technological progress is intrinsically good. He used an example of the first written formula for gunpowder, discovered in 9th century China during his research on the elixir of life. Is technology good, neutral or bad?

In fact, almost all technological advances have their advantages and disadvantages. Often the benefits and harms are unevenly distributed. one party may receive most or all of the profits, while the other party may receive all or most of the losses.

He points out that the computer industry is dominated by techno-analysis and recalls the early mythology of Silicon Valley and even later the mindset of early California settlers who had an attitude of overcoming challenges and improving. And change the floor . The belief that a good idea can change the world and make you rich appeared in Silicon Valley.

He quotes Joseph Weisenbaum, who developed what is believed to be the first AI system, as saying that computer technology from the beginning;

A fundamental conservative force rooted in existing hierarchies and power dynamics that otherwise must change.

This conservatism means that social change is stifled and that the benefits of technological progress accrue disproportionately to a small portion of humanity.

Provides guidance on how to find technoanalysis in action. If you find yourself making any of these claims, think carefully about the broader implications of what you're making.

  • I optimized the metrics someone found
  • Everyone agrees how great everything is.
  • Everything will be fine if we have ______
  • The myth says: revolution, change, progress
  • Potential troublemakers are excluded
  • I have not tried a non-technical solution to the problem.

It then teaches five specific lessons engineers need to pay attention to when developing products:

1) Make technology relevant

Ask what happened before this technology, what if it never came along, and what we would do without it.

2) Impact of research, not just technology

Look at the potential impact of technology in the short, medium and long term. Look closely and examine the immediate consequences to determine who and what may be affected.

3) Create space and learn from those who know

Identify and listen to interested individuals, communities and groups. Be sure to report your voice, and if you have privileges, use them to hear other voices.

4) Recognize system changes and explain them

Use language wisely and with foresight. He uses the example of "revolutionary" e-commerce to illustrate small changes in the way people communicate online. Exaggeration and hyperbole are often used to hide the impact of change on disadvantaged communities.

5) Fight for justice, not just for architecture

He talked about researchers being fired by Google for exposing bias in its algorithms. Give your voice to those who have been silenced.

She then talked about her decision to focus on data privacy as an area where she wants to make a difference and can make a difference.

He concluded with a series of questions that listeners should ponder.

  • What can you do if you can't confirm who you are now?
  • If you focus on change, what can you change besides technology?
  • What if we took collective responsibility for the future of the world instead of the future of technology?

From the criminalization of information to the abolition of prisons