The terrors of the patriarchy in tech

A worry expressed in The Guardian. AI more generally, the use of big data to train it more specifically, risks coding into the systems the attitudes of the current society:

But this can create problems when the world is not exactly as it ought to be. For instance, researchers have experimented with one of these word-embedding models, Word2vec, a popular and freely available model trained on three million words from Google News. They found that it produces highly gendered analogies. For instance, when asked “Man is to woman as computer programmer is to ?”, the model will answer “homemaker”. Or for “father is to mother as doctor is to ?”, the answer is “nurse”. Of course the model reflects a certain reality: it is true that there are more male computer programmers, and nurses are more often women. But this bias, reflecting social discrimination, will now be reproduced and reinforced when we engage with computers using natural language that relies on Word2vec. It is not hard to imagine how this model could also be racially biased, or biased against other groups.

Yes indeed, that will happen. If you train something (anything, this applies to guide or gun dogs as much as it does to a translation program) to operate within what exists then it will operate by the rules which currently exist.

The important part here being of course that part about the world not exactly as it ought to be. This starts to smack rather of New Soviet Man, that luscious planned economy would start working right around the time that we've changed humans so that they work within that luscious planned economy. Which isn't, quite, how it turned out, was it? 

But perhaps it should be different this time? To which there is an answer, the answer being the same even as the questions differ. We use markets and competition to work this out. Some section of society would prefer to see all such structured genderism not incorporated into those AIs. Other (very few perhaps) would like to see more of it, most possibly just wanting it to reflect the real world not the dreamed one. All of which is absolutely fine. There is no shortage of capital out there, there are no legal of cultural constraints upon anyone building an AI absolutely any way they wish to.

We'll find out which people really prefer when they use the various available alternatives and, well, use them. That's what we've done with every other invention and innovation in history after all and look how much better off we are than those societies which didn't - New Soviets for example.  

At which point:

Products that are more responsive to the needs of women would be a great start. 

Well, get on with it then. For surely you're not insisting that the men should do it for you, are you?

Previous
Previous

The problem is that what Trevor Nunn believes about inequality isn't true

Next
Next

Universities and incentives