algorithms are made by human beings
Apr. 17th, 2009 02:00 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
The issue with #AmazonFail isn’t that a French Employee pressed the wrong button or could affect the system by changing “false” to “true” in filtering certain “adult” classified items, it’s that Amazon’s system has assumptions such as: sexual orientation is part of “adult”. And “gay” is part of “adult.” In other words, #AmazonFail is about the subconscious assumptions of people built into algorithms and classification that contain discriminatory ideas. When other employees use the system, whether they themselves agree with the underlying assumptions of the algorithms and classification system, or even realize the system has these point’s of view built in, they can put those assumptions into force, as the Amazon France Employee apparently did according to Amazon.
From Why Amazon didn't just have a glitch by Mary Hodder
In both the Amazon glitch and structural social groups, the impact of system-driven automatic choices is often irrefutable: a category of books and a category of people suffer from discrimination that has a clear negative impact on their opportunity to succeed.
In both cases, the causes of the problem are constructs - one technological, one sociological - a creation by human beings that have no inherent malice, but result in discrimination because bias seeds the way the systems make choices
From How the Amazon glitch relates to structural discrimination and racism by poster Keith, which continues the discussion of Hodder's piece.
The comments on Hodder's piece are amazing, disheartening examples of the ignorance and unconscious bigotry in motion.
From Why Amazon didn't just have a glitch by Mary Hodder
In both the Amazon glitch and structural social groups, the impact of system-driven automatic choices is often irrefutable: a category of books and a category of people suffer from discrimination that has a clear negative impact on their opportunity to succeed.
In both cases, the causes of the problem are constructs - one technological, one sociological - a creation by human beings that have no inherent malice, but result in discrimination because bias seeds the way the systems make choices
From How the Amazon glitch relates to structural discrimination and racism by poster Keith, which continues the discussion of Hodder's piece.
The comments on Hodder's piece are amazing, disheartening examples of the ignorance and unconscious bigotry in motion.