-
WillOremus undersequoias I think because machine learning takes time to train and they never bothered to train it against white meme-throwing terrorists because they have a cultural blind spot.
-
WillOremus undersequoias It's like the face algorithm that couldn't see black people or the Google one that mis-classified them as monkeys and wasn't fixable. Or Google surfacing holocaust denial in search. These are all problems that surface from a lack of diversity & forward thinking in the ML group.
-
WillOremus undersequoias The model is wrong, their incentive structure is wrong, and their lack of diverse employees in high enough positions to impact project planning and testing make an enormous blind spot in their QA.
-
WillOremus undersequoias Together it creates a huge blindspot that allows then to leave their ML learning to accept white nationalism and misogy to bloom in their platform and it is impossible to unwind from their algorithms.
-
WillOremus undersequoias *misogyny