Chronotope’s avatarChronotope’s Twitter Archive—№ 93,063

                                  1. They're already editors of the internet dangillmor/1019736861768585218
                                1. …in reply to @Chronotope
                                  There's this weird idea that's--I'm gonna be blunt here--really gd stupid and blind to reality that because machines are making editorial choices for Google and Facebook they are not editorial choices...
                              1. …in reply to @Chronotope
                                The machines are making editorial choices. Those are not neutral choices. These choices are driven by decisions informed by programmers who *incredibly dangerously* don't think they are making decisions.
                            1. …in reply to @Chronotope
                              Those machine editorial choices are not somehow totally driven by the wisdom of the crowd and even if they were that's *still* bad because being an editor is equal parts listening to and telling the crowd what is important. Crowds are bad at making decisions. REALLY BAD AT IT.
                          1. …in reply to @Chronotope
                            But even so, these machines are making editorial choices based wholly in the worldview of people who develop them without any sort of thought on what that means or what impact it has.
                        1. …in reply to @Chronotope
                          Facebook leadership's thinking informs how its algo makes decisions. Like, Zuckerberg hand made the first draft of Facebook and his code reflects his way of thinking and his sense of ethics and *everything* is built on that at Facebook.
                      1. …in reply to @Chronotope
                        The fact that these people built machines to make editorial choices without even thinking about the implications of it is honestly way more frightening than if they had thought about being editors, because unconscious bias is way nasty.
                    1. …in reply to @Chronotope
                      The problem is it is perfectly possible and ok to treat Holocaust deniers as bad actors in the system and cause the system to disincentive them but that's not how Facebook or Google thought about it. We see how they thought about it now and it is bad news.
                  1. …in reply to @Chronotope
                    Code reflects the people who built it. It isn't neutral. There's isn't some elegant perfect way. Facebook and Google both develop entire programming languages and frameworks because there isn't Just One Way to do something on a computer.
                1. …in reply to @Chronotope
                  When they coded those algorithmic editors they described their worldview in the code, in how they solved problems w/it, in what tools they used, in the options and pathways they gave it. They could have thought those decisions were neutral, but people don't make neutral decisions
              1. …in reply to @Chronotope
                They can spend a lot of time considering, analyzing and trying to eliminate their bias from their systems, but that is a significant project which Twitter, Google and Facebook never attempted or even considered to attempt.
            1. …in reply to @Chronotope
              Facebook and Google already make editorial choices at a huge scale and really really badly. The fact that they need to make them better isn't even worth arguing about. OBVIOUSLY they need to make their machines to make better editorial choices b/c LOOK AROUND
          1. …in reply to @Chronotope
            This discussion about 'should Big Tech be editors' is a misdirection because that is an answered question. The real discussion is - why do we allow a world where their editorial decisions have such a huge impact?
        1. …in reply to @Chronotope
          The problem isn't - 'should they make editorial decisions over the internet' but 'how do we stop their editorial decisions from having an unfair and unbalanced impact on the internet' because that's the real problem...
      1. …in reply to @Chronotope
        Core internet principle 1: decentralization. These operations shouldn't have the power, capacity, or reach to make 'editorial decisions about the internet'. They are too large, their power too significant, their influence too unwieldy.
    1. …in reply to @Chronotope
      You can argue over their status as monopolies, and maybe they aren't as we traditionally see them, but they are entities to whom too much power is being given and the only real answer to the questions we are facing right now is to take it away. They must break up or be broken up
  1. …in reply to @Chronotope
    B/c at the end of the day, tech companies whose editorial decisions hold that much sway are analogous to one thing: cancer. The internet has cancers, they are Facebook, Google and Twitter. We need to cure ourselves of them. Not argue if they're blocking the right blood vessels.
    1. …in reply to @Chronotope
      You can either choose to exercise the power that remains to you as citizens of a government whose very existence is being continually eroded by these algorithmic editors or Big Tech will eventually be responsible for your demise.


Search tweets' text