Chronotope’s avatarChronotope’s Twitter Archive—№ 89,064

                                    1. I mean the irony here is that Google is doing the *exact same thing*, no? fchollet/976569949157179392
                                  1. …in reply to @Chronotope
                                    I mean Google News is still a major factor and search engine pages are doing the exact same thing with political views and algorithmic suggestions. The only difference is that it is Google instead of Facebook that's blind to its own role here.
                                1. …in reply to @Chronotope
                                  And that's not even counting Google's ad technology and exchanges all of which operate on fundamentally the same types of user data and under the same principles.
                              1. …in reply to @Chronotope
                                At the end of the day the problem is that everyone in Silicon Valley is fundamentally blind to their own part in building a massive technological environment that operates on user data for the purpose of convincing people to do things.
                            1. …in reply to @Chronotope
                              Personally, I don't think there's an undo button here. We can restrict user data collection and use legally, and we should. See: Chronotope/976093665004269568 ...
                          1. …in reply to @Chronotope
                            But we're not going to escape a future where machines attempt to convince us of things based on data about where we are on the internet. That would be like trying to un-invent the gun, or the nuclear ICBM. We're past a fundamental shift.
                        1. …in reply to @Chronotope
                          Instead of trying to sink persuasive technology--now impossible--we should be thinking about how we can build tools that leverage that technology to educate actively and effectively. This isn't a disarmament situation, this is our generation's space race.
                      1. …in reply to @Chronotope
                        The 'war against fake news' is essentially facing the same strategic obstacles as a war against drugs or a war against homelessness. This isn't something you can declare war against, fight effectively, or win. You have to address the underlying problems constructively.
                    1. …in reply to @Chronotope
                      And the underlying problems are people, a lack of media literacy, a lack of education, a lack of legal reinforcement, and an issue of incentives. None of those are engineering problems (though some of them might have technical solutions).
                  1. …in reply to @Chronotope
                    As fchollet notes - these are problems of our DNA, our brain. But I disagree with the idea that they can't be 'patched'. Because I disagree with the Silicon Valley view point that all problems are engineering problems...
                1. …in reply to @Chronotope
                  The human mind *isn't* a static system. Solutions exist but they come through education. You have to change human minds, and that means radical alterations to how we understand and prioritize education, digital and physical-space.
              1. …in reply to @Chronotope
                It also means changing the assumptions of the people who build these systems to include viewpoints outside of tech, men, whiteness, and America, where the systems are heavily biased towards a particular viewpoint b/c their creators are.
            1. …in reply to @Chronotope
              These systems are dangerous because they magnify the human assumptions of their creators. As a result we will *never* find a solution working within those assumptions. Chronotope/975720452042772482
          1. …in reply to @Chronotope
            The type of person who looks at the current situation and sees only danger and not possibility and promise to build out a new type of human education sees only nails and owns only hammers. The problem is inside the Silicon Valley house. Solutions are outside.
        1. …in reply to @Chronotope
          When we acknowledge that the tools are unavoidable we take on a different responsibility. We must discover how to deform them. To use them for good. To use them in fundamentally different ways. And to empower others to do so.
      1. …in reply to @Chronotope
        Specifically to empower others who are *unlike us* to do so.
    1. …in reply to @Chronotope
      (Either that or we destroy their incentive structure. I'm down to eliminate global capitalism as a system of societal organization if you've got a good idea on how to do so, but I'm at a loss there. And SV seems obsessed with making it worse, so no leads from that area.)
  1. …in reply to @Chronotope
    Feel free to be angry at the people who built these things irresponsibly. But honestly Facebook was a follower here, not innovative but improving on existing technological and economic structures. The problem started with Ad Tech and arguably Google's Ad Words...
    1. …in reply to @Chronotope
      Google Ad Words, which launched on October 23, 2000, has far more to do with the origins of our current predicament. Facebook's targeted advertising system launched in 2007.
      1. …in reply to @Chronotope
        This doesn't free either organization from fault. My point is this is a much larger issue than just Facebook. One ingrained in the thinking of Big Tech and inability or lack of interest in thinking out the negative consequences...
        1. …in reply to @Chronotope
          But also, forces of bad intent will always attempt to take over tools of good intent. In some ways this may have been both 100% predictable and not at all predictable.
          1. …in reply to @Chronotope
            For example: doctorow imagined a society freed by social reputation currency. Others followed. In the present day China is using a version of that idea for oppression. gizmodo.com/chinese-citizens-with-bad-social-credit-to-be-blocked-f-1823845648
            OpenGraph image for gizmodo.com/chinese-citizens-with-bad-social-credit-to-be-blocked-f-1823845648
            1. …in reply to @Chronotope
              Information wants to be free. It's a basic precept of the Internet and I still believe it. But that means bad information too. Our fixes can't be subtractive, they must be additive.
              1. …in reply to @Chronotope
                .samplereality wrote about building a deformed humanities, this is the type of path we need to seek. Our next stage of addressing the problems of persuasive technologies must be one that "tears apart existing structures and uses the scraps." samplereality.com/2012/05/02/notes-towards-a-deformed-humanities/
                1. …in reply to @Chronotope
                  You want to fix Facebook? (Or any of these systems?) The solution won't come from standing back and asking them to fix it. The solution comes from attacking them. Deform them by not following the rules. By not using them as intended. Use them as tools of education. Post madly.
                  1. …in reply to @Chronotope
                    The only way to save us from persuasive technology is to educate with it and turn it into a playground in which *everyone*--especially the minority voices silenced in Big Tech--can persuade.
                    1. …in reply to @Chronotope
                      (Some additional important detail for my reference to social reputation attached to this tweet. A better analogy might be how American tech culture imagine meritocracy vs how China implemented meritocracy) doctorow/976853314418638848


Search tweets' text