Chronotope’s avatarChronotope’s Twitter Archive—№ 128,021

    1. …in reply to @taubeneck
      taubeneck acfou swodinsky robinberjon Reality would seem to prove otherwise. Are these systems designed to be exploitative? I can't read minds. Is the end product exploitative, seen as exploitative by some buyers and pursued to exploit? Intent is irrelevant when no action has been taken to stop the end result.
  1. …in reply to @Chronotope
    taubeneck acfou swodinsky robinberjon If the endpoint of system optimization is a coup of a country's government. Perhaps more than one country's government? Then whatever you want to call your goals... they're the wrong goals and the end result is the only thing worth examining.
    1. …in reply to @Chronotope
      taubeneck acfou swodinsky robinberjon "many engineers just make bad technology because they don’t think about humans when they’re making it. They’re obsessed w/the technical details—but you have to start to realize your actions have consequences. It is indefensible to lack morals in software" vice.com/en/article/7x5an4/why-i-quit-github
      OpenGraph image for vice.com/en/article/7x5an4/why-i-quit-github


Search tweets' text