The Rise of the Robots: Recent Must- and Should-Reads as of May 15, 2018
Remembering Suzanne Scotchmer: Delong Morning Coffee Podcast

Another piece worrying that human beings are simply unequipped to deal with an advertising supported internet, in which money flows to those who hack your brain to glue your eyeballs to the screen: Ben Popken: As algorithms take over, YouTube's recommendations highlight a human problem: "A supercomputer playing chess against your mind to get you to keep watching...

...After YouTube built the system that recommends videos to its users, former employees like Guillaume Chaslot, a software engineer in artificial intelligence who worked on the site's recommendation engine in 2010-2011, said he watched as it started pushing users toward conspiracy videos. Chaslot said the platform’s complex “machine learning” system, which uses trial and error combined with statistical analysis to figure out how to get people to watch more videos, figured out that the best way to get people to spend more time on YouTube was to show them videos light on facts but rife with wild speculation.... This isn’t just a YouTube problem. Chaslot’s research on YouTube, which he released earlier this year, added to growing concerns about the pervasiveness of similar algorithms throughout modern society....

This reporter was helping his son research outer space for his school project. When he searched for "Saturn," the first results were mostly documentaries. One of the recommended videos was "10 facts you didn't know about space." That video led to additional recommendations such as "can you believe it" videos, a synthesized voice reading Nostradamus predictions and a clip "they don't want you to see" of pro-Putin propaganda. What had started out as a simple search for fun science facts for kindergartners had quickly led to a vast conspiracy ecosystem.... Chaslot, a software engineer in artificial intelligence, worked on a project to introduce diversity to YouTube’s video recommendations starting in 2010. It didn’t do as well for watch time, he said, so it was shut down and not used. "This is dangerous because this is an algorithm that's gaslighting people to make them believe that everybody lies to them just for the sake of watch time," he said....

The conspiracy videos are perfectly positioned to push our buttons and draw us in to consume more of them—signs that YouTube’s algorithm prioritizes, wrote Robert J. Blaskiewicz Jr., a columnist for the Committee for Skeptical Inquiry.... "Conspiracy stories hit our emotional fight or flight triggers,” Blaskiewicz wrote. “And the stories rest on the unstated premise that knowledge of the conspiracy will protect you from being manipulated. This in itself compels people to watch and absorb as much as they can and to revisit videos."... YouTube has said it's simply reflecting what users want to see, and videos are chosen based on their individual profile and viewing history....

YouTube has to balance protecting its profits with the trust of its users. Fail to walk the line and it can begin to undermine user value, said Kara Swisher, Recode executive editor and MSNBC contributor. "I think it's a problem not just throughout Youtube, but Google, Facebook, all these companies is that they prioritize growth over anything else. They may not be meaning to do it, but if growth is the goal, then user experience is not the goal,” said Swisher. “Real users, the ones you’re trying to attract, go away. And so it's in all their interests from a business point of view to clean this place up and to have more control over it and there's a moral responsibility to create a platform that isn't being abused by anybody. “Good advertisers don't wanna be next to these kind of videos either,” she added...

#shouldread
#riseoftherobots
#publicsphere

Comments