廣告

2020年4月20日 星期一

Take YouTube’s Dangers Seriously







The New York Times


Take YouTube’s Dangers Seriously





Timo Lenzen






By Shira Ovide





My colleague Kevin Roose excels at explaining how our behavior is shaped by the companies behind our favorite online hangouts.




In the first episode of Kevin’s new audio series, called “Rabbit Hole,” he tells us how Caleb Cain, a college dropout in West Virginia, found himself watching ever more extreme YouTube videos. Caleb said he started to believe the racism, misogyny and conspiracy theories he absorbed.




People believe in fringe ideas for complex reasons. But Kevin points some blame at YouTube and its feature that recommends one video after another. This can push people from relatively mainstream videos toward dangerous ideas.




Our conversation about this, and more:




Aren’t most of us on YouTube for cooking videos and kittens, not conspiracies?




Kevin: People watch more than a billion hours of YouTube videos daily. While we can’t know how much of that is disturbing or dangerous, it’s inevitably a huge amount. And for a long time, people like Alex Jones and propaganda networks like RT had millions of subscribers and hundreds of millions of views.




How much blame does YouTube deserve for people like Caleb developing extreme views?




It’s a hard question. When someone gets drawn into an extremist rabbit hole on YouTube, it’s often because of loneliness, economic conditions and the “alternative influence network” of people who spread these ideas by, essentially, being good at YouTube.












But YouTube bears responsibility. Part of what makes YouTube seductive — and successful as a business! — are its automated recommendations, and its function that starts playing the next video after you finish one. That software plays a huge role in what people watch.




If someone goes to a library, checks out “Mein Kampf” and becomes a neo-Nazi, that’s not the library’s fault. If there’s a robot librarian who greets them at the front door, steers them to the German history section and puts “Mein Kampf” in front of them. …




Oof. Do you think it would help if YouTube turned off video recommendations?




I do.




What do we collectively do?




We need to decrease the influence these platforms have over us. For me, removing automated features — turning off autoplay on YouTube, making my own Spotify playlists, making it so Alexa doesn’t automatically choose the dog food brand I buy — helps me feel more in control.







And we journalists at big news organizations can help by figuring out how to make true, factual information as appealing to people on YouTube as conspiracy theories.




When people who are radicalized online commit crimes, or Alexa leads us to buy a certain pet food, are these our choices? Or is the internet warping us?




Both! The French researcher Camille Roth writes that the algorithms powering websites like YouTube and Facebook come in two flavors: “read our minds” and “change our minds.” If we’re aware of the machines working on us, and feel the ways they’re steering our choices, we can decide whether we want to follow a recommendation or make a different decision.




Get this newsletter in your inbox every weekday; please sign up here.

沒有留言:

網誌存檔