Do you know where your news information is coming from? Do you know where it is sending you?

In the past 10 days, there have been two very troubling news reports regarding YouTube and its artificial intelligence (AI) algorithm that recommends content to YouTube users. This follows on other, older research and reporting indicating that the AI may be providing recommendations that send some YouTube users down some very dark and very dangerous rabbit holes.

In April, the University of North Carolina sociologist Zeynep Tufecki wrote about the potential danger of YouTube’s recommendation algorithm. After you watched what you were actually looking for – whether it was a music video, a movie preview, or a DIY tutorial – the next recommendation might not be so helpful, nor so pleasant.

At the beginning of June, Max Fisher and Amanda Taub reported a long form, deep dive into one of the negative things that YouTube’s algorithm is actually doing.

Fisher’s and Taub’s reporting discovered that YouTube’s recommendation algorithm was pushing user-generated and posted content of semi-nude girls, usually family videos that had been posted. Semi-nude is defined as wearing a two-piece swimsuit, a nightie, as the girl being no older than eight, and being mostly unclothed and/or briefly nude. Pushed by the recommendation algorithm, these videos play immediately after users have purposefully selected soft-core porn videos to watch. The concern here is that YouTube’s recommendation algorithm has created a gateway towards both the unintended exploitation of children and child porn.

YouTube says it is aware of the issue, telling Fisher and Taub they are working on a solution. They have removed some of the user-generated and posted videos of their own children from the rotation.

The U.S. Congress is now beginning to address the problem, working on legislation sponsored by Sen. Hawley (R-MO). Senators Blumenthal (D-CT) and Blackburn (R-TN). They have jointly written a letter to YouTube’s CEO demanding answers as to why YouTube will not just turn off these video recommendations.

As disturbing as this is, YouTube also does the same thing regarding politics. In 2018, Fisher and another one of his colleagues, Katrinn Benhold, took a deep journalistic dive into how YouTube recommends and promotes videos regarding politics: specifically in regard to Germany.

What they found was that YouTube’s recommendation algorithm was pushing extremist far right videos promoting incorrect information intended to anger and incite the viewer and to move them politically towards extremism.

Just last week, Kevin Roose reported an in depth article looking at one specific case of what the YouTube search algorithm’s promotion of these extremist videos, and the misinformation the often convey, can do to a vulnerable person in the U.S. It focuses on Caleb Cain, a college dropout who went looking for self-help videos on YouTube and ended up self-radicalized into extremism as a result of the videos he watched based on the recommendation algorithm.

While Cain’s example is, pardon the expression, an extreme case, and he has, fortunately, walked himself back from the extremist edge with help from others, the larger problem is that these search and recommendation algorithms are responsible for not insignificant amounts of the information, including news information, that we receive. They are everywhere – from Google search to Google News to Google owned YouTube to Facebook to Twitter to Amazon to Netflix – and they are providing recommendations for all of us.

These recommendations may be “what to read or watch next,” as is the case with the curated news feeds of Google News and YouTube and Facebook and Twitter. They may be helping you determine how to feel about the news you have just read, if you are getting your news, or some of it, through Facebook’s curated feed. And we’ll just leave what the algorithms involved with online dating may or may not be doing to the rest of your life for another day…

In order to be Thinking Security in our fast accelerating brave new digital world, we all have to be responsible consumers of news and information. We have to think about what the sources are of our information and who is providing or promoting the information. We have to consider the purpose: whether the point is to actually inform, or whether it is to stoke a strong emotional response that could send all of us, like it sent Caleb, down a digital rabbit hole of extremism and hate.

Not everyone who goes down these rabbit holes emerge in as good a place as Caleb Cain does. Lane Davis, who both consumed and generated the type of content that helped Cain self-radicalize, came to a far more tragic end. To prevent that, we need to be thinking smartly about the news and information we consume in our digital age.