An investigation by The Wall Street Journal showed that when a viewer picked a video based on their political bias, the videos recommended tended to be more extreme content which conformed to the same viewpoint.
YouTube, which is owned by Google, had issues last year, with major advertisers pulling their ads beginning in March when a report appeared stating that these ads were appearing on videos which had racist, sexist, extremist and anti-Semitic content.
In November last year, it was reported that clips of scantily dressed children were carrying ads from brands like Mondelez, Lidl and Mars.
The WSJ report quoted Northeastern University computer-science professor Christo Wilson, who studies the impact of algorithms, as saying: "The editorial policy of these new platforms is to essentially not have one.
"That sounded great when it was all about free speech and ‘in the marketplace of ideas, only the best ones win.’ But we’re seeing again and again that that’s not what happens. What’s happening instead is the systems are being gamed and people are being gamed.”
YouTube's product management chief for recommendations Johanna Wright admitted the recommendations were a problem. Engineers who work for the site said the algorithm did not seek out extreme videos but looked for highly trafficked ones – which often turned out to be extreme.