Professor Jonathan Albright said he had used several hundred "seed videos" which were returned from a search on YouTube's API for "crisis actor".
He chose that term because a number of students from the Parkland, Florida, school where a shooting took place recently are being described by this term, in order to spread disinformation that they are not genuinely concerned about stopping a repeat of such events.
Albright said the "next up" recommendations for each of his results generated the 9000-odd videos referred to earlier.
Some of the headings on videos unearthed by Albright.
"Themes include rape game jokes, shock reality social experiments, celebrity paedophilia, 'false flag' rants, and terror-related conspiracy theories dating back to the Oklahoma City attack in 1995," Albright wrote.
Earlier this month, iTWire reported that an investigation by The Wall Street Journal had found that the algorithm used by YouTube to recommend videos related to one that a user was viewing appeared to often lead to conspiracy theories, partisan viewpoints and misleading clips, even though the viewer had not shown any interest in that kind of content.
Along with his article, Albright has posted snapshots of the network generated from the 9000 "crisis actor" videos.
"I’m not finished with this part of the analysis yet, so these are very rough drafts. But they are interesting to look through to get a sense of the quality and breadth of content that’s being hosted, monetised, and promoted on Youtube," he said.
"Every religion, government branch, geopolitical flashpoint issue, shadow organisation — and every mass shooting and domestic terror event — are seemingly represented in this focused collection."
Albright said it appeared that mass shooting, false flag and crisis actor conspiracy videos were a well-established, if not flourishing, genre on YouTube.
He said the number of views for 50 of the top mass shooting-related conspiracy videos was around 50 million.
"Not every single video overlaps directly with conspiracy-related subjects, but it’s worth pointing out that these 8842 videos have registered almost four billion (3,956,454,363) views," he added.
YouTube, which is owned by Google, had issues last year, with major advertisers pulling their ads beginning in March when a report appeared stating that these ads were appearing on videos which had racist, sexist, extremist and anti-Semitic content.
In November last year, it was reported that clips of scantily dressed children were carrying ads from brands like Mondelez, Lidl and Mars.