YouTube purging chemistry videos

In summary, YouTube seems to indiscriminately be taking chemistry videos off of the site and banning the creators of those videos, which removes a valuable resource for chemistry education.
  • #1
Ygggdrasil
Science Advisor
Insights Author
Gold Member
3,759
4,199
Chemistry World (a magazine published by the UK Royal Society of Chemistry) is reporting that YouTube seems to indiscriminately be taking chemistry videos off of the site and banning the creators of those videos:
‘Seven years I ran the channel for, slowly getting it to 8000 subscribers and 1 million total views. It was gone in less than two hours and I could not stop it,’ says chemistry PhD student Tom from Explosions&Fire about his collection of over 90 videos. In late 2017, he received a strike for a video published four years prior. Strikes are given for content that violates YouTube’s community guidelines. Tom removed the flagged video, but received strikes for another two videos. The channel was suspended and not reinstated despite multiple appeals. He created a new channel, but says that ‘it is only a matter of time before it goes again’.

While Tom’s focus on energetic compounds might have put him in a grey zone of YouTube’s rules around dangerous content, other areas of synthetic chemistry have not been left untouched. In early 2018, a channel called ChemPlayer, run by an anonymous group of chemists, was terminated. It had received three strikes in quick succession for videos on phenylacetic acid synthesis, Grignard reactions and chocolate cake making.
https://www.chemistryworld.com/news...oul-of-youtubes-content-purge/3009206.article

Writers at Chemistry World argue that the removal of these videos is problematic as it removes a valuable resource for chemistry education and reinforces (and likely stems from) public misconceptions about chemistry:
We now know of a number of chemistry channels, some containing hundreds of videos, that have been taken down. It seems that censors with an itchy banning finger have pulled them because they contravene the video sharing platform’s community guidelines in some way. Exactly how is unknown as YouTube doesn’t provide detailed reasoning when it removes a video. It seems that these videos have been swept up in the media giant’s response to pressure from a number of governments – including the UK’s – to clean up its act and take down objectionable and illegal content. Chemistry videos being purged from video sites may sound like a trivial matter, but it’s important: these videos are inspiring the next generation of chemists.

While we may have been inspired by school lab demonstrations when we were growing up, this generation of schoolkids is lucky enough to also have a vast repository of videos covering every imaginable topic. These cover everything from the barking dog (an exothermic reaction I imagine most schools would consider too risky to perform) to experiments explaining everyday processes. It’s not just for schoolchildren either: hobby chemists can be found performing sophisticated syntheses too.
https://www.chemistryworld.com/news/the-ban-wagon-rolls-on/3009290.article
 
Chemistry news on Phys.org
  • #2
"You Tube?" Manny to Diego in Ice Age, "Diego, spit that out. You don't know where it's been." Or something to that effect.
 
  • #3
Youtube has been inappropriately removing and banning videos/channels these days. Seriously, they need a better filtering AIs.
 
  • #4
Maybe YouTube is not ready to be the best online site for some kinds of assessing of videos, but YouAlwaysHaveOtherOptions if you are producing and want to upload videos to some other internet sites. A problem is that some inappropriate videos can still be uploaded onto alternatives, too.
 
  • #5
Sigh. But what to expect if even PF is not free of occasional chemophobia?
 
  • Like
Likes Ygggdrasil
  • #6
I suspect that the learning algorithm that was implemented for screening inappropriate content in YouTube must have been too stringent in creating an association between "chemistry" and dangerous materials/methods (thinking specifically of bomb making or chemical weapons such as sarin gas), and thus have been flagging educational material indiscriminately.

This, IMHO, highlights the risk of putting too much trust in black-box algorithms in filtering content, and for a call to have more open-source, interpretable algorithms to be implemented, if possible.
 
  • #7
StatGuy2000 said:
I suspect that the learning algorithm that was implemented for screening inappropriate content in YouTube must have been too stringent in creating an association between "chemistry" and dangerous materials/methods (thinking specifically of bomb making or chemical weapons such as sarin gas), and thus have been flagging educational material indiscriminately.

This, IMHO, highlights the risk of putting too much trust in black-box algorithms in filtering content, and for a call to have more open-source, interpretable algorithms to be implemented, if possible.

Agreed. A lot of problems with YouTube, Google, Facebook etc. stems from their over-reliance on algorithms to curate and moderate content. The companies do not want to cut into their profits by hiring people to perform these tasks. Unfortunately, this leads to some rather bad situations:

The algorithm has been found to be promoting conspiracy theories about the Las Vegas mass shooting and incentivising, through recommendations, a thriving subculture that targets children with disturbing content such as cartoons in which the British children’s character Peppa Pig eats her father or drinks bleach.

Lewd and violent videos have been algorithmically served up to toddlers watching YouTube Kids, a dedicated app for children. One YouTube creator who was banned from making advertising revenues from his strange videos – which featured his children receiving flu shots, removing earwax, and crying over dead pets – told a reporter he had only been responding to the demands of Google’s algorithm.
https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts-truth

There are also many cases of the Google and Facebook algorithms aiding in the spread wrong information or deliberately false information, as detailed in this nice piece from The Atlantic: https://www.theatlantic.com/technology/archive/2017/10/google-and-facebook-have-failed-us/541794/

Algorithms can also have severe problems as they not only learn useful information from their training sets, but they often also pick up the biases and predjudices of society along the way:
Google’s translation program decided that soldiers, doctors and entrepreneurs were men, while teachers and nurses were women. Overwhelmingly, the professions were male. Finnish and Chinese translations had similar problems of their own, Quartz noted.

What was going on? Google’s Translate tool “learns” language from an existing corpus of writing, and the writing often includes cultural patterns regarding how men and women are described. Because the model is trained on data that already has biases of its own, the results that it spits out serve only to further replicate and even amplify them.

It might seem strange that a seemingly objective piece of software would yield gender-biased results, but the problem is an increasing concern in the technology world. The term is “algorithmic bias” -- the idea that artificially intelligent software, the stuff we count on to do everything from power our Netflix recommendations to determine our qualifications for a loan, often turns out to perpetuate social bias.
https://www.politico.com/agenda/story/2018/02/07/algorithmic-bias-software-recommendations-000631

The problem is worse in the case of the chemistry videos, as an untrained human would also have trouble distinguishing safe vs unsafe content in chemistry (e.g. even non-chemistry moderators in PF have had trouble with this issue).
 
  • #8
ygggdrasil, post #7,

So disappointing that quality checking is a matter of some automated "algorithm" and a real person does not view, think, and assess. Technological Progress, maybe!
 
  • #9
symbolipoint said:
ygggdrasil, post #7,

So disappointing that quality checking is a matter of some automated "algorithm" and a real person does not view, think, and assess. Technological Progress, maybe!

@symbolipoint , the simple fact is that given the sheer volume of data made available on YouTube (among other services) it is simply impractical to expect that humans can check and moderate over all such content. Some form of automation is required -- that's just a fact.

The issues here (at least as I suspect) are that the learning algorithm is not only learning beyond what is in the training set, but is also learning the random noise in the data, thus reducing prediction error (the errors of which leads to the strikes on perfectly acceptable content). This is a classic case of what machine learning researchers and statisticians refer to as "overfitting".

Overfitting is an issue that plagues the field of learning algorithms, and engineers/scientists at YouTube should be paying closer attention to this.
 
  • Like
Likes HAYAO
  • #10
StatGuy2000 said:
@symbolipoint , the simple fact is that given the sheer volume of data made available on YouTube (among other services) it is simply impractical to expect that humans can check and moderate over all such content. Some form of automation is required -- that's just a fact.

The issues here (at least as I suspect) are that the learning algorithm is not only learning beyond what is in the training set, but is also learning the random noise in the data, thus reducing prediction error (the errors of which leads to the strikes on perfectly acceptable content). This is a classic case of what machine learning researchers and statisticians refer to as "overfitting".

Overfitting is an issue that plagues the field of learning algorithms, and engineers/scientists at YouTube should be paying closer attention to this.
This is such a good point and I couldn't agree more.

Reporting inappropriate content in Youtube is purely based on human individual's perception. Sometimes, these reports are wrong, but the learning algorithm probably takes it seriously and cannot perfectly distinguish the difference between genuine reports and false alarms. So just like you say, it also fits noises.

I've seen several occasions of channels being deleted by haters making up false reports, and the thing is, it works. The account actually do get deleted by false reports.

I really hope Youtube find a better algorithm for more robust fitting.
 

FAQ: YouTube purging chemistry videos

1. What is "YouTube purging chemistry videos"?

YouTube purging chemistry videos refers to the recent trend of YouTube removing or demonetizing videos related to chemistry and other scientific topics. This has sparked controversy and concern among creators and viewers who believe it is hindering the spread of educational content.

2. Why is YouTube purging chemistry videos?

The exact reasons for YouTube purging chemistry videos are not clear. However, it is believed that it is due to a combination of factors such as changes in YouTube's algorithm, stricter content policies, and pressure from advertisers.

3. Which chemistry channels are being affected by this purge?

Many popular chemistry channels have been affected by this purge, including NileRed, NurdRage, Cody'sLab, and Periodic Videos. However, smaller channels and individual videos have also been impacted.

4. Will YouTube continue to purge chemistry videos in the future?

It is uncertain if YouTube will continue to purge chemistry videos in the future. However, many creators are taking precautions by diversifying their content and creating backup channels on other platforms.

5. What can be done to address the issue of YouTube purging chemistry videos?

Many creators and viewers have been voicing their concerns to YouTube through social media and various petitions. It is also important for viewers to support creators by watching their videos and engaging with their content on other platforms. Additionally, creators can explore alternative video hosting platforms or collaborate with other creators to promote their content.

Similar threads

Replies
4
Views
2K
Replies
4
Views
2K
  • Sticky
Replies
2
Views
497K
Back
Top