YouTube purging chemistry videos

  • #1
Ygggdrasil
Science Advisor
Insights Author
Gold Member
2019 Award
2,999
2,453

Main Question or Discussion Point

Chemistry World (a magazine published by the UK Royal Society of Chemistry) is reporting that YouTube seems to indiscriminately be taking chemistry videos off of the site and banning the creators of those videos:
‘Seven years I ran the channel for, slowly getting it to 8000 subscribers and 1 million total views. It was gone in less than two hours and I could not stop it,’ says chemistry PhD student Tom from Explosions&Fire about his collection of over 90 videos. In late 2017, he received a strike for a video published four years prior. Strikes are given for content that violates YouTube’s community guidelines. Tom removed the flagged video, but received strikes for another two videos. The channel was suspended and not reinstated despite multiple appeals. He created a new channel, but says that ‘it is only a matter of time before it goes again’.

While Tom’s focus on energetic compounds might have put him in a grey zone of YouTube’s rules around dangerous content, other areas of synthetic chemistry have not been left untouched. In early 2018, a channel called ChemPlayer, run by an anonymous group of chemists, was terminated. It had received three strikes in quick succession for videos on phenylacetic acid synthesis, Grignard reactions and chocolate cake making.
https://www.chemistryworld.com/news/hobby-chemists-fall-foul-of-youtubes-content-purge/3009206.article

Writers at Chemistry World argue that the removal of these videos is problematic as it removes a valuable resource for chemistry education and reinforces (and likely stems from) public misconceptions about chemistry:
We now know of a number of chemistry channels, some containing hundreds of videos, that have been taken down. It seems that censors with an itchy banning finger have pulled them because they contravene the video sharing platform’s community guidelines in some way. Exactly how is unknown as YouTube doesn’t provide detailed reasoning when it removes a video. It seems that these videos have been swept up in the media giant’s response to pressure from a number of governments – including the UK’s – to clean up its act and take down objectionable and illegal content. Chemistry videos being purged from video sites may sound like a trivial matter, but it’s important: these videos are inspiring the next generation of chemists.

While we may have been inspired by school lab demonstrations when we were growing up, this generation of schoolkids is lucky enough to also have a vast repository of videos covering every imaginable topic. These cover everything from the barking dog (an exothermic reaction I imagine most schools would consider too risky to perform) to experiments explaining everyday processes. It’s not just for schoolchildren either: hobby chemists can be found performing sophisticated syntheses too.
https://www.chemistryworld.com/news/the-ban-wagon-rolls-on/3009290.article
 

Answers and Replies

  • #2
Bystander
Science Advisor
Homework Helper
Gold Member
5,173
1,178
"You Tube?" Manny to Diego in Ice Age, "Diego, spit that out. You don't know where it's been." Or something to that effect.
 
  • #3
HAYAO
Science Advisor
Gold Member
319
153
Youtube has been inappropriately removing and banning videos/channels these days. Seriously, they need a better filtering AIs.
 
  • #4
symbolipoint
Homework Helper
Education Advisor
Gold Member
5,852
1,035
Maybe YouTube is not ready to be the best online site for some kinds of assessing of videos, but YouAlwaysHaveOtherOptions if you are producing and want to upload videos to some other internet sites. A problem is that some inappropriate videos can still be uploaded onto alternatives, too.
 
  • #5
Borek
Mentor
28,360
2,751
Sigh. But what to expect if even PF is not free of occasional chemophobia?
 
  • #6
StatGuy2000
Education Advisor
1,741
827
I suspect that the learning algorithm that was implemented for screening inappropriate content in YouTube must have been too stringent in creating an association between "chemistry" and dangerous materials/methods (thinking specifically of bomb making or chemical weapons such as sarin gas), and thus have been flagging educational material indiscriminately.

This, IMHO, highlights the risk of putting too much trust in black-box algorithms in filtering content, and for a call to have more open-source, interpretable algorithms to be implemented, if possible.
 
  • #7
Ygggdrasil
Science Advisor
Insights Author
Gold Member
2019 Award
2,999
2,453
I suspect that the learning algorithm that was implemented for screening inappropriate content in YouTube must have been too stringent in creating an association between "chemistry" and dangerous materials/methods (thinking specifically of bomb making or chemical weapons such as sarin gas), and thus have been flagging educational material indiscriminately.

This, IMHO, highlights the risk of putting too much trust in black-box algorithms in filtering content, and for a call to have more open-source, interpretable algorithms to be implemented, if possible.
Agreed. A lot of problems with YouTube, Google, Facebook etc. stems from their over-reliance on algorithms to curate and moderate content. The companies do not want to cut into their profits by hiring people to perform these tasks. Unfortunately, this leads to some rather bad situations:

The algorithm has been found to be promoting conspiracy theories about the Las Vegas mass shooting and incentivising, through recommendations, a thriving subculture that targets children with disturbing content such as cartoons in which the British children’s character Peppa Pig eats her father or drinks bleach.

Lewd and violent videos have been algorithmically served up to toddlers watching YouTube Kids, a dedicated app for children. One YouTube creator who was banned from making advertising revenues from his strange videos – which featured his children receiving flu shots, removing earwax, and crying over dead pets – told a reporter he had only been responding to the demands of Google’s algorithm.
https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts-truth

There are also many cases of the Google and Facebook algorithms aiding in the spread wrong information or deliberately false information, as detailed in this nice piece from The Atlantic: https://www.theatlantic.com/technology/archive/2017/10/google-and-facebook-have-failed-us/541794/

Algorithms can also have severe problems as they not only learn useful information from their training sets, but they often also pick up the biases and predjudices of society along the way:
Google’s translation program decided that soldiers, doctors and entrepreneurs were men, while teachers and nurses were women. Overwhelmingly, the professions were male. Finnish and Chinese translations had similar problems of their own, Quartz noted.

What was going on? Google’s Translate tool “learns” language from an existing corpus of writing, and the writing often includes cultural patterns regarding how men and women are described. Because the model is trained on data that already has biases of its own, the results that it spits out serve only to further replicate and even amplify them.

It might seem strange that a seemingly objective piece of software would yield gender-biased results, but the problem is an increasing concern in the technology world. The term is “algorithmic bias” -- the idea that artificially intelligent software, the stuff we count on to do everything from power our Netflix recommendations to determine our qualifications for a loan, often turns out to perpetuate social bias.
https://www.politico.com/agenda/story/2018/02/07/algorithmic-bias-software-recommendations-000631

The problem is worse in the case of the chemistry videos, as an untrained human would also have trouble distinguishing safe vs unsafe content in chemistry (e.g. even non-chemistry moderators in PF have had trouble with this issue).
 
  • #8
symbolipoint
Homework Helper
Education Advisor
Gold Member
5,852
1,035
ygggdrasil, post #7,

So disappointing that quality checking is a matter of some automated "algorithm" and a real person does not view, think, and assess. Technological Progress, maybe!
 
  • #9
StatGuy2000
Education Advisor
1,741
827
ygggdrasil, post #7,

So disappointing that quality checking is a matter of some automated "algorithm" and a real person does not view, think, and assess. Technological Progress, maybe!
@symbolipoint , the simple fact is that given the sheer volume of data made available on YouTube (among other services) it is simply impractical to expect that humans can check and moderate over all such content. Some form of automation is required -- that's just a fact.

The issues here (at least as I suspect) are that the learning algorithm is not only learning beyond what is in the training set, but is also learning the random noise in the data, thus reducing prediction error (the errors of which leads to the strikes on perfectly acceptable content). This is a classic case of what machine learning researchers and statisticians refer to as "overfitting".

Overfitting is an issue that plagues the field of learning algorithms, and engineers/scientists at YouTube should be paying closer attention to this.
 
  • #10
HAYAO
Science Advisor
Gold Member
319
153
@symbolipoint , the simple fact is that given the sheer volume of data made available on YouTube (among other services) it is simply impractical to expect that humans can check and moderate over all such content. Some form of automation is required -- that's just a fact.

The issues here (at least as I suspect) are that the learning algorithm is not only learning beyond what is in the training set, but is also learning the random noise in the data, thus reducing prediction error (the errors of which leads to the strikes on perfectly acceptable content). This is a classic case of what machine learning researchers and statisticians refer to as "overfitting".

Overfitting is an issue that plagues the field of learning algorithms, and engineers/scientists at YouTube should be paying closer attention to this.
This is such a good point and I couldn't agree more.

Reporting inappropriate content in Youtube is purely based on human individual's perception. Sometimes, these reports are wrong, but the learning algorithm probably takes it seriously and cannot perfectly distinguish the difference between genuine reports and false alarms. So just like you say, it also fits noises.

I've seen several occasions of channels being deleted by haters making up false reports, and the thing is, it works. The account actually do get deleted by false reports.

I really hope Youtube find a better algorithm for more robust fitting.
 

Related Threads on YouTube purging chemistry videos

Replies
2
Views
3K
  • Last Post
Replies
16
Views
3K
Replies
3
Views
819
  • Last Post
Replies
1
Views
4K
  • Last Post
2
Replies
30
Views
6K
  • Last Post
Replies
5
Views
4K
  • Last Post
Replies
13
Views
690
  • Last Post
Replies
2
Views
5K
Replies
2
Views
1K
Replies
4
Views
745
Top