Misinformation on YouTube

Recently I read an article that I first found on Slashdot and later tracked down to the original from JAMA (The Journal of the American Medical Association) about a content analysis study of videos concerning vaccination on YouTube. Researchers at The University of Toronto took a sample of 153 videos on YouTube by searching for videos containing the keywords vaccination or immunization. They then coded the videos based on whether they conveyed a positive or negative message about vaccination. They looked at the specific scientific claims being make and also coded them as either substantiated or unsubstantiated using the Canadian Immunization Guide as a reference. 48% of the videos were found to have a positive message, 32% were negative, and 20% were ambiguous. It was troubling that they did find that the negative videos had a higher mean user rating (i.e. 1-5 stars) as compared to the positive videos. 14.3% of the sample (22 videos) conveyed messages that were not substantiated with reference to the Canadian immunization guide.

The way this was reported in some news outlets (including the press release) was itself misleading. It was reported that 45% “of those videos” contradicted the Canadian reference guide, however, it wasn’t 45% of the total sample but rather 45% of only the negative videos that contradicted the reference guide. So the percentage of the entire sample (22 / 153) is 14.3%, considerably less alarming or big than was reported. So misinformation is still an issue on YouTube, but the magnitude of the reported effect wasn’t stated clearly; yet another reason to go back to the original source.

Comments are closed.