If anyone still doubted at that point the connection between terrorism and Google’s video platform, the Daily Telegraph revealed that British counterterrorism police had been monitoring a cell of ISIS “wannabes” since March, and recorded its members discussing how to use YouTube to plot a vehicular ramming and stabbing attack in London. Terrorists have learned that YouTube can be as deadly a weapon as cars and knives.
YouTube and Google, by posting such videos, are effectively being accessories to murder. They are also inviting class-action lawsuits from families and individuals victimized by terrorism. They need to be held criminally liable for aiding and abetting mass murder.
In Arabic with French subtitles, the clip lauds terrorists “martyred for Allah.” User comments include: “beautiful… may Allah give us all the knowledge and power to accelerate our imams.” In other words, the pictures of smiling terrorists and their dead bodies serve as an inspiration to young Muslims seeking Paradise through martyrdom. This is not theoretical. According to the website Wired UK, as of June 5, there were 535 terrorist attacks around the world — with 3,635 fatalities — since the beginning of 2017 alone.
In mid-March this year, major companies began withdrawing or reducing advertising from Google Inc., the owner of YouTube, for allowing their brand names to pop up alongside videos promoting jihad, a new report released on June 15 by the Middle East Research Media Institute (MEMRI) reveals.
According to the report — which documents the failure of Google to remove jihadi content that MEMRI volunteered to assist in flagging — thus far, AT&T, Verizon, Johnson & Johnson, Enterprise Holdings and GSK are among the companies pulling their ads from the platform. Google responded by promising to be more aggressive in ensuring brand safety of ad placements.
Then came the Westminster attack. On March 22, 2017, Khalid Masood rammed his car into pedestrians — killing four people and wounding dozens of others – then stabbed an unarmed police officer to death.
Exactly two months later, on May 22, Salman Ramadan Abedi detonated a shrapnel-laden homemade bomb at the Manchester Arena, after a concert by American singer Ariana Grande. The blast killed 22 people and wounded more than 100 others.
On June 3, ahead of Britain’s general election five days later, Khuram Shazad Butt, Rachid Redouane and Youssef Zaghba murdered eight people and wounded 48 others in a combined van-ramming and stabbing attack on London Bridge.
On June 6, Britain’s three main political parties pulled their campaign advertisements from YouTube, after realizing that they were placed in or alongside jihadi videos.
If anyone still doubted at that point the connection between terrorism and Google’s video platform, the Daily Telegraph revealed that British counterterrorism police had been monitoring a cell of ISIS “wannabes” since March, and recorded its members discussing how to use YouTube to plot a vehicular ramming and stabbing attack in London.
Appallingly, the surveillance did nothing to prevent the carnage. It did provide further evidence, however, that jihadis purposely use the major online platform to spread their message and recruit soldiers in their war against the West and any Muslims deemed “infidels.” Terrorists have learned that YouTube can be as deadly a weapon as cars and knives.
Nor could Google claim that it is unaware of the increasing pernicious use of its platform, or that it lacks the algorithmic tools to monitor YouTube’s massive traffic – involving 1.3 billion users and 300 hours of video uploaded every minute.
In the first place, complaints about jihadi content have been lodged by individuals and organizations for years. Secondly, Google vowed to tackle the problem through a flagging feature that alerts YouTube to material that “promotes terrorism.” Furthermore, YouTube itself claims: “Our staff reviews flagged videos 24 hours a day, 7 days a week to determine whether they violate our Community Guidelines.”