Archived Blog

Web 2.0 and Cyberterrorism

05.28.2009 - 1:56 PM

Social networking over the Internet has boomed in recent years because it allows networks of like-minded individuals to collaborate and connect, regardless of their respective geographies or physical location. Often these groups are joined by common interests or passions. However, precisely because of this ease of connection, Websense® Security Labs™ researchers are seeing an increase in those connected in another way…through Hate and Terror.

This issue has been at the forefront of media attention over the past several weeks, with everyone from Michael Arrington of TechCrunch, to Brian Cuban (Mark Cuban's brother and company attorney), to the Simon Wiesenthal Center, to the Electronic Frontier Foundation, all vocalizing their opinions about the hate groups on Facebook, including the groups "1,000,000 for the TRUTH about the Holocaust", "Holocaust Is a Holohoax", and "Based on the facts … there was no Holocaust".

Recently, the Simon Wiesenthal Center reported a 25 percent growth last year in Web sites, social-networking communities, portals, blogs and online chat rooms promoting racial violence, anti-Semitism, homophobia, hate music, and terrorism.

Researchers at Websense Security Labs have also seen a substantial increase in the occurrence of hate or militant content residing on Facebook and other popular Web 2.0 sites such as YouTube, Yahoo! Groups, and Google Groups. In fact, looking at the Websense categories “Militancy and Extremist” and “Racism and Hate” from January through May 2009, Websense added three times the number of sites to these categories in the Websense master database than were added in the same period in 2008.

Discovering and classifying objectionable and offensive content, particularly on Web 2.0 sites like Facebook and YouTube, can be extremely challenging. The remainder of this blog entry is devoted to showing examples of “Militancy and Extremist” and “Racism and Hate” pages found on these sites, and examining how researchers find and classify this content.

Here are a few examples of what Websense Security Labs has found:

Facebook: 

 

YouTube: 

 

On Google Groups: 

 

On Yahoo! Groups: 

 

The challenges

Finding these sites can be challenging, primarily because the site owners often try to hide them, or make the sites accessible only to their selected “friends.” The combination of deep human knowledge and sophisticated machine learning algorithms makes it possible for Websense to find content like this on Web 2.0 sites.

But that’s only the first step.

Once a YouTube video has been identified and classified (in this case, we identified a militant group video)...

 

...Websense Security Labs goes further, applying our resources to finding more sites of a similar nature. A wide breadth of peripheral information sends a site to our “police lineup.” User profile, channel subscribers, links in and out of a page, connections among users, the architectural style of a page, textual and multimedia content, among other attributes, all feed further into processes that help us uncover their “buddies.” We have learned that birds of a feather flock together. Once we identify one such site, we almost always find many more.

Here is another example we found through connections to the above page (in this case, a “Militancy and Extremist” category site): 

 

Websense keyword analysis reveals that militant and racist groups, for example, often use secret symbols or phrases that allow members to identify one another. “RAHOWA” is an acronym for “racial holy war" that is popular with many white supremacist groups. 

 

Numerical symbols such as “88” may also be an indication of a racist or militant site. “H” is the eighth letter of the alphabet. Put together twice, “HH” becomes shorthand for the Nazi greeting “Heil Hitler.” Some findings also indicate that many white supremacist groups have adopted symbols from pagan folklore, such as the Othala Rune: 

 

The Othala Rune did not originally have any racist implications and is still used by practitioners of some Pagan religions.

Correct classification of a page is often a complex matter. Features that appear to be hard evidence in one context, may give us an erroneous signal in another. For example, the Othala Rune symbol can be seen at the bottom of this page ("Racism and Hate"): 

 

It also appears in this person’s blog site in a non-racist context. This site would fall under the Websense category "Non-Traditional Religions, Occult and Folklore": 

 

Another challenge that Websense Security Labs takes on is to identify “hate” sites that at first glance appear to be news, shopping, or even philanthropy. There is often a fine line between sites that belong to news or media outlets, and those whose real aim is propaganda, rather than objective reporting. Our Websense area experts combine forces with our intelligent software systems to uncover this difficult to find, yet offensive content.

This effort is just one example of the many areas in which Websense stays on top of the ongoing cyber game in our efforts to protect our customers. In this case, uncovering the terrorism content, which is hiding...guess where? Yes, within YOUR popular Web 2.0 sites!

Websense recently commissioned a survey of 1300 IT managers about the implications, attitudes, and challenges of this and other Web2.0 activities at work. You can see the results of the survey and learn how businesses can enable the safe use of Web 2.0 at http://www.websense.com/content/web20-at-work.aspx.

Ruth Mastron, Senior Web Analyst, Content Analysis and Response
Eva Cihalova, Supervisor, Content Analysis and Response
Bookmark This Post: