Subscribe

Almost half of world's top Web sites risky

Admire Moyo
By Admire Moyo, ITWeb's news editor.
Johannesburg, 14 Dec 2016
Vulnerable software is the leading factor in classifying a site as risky, says Menlo Security.
Vulnerable software is the leading factor in classifying a site as risky, says Menlo Security.

Nearly half (46%) of the Internet's top one million Web sites, as ranked by Alexa, are risky.

This is according to a study released yesterday by US-based cyber security firm Menlo Security, which notes this is largely due to vulnerable software running on Web servers and on underlying ad network domains.

The firm says the results are significant because risky sites have never been easier to exploit, and traditional security products fail to provide adequate protection. Attackers have their veritable choice of half the Web to exploit, allowing them to launch phishing attacks from legitimate sites, it notes.

Menlo Security considers a site risky if the homepage or associated background sites are running vulnerable software, if the site is 'known-bad', or if it has had a security incident in the past 12 months.

Vulnerable software was the leading factor in classifying a site as risky, it points out. Of the one million sites, 355 804 were either running vulnerable software or accessing background domains running vulnerable software; 166 853 fell into known-bad categories; while 31 938 experienced a recent security incident.

Unknown territory

Another key finding was that background requests sending content to Web browsers outnumber user requests at a ratio of 25:1.

The culprit sites found in the study include destinations that are widely unknown by name; however, these large ad service networks are found hidden behind the world's most visited media sites and include sites for 24-hour news, weather sites and major metro newspapers.

"Browsing the Web is a leap into the unknown. We already knew that ad networks present risk to the public and businesses, but the extreme levels reached in 2016, affecting 46% of the most visited Web sites, mean that enterprises must address the problem," says Kowsik Guruswamy, CTO at Menlo Security.

Greg Maudsley, senior director of product marketing, and Jason Steer, solutions architect, EMEA at Menlo Security, point out that risk is not just associated with the sites that users visit. Each site a user visits generates requests to on average 25 requests to background Web sites, such as social media and advertising.

They explain business and economy Web sites feature highly when the information is sliced all ways for risky issues (third highest for known-bad sites, highest for domains associated with recent threat history, and highest for sites with known vulnerable software), but almost all businesses would not block access to business and economy, leading many businesses into a false sense of security and more cyber incidents to deal with.

The security experts say uncategorised Web sites are the highest type of sites associated with being known-bad, followed by adult and pornography as well as business and economy. Vulnerable software was found to be 16 years old in the one million subset.

Execution instructions

Menlo Security notes today, exploit kits are readily available to anyone, as are the instructional videos that provide step-by-step execution instructions.

The expertise requirement has all but vanished. Underscoring this point, the average age of suspected cyber attackers dropped from 24 to 17.

Compounding the issue, it says the vast majority of malware prevention products attempt to prevent attacks by distinguishing between "good" and "bad" elements, and then implement policies intended to allow "good" content and block the "bad".

In every case, the firm explains, the detection is never perfect, and thus the policy choice involves a level of risk that the wrong decision is being made. Additionally, enterprises regularly allow access to popular Web sites for the sake of productivity. Given the risk associated with nearly half of popular sites, a Web security strategy based on categorisation is effectively useless, says Menlo Security.

It adds although traditional phishing attacks involve the creation of a new imposter, or "spoofed" site, the sheer volume of vulnerable trusted sites makes it very easy for attackers to compromise a legitimate site and send that link as part of a phishing campaign.

With this approach, the firm notes, attackers no longer need to worry that URL filtering will thwart their efforts, and they avoid anomalies in the link address, such as misspellings, special characters, or numbers that might raise suspicions.

Clicking on what is a perfectly legitimate link within such a phishing e-mail can expose a user to a drive-by malware exploit that could deliver ransomware, or mark the beginning of a larger breach.

Risk mitigation

To mitigate the risks, Maudsley and Steer recommend new threat prevention techniques, such as isolation and remote browsing.

They add Web site owners should make sure they are running up-to-date software, patch vulnerable software, and ensure only trusted resource files are loaded from third-party servers. Users should use browser extensions and plugins that stop execution of JavaScript, stop using Flash altogether, and use Adblockers to block ads on sites where possible, they note.

Gartner recommends isolation as part of its adaptive security architecture - this approach does not try to distinguish between good and bad, but assumes all active content is risky.

Share