Automated attacks on web applications

Automated attacks on web applications

Posted by HSSL Systems Integrator on Feb 19th 2021

Cybercrime has become big business, and scammers are increasingly turning to bots and automation to make their attacks more efficient and effective and help them avoid detection.

In December, Barracuda researchers analyzed a sample of two months of data on web application attacks blocked by Barracuda systems and found a massive number of automated attacks. The top five attacks were dominated by attacks performed using automated tools.

Nearly 20% the attacks detected were fuzzing attacks, which are done using automation to try to find the points at which applications break to exploit. Injection attacks were the next at about 12%, and most of the attackers were using automated tools like sqlmap to try to get into the applications. Many of these attacks were script kiddie-level noise, attacks being thrown at an application without reconnaissance to customize the attacks.

Bots pretending to be a Google bot or similar were a close third, accounting for just over 12% of the web application attacks analyzed. Application DDoS (distributed denial of service) was surprisingly prevalent, making up more than 9% of the sample Barracuda researchers analyzed, and it was being executed across all geographies. A small portion of attacks (less than 2%) come from bots blocked by site admins.

Here’s a closer look at the trends Barracuda researchers found in web app attacks and the ways cybercriminals are using automated attacks.

Highlighted Threat

Automated attacks : Automated attacks use bots to try to exploit vulnerabilities in web applications. These attacks can range from fake bots posing as Google bots to avoid detection to application DDoS trying to crash a site by subtly overloading the application.

Although, bot traffic is a fast-growing problem, it doesn’t mean cybercriminals are moving away from their old standbys. A large part of the attacks that Barracuda researchers analyzed are what could be considered classic web app attacks, such as injection attacks (12%) and cross-site scripting (XSS) (1%.) Most of the attack traffic came from reconnaissance tools or fuzzing tools being used to probe applications, as noted above.

Injection attacks are the top attack in the latest OWASP Top 10 and have been present in every iteration since the first list. They show no sign of going away given the relative ease of execution and the possibility of large returns for the cybercriminals. Cross-site scripting attacks (XSS) was also quite popular, and it was the third most common attack in this category.

The Details

A significant portion of the attack traffic analyzed targeted WordPress or PHP (typically the phpMyAdmin pages) vulnerabilities, 6.1% and 1.05% respectively. Quite a few of these attacks were sprayed against non-PHP or non-WP sites, leading researchers to conclude that some attackers are script kiddies. But, they’ll likely soon learn to perform better reconnaissance before running the attacks.

Request smuggling attacks had gone down to negligible amounts until the HTTP Desync attacks were recently revealed. Since then, smuggling attacks have come back in a big way. Barracuda research found that more than 60% of smuggling attacks used an invalid header. A third used multiple content length, and 3% had malformed content length.

smuggling attacks

Most of the attacks that Barracuda researchers observed against JSON APIs were testing out the boundary conditions, basically people attempting to fuzz the APIs. In 95% of these types of attacks, Max Number value was exceeded, and in nearly 4% of these attacks, Max Value length was exceeded. Researchers also saw other attack attempts in the traffic—XSS and SQL Injection attacks—but the volume of these types of attacks was very low to negligible in the sample. The researchers expect this to grow over the next year.

Data leak attempts in the sample data focused primarily on attempting to leak sensitive data like credit card numbers, and social security numbers etc. An overwhelming number of exfiltration attempts seen in the sample were for credit card numbers. Visa was the clear focus, accounting for more than three-quarters of these attacks. This was followed distantly by JCB with more than 20%, and Mastercard, Diners, and American Express at much smaller volumes.

The State of Encryption

Barracuda researchers also analyzed the current state of encryption. Traffic encryption prevents a variety of attacks, such as man-in-the-middle, and provides one layer of protection for users visiting websites. However, attacks can still occur within the stream.

Nearly 92% of the traffic Barracuda researchers analyzed over a two-month period between October and December 2020 is HTTPS. Less than 10% of traffic is served over HTTP. This is encouraging progress and good news for the state of web application security.

Browser vendors have been prioritizing TLS1.3 as the preferred protocol, and this is starting to have an impact on adoption of these more secure protocols.

Barracuda researchers found that few organizations have turned on the older SSLv3 protocol because it is far too insecure. Even among organizations that have turned it on, they have very little SSLv3 traffic. The same is true for TLS1.0 and TLS1.1, with use of these protocols declining rapidly, each accounting for less than 1% of the traffic analyzed.

A full 65% of the total traffic analyzed for this report used TLS1.3, the most secure protocol available today. About a third of HTTPS traffic is still over TLS1.2, and that number is slowly dropping.

When looking at browsers using TLS1.3 (based on the reported User Agent), Chrome was the most popular browser, used for 47% of the traffic, followed by Safari, which accounted for 34% of TLS1.3 usage. Surprisingly, Edge edged out Firefox for third place with 16%. Firefox was used for just 3% of traffic. Firefox losing ground to Edge is likely due to two things:

Corporate systems that preferred Internet Explorer are moving on to Edge

TLS1.2 shows a more surprising trend. Internet Explorer usage is higher than Chrome, with Internet Explorer accounting for more than half of traffic, and Chrome usage just below 40%. In comparison, Safari comes in at less than 10%, Firefox usage is even smaller.

Barracuda researchers found that auto updates for Chrome and Firefox are being applied quite a bit. Most of the browser versions seen in this analysis were the latest or within two versions of the latest release.

There are still a good number of people using Internet Explorer, but IE11 was the version used in the vast majority of cases, which shows a trend in the right direction, toward more up-to-date and secure browsers.

In comparison, Barracuda researchers found that automated traffic isn’t using much TLS1.3; most of it uses TLS1.2. This includes site monitors, bots, and tools like curl.

How to protect against automated attacks

When it comes to protecting against newer attacks, such as bots and API attacks, defenders can be overwhelmed at times due to the number of solutions required. The good news is that these solutions are consolidating into WAF/WAF-as-a-Service solutions, also known as Web Application and API Protection services (WAAP).

As Gartner has stated in the 2020 WAF Magic Quadrant:

“Gartner defines WAAP services as the evolution of cloud WAF services. WAAP services combine cloud-delivered, as-a-service deployment of WAF, bot mitigation, DDoS protection and API security, with a subscription model.”

Organizations should look for a WAF-as-a-Service or WAAP solution that includes bot mitigation, DDoS protection, API security, and credential stuffing protection and make sure it is properly configured.

It is also important to stay informed about current threats and how they are evolving. For example, in a recent webinar (now available on demand), Barracuda shared our predictions for the top three attacks that we expect applications to face this year: automated bot attacks, attacks against APIs, and attacks against software supply chains. These newer attacks have fewer protections and tend to be let through due to a lack of understanding and, in some cases, shadow applications being deployed without appropriate protections.