Grey bar Blue bar
Share this:

Tue, 11 Dec 2007

Rob Auger from OWASP/WASC/CGiSecurity on Timing..

Rob had a rant on his site on the timing attack, with a CSRF twist.. We met him after our Vegas talk, but im not really sure how his attack differs from our published one..

my on-list response:

From: haroon meer 
Subject: Re: [WEB SECURITY] Performing Distributed Brute Forcing of CSRF
vulnerable login pages

Hi Robert..

Thanks for the kind words on the talk.. If you check out the visio at: you will see that its pretty much the same attack.. In a shameless display of self-pimpage, check out the paper from page 12.. Figure 23 for example shows the results in a victim/zombies browser, after he has visited our page.. Effectively he tries the userlist we send him (in this case on a standard squirrelmail login page). Once he detects a timing diff (again using a trivial algorithm to avoid latency disparity) he simply makes another request to the attacker to report his success..

We do give the important pieces of the script in the paper, but i suspect anyone with 2 minutes of time could have cobbled them together anyway..



Fri, 21 Sep 2007

BotNets not just for SPAM any more

The Symantec Security blog has an article titled "Botnets: not just for spamming anymore". Interestingly we are now starting to see the use of botnets for more than just simple spamming (or simpler DoS attacks).

Its pretty cool (in a twisted sort of way), because this is one of those things we called out a long time ago, predicting that botnets were way under-used as a form of cheap distributed computing. We have been mentioning its potential for effectively minimizing the key-space of session-ids and it looks like its starting to rear its head..

Its one of those combination "i expected this a long time ago" && "oh #$@#.. lets hope it doesnt catch" moments..

Fri, 3 Aug 2007

Late BlackHat Update..

ok.. so im in my room finally catching up on sleep (or will be in a few minutes) while most people are finishing Microsofts booze at the PURE microsoft party.. BlackHat is over, which means tomorrow we are off to the riviera for defcon..

Marco and i got a lot of positive feedback from our talk, including from guys like rob auger of wasc fame and andrew bortz who we quote in our paper, so it was pretty cool.. all our demos went of smoothly (where one of them was using javascript (and timing) to create a distributed brute-forcing tool, which had every opportunity to go south) so we were happy..

Most of the SensePost'ers have been making notes so they can blog on talks they attended.. this will filter through in the next few days.. I caught the Erratasec talk this morning, and have a few thoughts, but ill wait till i have time to actually comment properly..

Last night we found a patch of parking lot to have the first SensePost vs. BlackHat crew (.za vs USA?) Soccer/Football game.. We started off being kicked outa the Palace Ballroom and ended up on a patch of ground outside but it ended up being awesome.. Ultimately, Bradley, Charl, Marco, Nick and I ended up taking on Grifter, DedHed, Joe Grand, Dave, and Chris(?)  while j0hnny long was official photographer..

It was all around awesomeness and probably the most fun i personally had in vegas in a while..

We promised to upload one of the tools we demo'd during the talk, so ill do that in a few minutes..


Tue, 5 Jun 2007

Re: Jeremiah Grossmans "How to find your websites"

Jeremiah from WhiteHatSec has just written a quick piece on how to find your websites. Now Footprinting is obviously dear to our hearts, with 3 Blackhat talks on it (or applications of it) ("Automation - Deus ex Machina or Rube Goldberg Machine?", "Putting The Tea Back Into CyberTerrorism", "The Role of Non Obvious Relationships in the Foot Printing Process"), a commercial tool almost dedicated to it, and a full blown chapter on it in Open Source Penetration Testing by charl and gareth. Footprinting is a genuinely important part of a companies security assessment, cause it doesn't matter if they have multi-layer firewalls and WAF's protecting the web app on their, and an old barely used sql-injectable form on their site that lets you grab SA on their SQL server anyway.. (Now that the shameless self promotion is over..) i wanted to touch on an interesting aspect of webserver discovery that is often skipped, and thats the issue of multiple websites running as name based virtual hosts on the same web-server. There was a time (not so long ago) when all of the popular scanning tools, failed to take into account that scanning was not the same as scanning (or which happens to be on the same ip address).

Quick Virtual Host Refresher:

An HTTP/1.1 compliant browser (you will struggle to find one that is not) sends along an additional required field when requesting a website, the Host: header.

So.. while a GET on our website looked like this using HTTP/1.0:

haroon$ telnet 80
Connected to
Escape character is '^]'.
GET / HTTP/1.0

HTTP/1.1 200 OK With HTTP/1.1 you also have to specify a host-header:

haroon$ telnet 80 Trying Connected to Escape character is '^]'. GET / HTTP/1.1 Host:

HTTP/1.1 200 OK

This allows the web-server to correctly route the request to the name based virtual host running on it.What should be obviously apparent is that in the above example, attacking != attacking != attacking

There is every possibility that a highly vulnerable CGI exists on which will not exist under or

Therefore, if you _were_ running a cgi scanner like nikto or wikto against the IP, you would probably miss the cgi that (in many cases) would have allowed you to compromise the host.

So.. 3 quick tips on this..

  1. use one of the many online sources that attempt to map ip addresses to other websites running on them [,] (you might just find and get the beta version of their website prior to them turning on their sanitization features)
  2. if you are using a scanner, make sure you are aiming it right..
  3. if you are hosting your site at some ISP, ensure that you know who are hosted with. You could get owned just because someone else on the same box happened to have sloppy code (and the web-server setup doesn't segregate you properly).