Archive for the 'hacking' Category

Following the big dogs on web application security

Friday, December 21st, 2007

(This post originally appeared as part of the 2007 PHP Advent calendar)

At this time of year people are apt to get all warm and sentimental … Right up until their first trip to a mall on a Saturday when they go back to hating their fellow man and instituting an “If Amazon don’t sell it, you’re not getting it” policy on gift giving. December is very important to retail, and very important to retail sites.

I remember some good advice I read a long time ago. Vincent Flanders & Michael Willis in Web Pages That Suck suggested you “follow the big dogs”, in other words copy Amazon. Their reasoning was sound. You will likely get it wrong on your first try, you can’t afford to run usability studies of your own, and don’t want to spend months and numerous iterations getting it right. Learning from other people’s mistakes is always less embarrassing than learning from your own.

I have had to paraphrase here, because I opted to recycle nearly all my old books rather than ship them half way around the world. Had I wanted to check the accuracy of my quote, it would have cost me one cent to buy a second hand copy of that book.

While the long term relevance of most of the advice in old computer books is fairly accurately reflected by that valuation, it was good advice in 1998. If you were embarking on an ecommerce venture at a time when there was a shortage of people who knew what they were doing, best practice conventions were not settled and innovation was rapid there were worse philosophies you could have than “What Would Amazon Do?”

The same idea is popular today, and for the same reason. There is always a shortage of people who really know what they are doing, so there are plenty of people making decisions by asking “What Would Google/Amazon/Microsoft/eBay/PayPal/Flickr/Yahoo/YouTube/Digg/Facebook Do?” If you are in a space where nobody really knows the best way yet, copying the segment leader is a low risk, low talent shortcut to making mainly good decisions, even if does mean you are always three months behind.

The idea does not apply well to web application security. There are two main reasons for this: first, the big dogs make plenty of mistakes, and second, good security is invisible.

You might notice mistakes, you might read about exploited vulnerabilities and you might notice PR based attempts at the illusion of security, but you probably don’t notice things quietly being done well.

Common big dog mistakes include:

  • Inviting people to click links in email messages.
    You would think that, as one of the most popular phishing targets out there, PayPal would not want to encourage people to click links in emails. Yet, if you sign up for a Paypal account, the confirmation screen requests that you do exactly that.

    Paypal Confirmation Screen

  • Stupid validation rules.
    We all want ways to reject bad data, but it is usually not easy to define hard and fast rules to recognize it, even for data with specific formatting. Everybody wants a simple regex to check email addresses are well formed. Unfortunately, to permit any email that would be valid according to RFC2822, a simple one is not going to cut it. Which means that many, many people add validation that is broken and reject some real addresses. Most are not as stupid as the one AOL used to have for signing up for AIM, which insisted that all email addresses ended in .com, .net, .org, .edu or .mil, but many will reject + and other valid non-alphanumeric characters in the local part of an address (the bit before the @).
  • Stupid censorship systems
    Simple keyword based censorship always annoys people. Eventually, somebody named Woodcock is going to turn up.
    Xbox Live is infamous for rejecting gamertags and mottos after validating them against an extensive list of “inappropriate” words. Going far beyond comedian George Carlin’s notorious Seven Dirty Words, there is a list of about 2700 words that are supposedly banned. By the time you add your regular seven, all possible misspellings thereof, most known euphemisms for body parts, racial epithets, drug related terms, Microsoft brand names, Microsoft competitors’ brand names, terms that sound official and start heading off into foreign languages, you end up catching a lot of innocent phrases.
  • Broken HTML filtering.
    Stripping all HTML from user submitted content and safely displaying the result is often done badly, but is not that difficult. On the other hand, allowing some HTML formatting as user input, but disallowing “dangerous” parts is not an easy problem, especially if you are trying to foster an ecosystem of third party developers.

    The MySpace Samy worm worked not because MySpace failed to filter input, but because of a series of minor cracks that combined allowed arbitrary JavaScript. Once you choose to allow CSS so that users can add what passes for style on MySpace it becomes very hard to limit people to only visual effects.

    eBay has had less well known problems with a similar cause, but without a dramatic replicating worm implementation. Earlier this year scammers were placing large transparent divs over their listings so that any click on the page triggered a mailto or loaded a page of their own. I could not see examples today, so I assume they have fixed the specific vector, but giving users a great deal of freedom to format content that they upload makes ensuring that content is safe for others to view very difficult.

  • Stupidly long urls
    The big dogs love long complicated urls.

          https://chat.bankofamerica.com/hc/LPBofA2/?visitor=&mse
          ssionkey=&cmd=file&file=chatFrame&site=LPBofA2&channel=
          web&d=1185830684250&referrer=%28engage%29%20https%3A//s
          itekey.bankofamerica.com/sas/signon.do%3F%26detect%3D3&
          sessionkey=H6678674785673531985-3590509392420069059K351
          97612

    Having let people get used to that sort of garbage from sites that they should be able to trust, you can’t really be surprised that normal people can’t tell the difference between an XSS attack hidden in URL encoded JavaScript and a real, valid, safe URI. Even abnormal people who can decode a few common URL encodings in their heads are not really scrolling across the hidden nine tenths of the address bar to look at that lot.

  • Looking for simple solutions
    Security is not one simple problem, or even a set of simple problems, so looking for simple solutions such as the proposed .bank TLD is rarely helpful. This is not helped by the vendor-customer nature of much of the computer industry. The idea that you can write a check to somebody and a problem goes away is very compelling - buy a more expensive domain name, or a more expensive Extended Validation Certificate, or run an automated software scan to meet PCI compliance and you might sleep more soundly at night, but many users already don’t understand the URL and other clues that their browser provides them. Giving more subtle clues to them is unlikely to help. Displaying a GIF in the corner of your web page bragging about your safety might create the illusion of security and might well help sales, but it won’t actually help safety on its own.

You can’t follow the public example of the big dogs. They still make some dumb decisions, they still make the small mistakes that allow the CSRF and XSS exploits that are endemic and they are often not very responsive to disclosures. If a major site makes 99 good security decisions and one bad one, you won’t notice the 99. Unfortunately with security you are still far better off seeing how others have been exploited and critically evaluating what they say they should be doing, rather than trying to watch what they actually are doing.

Oh, and remember to stay away from malls on weekends in December.

I ♥ register_globals

Tuesday, March 13th, 2007

I am aware that there are some things so shocking that you are not supposed to say them in polite company “Hitler had some good ideas”, “Tori Spelling is really pretty” or “I think I look really good in a beret” are all ideas so confronting that they are best kept to yourself regardless of how strongly you believe them.

I have a similarly shocking sentiment that I feel I have to share.

I really like register_globals in PHP.

There, I’ve said it. I can go away and order my I register_globals shirt now.

I (heart) register_globals

Sure, choosing to mingle untrusted user data and internal variables is a bad idea. Sure, if you are too lazy to initialise important variables with a starting value it gives you one extra way to shoot yourself in the foot. Sure, polluting global scope with form variables is going to be a mess in a larger app.

There remains something to be said for simple, elegant, readable ways to shoot yourself in the foot. PHP, like any reasonably complete programming language provides a whole host of other ways, so removing one is not particularly useful.

I used to teach PHP to beginners as a first programming language. I have introduced a few thousand complete novices to programming via PHP.

With register_globals on, this example is a short step from the “Hello World!” example:

<?php
if($name)
{
 echo "Hello $name";
}
else
{
 echo
  '<form>
   Enter your name: <input type="text" name="name">
   <input type="submit">
  </form>';
}
?>

It flows nicely from a “Hello World!” example. It can introduce variables and control structure if you did not provide an even softer introduction to them. It can be turned into an example with a practical use without making the code more complex.

This version may not look very different to you:

<?php
if($_REQUEST['name'])
{
 echo "Hello {$_REQUEST['name']}";
}
else
{
 echo
  '<form>
   Enter your name: <input type="text" name="name">
   <input type="submit">
  </form>';
}
?>

To an experienced eye, the two versions are almost identical. The second requires a little more typing, but nothing to get excited over.

To a complete beginner though, the second is a couple of large leaps away from the first. To understand the second version, somebody has to understand arrays, and PHP string interpolation. Both of these are important topics that they will have to come to in their first few hours of programming, but without register_globals, they stand in the way of even the most trivial dynamic examples.

I miss being able to assume register_globals as default behaviour. It made the initial learning curve far less steep. It made little examples cleaner and more readable. Like most safety measures, it does not really protect people who are determined to get themselves into trouble anyway. People who don’t understand the reasons behind it just run extract() or some code of their own to pull incoming variables out anyway. The user submitted comments in the manual used to be full of sample code for doing exactly that.

Oh, but just a side note to all beret wearing white supremacist Tori spelling fans, just because I am willing to speak up for one unpopular cause does not mean I am interested in yours. Sorry.

Fun with Alexadex

Monday, February 27th, 2006

In case you are not aware, Alexadex is a virtual stock market game, where the values of stocks depend on their Alexa reach ratings.

Because I have too much time on my hands, I wanted to track my portfolio value in the sidebar of my blog. Look over there somewhere —–> and you will probably see it.

In case it holds amusement value to somebody, here is the code. It relies on PHP and MySQL and just does some simple screen scraping.

The fact that this URL works:
http://alexadex.com/ad/api?&method=getQuote&url=lukewelling.com
hints that there might be an API to do this at some point, but for now, I am screen scraping. (url pulled from Cal Evans’ blog)

The database table looks like this:

CREATE TABLE alexadex (
  timestamp timestamp(14) NOT NULL,
  value int(11) NOT NULL default '0',
  PRIMARY KEY  (timestamp)
)

From a cron job I am running:

<?php
require('functions.php');

connectToDb();

$username = 'tangledweb';
$url = "http://alexadex.com/ad/user/$username";
$marker = 'total:</b></td><td align=right>$';

$current =  scrape( $url, $marker );
if($current!==false)
{
   echo "stored: ";
   storeCurrent($current);
}

echo $current; 

?>


In case it is not obvious, my Alexadex username is tangledweb.

In my blog sidebar I have:

<?php
require('functions.php');
echo '<li><a href = "http://alexadex.com/ad/user/tangledweb"
      >My current portfolio is $';
$temp = getMostRecentFromDb();
echo number_format($temp['value']).'</a>';
?>

The functions these rely on are:

function storeCurrent($value)
{
 $value = intval($value);
 $sql = "INSERT
         INTO alexadex
         VALUES (NOW(), $value)";
  $result = mysql_query($sql);
}

function getMostRecentFromDb()
{
  $sql = "SELECT *
          FROM alexadex
          WHERE 1
          ORDER BY `timestamp` DESC
          LIMIT 1";

  $result = mysql_query($sql);

  return mysql_fetch_array($result);
}

function scrape($url, $marker, $maxLength = 50)
{
  $page = file_get_contents($url);
  if($page === false)
  {
    return false;
  }
  $pos = strpos($page, $marker);
  if($pos === false)
  {
    return false;
  }
  $value= substr($page, $pos + strlen($marker), $maxLength);
  $value= str_replace(',', '', $value);
  $value= intval($value);
  return $value;
}

function connectToDb()
{
  $connection = mysql_connect("host",
                              "user",
                              "pass");
  mysql_select_db("dbname", $connection);
}

This code comes with no warranty of any kind. You can have it as public domain, but I would appreciate a link to this blog if you use it. I hope it still works. WordPress seems to really, really want to mess with it when it saves it.