Posted by & filed under /dev/random.

The Cockoo’s Egg

This book was one of the first hacker books I read – and it still stands as the best. Following the discovery, tracing and eventually unmasking of a highly sophisticated computer espionage ring, this story is thrilling, and best of all: true. The book is still surprisingly relevant, and many of the method Clifford Stoll use to track down the hackers would still be viable today. The story starts with a simple calculation error in University of Berkeleys computer systems, and ends in a manhunt that lasted over a year, and during that time Clifford interfaced with a dozen or more three-letter agencies (CIA, FBI, NSA, CID and more). This book is one of the reasons that I wanted to work with informations security, and I’m looking forward to the day that someone can write a similarly enthralling tale of a post-2000 security incident (hello all wannabe writers, here’s a hint: Stuxnet).

The New School of Information Security

While this book is a brief to get through at a mere 160 pages, it contains an important message that I couldn’t agree with more: To open up the information security profession. Adam Shostack and Andrew Stewart challenges many of the dogmas that the industry is based on, and also offer concrete advice on how to avoid falling in the same trap as everyone else. The pages in this book are jammed with insight and deeply troubling answers on why the security profession largely has failed to accomplish protection for the Internet’s users.

The Black Swan

OK, this is not a hacker book at all. It’s certainly not technological, and it’s primarily meant for a non-scientific crown; economists (*shudder*). But it’s largely relevant for anyone that works within security as it about the consequence of the highly unlikely but high impact events, aka “Black Swans”. Most risk analysis concentrate on the risks that are treatable, but there’s always some risks that are so improbable (even though the impact may be catastrophic) that no risk treatment is needed. Nicolas Taleb argues that the failure to acknowledge these black swans are not only irrational, but that it also can lead to some of the greatest catastrophes because of their high impact. A must-read for not only hackers, but for everyone that doesn’t want to be wiped out by an “unforeseeable” event.

The Codebreakers

Forget The DaVinci Code: This brick of a book is an essential read if you like riddles, codes and mysteries. Who doesn’t? But be warned, the book is big and you deserve a medal if you are able to read it from start to finish. But the motivation should be clear: Every good hacker should know his/her cryptology, and this book is the de facto non-technical reference, providing the history of codes and cryptography from ancient times to the age of the computer. The history of breaking codes are told in a surprisingly understandable and fascinating way. You’ll be surprised of the impact codes and code-breaking has played in the history of men, from Egyptian hieroglyphs to the Enigma machines to the facilitation of secure Internet services.  To be able to understand cryptography will enable you to understand much of the fabric that we take for granted today and appreciate all the work that has been to applied to secure your everyday card+pin transactions.

The Girl with the Dragon Tattoo

I had to throw in a work of fiction here as well, hackers need to relax sometimes too. The Stieg Larsson books are not only well written, but also not totally unbelievable in terms of the hacking stunts that are pulled off. Well, it’s fiction of course, but the methods the main character Lisbeth Salander (aka “Wasp”) uses are very realistic and resemble the methods a real hacker would have used to get even on his/her adversaries. And she uses a MacBook Pro, got to love that.

Have I forgotten something? Do you disagree? Do you have any additions? Use your voice in the comments below.

Posted by & filed under Hacks, Security News.

Wired‘s Threat Level blog has a very good article on how not to run a professional information security services firm. HBGary Federal, that was recently hacked by the loosely attached group of hacktivists called Anonymous (press release here), has, it seems, fallen ill to some unknown spy movie virus when trying to unmask the group and related site WikiLeaks. Based on the data from the leaked e-mails from the firm, we can see how they plot to make major dough by performing cloak-and-dagger operations that I would say (note: IANAL) constitutes several federal crimes.

Some of the methods that HBGary Federal was to suggest to use against WikiLeaks are, well, frankly quite disturbing. Social mining leading to something that looks like extorsion and hacking the WikiLeaks servers in Sweden is just some of the suggestions. Wow.

Posted by & filed under /dev/random.

A couple of days ago I wrote about how I was migrating this site away from Bluehost due to abysmal response times lately. Well, the results are just in, and I thought I’d share it with you. I went for HawkHost, and my experience so far has been great, just look at this:

I use pingdom.com to monitor my uptime and server responsiveness, and they’ve akso got a sweet iPhone app. The graph is showing Break & Enter’s response times the last couple of months. Anyone want to guess when I migrated this blog?

Say no more.

Posted by & filed under Security News.

Researchers at the Fraunhofer institute has done some interesting research on physical access attacks on iPhones and iPads. Turns out, if you have physical access to a turned off and locked iOS device, the process of getting all passwords on the phone boils down to three simple steps:

  1. Jailbreak it, thus getting SSH access
  2. Copy a specially crafted keychain access script
  3. Execute script

No cracking or cryptanalysis involved.

The keychain is the iOS place for password storage, and it encrypts all its secrets using AES-256.

The attack works by using system calls to unlock different parts of the keychain. You would think that this requires an user to enter his password, but for typical time-sensitive services as network connections and VPN tunnels, a security tradeoff has been made, and no password is needed to decrypt the secrets.

As an example, GMail passwords were found to be protected and required the user passcode, but WPA keys for wireless access and Microsoft Exchange passwords were not. It’s also worth mentioning that getting access to one password often means getting access to others, since users are prone to re-use their passwords or store passwords in other password-protected places such as email accounts.

Bottom line is; if you loose your device, initiate a change of all your passwords.

Full paper here.

Posted by & filed under /dev/random.

I’m currently migrating this site away from Bluehost, and I can’t say I’m going to miss them. Repeated downtime (uptime @99.7 % and going down), sharing server with  DDoS targeted sites and a response time from hell finally got me looking for a new host.

Although Bluehost was cheap there’s plenty of (cheap) fish in the sea, and I’ve also learned a valuable lesson that I think many others can relate to: “Unlimited” bandwith is not a good thing, it simply means that performance is severely limited because the hosting provider needs to capitalize on other things, in my case by serving my site from an overloaded server.

Well enough of that allready, I’m going to spend the next few days tweaking this site to optimum performance. Stay tuned!

Posted by & filed under /dev/random.

While testing the small python banner grabbing script I made a couple of weeks ago, I noticed some strange headers from the Norwegian newspaper VG:

$ ./pybgrab.py http://www.vg.no
[+] http://www.vg.no:
[!] No 'Server' header received, the response contained the following headers:
X-VG-WebServer: leon
Cache-Control: must-revalidate
Vary: Accept-Encoding
Content-Type: text/html; charset=UTF-8
X-VG-WebCache: dexter
X-Rick-Would-Never: Run Around and Desert You
X-VG-Korken: http://www.youtube.com/watch?v=Fcj8CnD5188
X-VG-SolveMe: uggc://jjj.it.ab/ynxfrgngg.cuc
Content-Length: 317406
Date: Fri, 04 Feb 2011 19:17:01 GMT
Connection: close
X-Cache: HIT
X-Cache-Hits: 578
X-Age: 387

Those are not exactly RFC2616 headers, to put it mildly. It contains a Rickroll, a youtube video of the VG Multimedia team in the famous sled ride “Korketrekkeren” in Oslo, and a riddle:

X-VG-SolveMe: uggc://jjj.it.ab/ynxfrgngg.cuc

The riddle is fairly easy to solve (hint: it’s a famous cipher). The solution leads you to a web page, but the last part of the riddle is missing; VG seems to have removed the bas64-encoded headers that contained the URL to a job ad for the newspaper. Hard times in the news industry maybe?

Posted by & filed under Tools & Methodology.

Recently I’ve been checking the patch level on a LOT of Microsoft servers, mostly versions of Microsoft Server and Microsoft SQL Server. Microsoft has a great tool for this, the Microsoft Baseline Security Analyzer. It’s legacy software, but it’s free of charge and still works like a charm.

Obviously, not all administrators are too keen on installing an application on their precious servers, or let me plug my PC into their corporate network in order to perform a remote scan. So often I’m forced to use the standalone version of MBSA and run it through the command line. Since the CL tool is kind of picky, I’ve created a quick tutorial on how to utilize it here.

  1. First, you need the MBSA, downloadable from here
  2. To make the tool standalone, you’ll need to install it somewhere and copy out the mbsacli.exe and wusscan.dll files to a temporary catalog (the default install location is C:\Program Files\Microsoft Baseline Security Analyzer 2\)
  3. You’ll also need Microsoft latest wsusscn2.cab file, which contains details on all the latest updates from Microsoft. You can download it here
  4. Select those three files (or zip them down) and copy them to the server you wish to test
  5. Open a command window, cd into the directory containing the files (on the server), and issue the following command:
mbsacli /xmlout /catalog wsusscn2.cab /unicode /nvc >results.xml

The resulting results.xml file should contain all Microsoft patches that are targeted towards that particular server, both installed and uninstalled. So if the server is a Microsoft Server 2008 running SQL Server 2005, the file will contain both patches for Microsoft Server 2008 and SQL Server 2005.

The easiest way of analyzing the results are to open the file in Excel. Here you can easily filter out all installed patches for example, and gain a quick overview if the server is under a good patch management process or not.

In the example output below, the server is missing a lot of patches (sorted by severity), which makes it vulnerable to a wide range of exploits:

Example output from MBSA in Excel

I’ve had some issues running this on older installations of Windows software such as Windows 2000 and Windows NT (yeah I know, but some people still use non-supported Microsoft products), but for most properly configured servers the above command works as intended. Drop me a comment if you have any trouble!

Posted by & filed under Tools & Methodology.

I’m currently experimenting with Python 3, and made myself a simple HTTP/HTTPS python banner grabber. The code is included below, I hope someone finds it useful.

The code can take input URLs from file or rom the command line, and defaults to just printing the HTTP ‘Server’ header.

Update: Thanks to Jeff who pointed out an obvious flaw in my first version of this script. I’ve thrown in some error handling when servers returns unexpected headers as well. See below for an updated version of the script.


#!/usr/bin/env python3.1
'''
Copyright (c) 2011 Carsten Maartmann-Moe

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

Created on Jan 22, 2011

@author: Carsten Maartmann-Moe <[email protected]>
'''
import getopt
import sys
import urllib.request
import locale

ALL_HEADERS = False
VERBOSE = False

def main(argv):
    encoding = locale.getpreferredencoding()
    input_list = None

    try:
        opts, args = getopt.getopt(argv, 'hi:va', ['help', 'input-list=',
                                                   'verbose', 'all-headers'])
    except getopt.GetoptError as err:
        print(err)
        usage()
        sys.exit(2)
    for opt, arg in opts:
        if opt in ('-h', '--help'):
            usage()
            sys.exit()
        elif opt in ('-i', '--input-list'):
            input_list = arg
        elif opt in ('-v', '--verbose'):
            global VERBOSE
            VERBOSE = True
        elif opt in ('-a', '--all-headers'):
            global ALL_HEADERS
            ALL_HEADERS = True
        else:
            assert False, 'Unhandled option: ' + opt

    if len(args) < 1: # Print usage if no arguments are given
        usage()

    urls = args # The remainder of arguments are treated like URLs

    if input_list: # If the user supplied a input list, add those URLs
        read_file(input_list, encoding, urls)

    headers = {} # Dictionary for storing headers with URLs as keys
    for url in urls:
        headers[url] = grab_headers(url)

    print_results(headers)

def read_file(filename, preferred_encoding, urls):
    ''' Reads a file and appends the lines to the 'urls' parameter '''
    line_num = 0
    with open(filename, encoding=preferred_encoding) as file:
        for line in file:
            line_num += 1
            urls.append(line.rstrip())

def grab_headers(url):
    ''' Grabs the headers of a from the response of a given URL '''
    response = None
    try:
        response = urllib.request.urlopen(url)
    except Exception as e:
        print("Could not complete HTTP request: " + str(e))
        sys.exit(1)
    return response.info()

def print_results(headers):
    for url in headers:
        print('[+] ' + url + ':')
        if ALL_HEADERS:
            print(headers[url])
        else:
            try:
                server_header = dict(headers[url])['Server']
                if server_header:
                    print(server_header)
            except KeyError as e:
                print('[!] No \'Server\' header received, the response ' +
                      'contained the following headers:')
                print(headers[url])

def usage():
    print('''Usage: ./pybgrab.py [OPTIONS] URL

Supply an URL to grab the web server's 'Server' HTTP Header.

    -a, --all-headers:    Returns all HTTP Headers instead of just the
                          'Server' header
    -h, --help:           Displays this message
    -i FILE, --input-list=FILE:
                          Specify a list of URLs in an input file
    -v/--verbose:         Verbose mode''')

if __name__ == '__main__':
    main(sys.argv[1:])

Posted by & filed under /dev/random, Tools & Methodology.

Needed a working whois for an ongoing project, so I quickly ported this snippet of whois code to Python 3, the result is below:

"""
Whois client for python

transliteration of:

http://www.opensource.apple.com/source/adv_cmds/adv_cmds-138.1/whois/whois.c

Copyright (c) 2010 Chris Wolf

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in
all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.

  Last edited by:  $Author$
              on:  $DateTime$
        Revision:  $Revision$
              Id:  $Id$
          Author:  Chris Wolf
"""
import sys
import socket
import optparse
import locale
#import pdb


class NICClient(object) :

    ABUSEHOST           = "whois.abuse.net"
    NICHOST             = "whois.crsnic.net"
    INICHOST            = "whois.networksolutions.com"
    DNICHOST            = "whois.nic.mil"
    GNICHOST            = "whois.nic.gov"
    ANICHOST            = "whois.arin.net"
    LNICHOST            = "whois.lacnic.net"
    RNICHOST            = "whois.ripe.net"
    PNICHOST            = "whois.apnic.net"
    MNICHOST            = "whois.ra.net"
    QNICHOST_TAIL       = ".whois-servers.net"
    SNICHOST            = "whois.6bone.net"
    BNICHOST            = "whois.registro.br"
    NORIDHOST           = "whois.norid.no"
    IANAHOST            = "whois.iana.org"
    GERMNICHOST         = "de.whois-servers.net"
    DEFAULT_PORT        = "nicname"
    WHOIS_SERVER_ID     = "Whois Server:"
    WHOIS_ORG_SERVER_ID = "Registrant Street1:Whois Server:"


    WHOIS_RECURSE       = 0x01
    WHOIS_QUICK         = 0x02

    ip_whois = [ LNICHOST, RNICHOST, PNICHOST, BNICHOST ]
    
    language, encoding = locale.getdefaultlocale()

    def __init__(self) :
        self.use_qnichost = False
        
    def findwhois_server(self, buf, hostname):
        """Search the initial TLD lookup results for the regional-specifc
        whois server for getting contact details.
        """
        nhost = None
        parts_index = 1
        start = buf.find(NICClient.WHOIS_SERVER_ID)
        if (start == -1):
            start = buf.find(NICClient.WHOIS_ORG_SERVER_ID)
            parts_index = 2
        
        if (start > -1):   
            end = buf[start:].find('\n')
            whois_line = buf[start:end+start]
            whois_parts = whois_line.split(':')
            nhost = whois_parts[parts_index].strip()
        elif (hostname == NICClient.ANICHOST):
            for nichost in NICClient.ip_whois:
                if (buf.find(nichost) != -1):
                    nhost = nichost
                    break
        return nhost
        
    def whois(self, query, hostname, flags):
        """Perform initial lookup with TLD whois server
        then, if the quick flag is false, search that result 
        for the region-specifc whois server and do a lookup
        there for contact details
        """
        #pdb.set_trace()
        s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
        s.connect((hostname, 43))
        if (hostname == NICClient.GERMNICHOST):
            s.send(("-T dn,ace -C US-ASCII " + query + "\r\n").encode())
        else:
            s.send((query + "\r\n").encode())
        response = b""
        while True:
            d = s.recv(4096)
            response += d
            if not d:
                break
        s.close()
        #pdb.set_trace()
        nhost = None
        if (flags & NICClient.WHOIS_RECURSE and nhost == None):
            nhost = self.findwhois_server(response.decode(), hostname)
        if (nhost != None):
            response += self.whois(query, nhost, 0)
        return response
    
    def choose_server(self, domain):
        """Choose initial lookup NIC host"""
        if (domain.endswith("-NORID")):
            return NICClient.NORIDHOST
        pos = domain.rfind('.')
        if (pos == -1):
            return None
        tld = domain[pos+1:]
        if (tld[0].isdigit()):
            return NICClient.ANICHOST
    
        return tld + NICClient.QNICHOST_TAIL
    
    def whois_lookup(self, options, query_arg, flags):
        """Main entry point: Perform initial lookup on TLD whois server, 
        or other server to get region-specific whois server, then if quick 
        flag is false, perform a second lookup on the region-specific 
        server for contact records"""
        nichost = None
        #pdb.set_trace()
        # this would be the case when this function is called by other then main
        if (options == None):                     
            options = {}
     
        if ( ('whoishost' not in options or options['whoishost'] == None)
              and ('country' not in options or options['country'] == None)):
            self.use_qnichost = True
            options['whoishost'] = NICClient.NICHOST
            if ( not (flags & NICClient.WHOIS_QUICK)):
                flags |= NICClient.WHOIS_RECURSE
            
        if ('country' in options and options['country'] != None):
            result = self.whois(query_arg, options['country'] + NICClient.QNICHOST_TAIL, flags)
        elif (self.use_qnichost):
            nichost = self.choose_server(query_arg)
            if (nichost != None):
                result = self.whois(query_arg, nichost, flags)           
        else:
            result = self.whois(query_arg, options['whoishost'], flags)
            
        return result.decode()
#---- END OF NICClient class def ---------------------
    
def parse_command_line(argv):
    """Options handling mostly follows the UNIX whois(1) man page, except
    long-form options can also be used.
    """
    flags = 0
    
    usage = "usage: %prog [options] name"
            
    parser = optparse.OptionParser(add_help_option=False, usage=usage)
    parser.add_option("-a", "--arin", action="store_const", 
                      const=NICClient.ANICHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.ANICHOST)
    parser.add_option("-A", "--apnic", action="store_const", 
                      const=NICClient.PNICHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.PNICHOST)
    parser.add_option("-b", "--abuse", action="store_const", 
                      const=NICClient.ABUSEHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.ABUSEHOST)
    parser.add_option("-c", "--country", action="store", 
                      type="string", dest="country",
                      help="Lookup using country-specific NIC")
    parser.add_option("-d", "--mil", action="store_const", 
                      const=NICClient.DNICHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.DNICHOST)
    parser.add_option("-g", "--gov", action="store_const", 
                      const=NICClient.GNICHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.GNICHOST)
    parser.add_option("-h", "--host", action="store", 
                      type="string", dest="whoishost",
                       help="Lookup using specified whois host")
    parser.add_option("-i", "--nws", action="store_const", 
                      const=NICClient.INICHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.INICHOST)
    parser.add_option("-I", "--iana", action="store_const", 
                      const=NICClient.IANAHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.IANAHOST)
    parser.add_option("-l", "--lcanic", action="store_const", 
                      const=NICClient.LNICHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.LNICHOST)
    parser.add_option("-m", "--ra", action="store_const", 
                      const=NICClient.MNICHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.MNICHOST)
    parser.add_option("-p", "--port", action="store", 
                      type="int", dest="port",
                      help="Lookup using specified tcp port")
    parser.add_option("-Q", "--quick", action="store_true", 
                     dest="b_quicklookup", 
                     help="Perform quick lookup")
    parser.add_option("-r", "--ripe", action="store_const", 
                      const=NICClient.RNICHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.RNICHOST)
    parser.add_option("-R", "--ru", action="store_const", 
                      const="ru", dest="country",
                      help="Lookup Russian NIC")
    parser.add_option("-6", "--6bone", action="store_const", 
                      const=NICClient.SNICHOST, dest="whoishost",
                      help="Lookup using host " + NICClient.SNICHOST)
    parser.add_option("-?", "--help", action="help")

        
    return parser.parse_args(argv)
    
if __name__ == "__main__":
    flags = 0
    nic_client = NICClient()
    (options, args) = parse_command_line(sys.argv)
    if (options.b_quicklookup is True):
        flags = flags|NICClient.WHOIS_QUICK
    print((nic_client.whois_lookup(options.__dict__, args[1], flags)))

Posted by & filed under Security News.

Update: Aftenposten now publishes all documents that are used to write related news articles in a RSS feed here: http://www.aftenposten.no/eksport/rss-1_0/?seksjon=spesial_wikileaksdokumenter&utvalg=siste

The Norwegian newspaper “Aftenposten” claims, according to several sources [Norwegian, in english here] that it has access to all the Wikileaks cables. This would effectively mean that the carefully planned drips of information to selected newspapers that Wikileaks is known for has, well, sprung a leak (pun intended).

Aftenposten news editor Ole Erik Almlid, who doesn’t want to give up the source of the leak, are quoted saying:

“We’re free to do what we want with these documents. We’re free to publish the documents or not publish the documents, we can publish on the internet or on paper. We are handling these documents just like all other journalistic material to which we have gained access.”

WikiLeaks has only published 1,862 cables so far out of a total of 251,287, but Aftenposten claims to have access to the whole lot. The newspaper refuse to divulge its sources, so its still unclear whether the leak of the leak has come from any of Wikileaks media partners (The Guardian, El Pais, New York Times and Der Spiegel) or Wikileaks itself.

Wikileaks are known for carefully staging the releases of information, claiming to both protect its sources and prevent the loss of human lifes as a consequence of the leak.  The fact that they have routinely published cables (mostly regarding Norwegian-American relations) during the last week seem to suppport this claim.

While the legitimity of this leak is still unclear, a single newspaper with full access can publish articles articles at their own will and according to their own ethical guidelines, free from the Wikileaks schedule. This may have consequences for Wikileaks partners and Wikileaks itself, and may create a rush to publish the rest of the cables.