Wednesday, September 4, 2013

Installing BinDiff on Linux Mint 14

I recently upgraded my system to Linux Mint 14 and went about re-installing all my software. When I got to Zynamics/Google BinDiff, I found I had an issue:
$ sudo dpkg -i bindiff401-debian50-amd64.deb

Selecting previously unselected package bindiff.
Unpacking bindiff (from bindiff401-debian50-amd64.deb) ...
dpkg: dependency problems prevent configuration of bindiff:
 bindiff depends on sun-java6-jre; however:
  Package sun-java6-jre is not installed.

Unfortunately, BinDiff requires sun-java6-jre, which is not in the Linux Mint repository, nor any other repository I could find. I could circumvent this by installing BinDiff by using the --ignore-depends=sun-java6-jre option to dpkg. However, every time I went to install updates I would get an error message that BinDiff was broken, and be prompted to uninstall it before I could continue.

However, I found a work-around - create a dummy package named sun-java6-jre using the tool equivs. (There are some docs out there on this, but I was unable to find a non-Google cached copy, so here was what I did.)

Linux Mint has equivs in its repository, so if its not already installed, apt-get it.

Next, run equivs-control sun-java6-jre and this will create a file named sun-java6-jre that you will need to modify.

At minimum, you'll need to uncomment and/or fill out the following fields:
  • Package
  • Version
  • Maintainer
I also filled out the description fields so I would remember what it was.

After the file is modifoed, run equivs-build sun-java6-jre and you should see something similar to below:
$ equivs-build sun-java6-jre
dh_testdir
dh_testroot
dh_prep
dh_testdir
dh_testroot
dh_install
dh_installdocs
dh_installchangelogs
dh_compress
dh_fixperms
dh_installdeb
dh_gencontrol
dh_md5sums
dh_builddeb
dpkg-deb: building package `sun-java6-jre' in `../sun-java6-jre_6.0_all.deb'.

The package has been created.
Attention, the package has been created in the current directory,
not in ".." as indicated by the message above!
Once that has successfully completed, you should have a sun-java6-jre_6.0_all.deb file in your directory. If that failed, you probably forgot to modify one of the fields in the file.

Finally, just dpkg -i the new deb file and BinDiff, and you should be ready to go!
$ sudo dpkg -i sun-java6-jre_6.0_all.deb
Selecting previously unselected package sun-java6-jre.
(Reading database ... 237677 files and directories currently installed.)
Unpacking sun-java6-jre (from sun-java6-jre_6.0_all.deb) ...
Setting up sun-java6-jre (6.0) ...
$ sudo dpkg -i bindiff401-debian50-amd64.deb
Selecting previously unselected package bindiff.
(Reading database ... 237681 files and directories currently installed.)
Unpacking bindiff (from bindiff401-debian50-amd64.deb) ...
bindiff license has already been accepted
Setting up bindiff (4.0.1) ...

$
Then you are good to go!

Sunday, May 19, 2013

My Take on the City of Akron Hack

On Thursday, May 16, 2013, a Turkish hacking group called Turkish Ajan hacked into the City of Akron and released a number of files that contain personal information on a number of Akron citizens. According to the city, the attackers were able to gain access into some internal systems where they obtained tax information.

The news has died down on this for the moment, but from the information that has been released, there are some things we can infer:

1. The attackers compromised the city's public website. From the errors that were being displayed on the site, information that has been released from the city, and the way this group works, it was likely through SQL injection (although this has not been specifically stated yet).

2. The attackers compromised the city's internal systems and obtained access to tax systems. It is unknown if they were able to do from the city's public website, through the tax paying system, or some other server. In any case, this appears to be where the attackers got the files they posted.

3. Around 25K people are affected.

4. The FBI is involved and was called in quickly after the compromise was discovered. IMO, this is good.

Any additional information on what happened is pretty much speculation. Trust me, I've been speculating alot and have a pretty good idea of what happened, but I have no proof. Hopefully whoever is doing the forensics for the city will have their findings released at some point. As an Akron citizen, and tax payer, I want to know this information.

However, there is one thing that I think needs brought up. Why was this information stored unencrypted? If it was encrypted, how did the attackers obtain access to the keys to decrypt it?

The information that was released contains social security numbers of both the taxpayer and their spouse, and credit card numbers. According to PCI standards (and my understanding), the credit card numbers should have been encrypted. The federal government is required to comply with PCI, what about the city of Akron government?

As for the SSNs, I don't know of any specific regulations that requires that information to be encrypted (please let me know if there is), but I can't imagine that there is any reason it shouldn't be. I have a feeling there are at least 25,000 people who agree with me.

One final item of note. The press has been getting quotes from Deputy Mayor Rick Merolla. With all due respect sir, shut up. I can only imagine your IT and information security people are cringing whenever they read your quotes pertaining to the security of the city of Akron systems.

I'm sure you are very smart, but its obvious you are not familiar with information security. Quotes such as "Our systems are all, all our virus protection, intrusion protection systems, all of our virus software is still up to date so we are still not sure how they got in" show this. Let those performing the investigation or the talented IT personnel you employ speak on these things.

If you like, I am personally offering to give you a training course on information security, hackers, and how attacks take place. This will at least give you an idea on why the things you have been quoted as saying are cringe-worthy.

Friday, April 19, 2013

MASTIFF 0.6.0 Released!

The latest version of MASTIFF, 0.6.0, has just been released! Run over to the download site and grab the latest version!

The official changelog is located here, but the major improvements are described below.

Upgrading MASTIFF to the latest version is easy. You can follow this process:
  1. Download and install pydeep.
  2. Download MASTIFF 0.6.0 and untar it.
  3. Run "make test" to ensure you are not missing any dependencies.
  4. Run "sudo make install" to install the latest version.
  5. Copy the analysis plug-ins (the plugins directory in the tarball) to your location of choice and ensure the config file is pointing to that directory.
  6. Add any new options to your MASTIFF config file. The easiest way may be to use sdiff.

Queue

MASTIFF now has a queueing system so multiple files can be analyzed by the framework. To utilize this, give MASTIFF a directory instead of a file to analyze. It will find all files in that directory and its subdirectories, add them to the queue, and begin processing.

The queue is maintained within the MASTIFF database. So, if you have to stop MASTIFF in the middle of its run, it will begin re-processing the queue when its restarted. Some additional options have been added to allow you to work with the queue:
  • --clear-queue: This will clear the current queue.
  • --ignore-queue: This will ignore the queue and just process the file you give it.
Analysis plug-ins are also taking advantage of the queue. The pdf-parser and ZipExtract plug-ins have a new option ("feedback") which allow you to feed files from the plug-ins back into the queue for processing. For example, the ZipExtract plug-in will add all files that were extracted from the archive into the queue for processing.

Fuzzy Hashing

Fuzzy hashing is not something new within MASTIFF. However, we have changed the Python library used for it. Previously, we used pyssdeep but found that there were a number of stability issues with it on OSX and when processing large amounts of files.

Therefore, we have switched to pydeep (https://github.com/kbandla/pydeep). Our testing has shown it to be much more stable thus far.

libmagic

There was some confusion on which Python libmagic libraries to use when installing MASTIFF. To help alleviate some of that, the framework has been modified to use two different libmagic libraries:
If either library is installed, MASTIFF will utilize them.

Other Changes

A number of other bug fixes and improvements have been made. Please see the changelog file for a complete list.

As always, if you have any questions, please email mastiff-project@korelogic.com.

We have alot of great things coming down the pipe for MASTIFF, but if you have any suggestions, enhancements or plug-ins, let us know!

Thursday, February 21, 2013

MASTIFF: Automated Static Analysis Framework

Malware analysis is a process that begs to be automated. Messing up one step or running one tool incorrectly can cause you to have to restart the entire process. Fortunately, there are a number of automation frameworks or systems, such as Cuckoo or Threat Expert, that exist to help automate malware analysis.

While these automation frameworks are great, they tend to focus on dynamic analysis (behavioral analysis); static analysis (characteristic analysis) is mostly left out. The static analysis techniques that the frameworks do perform vary, but typically include hashing, strings extraction, some file-type specific tools, along with a couple other techniques. Additional static analysis programs or techniques usually have to be implemented on their own.

To do this, analysts typically create a master static analysis script that runs all of the tools desired against a file. However, if an analysis tool is run against a file type that it cannot analyze, such as a PE header analysis tool on a PDF, you run the risk of crashing the analysis program and, in turn, your automation script.

As an incident responder and malware analyst, I came up against these issues all the time, so I started to look for a solution. Nothing existed to automate the entire static analysis process and allow you to add in your own techniques.

That is why MASTIFF, an open source automated static analysis framework, was created. MASTIFF performs two functions for the analyst:
  • The file type of the file being analyzed is automatically determined.
  • Only those techniques which work on that file type are applied.
By automatically determining the file type for the analyst and ensuring that only the static analysis techniques that work on that file type are run, analysts can be assured that the risk of crashing the automated process is lessened, and that only relevant data is returned.


MASTIFF works by utilizing plug-ins for both file-type detection and static analysis techniques. The decision to utilize plug-ins was two-fold:
  • The types of files analyzed and the techniques available within MASTIFF can be easily expanded by adding new plug-ins.
  • MASTIFF is able to be "crowd-sourced".
The last reason was especially important. Anyone can create a new plug-in to add a new file type or analysis technique. As more people add plug-ins, the more useful the framework becomes. To facilitate easier plug-in development, template, or skeleton, plug-ins have been included with the project. In just a few minutes, someone can modify a few fields in the template and have a new plug-in ready to go.

In the coming weeks, I'll be posting information and tutorials related to MASTIFF, how to use it, how to create plug-ins for it, etc. Please let me know any questions you have on the framework or there is something specific that should be focused on.

Finally, I want to state that MASTIFF was funded through KoreLogic, the company I work for, and the DARPA Cyber Fast Track (CFT) program. If you are unfamiliar with CFT, I highly recommend looking at their site and submitting a proposal. Its a great program, but you only have until April 1, 2013 to do so and then no further submissions will be taken.

Tuesday, February 19, 2013

ShmooCon 2013

This past weekend I went to my first ShmooCon in Washington D.C. I have to say this was an experience that I was not expecting. I've been to many security conferences in the past, included RECon, BlackHat, GFIRST, and some SANS and OWASP conferences. ShmooCon ranks up there in the top 2 spots, if not one of the best that I've been to.

The best thing about ShmooCon is that it has a small con feel to it, while having everything the big cons have (e.g. big name speakers, contests, prizes, lots of smart people). It also has a small con price - if you can get a ticket, its only going to cost you around $150.

I was also lucky enough to be selected as a speaker this year, presenting a talk on my newly open-sourced tool MASTIFF. As a speaker, they one of the best run CFP processes I have ever used. After selection, they are constantly available for questions, have excellent moderators and are great in making sure you have what you need.

The talks at the conference were amazing. They are of the highest quality and even the ones I didn't like were full of good information. Since I was releasing MASTIFF the first day I was there, and I was freaking out about my talk (I was in the last speaking slot of the tracks), I didn't get to see all that I would have liked. However, these stood out:

  • NSM and more with Bro Network Monitor by Liam Randall - This was the best talk of the conference IMO. Liam gave an excellent talk about what Bro is, how it works, and even how easy it is to extend it. His presentation was how all presentations should be - easy to follow and good at explaining a relatively complicated concept.
  • Crypto: You're doing it wrong by Ron Bowes -  Ron gave an excellent talk about some crypto attacks, how they can be performed, and even did 3 live demos (that didn't fail) that performed these attacks. I'm not a crypto guy, but Ron's explanations of everything were easy to follow and entertaining. Plus he used The Call of Cthulhu as some of his encrypted text.
There were alot more that I saw that were excellent, and some that I unfortunately missed. Luckily, ShmooCon makes all their recordings available online for free and should be up in a couple of weeks. I look forward to next year!