Tuesday, November 26, 2013

Introducing MetaDiver!

I'm pleased to share a program I have been working on for the past few months. MetaDiver extracts metadata from known files such as Office Docs and PDF's using the Windows Shell. Really anything Shell has a dll registered for in theory... It spits out a nice spreadsheet or delimited document with the attributes you are interested in. With more features planned in future releases I'm really excited about the simplicity and power of this product! This isn't a forensic tool in that it doesn't parse the file but it's a great way to take metadata that is a pain to review in most Forensic and IR tools and make it much easier to work with. It's also a great sanity check against your tools. You should be able to mount an image with your favorite tool and point this guy to the files your are interested in. However I have only tested against data on my hard drives that has already been exported using my favorite forensic tools. If it's more than 10-20 documents it could take a while to finish which it percolates so just be patient, grab a beer or coffee and give it time to get through everything. It will tell you when it's done!

Download v1.0.7 or click here.

Info: Right now it's just zipped up without an installer, I hope to add one soon. Just download, unzip and keep the files in the directory with the exe.

Requirements: Windows 7 or later and .NET 4.0.

Please send feedback if you like it, hate it, whatever.


Sunday, November 17, 2013

Scripting with FTK Filters - Updated

This is an updated post about building Access Data’s FTK Toolkit filters outside of FTK. Access Data probably won’t like this since a bad filter can cause the client to crash if you build the filter wrong. So lets build it with care.

If you are someone familiar with FTK then you have had to work with filters. You may even have broken a keyboard or two. One of the first things that came up when moving to FTK 2+ was how to make filters with a lot of items without getting clickitis as we call it. Building large filters by hand is tedious and time consuming and prone to errors due to copy & paste. Out of shear desperation I decided to write a script to automate building big filters so I could spend more time on analysis and less copy-pasting. There might be a better way of getting this result without filters, if there is please let me know!

The Script

Here is a quick and dirty example in Perl for creating a filter for FTK item numbers (don’t judge my syntax too harshly!). 

##<-Begin Script for FTK 5

#!/usr/bin/perl -w
use DBI;
use strict;
use warnings;
use IO::File;
use File::Copy;

my $path = shift ;

#This filter will set a criteria of matching "any" item number in the list.
print "<?xml version=\"1.0\" encoding=\"UTF-8\"?>
                <exportedFilter xmlns=\"http://www.accessdata.com/ftk2/filters\"><filter name=\"Item #\" matchCriterion=\"any\" id=\"f_1000044\" read_only=\"false\" description=\"\">";
open("FILE","<","$path") or die "can't open: $!\n";
my @items = <FILE>;
my $i = 0;
foreach my $item_num (@items)            {             
                #Cleanup spaces after int if they exist otherwise FTK will freak since it expects an integer.
                $item_num =~ s/\s+$//;
                print "<rule position=\"". $i . "\" enabled=\"true\" id=\"a_9000\" operator=\"is\"><one_int value=\"" . $item_num . "\"/></rule>";
print "</filter><attribute id=\"a_9000\" type=\"int\"><table>cmn_Objects</table><column>ObjectID</column></attribute></exportedFilter>";
##<-End Script
The way this works you will just feed in a text file with a line for each item number. Make sure you strip out formatting and white spaces. I attempt this in the script but it’s always best to feed in the cleanest data possible! Nice and simple right?

Not so fast, one catch is the table object and column name and the id’s can change from FTK version to version, so I highlighted them above. The best way to check if any values have changed after an FTK upgrade is to just build a new dummy filter for the item type you want, export it then check the XML. Sometimes FTK gets temperamental and I can’t explain why. If anyone has ideas I’d love the feedback. I’ve been successfully building these filters since FTK 2.x for item numbers, hash’s, etc…

Possible uses

File name matches
Quick hash matches
File Path matching (equivalent is a like '%%' in SQL)
ItemNumber matching to original item when you get lists of itemnumbers during discovery.

Exporting to an XML File

If you want the output the script to an xml file, when you run it use this syntax "perl script.pl > filter.xml".
*Using > will send the console output to a file (overwriting any file with the same name that already exists).

Then you can just import the filter.xml into FTK! 

Wrap Up

There you have it, building FTK filters using Perl. Look forward to feedback based on your own experiences. 

Hope you found this post interesting and useful!