Friday, July 15, 2011

Log2Timeline and Super Timelilnes

With the recent release of Kristinn Gudjonsson's Log2Timeline v.60, oddly named, "The Killer Dwarf" (Ya...you had to be there), generating Super timelines has now become easier than ever. However, before we get into the technical specifics of exactly HOW this is done, let's cover the two divergent theories about timelines.

For the purposes of this post, I will refer to the two groups as the Hogs and the Budgies. Yes...I know I am terrible at naming things, but after you hear my rationale behind these names, you will at least know my thought process. First of all, both sides agree that timelines should be made. In fact, I am not entirely sure how I ever conducted an investigation without making a timeline, and I am even less sure about how anyone currently conducting investigations can think they are doing a comprehensive job without timelines! The separation in philosophies comes from exactly what data elements to include in the timeline.

Hogs want to include everything...file system data, event logs, registry last write times, application logs...whatever you have, throw it in there. The theory is, I am not entire sure what I will need, or what I really want to see so just show me everything and I will decide later.

Budgies are the exact opposite...they want to see a much smaller data sample. Presumably, they know precisely what is that they want, and only want to see that data.

I categorize myself as a Flying Pig, because what I want to look at changes from case to case. Sometimes, I only need data from the active file system, while other times, I want to maybe see the event logs, and just the system hive last write times.

I think it's OK to be a Flying Pig, and in my opinion, a good marriage of including just the right data elements into your timeline. If you are new to making forensic timelines, my recommendation is to be a Hog. Gather all of the data you can and throw it into your super timeline. Hopefully, as you get more and more familiar with what data provides value to your investigations, you will get better at determining which elements to include. The fact that you are doing timelines at all, sadly puts you in a very small (yet hopefully growing) number of investigators...so keep it up, however you choose to do it.

Now, on to the technical goodness!

Getting Log2Timeline to run properly in Windows was a bit of a challenge. I worked with Kristinn for about a month tweaking perl modules until we finally got a final product that worked properly.

To start with, go to www.log2timeline.net and download the latest version, and the Windows install guide. Once you have the files, unpack them into your tools directory and follow the install guide. I am not going to say much more about that here, other than I KNOW for a fact that it works...since I am the one that wrote it =). So if you follow it step by step, you should not have any problems.

What makes the newest release of Log2Timeline really powerful is the addition of the recurse option. This means that you can throw all of the data you want added to your timeline into a single directory, and use Log2Timeline to recurse through that directory and add any applicable files to the timeline.

Arguably just as important and powerful of a change is the addition of file carving functionality with plugin grouping (much like Harlan Carvey uses in Reg Ripper).

For example...let's say you acquire volatile data from a Windows XP System. You have the event logs, the registry hives, a couple of ntuser.dat files, and the Master File Table. You can chunk (yes...that is an Oklahoma term) them all into a single directory and use the following command syntax.

c:\>tools\log2timeline> perl c:\tools\log2timeline\log2timeline.pl -m "keyword" -z CST6CDT -r vol -f winxip -w c:\cases\\timeline\supertimline.csv

Let's take a look at these options one by one.

The -m option allows you to put in a keyword. Normally, I use the hostname and the drive letter...for example...cpbeefcake_win7_c:\. This can be anything that will allow you to quickly and easily distinguish one timeline from another.

The -z option allows you to set the timezone for the timeline. This step cannot, and should not be skipped. While I live in the central timezone, I work cases in multiple other timezones. By default, if you don't specify Log2Timeline will use the timezone of the localhost. Now, if the case you are working is say in Pacific Standard Time, and your timeline gets generated in Eastern Standard Time, your timeline will be off by as many as four hours! That is a HUGE margin of error, and will no doubt mess with the accuracy of your findings.

The -r option, we talked about briefly, but it is used to recurse through a directory. Log2timeline uses file carving to identify the header of all of the files in the directory. Once it obtains that data, it compares the headers to the known headers for the various plugin types. If the header is recognize, it will automatically load the appropriate plugin, and parse the chronological data from the file and put it into the timeline (pretty sweet!).

The -f option identifies the file type. This can either be the specific file type (if you are only parsing a single file) or a set of plugins if you are parsing the files from a specific operating system. In my example, I used the "winxp" plugin, which automatically loads all of the plugins needed for a Windows XP system.

The -w is the write option. This tells the tool where to write the output file...pretty basic. By default, the tool writes the output in CSV format. DO NOT append the .csv file extension to the output file. I am not sure why this hoarks up the output file, but it does. For some reason, the column headers are left off file and the l2tprocess will fail. I need to get with Kristinn on this.

If done correctly, your column headers should look like this...

c:\tools\test>strings 1

date,time,timezone,MACB,source,sourcetype,type,user,host,short,desc,version,filename,inode,notes,format,extra

Now, if you want to, you can append like the contents of the Master File Table, or a timeline you created with Mactime to your initial output file. Again, since Log2timeline outputs into CSV format by default, you would need to append the final output from mactime, and not a bodyfile generated from FLS.

After you have your super file the way you want it, with all of the data you want it in, you will need to make sure the file is in chronological order, since Log2Timeline will simply add the data to the super file in sequential order (in the order it was read, or appended).

To do this, use the following command...

c:\tools\log2timeline>perl l2t_process -b super > supertimeline

The l2tprocess will chronologically arrange the data from the super file into the correct order, with the first entry at the top, and the last entry at the bottom. Pretty nice!

Another great feature is the ability to use the MFT in the supertimeline! Check it...

date,time,timezone,MACB,source,sourcetype,type,user,host,short,desc,version,filename,inode,notes,format,extra

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$MFT,/$MFT,2,/$MFT,0, ,Log2t::input::mft,-

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$MFT,/$MFT,2,/$MFT,0, ,Log2t::input::mft,-

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$MFTMirr,/$MFTMirr,2,/$MFTMirr,1, ,Log2t::input::mft,-

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$MFTMirr,/$MFTMirr,2,/$MFTMirr,1, ,Log2t::input::mft,-

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$LogFile,/$LogFile,2,/$LogFile,2, ,Log2t::input::mft,-

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$LogFile,/$LogFile,2,/$LogFile,2, ,Log2t::input::mft,-

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$Volume,/$Volume,2,/$Volume,3, ,Log2t::input::mft,-

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$Volume,/$Volume,2,/$Volume,3, ,Log2t::input::mft,-

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$AttrDef,/$AttrDef,2,/$AttrDef,4, ,Log2t::input::mft,-

02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$AttrDef,/$AttrDef,2,/$AttrDef,4, ,Log2t::input::mft,-

You see the $SI and $FN in column eight? That's right baby! Timestomping has NEVER been easier to detect! You will see...plan as day...when the chronological data has been manipulated since they $SI and FN attributes will be different! Provided you search by keyword, they will appear literally right on top of each other! Very nice addition!

I almost wish it were harder than that to create super timelines, but it's really not. Kristinn has done a fantastic job on the latest release of Log2Timeline. There numerous other options the tool can use, and for the sake of brevity (not to mention the fact that you can read) I have not covered all potential option combinations. My advice is to take some time and play with the tool. Get to know how it works, what the output looks like, and what commands you think are the most relevant for the timelines you are creating.

Serious props to Kristinn for making this extremely useful and powerful tool free to the forensic community. He has done an outstanding job, and honestly, like Harlan's Reg Ripper, and Mandiant's Memoryze, this is a game changer.

Happy Hunting!

5 comments:

  1. Thanks for your efforts on getting log2timeline to work on Windows. I was testing it out last week and its impressive.

    I agree the plugin funtionality is a great new feature. I thought people would like to know that custom modules can be created in addition to the default ones like "winxp". All that needs to be done is to create file with .lst as the file extension and list the options to run. The file then gets placed in the directory where the other .lst files are located. For example, a custom file can be created to run the Windows Shortcut file, Internet Explorer history, and ntuser.dat against a specific user profile. The directory on Windows containing the .lst files is C:\Perl\lib\Log2t\input (if your instructions were used to install log2timeline).

    All the credit on creating your own module goes to Kristinn. All I did was email him to see if it's possible to create your own custom modules and he provided the answer.

    ReplyDelete
  2. Great article. I'm looking for the Windows install guide on the log2timeline site but can't find it. Can you post it on your site (since you wrote it)?

    ReplyDelete
  3. Like Mike Pilkington, I can't find the Windows install guide. Not a good start for a forensic investigator, can't even find an Install Guide. I'm a decided part-timer, Windows is my world :(

    ReplyDelete
  4. Just download log2timeline from the site. Extract the files (using 7-Zip) and go to the docs folder. In there you'll find the instructions for installing on Windows. Not a good start for a forensic investigator indeed :@)

    ReplyDelete
  5. Hmmmm... I'm quite new to this one. I'm starting to learn this from time to time. Thanks for posting anyway, I truly appreciate this.

    ReplyDelete