This Blog is dedicated Digital Forensics and Incident Response, tools, techniques, policies, and procedures.
Tuesday, December 6, 2011
Manipulating WFP and Residual IOCs
Friday, November 18, 2011
Pauldotcom Inverview
Wednesday, November 16, 2011
Interview on Pauldotcom
Tuesday, November 15, 2011
Thursday, October 20, 2011
SecTor 2011 Huge Success
A special thanks to Melanie Wallis for handling the logistics for all of the speakers!
If you have not been to SecTor in the past, you are seriously missing out. The talks get better year after year, and the crowd continues to grow. This year Brian said they had over 1100 attendees, which is fastasic!
Also, Grayson Lenik, Jibran Ilyas, and I conducted a one day Forensics training seminar that went beautifully, and was very well recived.
Great job again to everyone at Black Arts LTD for making SecTor 2011 a huge success!
Monday, October 10, 2011
The Great Northern Invasion
On October 17th, I am presenting a day of Law Enforcement training in Toronto at SecTor followed on the 18th by the debut of Sniper Foreniscs v3.0: Hunt.
Then, on October 25th, I am speaking at SecureTech Canada where I am sitting on a panel discussing Cyber Extortion and Protecting Critical Data.
I am really looking forward to a pile of poutine...and maybe a Keith's!
If anyone is going to SecTor, I will be at Joe Bidali's right across the street from the hotel on most nights. See you there eh!
Wednesday, September 21, 2011
Log2Timeline Intall Guide
+ ------------------------------
+
+ ------------------------------
This has been tested on a Windows XP sp3 machine (32 bit), and Win7 64 bit machine.
Download and install ActiveState Perl
Open command prompt and run the following commands (install dependencies):
ppm install datetime
ppm install win32::api
ppm install date::manip
ppm install xml::libxml
ppm install carp::assert
ppm install digest::crc
ppm install data::hexify
ppm install image::exiftool
ppm install file::mork
ppm install datetime::format::strptime
ppm install parse::win32registry
ppm install html::scrubber
Download the latest source code for log2timeline
Download two additional libraries:
"http://search.cpan.org/CPAN/
"http://search.cpan.org/CPAN/
Uncompress
Copy the content of the lib/XML folder to c:/perl/lib/XML/
Inside the Mac-Propertylist:
Create the directory c:/perl/lib/Mac
Copy the content of the lib/* to c:/perl/lib/Mac
Inside the log2timeline directory
Delete the file lib/Log2t/input/pcap.pm
Copy the content of the lib/Parse/* to c:/perl/lib/Parse/
Copy the content of the folder lib/Log2t to c:/perl/lib/Log2t/*
Copy lib/Log2Timeline.pm to c:/perl/lib/
Copy log2timeline to c:/perl/bin/log2timeline.pl
Copy l2t_process to c:/perl/bin/l2t_process.pl
Copy timescanner to c:/perl/bin/timescanner.pl
Test and hope the best... ;)
Monday, August 29, 2011
CyberSpeak Interview Available!
Thanks to Ovie Carroll and George Starcher for taking the time to interview me! I hope your ratings don't drop too much =).
Tuesday, August 16, 2011
CyberSpeak Interview
I just finished an interview with Ovie Carroll on CyberSpeak! It should be posted in about two weeks! Give it a listen!
Talked about Sniper Forensics and how it rocks the hizzie!
Friday, August 12, 2011
Investigation Plans
Here is what I do...
First, I open Case Notes and open my custom tab that I have labeled, "Investigation Plan".
Second, I sit back and think about what it is that I have been asked to do. This will obviously change from case to case, agency to agency, and person to person, but the general goal should be the same. You have been asked to identify something for some reason. You are not conducting the investigation for the sake of the investigation itself.
Once I have my overall goal, I write it down in my Case Notes..."I have been asked to confirm blah.
Third, I brainstorm on the "stuff" I will likely need to accomplish my goal. Will I need logs, will I need to interview customer (victim) employees, will I need timeline data, registry data...whatever.
Fourth, I use my tab that I have labeled, "Questions", and I ask myself questions that based on the data I just brainstormed, should help me to accomplish my overall goal. Throughout the investigation, I answer my questions. These answers will either terminate my line of thinking in that area and provide me with a new theory, or support my theory, enabling me to continue down the same path.
Following this brief but very useful exercise will give clarity to my investigation as well as provide success indicators so that I know I have found what I am looking for! Without a clear idea of what you have been asked to do, an investigator can easily become lost in the, "Fog of Forensics" and his case can grind to a stand still.
If you are using Investigation Plans...Good on you! If you are not...start...I promise you will see significant and immediate benefits!
Now...that pretty much concludes
Monday, July 18, 2011
How Do I Get There From Here?
First of all, you need a good attitude. You need to leave your ego or any overinflated sense of superiority at the door. Some of the absolute BEST people in this industry...guys like Harlan Carvey, Rob Lee, Ovie Carroll, Cory Altheide, Hal Pomeranz, Chad Tilbury, Lenny Seltzer, Jesse Kornblum, Colin Sheppard, Chris Hague, Jibran Ilyas, Grayson Lenik and Eric Huber all share a common trait...Humility. I bet if you asked any one of them if they were good at what they do, you would likely get some variant of the response, "I sure try, but there is always so much to learn!"
They know they do not know everything, and work hard keep current on emerging concepts and technologies . I have met them all, and there is absolutely NO pretense in any of these industry giants. Also, they are passionate about their work, and love what they do. They are the best because they work the hardest. Period.
You also need to be flexible. The slogan of this industry is "semper gumby" - always flexible. You need to be able to adapt to constantly changing situations, emerging evidence, difficult customers, challenging time tables, and extensive travel. Don't be too rigid, or get frustrated when things either change unexpectedly, or don't turn out as planned.
And Travel...loooooots of travel. As an example, I am writing this in the airport during week three of a seven week travel spree. You will travel...a LOT...so get used to it.
Second, you have to be wired for this kind of work. By, "wired", I mean you just have to "get" technology. You have to have a knack for computers beyond the skills and abilities of what would commonly be referred to as a "normal" end user. You cannot be scared by the command line, Linux, Mater Boot Records, Master File Tables, the Windows Registry, the OSI model, Perl, Ruby, and/or Python (just to name a few). You need to be able to read, comprehend, and figure stuff out. You should know what you are looking at, why, and be able to explain it to anyone. In short, you need to be either inherently smart, or prepared to work really hard (I fall into the latter category - not the smartest dood in the room, but I think I work as hard as, or harder than just about anyone). In my opinion, having a concrete foundational knowledge is essential for the job, and is really the difference maker between someone who is OK at the job, and someone who is really good. So never stop learning!
Remember, knowing how to use a tool (any tool) no more makes you an investigator, than knowing how to use MS Word makes you Stephen King. It's a tool that does something...NOTHING more. It's the expert set of eyes on the screen and the expert fingers on the keyboard that make up the expert.
Third, you need a desire to find the truth. The evidence is there (usually), and it's up to you to find it, and interpret it properly. Also, there is a famous quotes by Dr. Carl Sagan who stated, "The absence of evidence is not the evidence of absence". Remember, it is the job of the investigator to identify and properly interpret the evidence.
These are the precepts you should hang your "hat" on. Find the truth. Dig it out of every registry hive, file system, unallocated cluster, slack space, and network capture you can find.
Along those lines, Harlan and I were recently having a discussion over breakfast about context. The basic results were that many investigators will jump to conclusions based on a single data point without building appropriate context around that data point. Why is it there? What does it mean? Am I drawing conclusions based on theory or fact? Are there other data points that all indicate the same "thing" took place. For us, best practice is to identify at least three data points that all point in the same direction. This will give the investigator confidence in what they found (that it is indeed accurate), and give weighting to the evidence.
This is something I touch on in Sniper Forensics. NEVER EVER form your opinion about what happened and try to make the data fit your theory. Let the data formulate your theory, and allow your investigation to flow with the evidence. You may change directions numerous times. Doing so doesn't mean you are wrong, or a bad investigator. It means you know enough to allow the evidence to guide the investigation. It's a complex, fluid combination of art and science, and if it were easy, everybody would do it and be good at it.
OK...so now that we have covered some of the basics regarding attitude, and some philosophical essentials, let's talk about education. You need it. Personally, I am not a huge fan of the forensic degree programs currently be taught at many universities. From what I have seen, they teach tool use, and maybe a little theory. Which is good, but not something that is going to equip an investigator for a successful career in the field. I would LOVE to see them teach the history of forensic science, logic, investigative methodology, technical writing, research methodologies, public speaking, conflict resolution, and systems administration. These are the key proponents of a solid investigator...not knowing how to use a tool! If you have the opportunity to take any class that covers these topics, I would HIGHLY recommend doing so. You would be amazed if I told you how relevant my Pre-Socratic Philosophy class is to my job! Or how much better my reports are after taking a technical writing course. The independent research I have done on expert witness testimony has made me better prepared to speak on the stand. Taking a class that certifies you in how to use a certain tool...ya...not gonna teach you ANY of those things...I'm juuuuuuuuuuuust sayin...
In my opinion, if you are looking into a degree program, take something that is going to teach you what "normal" looks like. Get a general IT degree that is well rounded with courses in Windows, Linux, networking, midrange, and emerging technologies. You can learn the tools later, knowing the basics will serve you far better in the field.
I am a fan of technical certifications...sort of. I have several, and I feel like I got something out of studying for, taking, and passing the requisite examinations. I think the subject matter is relatively small (compared to the larger IT world), focused, and can help to contribute to your subject matter expertise in a specific area.
Now, I am only partially a fan of certifications for a couple of reasons. I know several people who have multiple certifications, and are crummy investigators. Alternatively, I know several people who have few or no technical certifications, who are fantastic investigators. Again, those little letters after your name don't make you a good investigator. They mean you paid some money, sat in a class, and passed an exam. Nothing more. If you have multiple certs...good for you...don't get a big head about it. If you don't have any...don't let it discourage you. They are what they are...indicators that you took a class and passed a test.
Don't get me wrong, from a business perspective, technical certifications go a long way in establishing you as a subject matter expert (some contracts I have worked on even required them). Also, they can show prospective employers that you are serious about your trade, and have taken steps to set yourself apart from other applicants. But don't ever think that just because you have a cert and someone else doesn't that you are "better" than they are. It's simply not the case...ever...and it's just going to make you look like a jerk. I recommend taking the approach that you love the trade and want to learn as much as you can about it. You are fortunate enough to have the resources necessary to attend the class and take the exam. It was a great experience, and you feel that you have benefitted from the knowledge you gained. BUT, you realize that the forensics/IR world is a big place with a LOT to learn, and you are eager to be engaged in any way you can (recognize your efforts without breaking your arm patting yourself on the back...good skill to have). If you are good at what you do, your actions will speak far louder than any certifications ever could.
Next, know that you are going to have to interact with customers....a lot. You are going to have to explain some very technical concepts to non-technical people - not stupid, just not technical. You are going to have to deal with angry lawyers, crying business owners, demands, fear, and uncertainty. Basically, every new case, is everyone's worst day. You need to become skilled in situational analysis, leadership, public speaking, and incident management. You will have to learn how to walk the line (a very fine line sometimes) between confidence and arrogance. This is a difficult concept to learn, and honestly after studying it in both my undergrad and graduate degree programs, at Warrant Officer Candidate School, and reading books about it...it's something you are going to have to experience to get good at. At least by doing to research on it, you can better prepare yourself, and decrease the time it's going to take you to become proficient.
I also recommend reading Dale Carnegie's, How to Win Friends and Influence People at least once per year. Take good notes, and use them. It has a wealth of information and has been THE standard for interpersonal business relationships for almost 100 years. Also, realize that at the end of your contract is a person...a human being. This is their business, or their company...their livelihood. This is how they put a roof over their head, food on their table, and their kids through school. Be cognizant of that, and empathetic to their situation.
Finally, I will share some personal details about how I broke into the industry. When I was a sysadmin I got bored. You can only makes things work so well, and know how to troubleshoot so much, before it becomes mundane. That was the case with me...I was a Solaris and Windows admin at a decently sized IT shop and I was pretty good. My systems ran well, I could troubleshoot quickly and efficiently...and I was bored to tears. So, I searched internally for openings doing something different and I came across a posting for the Ethical Hacking Team. I had all of the required skills (networking, Linux, Windows), no different than any of the other applicants. But, what I had that they did not was raw desire. I wanted this job more than anything. I read anything I could get my hands on that dealt with the subject, spent my own money setting up a makeshift lab to play with tools, and perform experiments. I ooozed enthusiasm. I ended up getting the job. After I was hired, I asked my new manager what was it about me that ended up landing me the job? She told me something I have never forgotten to this day...
"Chris, I can teach you how to use the tools. The other folks on the team can teach you how to go after certain targets, what to look for, and how to run exploits. What I can't teach is enthusiasm. I know that you will be one of my best pentesters in a year simply because you want to be. I firmly believe you wanted the job more than anyone else."
So, while being passionate may not land you the job, it will set you apart from other applicants. Read, research, study, conduct experiments. Learn something new every day. Learn how to use open source tools (which is like 99% of what I use). Learn about forensic theory, investigative methodology, and logic. Learn how to write reports, how to deal with difficult situations and difficult people, and how to LISTEN! Most of all, love the work!
I hope you find this information helpful. If you have any specific questions, please feel free to email me at any time. I am always willing to help!
Happy Hunting!
Friday, July 15, 2011
Log2Timeline and Super Timelilnes
For the purposes of this post, I will refer to the two groups as the Hogs and the Budgies. Yes...I know I am terrible at naming things, but after you hear my rationale behind these names, you will at least know my thought process. First of all, both sides agree that timelines should be made. In fact, I am not entirely sure how I ever conducted an investigation without making a timeline, and I am even less sure about how anyone currently conducting investigations can think they are doing a comprehensive job without timelines! The separation in philosophies comes from exactly what data elements to include in the timeline.
Hogs want to include everything...file system data, event logs, registry last write times, application logs...whatever you have, throw it in there. The theory is, I am not entire sure what I will need, or what I really want to see so just show me everything and I will decide later.
Budgies are the exact opposite...they want to see a much smaller data sample. Presumably, they know precisely what is that they want, and only want to see that data.
I categorize myself as a Flying Pig, because what I want to look at changes from case to case. Sometimes, I only need data from the active file system, while other times, I want to maybe see the event logs, and just the system hive last write times.
I think it's OK to be a Flying Pig, and in my opinion, a good marriage of including just the right data elements into your timeline. If you are new to making forensic timelines, my recommendation is to be a Hog. Gather all of the data you can and throw it into your super timeline. Hopefully, as you get more and more familiar with what data provides value to your investigations, you will get better at determining which elements to include. The fact that you are doing timelines at all, sadly puts you in a very small (yet hopefully growing) number of investigators...so keep it up, however you choose to do it.
Now, on to the technical goodness!
Getting Log2Timeline to run properly in Windows was a bit of a challenge. I worked with Kristinn for about a month tweaking perl modules until we finally got a final product that worked properly.
To start with, go to www.log2timeline.net and download the latest version, and the Windows install guide. Once you have the files, unpack them into your tools directory and follow the install guide. I am not going to say much more about that here, other than I KNOW for a fact that it works...since I am the one that wrote it =). So if you follow it step by step, you should not have any problems.
What makes the newest release of Log2Timeline really powerful is the addition of the recurse option. This means that you can throw all of the data you want added to your timeline into a single directory, and use Log2Timeline to recurse through that directory and add any applicable files to the timeline.
Arguably just as important and powerful of a change is the addition of file carving functionality with plugin grouping (much like Harlan Carvey uses in Reg Ripper).
For example...let's say you acquire volatile data from a Windows XP System. You have the event logs, the registry hives, a couple of ntuser.dat files, and the Master File Table. You can chunk (yes...that is an Oklahoma term) them all into a single directory and use the following command syntax.
c:\>tools\log2timeline> perl c:\tools\log2timeline\log2timeline.pl -m "keyword" -z CST6CDT -r vol -f winxip -w c:\cases\
Let's take a look at these options one by one.
The -m option allows you to put in a keyword. Normally, I use the hostname and the drive letter...for example...cpbeefcake_win7_c:\. This can be anything that will allow you to quickly and easily distinguish one timeline from another.
The -z option allows you to set the timezone for the timeline. This step cannot, and should not be skipped. While I live in the central timezone, I work cases in multiple other timezones. By default, if you don't specify Log2Timeline will use the timezone of the localhost. Now, if the case you are working is say in Pacific Standard Time, and your timeline gets generated in Eastern Standard Time, your timeline will be off by as many as four hours! That is a HUGE margin of error, and will no doubt mess with the accuracy of your findings.
The -r option, we talked about briefly, but it is used to recurse through a directory. Log2timeline uses file carving to identify the header of all of the files in the directory. Once it obtains that data, it compares the headers to the known headers for the various plugin types. If the header is recognize, it will automatically load the appropriate plugin, and parse the chronological data from the file and put it into the timeline (pretty sweet!).
The -f option identifies the file type. This can either be the specific file type (if you are only parsing a single file) or a set of plugins if you are parsing the files from a specific operating system. In my example, I used the "winxp" plugin, which automatically loads all of the plugins needed for a Windows XP system.
The -w is the write option. This tells the tool where to write the output file...pretty basic. By default, the tool writes the output in CSV format. DO NOT append the .csv file extension to the output file. I am not sure why this hoarks up the output file, but it does. For some reason, the column headers are left off file and the l2tprocess will fail. I need to get with Kristinn on this.
If done correctly, your column headers should look like this...
c:\tools\test>strings 1
date,time,timezone,MACB,source,sourcetype,type,user,host,short,desc,version,filename,inode,notes,format,extra
Now, if you want to, you can append like the contents of the Master File Table, or a timeline you created with Mactime to your initial output file. Again, since Log2timeline outputs into CSV format by default, you would need to append the final output from mactime, and not a bodyfile generated from FLS.
After you have your super file the way you want it, with all of the data you want it in, you will need to make sure the file is in chronological order, since Log2Timeline will simply add the data to the super file in sequential order (in the order it was read, or appended).
To do this, use the following command...
c:\tools\log2timeline>perl l2t_process -b super > supertimeline
The l2tprocess will chronologically arrange the data from the super file into the correct order, with the first entry at the top, and the last entry at the bottom. Pretty nice!
Another great feature is the ability to use the MFT in the supertimeline! Check it...
date,time,timezone,MACB,source,sourcetype,type,user,host,short,desc,version,filename,inode,notes,format,extra
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$MFT,/$MFT,2,/$MFT,0, ,Log2t::input::mft,-
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$MFT,/$MFT,2,/$MFT,0, ,Log2t::input::mft,-
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$MFTMirr,/$MFTMirr,2,/$MFTMirr,1, ,Log2t::input::mft,-
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$MFTMirr,/$MFTMirr,2,/$MFTMirr,1, ,Log2t::input::mft,-
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$LogFile,/$LogFile,2,/$LogFile,2, ,Log2t::input::mft,-
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$LogFile,/$LogFile,2,/$LogFile,2, ,Log2t::input::mft,-
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$Volume,/$Volume,2,/$Volume,3, ,Log2t::input::mft,-
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$Volume,/$Volume,2,/$Volume,3, ,Log2t::input::mft,-
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$SI [MACB] time,-,-,/$AttrDef,/$AttrDef,2,/$AttrDef,4, ,Log2t::input::mft,-
02/26/2009,20:51:34,CST6CDT,MACB,FILE,NTFS $MFT,$FN [MACB] time,-,-,/$AttrDef,/$AttrDef,2,/$AttrDef,4, ,Log2t::input::mft,-
You see the $SI and $FN in column eight? That's right baby! Timestomping has NEVER been easier to detect! You will see...plan as day...when the chronological data has been manipulated since they $SI and FN attributes will be different! Provided you search by keyword, they will appear literally right on top of each other! Very nice addition!
I almost wish it were harder than that to create super timelines, but it's really not. Kristinn has done a fantastic job on the latest release of Log2Timeline. There numerous other options the tool can use, and for the sake of brevity (not to mention the fact that you can read) I have not covered all potential option combinations. My advice is to take some time and play with the tool. Get to know how it works, what the output looks like, and what commands you think are the most relevant for the timelines you are creating.
Serious props to Kristinn for making this extremely useful and powerful tool free to the forensic community. He has done an outstanding job, and honestly, like Harlan's Reg Ripper, and Mandiant's Memoryze, this is a game changer.
Happy Hunting!
Thursday, July 7, 2011
Wednesday, July 6, 2011
MBR Analysis
A few weeks ago, Harlan touched on the concept of analyzing the Master Boot Record (MBR or $BOOT) for signs of malware infestation. That got me to thinking, "what would that really look like"? So, I tested it and thought I would share my results.
To recap Harlan's post, basically the MBR contains the partition tables for a Windows system. On a typical NTFS host , the offset for the primary partition table that contains the operating system is 0x63. This may vary based on the type of system or the configuration, but generally speaking, this is pretty consistent. An easy way to check an image for the offset values is the The Sleuth Kit's tool, "mmls". By running mmls against an image, you will see the offset values for the partition tables.
Now, how malware comes into play here, is very interesting, and very clever. Let's take a "typical" Winodws NTFS system and assume that the OS partition is located where we would expect to see it, at offset 0x63. But what if there was a partition table set at offset 0x62? Would you even recognize it, or if you did, would you even care? It's not offset 0x63 right, and when you mount offset 0x63 you see the NTFS file system...plain as day...so no harm no foul, right? Wrong, and here's why.
The malware creates a partition table at offset 0x62 and copies the MBR, with a jump statement. The OS boots and see the MBR in offset 0x62 FIRST. It reads the data and if malware is present executes it. It then follows the jump command to offset 0x63, the NFTS file system is recognized, and normal the normal boot process resumes. When the malware runs on the infected system, the traces are NOT in the primary file system, because they are stored in another partition table! Pretty slick!
After some digging around, I found a pretty nice perl script called, MBRparser by Gary Kessler. It's easy to use and shows you exactly what you would need to see when looking for MBR infections. In the screenshot below, I used Gary's tool to parse the MBR from my local Windows 7 Dell laptop.
As you can see, since I have a typical NFTS file system, my first partition table is set to 0x63, exactly what I would expect to see. What I would NOT expect to see, is a entry prior to offset 0x63. If I exported the MBR (again, $BOOT) from a target system and parsed it with MBRparser, and I saw a partition table prior to 0x63, I would immediately become suspicious.
Now, don't think that every time you have a partition table before the NTFS file system that you have MBR malware. There are systems that intentionally put partitions with vendor tools, or other data there intentionally. So, "Don't Panic"...at least not yet. If you see something there before the NTFS file system you can either mount it with a tool like ImDisk, or FTK Imager, or you can extract the data using The Sleuth Kit's, "blkls". Then you can see the data and decide for yourself if it's just benign vendor stuff, or if it's, malware.
The real takeaway here is to actually start looking. By adding this step to your malware detection methodology, you will increase your chances to catch an infection of this nature. And, since you were likely not doing this in the first place, you have made yourself an exponentially better investigator.