COMPUTER FORENSICS
Computer forensics is a branch of forensic science pertaining to legal evidence found in computers and digital storage mediums. Computer forensics is also known as digital forensics.
The goal of computer forensics is to explain the current state of a digital artifact. The term digital artifact can include a computer system, a storage medium (such as a hard disk or CD-ROM), an electronic document (e.g. an email message or JPEG image) or even a sequence of packets moving over a computer network. The explanation can be as straightforward as "what information is here?" and as detailed as "what is the sequence of events responsible for the present situation?"
The field of computer forensics also has sub branches within it such as firewall forensics, network forensics, database forensics and mobile device forensics.
There are many reasons to employ the techniques of computer forensics:
* In legal cases, computer forensic techniques are frequently used to analyze computer systems belonging to defendants (in criminal cases) or litigants (in civil cases).
* To recover data in the event of a hardware or software failure.
* To analyze a computer system after a break-in, for example, to determine how the attacker gained access and what the attacker did.
* To gather evidence against an employee that an organization wishes to terminate.
* To gain information about how computer systems work for the purpose of debugging, performance optimization, or reverse-engineering.
Special measures should be taken when conducting a forensic investigation if it is desired for the results to be used in a court of law. One of the most important measures is to assure that the evidence has been accurately collected and that there is a clear chain of custody from the scene of the crime to the investigator---and ultimately to the court. In order to comply with the need to maintain the integrity of digital evidence, British examiners comply with the Association of Chief Police Officers (A.C.P.O.) guidelines. These are made up of four principles as follows:-
Principle 1: No action taken by law enforcement agencies or their agents should change data held on a computer or storage media which may subsequently be relied upon in court.
Principle 2: In exceptional circumstances, where a person finds it necessary to access original data held on a computer or on storage media, that person must be competent to do so and be able to give evidence explaining the relevance and the implications of their actions.
Principle 3: An audit trail or other record of all processes applied to computer based electronic evidence should be created and preserved. An independent third party should be able to examine those processes and achieve the same result.
Principle 4: The person in charge of the investigation (the case officer) has overall responsibility for ensuring that the law and these principles are adhered to.
The Forensic Process
There are five basic steps to the computer forensics:
1. Preparation (of the investigator, not the data)
2. Collection (the data)
3. Examination
4. Analysis
5. Reporting
The investigator must be properly trained to perform the specific kind of investigation that is at hand.
Tools that are used to generate reports for court should be validated. There are many tools to be used in the process. One should determine the proper tool to be used based on the case.
Collecting Digital Evidence
Digital evidence can be collected from many sources. Obvious sources include computers, cell phones, digital cameras, hard drives, CD-ROM, USB memory devices, and so on. Non-obvious sources include settings of digital thermometers, black boxes inside automobiles, RFID tags, and web pages (which must be preserved as they are subject to change).
Special care must be taken when handling computer evidence: most digital information is easily changed, and once changed it is usually impossible to detect that a change has taken place (or to revert the data back to its original state) unless other measures have been taken. For this reason it is common practice to calculate a cryptographic hash of an evidence file and to record that hash elsewhere, usually in an investigator's notebook, so that one can establish at a later point in time that the evidence has not been modified since the hash was calculated.
Other specific practices that have been adopted in the handling of digital evidence include:
* Imaging computer media using a writeblocking tool to ensure that no data is added to the suspect device.
* Establish and maintain the chain of custody.
* Documenting everything that has been done.
* Only use tools and methods that have been tested and evaluated to validate their accuracy and reliability.
Some of the most valuable information obtained in the course of a forensic examination will come from the computer user. An interview with the user can yield valuable information about the system configuration, applications, encryption keys and methodology. Forensic analysis is much easier when analysts have the user's passphrases to access encrypted files, containers, and network servers.
In an investigation in which the owner of the digital evidence has not given consent to have his or her media examined (as in some criminal cases) special care must be taken to ensure that the forensic specialist has the legal authority to seize, copy, and examine the data. Sometimes authority stems from a search warrant. As a general rule, one should not examine digital information unless one has the legal authority to do so. Amateur forensic examiners should keep this in mind before starting any unauthorized investigation.
Live vs. Dead analysis
Traditionally computer forensic investigations were performed on data at rest---for example, the content of hard drives. This can be thought of as a dead analysis. Investigators were told to shut down computer systems when they were impounded for fear that digital time-bombs might cause data to be erased.
In recent years there has increasingly been an emphasis on performing analysis on live systems. One reason is that many current attacks against computer systems leave no trace on the computer's hard drive---the attacker only exploits information in the computer's memory. Another reason is the growing use of cryptographic storage: it may be that the only copy of the keys to decrypt the storage are in the computer's memory, turning off the computer will cause that information to be lost.
Imaging electronic media (evidence)
The process of creating an exact duplicate of the original evidentiary media is often called Imaging. Using a standalone hard-drive duplicator or software imaging tools such as DCFLdd or IXimager, the entire hard drive is completely duplicated. This is usually done at the sector level, making a bit-stream copy of every part of the user-accessible areas of the hard drive which can physically store data, rather than duplicating the filesystem. The original drive is then moved to secure storage to prevent tampering. During imaging, a write protection device or application is normally used to ensure that no information is introduced onto the evidentiary media during the forensic process.
The imaging process is verified by using the SHA-1 message digest algorithm (with a program such as sha1sum) or other still viable algorithms such as MD5. At critical points throughout the analysis, the media is verified again, known as "hashing", to ensure that the evidence is still in its original state. In corporate environments seeking civil or internal charges, such steps are generally overlooked due to the time required to perform them. They are essential for evidence that is to be presented in a court room, however.
Collecting Volatile Data
If the machine is still active, any intelligence which can be gained by examining the applications currently open is recorded. If the machine is suspected of being used for illegal communications, such as terrorist traffic, not all of this information may be stored on the hard drive. If information stored solely in RAM is not recovered before powering down it may be lost. This results in the need to collect volatile data from the computer at the onset of the response.
Several Open Source tools are available to conduct an analysis of open ports, mapped drives (including through an active VPN connection), and open or mounted encrypted files (containers) on the live computer system. Utilizing open source tools and commercially available products, it is possible to obtain an image of these mapped drives and the open encrypted containers in an unencrypted format. Open Source tools for PCs include Knoppix and Helix. Commercial imaging tools include Access Data's Forensic Toolkit and Guidance Software's EnCase application.
The aforementioned Open Source tools can also scan RAM and Registry information to show recently accessed web-based email sites and the login/password combination used. Additionally these tools can also yield login/password for recently accessed local email applications including MS Outlook.
In the event that partitions with EFS are suspected to exist, the encryption keys to access the data can also be gathered during the collection process. With Microsoft's most recent addition, Vista, and Vista's use of BitLocker and the Trusted Platform Module (TPM), it has become necessary in some instances to image the logical hard drive volumes before the computer is shut down.
RAM can be analyzed for prior content after power loss. Although as production methods become cleaner the impurities used to indicate a particular cell's charge prior to power loss are becoming less common. However, data held statically in an area of RAM for long periods of time are more likely to be detectable using these methods. The likelihood of such recovery increases as the originally applied voltages, operating temperatures and duration of data storage increases. Holding unpowered RAM below − 60 °C will help preserve the residual data by an order of magnitude, thus improving the chances of successful recovery. However, it can be impractical to do this during a field examination.
Analysis
All digital evidence must be analyzed to determine the type of information that is stored upon it. For this purpose, specialty tools are used that can display information in a format useful to investigators. Such forensic tools include: AccessData's FTK, Guidance Software's EnCase, and Brian Carrier's Sleuth Kit. In many investigations, numerous other tools are used to analyze specific portions of information.
Typical forensic analysis includes a manual review of material on the media, reviewing the Windows registry for suspect information, discovering and cracking passwords, keyword searches for topics related to the crime, and extracting e-mail and images for review.
Reporting
Once the analysis is complete, a report is generated. This report may be a written report, oral testimony, or some combination of the two.
The increasing use of telecommunications, particularly the development of e-commerce, is steadily increasing the opportunities for crime in many guises, especially IT-related crime .Developments in information technology have begun to pose new challenges for policing. Most professions have had to adapt to the digital age, and the police profession must be particularly adaptive, because criminal exploitation of digital technologies necessitates new types of criminal investigation. More and more, information technology is becoming the instrument of criminal activity. Investigating these sophisticated crimes, and assembling the necessary evidence for presentation in a court of law, will become a significant police responsibility. The application of computer technology to the investigation of computer-based crime has given rise to the field of forensic computing. This paper provides an overview of the field of forensic computing.
Computer forensics is a branch of forensic science pertaining to legal evidence found in computers and digital storage mediums. Computer forensics is also known as digital forensics.
The goal of computer forensics is to explain the current state of a digital artifact. The term digital artifact can include a computer system, a storage medium (such as a hard disk or CD-ROM), an electronic document (e.g. an email message or JPEG image) or even a sequence of packets moving over a computer network. The explanation can be as straightforward as "what information is here?" and as detailed as "what is the sequence of events responsible for the present situation?"
The field of computer forensics also has sub branches within it such as firewall forensics, network forensics, database forensics and mobile device forensics.
There are many reasons to employ the techniques of computer forensics:
* In legal cases, computer forensic techniques are frequently used to analyze computer systems belonging to defendants (in criminal cases) or litigants (in civil cases).
* To recover data in the event of a hardware or software failure.
* To analyze a computer system after a break-in, for example, to determine how the attacker gained access and what the attacker did.
* To gather evidence against an employee that an organization wishes to terminate.
* To gain information about how computer systems work for the purpose of debugging, performance optimization, or reverse-engineering.
Special measures should be taken when conducting a forensic investigation if it is desired for the results to be used in a court of law. One of the most important measures is to assure that the evidence has been accurately collected and that there is a clear chain of custody from the scene of the crime to the investigator---and ultimately to the court. In order to comply with the need to maintain the integrity of digital evidence, British examiners comply with the Association of Chief Police Officers (A.C.P.O.) guidelines. These are made up of four principles as follows:-
Principle 1: No action taken by law enforcement agencies or their agents should change data held on a computer or storage media which may subsequently be relied upon in court.
Principle 2: In exceptional circumstances, where a person finds it necessary to access original data held on a computer or on storage media, that person must be competent to do so and be able to give evidence explaining the relevance and the implications of their actions.
Principle 3: An audit trail or other record of all processes applied to computer based electronic evidence should be created and preserved. An independent third party should be able to examine those processes and achieve the same result.
Principle 4: The person in charge of the investigation (the case officer) has overall responsibility for ensuring that the law and these principles are adhered to.
The Forensic Process
There are five basic steps to the computer forensics:
1. Preparation (of the investigator, not the data)
2. Collection (the data)
3. Examination
4. Analysis
5. Reporting
The investigator must be properly trained to perform the specific kind of investigation that is at hand.
Tools that are used to generate reports for court should be validated. There are many tools to be used in the process. One should determine the proper tool to be used based on the case.
Collecting Digital Evidence
Digital evidence can be collected from many sources. Obvious sources include computers, cell phones, digital cameras, hard drives, CD-ROM, USB memory devices, and so on. Non-obvious sources include settings of digital thermometers, black boxes inside automobiles, RFID tags, and web pages (which must be preserved as they are subject to change).
Special care must be taken when handling computer evidence: most digital information is easily changed, and once changed it is usually impossible to detect that a change has taken place (or to revert the data back to its original state) unless other measures have been taken. For this reason it is common practice to calculate a cryptographic hash of an evidence file and to record that hash elsewhere, usually in an investigator's notebook, so that one can establish at a later point in time that the evidence has not been modified since the hash was calculated.
Other specific practices that have been adopted in the handling of digital evidence include:
* Imaging computer media using a writeblocking tool to ensure that no data is added to the suspect device.
* Establish and maintain the chain of custody.
* Documenting everything that has been done.
* Only use tools and methods that have been tested and evaluated to validate their accuracy and reliability.
Some of the most valuable information obtained in the course of a forensic examination will come from the computer user. An interview with the user can yield valuable information about the system configuration, applications, encryption keys and methodology. Forensic analysis is much easier when analysts have the user's passphrases to access encrypted files, containers, and network servers.
In an investigation in which the owner of the digital evidence has not given consent to have his or her media examined (as in some criminal cases) special care must be taken to ensure that the forensic specialist has the legal authority to seize, copy, and examine the data. Sometimes authority stems from a search warrant. As a general rule, one should not examine digital information unless one has the legal authority to do so. Amateur forensic examiners should keep this in mind before starting any unauthorized investigation.
Live vs. Dead analysis
Traditionally computer forensic investigations were performed on data at rest---for example, the content of hard drives. This can be thought of as a dead analysis. Investigators were told to shut down computer systems when they were impounded for fear that digital time-bombs might cause data to be erased.
In recent years there has increasingly been an emphasis on performing analysis on live systems. One reason is that many current attacks against computer systems leave no trace on the computer's hard drive---the attacker only exploits information in the computer's memory. Another reason is the growing use of cryptographic storage: it may be that the only copy of the keys to decrypt the storage are in the computer's memory, turning off the computer will cause that information to be lost.
Imaging electronic media (evidence)
The process of creating an exact duplicate of the original evidentiary media is often called Imaging. Using a standalone hard-drive duplicator or software imaging tools such as DCFLdd or IXimager, the entire hard drive is completely duplicated. This is usually done at the sector level, making a bit-stream copy of every part of the user-accessible areas of the hard drive which can physically store data, rather than duplicating the filesystem. The original drive is then moved to secure storage to prevent tampering. During imaging, a write protection device or application is normally used to ensure that no information is introduced onto the evidentiary media during the forensic process.
The imaging process is verified by using the SHA-1 message digest algorithm (with a program such as sha1sum) or other still viable algorithms such as MD5. At critical points throughout the analysis, the media is verified again, known as "hashing", to ensure that the evidence is still in its original state. In corporate environments seeking civil or internal charges, such steps are generally overlooked due to the time required to perform them. They are essential for evidence that is to be presented in a court room, however.
Collecting Volatile Data
If the machine is still active, any intelligence which can be gained by examining the applications currently open is recorded. If the machine is suspected of being used for illegal communications, such as terrorist traffic, not all of this information may be stored on the hard drive. If information stored solely in RAM is not recovered before powering down it may be lost. This results in the need to collect volatile data from the computer at the onset of the response.
Several Open Source tools are available to conduct an analysis of open ports, mapped drives (including through an active VPN connection), and open or mounted encrypted files (containers) on the live computer system. Utilizing open source tools and commercially available products, it is possible to obtain an image of these mapped drives and the open encrypted containers in an unencrypted format. Open Source tools for PCs include Knoppix and Helix. Commercial imaging tools include Access Data's Forensic Toolkit and Guidance Software's EnCase application.
The aforementioned Open Source tools can also scan RAM and Registry information to show recently accessed web-based email sites and the login/password combination used. Additionally these tools can also yield login/password for recently accessed local email applications including MS Outlook.
In the event that partitions with EFS are suspected to exist, the encryption keys to access the data can also be gathered during the collection process. With Microsoft's most recent addition, Vista, and Vista's use of BitLocker and the Trusted Platform Module (TPM), it has become necessary in some instances to image the logical hard drive volumes before the computer is shut down.
RAM can be analyzed for prior content after power loss. Although as production methods become cleaner the impurities used to indicate a particular cell's charge prior to power loss are becoming less common. However, data held statically in an area of RAM for long periods of time are more likely to be detectable using these methods. The likelihood of such recovery increases as the originally applied voltages, operating temperatures and duration of data storage increases. Holding unpowered RAM below − 60 °C will help preserve the residual data by an order of magnitude, thus improving the chances of successful recovery. However, it can be impractical to do this during a field examination.
Analysis
All digital evidence must be analyzed to determine the type of information that is stored upon it. For this purpose, specialty tools are used that can display information in a format useful to investigators. Such forensic tools include: AccessData's FTK, Guidance Software's EnCase, and Brian Carrier's Sleuth Kit. In many investigations, numerous other tools are used to analyze specific portions of information.
Typical forensic analysis includes a manual review of material on the media, reviewing the Windows registry for suspect information, discovering and cracking passwords, keyword searches for topics related to the crime, and extracting e-mail and images for review.
Reporting
Once the analysis is complete, a report is generated. This report may be a written report, oral testimony, or some combination of the two.
The increasing use of telecommunications, particularly the development of e-commerce, is steadily increasing the opportunities for crime in many guises, especially IT-related crime .Developments in information technology have begun to pose new challenges for policing. Most professions have had to adapt to the digital age, and the police profession must be particularly adaptive, because criminal exploitation of digital technologies necessitates new types of criminal investigation. More and more, information technology is becoming the instrument of criminal activity. Investigating these sophisticated crimes, and assembling the necessary evidence for presentation in a court of law, will become a significant police responsibility. The application of computer technology to the investigation of computer-based crime has given rise to the field of forensic computing. This paper provides an overview of the field of forensic computing.