Network forensics is a sub-branch of digitalforensics relating to the monitoring and analysis of computer network traffic for the purposes of information gathering, legal evidence, or intrusion detection. Unlike other areas of digitalforensics, network investigations deal with volatile and dynamic information. Network traffic is transmitted and then lost, so network forensics is often a pro-active investigation. Network forensics generally has two uses. The first, relating to security, involves monitoring a network for anomalous traffic and identifying intrusions. An attacker might be able to erase all log files on a compromised host; network-based evidence might therefore be the only evidence available for forensic analysis. The second form relates to law enforcement. In this case analysis of captured network traffic can include tasks such as reassembling transferred files, searching for keywords and parsing human communication such as emails or chat sessions.
The data collected from several computer crime and cyber fraud and digital crime investigations require tools to facilitate efficient data management, analysis, validation, visualization and dissemination, while preserving the intrinsic value of the data and original copy of the data unmodified. The magnitude of data generated and shared by various sectors like businesses, public administrations numerous industrial and not-to-profit sectors, and scientific research, social media sites, sensors networks, cyber-physical systems, and Internet of Things, has increased immeasurably . The amount of complex and heterogeneous data including textual to multimedia content pouring from any-where, any-time, and any-device, is an era of Big Data has become an emerging data science paradigm of multi dimensional information mining for scientific discovery and analytics  .
This course is a new subject. At first, it was called computerforensics, which was a complementarity to information security specialty. In the recent years, several universities all over the word began to set up special computerforensics subject and some teaching work has started. At the same time, some research institutions also joined in this field. For example, Canterbury Christ Church University sets up the Mas- ter Degree of Science in Forensic Computing . It brings together every important aspect of digital fo- rensic examination to support criminal investigation involving digital evidence. The subject areas covered in this outline achieves a balance between the practice and their underpinning theory. As such it is ideally suited for those who are already engaged, or are aim- ing to develop a career in law enforcement or associ- ated areas both in the UK and elsewhere. In U.S.A., the center of security information system provides some related courses for the Master. California Uni- versity established the lab for computersecurity and has begun some technical research.
Abstract: Digital forensic is the process of interpreting and uncovering electronic data. The goal of process is to preserve any evidence in its most original form while applying performing a structured investigation by collecting, identifying and validating the digital information for the purpose of reconstructing past events. This dissertation will discuss the need for network forensics to be practiced in legal and an effective way. In this study also confer types of digitalforensics and also prevention ideas from online fraud, social networking crime etc. IDS stand for intrusion detection system is a technique by using of we can monitor our network traffic and also take control over suspicious activity and alter the administrator or the network. In this dissertation I also try to define how computer may communicate with each other as well as how they share resources and using same internet. This paper defined types of intrusion detection system and did practical implementation on packet transmission in order to sniff bad data packets and take control over transmission between computers which share resources. The full implementation of the sniffer application software that captures network data as well as provides sufficient means for the decision making process of an administrator. The aim of this application is to rewrite C# language sniffer into .Net, and also develop an application that consumes little memory on the hard disk.
If data on hard disks being analysed is encrypted than computer forensic analysis could results as an impossible task or could be difficult one. Tools for software encryption keeps the encryption keys in memory, and sometime the hibernated files might be saved in a disk by hibernation. There are many tools available from which we can decrypt the encrypted files directly or by using hibernation file. This process of decryption of the encryption storage is discussed in our research.
Collection is the first step in the forensics process to identify potential sources and how the data is collected. This collection involves increasingly complex processes and methods due to rapid technological developments, multiple computers, a wide variety of storage media and many computer networks with all the technologies attached to them. Surely this complexity requires different handling. After conducting the data collection process, a further step is to conduct testing, including in assessing and extracting relevant information from the data collected. Once the information is extracted, the examiner performs an analysis to formulate the conclusions in describing the data. The analysis in question certainly takes a methodical approach in generating quality conclusions based on the availability of data. Documentation and reporting are the final stages of computerforensics. In this stage, the information is the result of the analysis process.
When a piece of evidence is to be presented in a court, the chain of custody of the evidence must be established to guarantee that it has not been tampered with. The process makes two assumptions that do not hold by default in the virtual world. The first is that the evidence was not altered from the time it was created to the time it was collected. In a world where data is rapidly combined to produce new content, it is likely that the data found during an investigation would have undergone editing operations before it was collected as evidence. The second erroneous assumption is that a piece of evidence was created by a single individual. A virtual object is much more likely to have multiple co-authors. Note that a co-author is a principal who owns one of the processes that operated on any of the data used to create the object in question.
The EnCase integrated environment means that the EnCase software acquires the evidence as a verifiable, proprietary bit-stream image (called an Evidence File, EF), mounts the image EF as a read-only virtual drive, and reconstructs the file system structure utilizing the logical data in the image. This integrated procedure eliminates the time-consuming sequence of steps normally associated with traditional command-line-based imaging and ensures all the evidence and meta-evidence (such as timestamps) remains forensically unaltered. The acquired EF is available as a loss-less compressed image, and includes cyclic redundancy checks and a MD5 hash value to ensure data integrity. EnCase can image different forms of media, such as SCSI/IDE drives and Zip/Jaz drives as well as RAID disk sets. The investigator can also bypass the acquisition of an EF by prescanning an evidence drive using a parallel port or 10-BaseT network cable between the investigator’s computer and the target computer and invoking the remote preview feature. This makes it easy for an investigator to quickly undertake a perfunctory forensic analysis of the drive without incurring the overheads of an EF creation. Previewing is useful when a preliminary look at the evidence storage media is warranted by time constraints, such as during on- site inspections. Unfortunately, in the review preview mode, the investi- gator is unable to save any of his/her findings, such as search results as all of these will be lost once the computers are disconnected.
Fig1.7.2 shows one way in which it can be done. A ROM is programmed to contain one cycle of a digitized sinewave over its entire address range. If the ROM is addressed by a counter which is allowed to overflow it will produce a continuous digital sinewave whose frequency is determined by the clock rate divided by the size of the ROM. Instead of a counter, the ROM is addressed by an accumulator which adds a constant to its count on each clock. The frequency is now increased in proportion to the value of the constant. Adding a small modifier to the constant allows the frequency to be raised or lowered slightly and the result is a digitally controlled oscillator which can be incorporated in a phase locked loop so that it locks to reference subcarrier. The result is a very clean digital subcarrier having the same sampling rate as the component digital video data. A quadrature component is easily obtained from a second ROM containing a cosine wave. Clearly it is impossible for there to be any error in the quadrature. In PAL, V-switch is obtained by numerically inverting the subcarrier samples to the V multiplier. In PAL and NTSC bursts are created by adding appropriate envelopes to the modulator inputs. In SECAM a frequency modulated chroma signal is required. It will be seen that in the configuration of Fig 1.7.2 the frequency is proportional to the input constant. If the constant is replaced with a variable sample stream the result is a frequency modulator. Component digital interface signals do not carry conventional sync pulses but instead have reserved bit patterns for synchronizing. The sync generator in the encoder must recreate the original sync structure using look-up tables containing the sample values needed. The chroma, luminance and syncs are added numerically. The filtering and modulation processes extend the wordlength of sample values and so after the final addition to produce a digital composite signal the wordlength must be carefully rounded to the length suitable for the DAC in use. Simple truncation cannot be used as this will result in distortion. Following the output DAC a low-pass analog filter removes the images due to the sampling spectrum and sets the overall bandwidth of the composite signal.
Eric Shaw is a clinical psychologist who has spent the last 20 years specializing in the psychological profiling of political actors and forensic subjects. He has been a consultant supporting manager devel- opment and organizational change, a clinician aiding law enforcement and corporate security, an intelligence officer supporting national security interests and a legal consultant providing negotiation and liti- gation assistance. He has also provided cross-cultural profiling for the U.S. Government on the psychological state and political attitudes of figures such as Saddam Hussein, Iranian revolutionary leaders under Khomeini, senior Soviet military commanders, as well as Yugoslav, Laotian, Cuban and other military and political leaders. In 2000 he helped develop a tool designed to help analysts identify political, reli- gious and other groups at-risk for terrorist violence.This approach examines the group’s cultural context, its relationship with allied and competitive actors in the immediate political environment, their internal group dynamics and leadership. It utilizes a range of informa- tion on the group, including their publications, web sites and internal communications. Eric has recently published articles on cyber ter- rorism examining the likelihood of the use of cybertactics by tradi- tional and emerging forms of terrorist groups.
lpd is the UNIX facility for printing (Line Printer Daemon). It allows you to submit printjobs, run them through filters, manage the print queues, and so on. lpd can accept print jobs locally, or over the network, and access various parts of the system (printers, logging daemons, etc), hence making it a potential security hole. Historically lpd has been the source of several nasty root hacks, however it seems to have been mostly fixed now, there are still many potential denial of service attacks though due to it’s function (something simple like submitting huge printjobs and running the printer out of paper). Fortunately lpd is slowly being phased out with the advent of network aware printers, however there is still a huge of printing done via lpd. lpd access is controlled via /etc/hosts.equiv , and /etc/hosts.lpd . You should also firewall lpd from the outside world, and if you need to send printjobs across public networks, remember anyone can read them, so a VPN solution is a good idea. lpd runs on port 515 using tcp. The hosts.lpd file should contain a list of hosts (workstation1.yourdomain.org, etc), one per line that are allowed to use the lpd services on the server, you might as well use
Digital evidences can be defined as the clues which can be recovered from digital sources and helps in digitalforensics investigations. Evidences are very delicate to deal with it, if it is handled improperly it can be spoiled. Relevant evidence is any evidence that makes the existence of a fact that is of consequence to the case either more or less probable than it would be without the evidence. The evidence is accurate and reliable if the substance of the story the material tells is believed and is consistent, and there are no reasons for doubt. Digital evidence can be classified, compared, and individualized in several ways. One of those ways is by the contents of the evidence. For example, investigators use the contents of an e-mail message to classify it and to determine which computer it came from. DigitalForensics includes various sub-branches relative to investigation of various types of devices, media and artefacts. It is an important part of computerinvestigation to recover data.
Generally the type of operating system a company uses dictates the timing and the manner in which a computer is powered down. With some operating systems, merely pulling the power plug is the preferred method. With other systems, disconnecting the power supply without allowing the operating system to initiate internal shutdown could result in the loss of files or, in rare instances, a hard drive crash. Potential evidence may reside in typical storage areas such as the spreadsheet, database, or word processing files. However, potential evidence may also be in file slack (file slack is the unused space in a data cluster that’s at the end of most files), erased files, and Windows swap files. Potential evidence in these locations is usually in the form of data frag- ments and can be easily overwritten by booting the computer and running the operating system. For example, when the Windows operating system boots up (loads), it generates new files and opens existing files. This has the potential to overwrite and destroy data or possible evidence pre- viously stored in the Windows swap file. To use another example, when word processing or other program files are opened and viewed, temporary files are created and overwritten by updated ver- sions of files, making potential evidence stored in these locations subject to loss. According to the U.S. Department of Energy’s First Responder’s Manual, the following are the basic characteristics and procedures (broken down by operating system) that should be followed when an operating system shutdown is warranted.
To prove the note was not written by Guthrie, the defense called a computer specialist who testified that in his examination of the contents of the church computer’s hard drive, there were no traces of any such note ever having been created. However, prosecutors were reminded that there was a second computer. It had been in the Guthrie home. When officers had earlier examined the home in July with a search warrant, they saw the computer, but it appeared not to have been used. They decided not to take it. Guthrie had access to it until he was arrested and jailed on August 27. Sometime after his arrest, he asked his daughter and son-in-law, Suzanne and Les Hewitt, to store some of his household belongings, including this computer and the printer to which it was connected. Now on the revelation of a suicide note, the state asked Les Hewitt to bring in the computer. He agreed. Guthrie moved to suppress the evidence gained from this computer, asserting that it was seized illegally. The court denied the motion. From examining the home computer’s hard drive, the state’s expert found a document with conspicuous similarities to the note Guthrie gave to his attorney. This document had been created and modified on August 7, 1999. Like the document portrayed to the jury as Sharon’s “suicide note,” it was dated May 13. The font appeared similar, and the margin size and spacing between words appeared identical, even the lack of a space in the date between the comma and 1999.
In order to keep a step ahead of the criminals, various investigative agencies and detectives around the globe have strengthen their computer forensic branches and equipped them with the expertise to face such crimes. As a result of the evolution of the computer which has become more powerful than ever, the area of digital forensic must evolve too. The computer investigator must be able to differentiate between genuine and bogus information and being able to extract the evidence and translate raw data into concrete evidence which lead to condemning the offender in the law of court.
Computerforensics has become an essential tool in the identification of misuse and abuse of systems. Whilst widely utilised within law enforcement, the rate of adoption by organisations has been somewhat slower, with many organisations focusing upon the traditional security countermeasures to prevent an attack from occurring in the first place. Such an approach is certainly essential, but it is also well understood that no system or network is completely secure. Therefore, organisations will inevitably experience a cyberattack. Moreover, traditional countermeasures do little to combat the significant threat that exists from within the organisation. Computerforensics is an invaluable tool for an organisation in understanding the nature of an incident and being able to recreate the crime. The purpose of this pocket book is to provide an introduction to the tools, techniques and procedures utilised within computerforensics, and in particular focus upon aspects that relate to organisations. Specifically, the book will look to:
Availability, as defined in an information security context, assures that access data or computing resources needed by appropriate personnel is both reliable and available in a timely manner.The origins of the Internet itself come from the need to ensure the availability of network resources. In 1957, the United States Department of Defense (DOD) created the Advanced Research Projects Agency (ARPA) following the Soviet launch of Sputnik. Fearing loss of command and control over U.S. nuclear missiles and bombers due to communication channel disruption caused by nuclear or conventional attacks, the U.S. Air Force commis- sioned a study on how to create a network that could function with the loss of access or routing points. Out of this, packet switched networking was created, and the first four nodes of ARPANET were deployed in 1968 running at the then incredibly high speed of 50 kilobits per second.
With the development of innovative multimedia technologies, the realism of computer generated characters has achieved a very high quality level. Non- existing subjects or situations can be easily generated. Thus, in a daily life context, it raises the need of advance tools supporting users in the identification of artificial data which may not represent reality. Although many interesting methods have been proposed to discriminate between CG and natural multimedia data, most of these methodologies do not achieve satisfactory performance in the detection of CG characters. Hence, in this doctoral study, we proposed efficient techniques to distinguish between CG and natural on this special kind object. Our methods are developed based on geometric-based forensic techniques, which exploit the measure on facial shapes formation and the evolution of facial animations. These solutions can be applied both for images and videos, in a wide situations and contexts.
The Comparison is done among five tools which are Award Key Logger, Recuva, USBDeview, OpenPuff and WinHex. As we compare our tools on the basis of feature, investigation process and IDFPM model framework we notice that among the five tools WinHex is said to be the better tool. WinHex not only has all five properties of Investigation process but also has the properties defined in the IDFPM process. WinHex is an advanced tool for everyday and emergency use which inspect and edit all kinds of files, recover deleted files or lost data from hard drives with corrupt file systems, data wiping and disk cloning. WinHex also analyzes the data and compares the file. So in comparison among other five tools, WinHex is the best in terms of utilization, characterization and performance.
Hash Collision – Hash function is an algorithm used to create a unique fixed value string from any amount of data. This process is irreversible. In computerforensics it guarantees that digital evidence has not been changed during the investigation proc- ess. However in 2005 a student from China created a hash collision from two different inputs of data. The student managed to create the same hash outputs from the two different sets of inputs . Having this ability, any user could efficiently undermine the credibility of digital evidence. During this research, this method was only consid- ered from the theoretical perspective in potential in threatening the credibility of computerforensics software .