A Comprehensive Digital Forensic Investigation Model and Guidelines for Establishing Admissible Digital Evidence

MPhil Thesis

AuthorsAdemu, Inikpi Onechojo
TypeMPhil Thesis
Abstract

Information technology systems are attacked by offenders using digital devices and networks to facilitate their crimes and hide their identities, creating new challenges for digital investigators. Malicious programs that exploit vulnerabilities also serve as threats to digital investigators. Since digital devices such as computers and networks are used by organisations and digital investigators, malicious programs and risky practices that may contaminate the integrity of digital evidence can lead to loss of evidence. For some reasons, digital investigators face a major challenge in preserving the integrity of digital evidence. Not only is there no definitive comprehensive model of digital forensic investigation for ensuring the reliability of digital evidence, but there has to date been no intensive research into methods of doing so.
To address the issue of preserving the integrity of digital evidence, this research improves upon other digital forensic investigation model by creating a Comprehensive Digital Forensic Investigation Model (CDFIM), a model that results in an improvement in the investigation process, as well as security mechanism and guidelines during investigation. The improvement is also effected by implementing Proxy Mobile Internet Protocol version 6 (PMIPv6) with improved buffering based on Open Air Interface PIMIPv6 (OAI PMIPv6) implementation to provide reliable services during handover in Mobile Node (MN) and improve performance measures to minimize loss of data which this research identified as a factor affecting the integrity of digital evidence. The advantage of this is to present that the integrity of digital evidence can be preserved if loss of data is prevented.
This research supports the integration of security mechanism and intelligent software in digital forensic investigation which assist in preserving the integrity of digital evidence by conducting experiments which carried out two different attack experiment to test CDFIM. It found that when CDFIM used security mechanism and guidelines with the investigation process, it was able to identify the attack and also ensured that the integrity of the digital evidence was preserved. It was also found that the security mechanism and guidelines incorporated in the digital investigative process are useless when the security guidelines are ignored by digital investigators, thus posing a threat to the integrity of digital evidence.

Year2013
Digital Object Identifier (DOI)
Publication dates
Jan 2013
Publication process dates
09 May 2016
Publisher's version INIKPI_ADEMU_FINAL_THESIS V2.pdf

https://repository.uel.ac.uk/item/85xwz

Log in to edit

Download files

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

Ethical and Legal Aspects of Digital Forensics Algorithms: The Case of Digital Evidence Acquisition

DOI: https://doi.org/10.1145/3560107.3560114 ICEGOV 2022: 15th International Conference on Theory and Practice of Electronic Governance , Guimarães, Portugal, October 2022

The first step that forensic examiners perform is identifying and acquiring data. Both are among the most critical segments in the forensic process since they are sine qua non for completing the examination and analysis phases. The evidence acquisition must be managed in a deliberate, ethical and legal manner. On many occasions, the outcome of the investigation depends mainly on the relevance and precision of the evidence acquired. The goal of this research is to identify both legal and ethical issues that forensic investigators face during evidence acquisition and to design a framework using design science, which recognises and resolves the problems identified. The framework must preserve the forensic soundness of the investigation, overall integrity, effectiveness, and efficiency. The elicitation of the requirements for the framework is based on a literature review and ex-ante expert interviews, while the validation and evaluation of the framework stem from ex-post expert interviews. The designed framework aims to minimise hazardous practices that lead to negative consequences and to effectively align the new technologies in digital forensics with human expertise for improved results during the phase of digital evidence acquisition.

ACM Reference Format: Maria Ioanna Maratsi, Oliver Popov, Charalampos Alexopoulos and Yannis Charalabidis. 2022. Ethical and Legal Aspects of Digital Forensics Algorithms: The Case of Digital Evidence Acquisition. In 15th International Conference on Theory and Practice of Electronic Governance (ICEGOV 2022), October 04-07, 2022, Guimarães, Portugal . ACM, New York, NY, USA, 14 Pages. https://doi.org/10.1145/3560107.3560114

1 INTRODUCTION

The affordances provided by the powerful forensic software and hardware tools combined with data science technologies and automated decision-making have significantly improved the forensic process by reducing the amount of data the examiners need to look into during a criminal investigation. Cybercrime, which is an illegal activity conducted using computers and the Internet [ 1 ], was third among the most significant global threats in 2018, with ransomware being on centre stage [ 2 ]. The dynamics of cybercrime are relatively high, which requires security to be not only reactive but also proactive. In addition, it implies an urgent need for better digital forensics technologies that are more “intelligent”, efficient, and rigorous to cope with the enormity of cybercrime. In this light, forensics based on machine learning, behavioural forensics, crime prediction, profiling, and surveillance aim to better combat cybercrime and the forensic process as a whole.

The general steps of a forensic examination process according to NIST 800-86 [ 3 ] are shown in Figure 1 below.

Figure 1

The acquisition phase of the forensic investigation process is critical since identifying and collecting potential sources of evidence may significantly affect the next steps in the process. Forensic experts are bound to numerous codes of ethics associated with their profession [ 4 , 5 ] to ensure that the way the forensic phases are conducted is fair, objective, ethical, unbiased and compliant with the current legislation. Forensic soundness is also crucial in order to have evidence which is honest, informed, competent and complete while at the same time admissible in a court of law. However, many of these new technology processes and systems are not always designed with those codes of ethics in mind. Nowadays, more than ever, there is a pressing need to include ethical issues and principles to optimise the results and the benefits of portent and transformative technologies while at the same time mitigate negative implications and consequences. The forensic process ought to be conducted in an impartial, objective, legal and ethical manner in order to minimise human-induced error but also errors introduced by the digital tools which could negatively affect the outcome of the investigation and to maximise the effectiveness and efficiency of the utilised new technologies in the forensic field. This is highly critical of a task when the creation of artefacts is involved, as the adequacy of these artefacts -or rather the absence of them, can lead to tremendously serious and heavily multiparametric consequences. More specifically, the evidence data and the artefacts derived from the first stages of the forensic process must fulfil several criteria which will ensure that the evidence's integrity, value, legality and chain of custody, among others, are preserved and not tampered with. This problem is of general interest as these ethical and legal issues jeopardise the veracity and credibility of evidence while, in parallel, they affect the smooth conduction of the forensic process, on many occasions, even the outcome of the investigation itself. In addition, it is imperative to preserve the forensic soundness of the evidence. As per the Daubert Standard, which is a list of criteria used to determine the admissibility of expert witness testimony in federal court [ 6 ], “the judge is the gatekeeper” who makes the verdict over the expert's evidence. Consequently, if the evidence is not forensically sound, it can be easily disregarded and deemed useless before a court of law. Within the scope of this research is to adequately consider and analyse all the parameters necessary in order to identify and resolve the ethical and legal issues which arise in digital forensics investigations during the acquisition phase (first step of Figure 1 ) of the forensic process.

The structure of the presented research continues as follows: Introduction (this section), Background, Methodology, Results, Discussion and Conclusion. The first section is a rounded and short induction into the importance and relevance of the research area, the influence of the powerful technologies that move the horizon of problem-solving, inter alia in digital forensics and yet create challenging ethical problems that need to be addressed. The work and the body of research are central to the second section of the paper, along with the ideas and paradigms presented. Moreover, one can also see how the novel ideas are grounded along with possible extensions. To proceed, an actual definition of the problem is integral before Section 3, including different aspects of the error sources, such as the tools being used or the procedures exercised, which could make the digital forensic investigation unsound and incomplete. The research methods are presented in Section 4, and they include design science since we are building an artefact (or a framework in our case) which consists of various modules and their intended functionality, and surveys to elicit the requirements that are built in the framework. In Section 5, the results of the research are presented, with a clear demonstration of each phase of the framework and the interpretation of the qualitative data along with the identification of the associated legal and ethical issues. In Section 6, the positive perceptions, and the evaluation by the experts of the framework are analysed and contrasted with their expectations. Finally, the authors discuss the possibility of translating the framework from abstraction to an implemented model which does not supersede human expertise but rather serves as an extension assistant that improves the correctness of the digital investigation processes, its efficiency and effectiveness.

2 BACKGROUND

To the best of the authors’ knowledge, there has not been ample research using a similar approach to resolve ethical and legal issues in the field of digital forensics. There has been, however, research conducted to resolve ethical issues which arise when novel AI techniques are used in various areas of computer science and subsequent applications. While this body of work is potentially relevant to the presented research, since the main focus area is digital forensic science, it is the belief of the authors that the results based on the proposed methodology and the generality of the approach are complementary to the growing volume of knowledge in this fairly young discipline. In the remaining part of this Section, some of the most relevant research is presented for context and intuition.

Ward & Syversen (2008) presented a framework to cover ethical tasks to be found in forensic or correctional work. The authors enumerated some of the challenges found in ethical forensic practice and reasoned about the justice and proportionality of criminal punishment, the different levels of abstraction of ethical codes and their approach but also about what is to be expected when codes of ethics from different grounds are in conflict. The levels of ethical abstraction the authors chose are “ethical theories and concepts, ethical principles, institutional ethical rules or codes and specific ethical judgements of forensic practitioners” and all these strongly include the notion of human dignity. The main ethical principles that the authors included in their framework were beneficence and nonmaleficence, fidelity and responsibility, integrity, justice and respect for people's rights and dignity, each one examined in practice along with the common pitfalls they entail. They also discussed a model developed by Bush et al. (2006) and attempted to unify the various levels of ethical abstraction to help forensic practitioners think in a consistent manner rather than doing so in a fragmented way. Ward & Willis (2010) continued their work by having a slightly different perspective, thus creating a framework for ethical research in the domains of forensic and correctional sciences to help researchers address certain difficult situations instead of only relying on professional codes of ethics.

According to Floridi (1999), “Physical objects may not be affected by their manipulation, but any cognitive manipulation of information is performative. It modifies the nature of information by automatically cloning it.” This is a very sensitive subject not only for digital forensics but also for forensics in general, as the entropy and unpredictability caused by the human factor are a double-edged sword for the outcome of the investigation. It can introduce bias and distortion of information which may render the outcome either misleading or worthless. The situation may be rectified by exercising rational and realistic critical thinking. Models developed to deal with ethical issues in forensic acquisition in no way pretend to eliminate human bias or intervention but rather to minimise hazardous practices and behaviour that may lead to negative consequences. The combination of the two has the potential to generate the best results based on the synergy between new technologies and human expertise. Floridi & Turilli (2009) describe in detail the ethics of information transparency and demonstrate the process of deriving information from data, which apparently bears a direct similarity to the extraction of information by forensic examiners or tools when they acquire data for examination. In Figure 2 below, one can see the information creation process.

Figure 2

These relationships between raw data, ethical principles and information created are pivotal in planning and developing the model central to the research. Philip Brey argues that computer ethics should not just study ethical issues in the use of computer technology but also in the technology itself [ 12 ]. This is important since, by bringing AI and data mining into forensics, we are dealing with intelligent agents which, on many occasions, proceed into automatic decision making and exercise a sort of autonomy without human intervention. However, they are guided by human intervention. He claims that certain artefacts (in our case, tools or procedures) can be associated with certain recurring consequences. If this statement is generalised, then it can be claimed that “particular consequences may manifest themselves in all of the central uses of the artefact.” This generalisation does not always hold, and it translates to an excessively deterministic view of the artefact itself. However, it is suggestive of the fact that with the usage of an artefact, one can expect certain consequences to be necessary or unavoidable. What Brey baptised as an “embedded value” can be seen as a built-in consequence to make this outcome more controlled and predictable. For example, spyware tends to break privacy no matter how it is being used. Thus, one can claim that it has privacy invasion as its embedded “disvalue”, which is the antonym of embedded value.

Considering forensic tools as such artefacts, one can make similar analogies by breaking down their operation into smaller fragmented procedures to determine the desired defined values one would like to preserve while at the same time minimising the respective disvalues. Various approaches to integrating values into the design process are presented, something also named “value-sensitive” design. John Arquilla [ 12 ] brought up the ethical considerations following cyber surveillance and other intrusive means, stating that “individual liberty and privacy may come under sharp, devaluating pressure as a result of efforts to detect, deter or defend against the various forms of cyberattack”. This is directly related to what earlier was mentioned as “cyber-sleuthing” and other similar means that forensic experts sometimes use in investigations, jeopardising the evidence's credibility and lawfulness. According to Vincent Wiegel (2010), “Different applications will require different complexity in moral awareness and reasoning.” Moor (2009) sketches a continuum of increasingly rich moral agents: Ethical impact agent, Implicit ethical agent, Explicit ethical agent and Full ethical agent, the last being the human level of moral awareness with full initiative and autonomy. However, in our case, the framework needs to be of the first type as it measures the potential impact of its own existence. The framework will not be autonomous or have a self-decision-taking capability, but rather it will provide the forensic expert with a range of possible ethical implications of their actions, tools or procedures and their respective impact. As a result, we create an opportunity for human intervention that can eventually avoid what has been defined as “disvalue” according to the value that needs to be preserved each time.

In a more recent article concerning the human bias and ethics of algorithms, Martijn van Otterlo (2017) makes an analogy with physical libraries and archives as information providers. This is very relevant to our case as the same ethical issues can emerge in computer forensics techniques. Van Otterlo states that people, many times wrongfully so, associate algorithms with infallibility, trustworthiness and above all, objectiveness. He, in fact, views algorithms as heavily biased and an example of that is the black box phenomenon. Especially with complex systems, humans cannot see within the functions, data and learning processes of the algorithm upon which its decision is based. He also claims that with rapidly evolving technology, legal developments are too slow to catch up with the shift in moral values. “The more intelligent, autonomous or conscious an algorithm will become, the more moral values will be attributed to it, and the more ethical reasoning and behaviour will be expected of it.” Current literature on possible solutions for this mostly deals with how to make algorithms that behave appropriately. However, most of them inevitably include human involvement. The European General Data Protection Regulation (GDPR) will cover forms of algorithm decision-making, although its effectiveness will depend on its practical application when we have to deal with difficult cases. Van Otterlo proposed a research strategy called IntERMeDIUM (the acronym stands for intentional, executable, reward-based, moral, declarative, inductive, utilitarian and machine), which is a synthesis of preceding ideas and research. It aims to develop ethical learning agents in the future by including an executable code of ethics which will guide the algorithm's decisions. The aforementioned properties will be taken into consideration by the authors, as they might be deemed useful and directly applicable in our research case. Van Otterlo argues that this is a point of thin balance as the code of ethics will need to follow and comply with each one of those properties and will also need to be checked regularly to evaluate the model's effectiveness and correctness. Having taken into consideration the previous research presented, one can easily notice that there has been substantial research conducted that lasts several decades in analysing computer ethics in the world of technology. However, technology advances so rapidly that it appears to be that ethical and legal progress is always behind. In the following sections of this research, the authors are going to analyse the identified ethical issues derived from the existing literature, desktop analysis of commonly used forensic tools but also from the qualitative approach (questionnaire), which will be presented in the methodology section.

3 METHODOLOGY

A large number of research strategies exist, out of which the authors must choose the most appropriate one according to the goals and requirements of their research. Johannesson & Perjons (2014) proposed nine research strategies: experiments, surveys, case studies, grounded theory, ethnography, action research, simulation, mathematical, logical proof and phenomenology in order to achieve the goals of each one of the five Design Science activities, as those are shown below. For the purpose of this research, the five activities this model proposes were organised and modified, as Figure 3 demonstrates, in order to complete each part of the process accordingly.

Figure 3

During the first activity, the authors used the questionnaire, relevant scientific research and the information derived from the way in which digital forensics tools work in order to identify and clearly state the problem. A questionnaire including questions of open type was then delivered to expert participants (digital forensic analysts, the police, lawyers, university professors and information security professionals) to help determine the next steps. In the next phase, the context and information derived from the questionnaire using grounded theory and content analysis were used to specify the requirements, which are the most critical aspect of the artefact created. At this phase and based on the same information, the authors defined what was mentioned earlier as “values”, which are the ones that need to be preserved, but also the “disvalues” which need to be minimised during the forensic acquisition stage.

Throughout the Design and Development phase, the authors sketched and designed multiple different potential models and frameworks, using the information derived so far. They then combined those with the support of existing literature and brainstorming and finally assessed all of them in order to select the one which (1) covers most aspects and (2) fulfils the specified requirements of the second activity. The usability and functionality of the produced artefact were then demonstrated and finally given to the expert participants for evaluation in order to test its performance. The last three activities of the method are conducted in an iterative manner to integrate the feedback into the design and development process.

3.1 Data Collection

According to Denscombe (2010), there are four data collection methods; ‘questionnaires, interviews, observations and documents.’ The data collection for this research was conducted mainly by means of a questionnaire to gain access to qualitative data, which assisted in forming the requirements of the artefact but also in the design and evaluation of it. The questionnaire included both short questions but also questions of open type to allow for more intuitive answers.

The option of interviews was not chosen due to constraints posed from the participants’ side due to their limited availability. Observations were also rejected due to the limited time within which this research is conducted since there would be some issues concerning the accuracy and the reflection of reality due to the short time span. For the same reason, documents were also excluded from the data collection methods used in this research. The questionnaire was, thus, deemed to be the most appropriate data collection method in this case, given all of the aforementioned limitations. As mentioned earlier, the questionnaire was completed mainly by digital forensic experts with many years of experience in this field, including police digital forensics experts, academics, and researchers as well as law experts, to give an insight of a different perspective on the same matter as another side of the same coin.

3.2 Data Analysis

The analysis of the collected data was mainly conducted based on content analysis but also influenced by a flavour of grounded theory. The reason for that was that the answers received by the participants were rather explicit and straightforward. These two similar but slightly differentiated methods are used to extract meaningful context out of given qualitative data. According to Johannesson & Perjons (2014), “in contrast to experiments, the grounded theory does not start with a hypothesis to be tested but instead with data from which a theory can be generated”. Hence, through the questionnaire and based on that, the authors extracted the information necessary in order to proceed with the design process of the artefact. The content analysis assisted in categorising the different aspects of information acquired by the open questions of the questionnaire with the same purpose. It helps develop relevant categories and identify “keywords” associated with them. As per Denscombe (2010), content analysis helps the researcher spot the priorities, values and ideas that are conveyed in the data by measuring the frequency with which those appear while also assigning a positive/negative sentiment and contextual proximity of one text part with another.

To define the requirements for the framework design, the authors conducted the aforementioned survey (questionnaire). While interpreting the qualitative data with the help of content analysis, the authors identified some common ground among the experts’ opinions. Some legal issues were identified by the participants, among which are user privacy, data protection and the unregulated use of non-accredited tools (OS tools) in courts of law. Also, the lack of documentation on how the acquisition and analysis of evidence should take place, especially when it comes to live memory acquisition. Similarly, the ethical issues which were identified were again user privacy, retrieval of more data than required for the case (many times involving sensitive data) and the involvement of third parties.

Many of the participants believed it would be helpful to have a standard protocol or procedure to handle and deal with such issues that might arise. Part of the current standard procedures appears to be consulting the law department of the institution or the district attorney. From a law perspective, though, this can be a bit problematic (as stated by digital forensics expert) as the people who specialised in law should be more involved in knowing how the tools work and acquire evidence in order to think of potential issues and work better with the forensic experts. Some things mentioned by the participants’ side that could help their work were: support for Linux RAM acquisition and ACL-based (Access Control List) case sharing so certain people have access to certain parts of a case file. A clear indication of the status of the progress of analysis and/or acquisition was deemed to be helpful. Simple license management procedures and the legal right to store and use databases for forensic data instead of prohibiting it altogether.

Some of the desired properties for data evidence mentioned by the experts to help their work were integrity, completeness, accuracy, reproducibility, simplicity, clarity, timeliness, and being fast. Nearly all the experts (83.3%) deemed both integrity and accuracy to be of great importance as properties that digital evidence should have. Legitimacy was rated medium on average, not the first priority. Relevance was also not rated high among the participants, lower than legitimacy. When dealing with large amounts of data, though, this property can become useful -if this opportunity can be given. This is one of the properties that data mining-based systems try to achieve to help forensic experts make sense of huge amounts of data and draw their attention to the most relevant data to their case. Impartiality was rated the lowest. This could possibly be due to the fact that it can be highly subjective and hard to prove, in which case it does not contribute much to the case and can even cause adverse effects if perceived wrongfully.

4.1 Definition of Requirements

Before moving on to the requirements, it is important that some notions, properties and concepts included in the requirements section are defined knowing what each term refers to, such as:

Integrity: Evidence integrity refers to “the validity of information derived from the examination of physical (in our case digital) evidence, and it depends entirely upon the care with which the evidence has been protected from contamination.” [ 17 ]

Accuracy: Forensic evidence accuracy is a term closely related to integrity but different as it denotes the proximity between the collected evidence and how the events occurred in reality [ 18 ]. In other words, it is a measure of how reliable the representation of reality is according to the information given by a piece of data.

Data minimisation (or minimality principle): “A principle that states that data collected and processed should not be held or further used unless this is essential for reasons that were clearly stated in advance, to support data privacy.” [ 19 ]. This principle is one of the fundamental principles in the European GDPR.

According to the questionnaire content analysis results, it appeared to be that the participants identified mainly three (3) ethical issues during the acquisition of digital evidence: (1) the difficulty of preserving user privacy, (2) the difficulty of preserving the minimality principle as in most investigations they inevitably get access to more data than may be required or data of third parties which in many cases are not related to the investigation (many times this can be sensitive personal data) and (3) the use of non-accredited forensic tools which can jeopardise the credibility of the evidence or make it inadmissible in a court of law.

Among the main legal issues identified by the participants, the first one, inter alia, was the lack of permission to store information about the investigation, which could facilitate the outcome of it. Some of the forensic experts mentioned that it would be very helpful for them to be able to store data (with the appropriate protection and help of verifiers and supervisors) related to the investigation. The second issue was the need to preserve the minimality principle which is not only an ethical but also a legal issue. For instance, it is one of the main principles in the European GDPR. Quoting one of the participants: “The Swedish law does not allow researchers to collect and analyse personal data from the Web, and neither is it allowed to collect personal information about possible criminal activity. So, for instance, the usage of crawlers (“a program that visits websites and reads their pages and other information in order to create entries for a search engine index” [ 20 ]) or other similar bots cannot be used without permission from the regional ethical review board of the university's region.” Another participant also stated that by Swedish law, they are not allowed to have databases where they store information about each investigation. Two other participants from another European country stated that in their country, the main problem of the same nature is that the communications of the suspect are protected during digital evidence seizure. Desired properties of the digital evidence by the experts were integrity, accuracy, timeliness (to be acquired fast) and legitimacy.

All things considered, the authors defined the requirements for the designed artefact in order to satisfy the most important and persistent issues identified by the experts and literature so far, but also to include and propel the desired properties of digital evidence data. As mentioned earlier, in the Cambridge Handbook of Information & Computer Ethics [ 12 ], an “embedded value” is a built-in consequence. For example, a crawler has the tendency to break privacy, no matter how it is being used. Thus, its embedded disvalue is the invasion of privacy. For the requirements, one needs to minimise the embedded disvalue of the forensic acquisition process procedures and consequently preserve the embedded value. The artefact designed is an “ethical impact agent”, as defined by Moor (2009), as it aims to inform the forensic examiner of the possible ethical and legal implications of their actions, tools or procedures so that they can intervene and avoid the disvalues. In the same principle, the requirements defined are shown in Table 1:

According to Johannesson & Perjons (2014), the artefact will have internal and external properties, among which need to be consistency (keeping conflicts at a minimum level), modularity (the model consists of many parts which interconnect and interact) and conciseness (keeping complexity and redundancy low).

4.2 The Artefact

In this Section, the functionality, construction and environment of the framework are presented. According to Johannesson & Perjons (2014), the construction includes the internal properties of the framework; the framework needs to be coherent, consistent (keeping conflicts at a minimum level), modular (consist of modules which interact and can be easily separated and recombined), concise (keeping complexity and redundancy low), but also elegant which means its design is pleasing and well-structured. On the other hand, the framework includes environmental (external) properties which describe how it interacts with other artefacts or external entities such as users.

These properties, in their turn, are divided into usage, management and generic external properties. These, among others, include properties such as usability, customisation, traceability, maintainability, accountability, autonomy and efficiency. The main activities of this piece of research were development and evaluation, while their output was different frameworks. The most suitable in terms of how it would fulfil the set requirements were then chosen. The abstract framework for this case was inspired by Saleem et al. (2014) and adapted to focus exclusively on the phase of data acquisition. The abstract form of the designed framework can be seen in Figure 4 .

Figure 4

The authors have added the sub-processes of Evaluation of the Acquired Data and an initial “Sieving” of the data after acquiring it. There, the forensic examiner (or the system, in the case of the development of a smart agent) will evaluate the data collected so far to ensure that it satisfies the requirements set in the previous Section. This process is called Evidence Assessment by the U.S. Department of Justice [ 22 ] under the rationale that digital evidence should be thoroughly assessed in order to specify the scope of the investigation and then determine the appropriate course of action. The authors also deemed this to be very important and thus decided to include it as a separate process. Once this is done, if needed, the process will be iterative for more data to be acquired and follow the same path. This is being done mainly under two “umbrellas” which are: (1) applying a protocol for handling digital evidence during digital forensic acquisition (this is divided into live forensic acquisition and offline forensic acquisition to capture the special traits of each case) and (2) the preservation of the embedded values which the authors mentioned in the previous Section.

Following the abstract framework, the sub-processes (1), (2), (3), (4) and (5) of Figure 4 are fragmented showing what they include and will be demonstrated separately below.

4.2.1 Protocol for handling digital evidence during acquisition (1). In order to improve the forensic soundness and integrity of the data, the authors decided to include the model PIDESC (Protecting Digital Evidence Integrity by using Smart Cards), developed by Saleem & Popov (2011). They evaluated existing security methods and realised most were based on digital hashes, which provided weaker security against tampering with them. The proposed solution preserves the integrity by combining smart cards, hashes and timestamps. The essence of the procedure can be seen below:

If a computer is off: (1a) Protocol for offline forensic acquisition

  • Save the original material as it is.
  • Take photos of physical evidence and a screenshot of evidence content.
  • Document temporal information, date, time etc.
  • Inject the bit-by-bit copy into the forensic computer.
  • Apply PIDESC as shown in Figure 5 .
  • Implement an Access Control List (ACL) only to let certain people have access to certain data according to their role in the investigation for the specific case. This will help increase accountability and decrease the chance of accidental mistakes by people who are not authorised to access this data. (Fewer people have hands on the data without absolutely needing to). The forensic system could operate on a reference monitor architecture where every operation (read, write etc.) by subjects on the data is being monitored to prevent unauthorised modification.
  • Document everything.

Figure 5

If a computer is on: (1b) Protocol for live forensic acquisition

At this point, in the case of a live investigation, the protocol is being modified as follows:

  • Freeze the current state of the computer in order to image RAM. This helps ensure that there have been no modifications during the acquisition (although not bulletproof).
  • As Jones (2007) suggested: Swap the hard disk with forensic hardware in the principle of a shadow drive (place a drive between the motherboard and hard disk).
  • Take the shadow drive and inject it into the forensic computer.
  • Follow the same last three steps of protocol (1a)

4.2.2 Preservation of embedded values (2). The embedded values defined by the requirements of this artefact were (2):

  • User privacy/ user data protection
  • Data minimisation

To attain improved user privacy, we suggest the instrument of an external review where a small team of forensic experts and law experts will have the role of verifiers. The process should be done to increase the objectivity of the procedure and help the forensic examiners avoid pitfalls that could compromise the evidence admissibility in a court of law. Data can be pseudonymised so it can be reviewed by external reviewers without the user identity being revealed to them. This helps preserve the property of unlinkability with the true identity of the person but still allows for the reviewers to see how the procedure takes place. The external review will also be compliant with the existing regulation, for example, Article 10 of the GDPR, in order to have the appropriate permission and ensure the lawfulness of the processing. The rationale behind this sub-process came from a few survey participants who emphasised how useful they think it would be if there was closer cooperation and cross-education between law and forensic experts.

In order to achieve data minimisation, the authors related this to sub-process (1), where an Access Control List (ACL) is implemented in order to prevent unauthorised access for people who do not need to lay eyes on the data. It is very important in this process to ensure that the forensic examiners have signed and are compliant with the respective forensic science code of ethics.

4.2.3 Evaluation of the acquired data (3). As mentioned earlier, this sub-process was inspired by the U.S. Department of Justice's “Assessment of data”, as well as Grobler & von Solms (2007), who suggested a best practice approach.

This sub-process includes two (2) dimensions:

  • Laws & Regulations: where the forensic experts assess the nature of the evidence data according to cybercrime law, admissibility in court and other legislation with the help of a district attorney.
  • Scope: where the scope of the case is determined in order to avoid unnecessary hazards and include the particularities of each case. In this light, the assignment of roles to the experts is case-specific in order to implement the access control to the data (as mentioned in the protocol of evidence acquisition (1a) and (1b)).

4.2.4 “Sieving” of the data (4). At this point, the data is being sieved in order to get rid of data which is irrelevant to the specific case. Protecting metadata is important, such as third-person information. If those are irrelevant to the case, then they should not be accessible to everyone. This data must be anonymised to be able to work with them in case they can contribute to the case but without revealing the identity of the user, as this could undermine the evidence admissibility in court.

4.2.5 Return (5). In case there is a need, a return to the acquisition phase following the protocols presented (1) and preservation of embedded values (2) warrants the repeat of the process. Document everything, and if there is no more iteration required, move on to the forensic analysis stage.

4.3 Evaluation of the Artefact

One of the main goals of the evaluation is to determine the extent to which the artefact addresses the problem it was initially designed to solve [ 15 ]. In addition, the evaluation helps determine whether the requirements which were set before the design of the artefact were met. The first evaluation strategy for this piece of research is a naturalistic, ex-ante evaluation, as the artefact has not yet been used. Ex-ante evaluation appears to be the most suitable first type of evaluation for the artefact as it is fast and can provide a good first impression for an early version of the artefact. However, it is important to be cautious and ensure that the artefact will be re-evaluated at a later point. This is to avoid situations where an optimistic view implies a much better artefact than it is in reality. As mentioned earlier, the evaluation strategy of this artefact is also naturalistic as it aims to be used in real conditions by forensic experts. An ex-post evaluation of the artefact is in the process of being completed with the involvement of experts in the areas of forensics, information security, and law enforcement agencies.

5 DISCUSSION

The purpose of this piece of research was to create a framework for the acquisition phase of the forensic procedure to resolve issues of ethical and legal nature that might jeopardise important properties of the digital evidence. The framework was positively perceived by all the forensics, information security, law, and criminology experts during the evaluation phase (ex-ante and ex-post), and they all valued its potential contribution to improving the forensic acquisition in terms of preserving the critical digital evidence properties and embedded values of each investigation. The designed framework aimed to be added as the first step towards an innovative proposed solution to this research problem on the ground that similar research on this specific topic is still in its infancy. As mentioned earlier, while there has been extensive research regarding the ethicality and legality of algorithms and computers, there has not been a clear line of progress to embed this knowledge into digital and cyber forensics to resolve acquisition-specific issues.

While, according to the forensic and information security experts, the designed framework seems to be promising for delivering positive results and improvements, there are still many aspects that need to be taken into consideration before jumping to overly optimistic conclusions. The area of ethics is an area that needs special cautiousness and treading lightly, as any different definition or viewpoint can have a “walking on ice” effect. Any framework or research aiming to involve ethics, even more so when the outcome could have tremendous consequences, ought to be thoroughly verified and tested but also make room for the necessary flexibility and adaptability which will facilitate the fulfilment of the multi-sided specificities of human nature. As one of the experts added: “a framework that is too rigorously applied could be just as damaging as the lack of one”. For this reason, the designed framework does by no means aim to eliminate human intervention altogether but rather to minimise hazardous practices and behaviours that lead to negative consequences and to align the new knowledge and technologies effectively with human expertise. Another aspect and, at the same time, limitation of this research is that, due to the short timeframe within which the research was conducted, there has not yet been sufficient evaluation and improvement of the designed artefact. The continuous feedback and evaluation of the framework are pivotal to ensure more and more that all possible failures and problematic areas are taken into consideration and, at the same time, embed more knowledge into it in order to make it more adaptable, flexible and robust against misuse. However good the a priori analysis of the framework can be, such emergent issues can potentially show up at any time and therefore it is very important to spot them early on and improve them through this iterative procedure.

The purpose of this research was to contribute to the body of knowledge of the ethical and legal aspects of forensic algorithms and procedures, focusing on the acquisition phase. As discussed earlier, the research work was subject to limitations of different kinds, mainly of temporal nature. The subject of research presented is a subject of great complexity with various interactive parameters involved, so the time dedicated to it was rather short considering the vastness of the topic. However, according to the evaluators of the designed framework, it has set the beginning and, taking the right considerations in mind, has the potential to develop into a possible solution for the problem examined.

6 FUTURE PERSPECTIVE

Future research for this topic could address and include the implementation of a model which makes use of existing technologies such as, for instance, supervised algorithm learning and could improve the work for forensic experts in terms of effective selection of information and time consumption. The model would not pretend to supersede human expertise with machine decisions but work in a complementary manner for more efficient and rigorous investigations. This aspect was also pointed out by two of the ex-post evaluation experts, which underlined its significance and relevance.

In the same light, more research could be conducted for process (2) “Preservation of Embedded Values” where implementation of access control would ensure the prevention of unauthorised access to data that does not need to be accessed by any forensic examiner. The framework needs further and more thorough evaluation, particularly in the areas which might be potential sources of conflicts among the domain experts. In addition, practitioners and future researchers could consider analysing and comparing different implementations of access control to identify and assess the most, and eventually, the best applicable solution, which can be either case-specific or general with a high degree of adaptability.

ACKNOWLEDGMENTS

Special thanks to my academic supervisors at Stockholm University and the University of the Aegean, as well as the Information Systems Laboratory for their valuable support, guidance, and resources.

  • What is cybercrime? Definition from SearchSecurity. Retrieved September 10, 2019, from https://www.techtarget.com/searchsecurity/definition/cybercrime
  • World Economic Forum. 2019. The Global Risks Report 2018. [online] Retrieved from https://www.weforum.org/reports/the-global-risks-report-2018 [Accessed 10 Feb. 2019]
  • The National Institute of Standards & Technology. 2016. Guide to Integrating Forensic Techniques into Incident Response, Computer Security Division Information Technology Laboratory, NIST, Gaithersburg, MD 20899-8930
  • The California Association of Criminalists. 2009. A Model for a National Code of Ethics in the Forensic Sciences, Recommendation 7, NAS Report
  • The Northwest Association of Forensic Scientists (NWAFS). 2007. The Code of Ethics of the Northwest Association of Forensic Scientists, Salt Lake City, UT.
  • Expert, T. 2019. The Daubert Standard: A Guide to Motions, Hearings, and Rulings. [online] The Expert Institute. Retrieved from: https://www.theexpertinstitute.com/the-daubert-standard-a-guide-to-motions-hearings-and-rulings/ [Accessed 12 Feb. 2019].
  • Ward, T., Syversen, K. 2008. Human Dignity & Vulnerable Agency: An Ethical Framework for Forensic Practice, 1359-1789 © 2008 Elsevier Ltd.
  • Bush, S. S., Connell, M. A., & Denney, R. L. 2006. Ethical practice in forensic psychology: A systematic model for decision making (pp. ix-196). Washington, DC: American Psychological Association.
  • Tony Ward, Gwenda Willis. 2010. Ethical issues in forensic and correctional research, Aggression and Violent Behavior, Volume 15, Issue 6, Pages 399-409, ISSN 1359-1789, https://doi.org/10.1016/j.avb.2010.07.002 .
  • Floridi, L. 1999., Information Ethics: On the Philosophical Foundation of Computer Ethics, © 1999 Kluwer Academic Publishers
  • Turilli, M., Floridi, L. 2009. The Ethics of Information Transparency, Springer Science+Business Media B.V. 2009
  • Floridi, L. 2010. The Cambridge Handbook of Information and Computer Ethics, © Cambridge University Press 2010
  • Moor, James. 2009. "Four kinds of ethical robots." Philosophy Now 72: 12-14.
  • Otterlo, Martijn. 2018. Gatekeeping Algorithms with Human Ethical Bias: The ethics of algorithms in archives, libraries and society.
  • Johannesson, P., Perjons, E. 2014. An Introduction to Design Science, DOI 10.1007/978-3-319-10632-8_13
  • Denscombe, M. 2010. The Good Research Guide. Maidenhead, England: McGraw-Hill/Open University Press.
  • Wilenet.org. 2017. “Evidence Integrity”, [online] Retrieved from: https://wilenet.org/html/crime-lab/physevbook/chapter1-evidence-integrity-2017.pdf [Accessed 30 April 2019].
  • McKemmish, R. 2008. in IFIP International Federation for Information Processing, Volume 285; Advances in Digital Forensics IV; Indrajit Ray, Sujeet Shenoi; (Boston: Springer), pp. 3–15
  • Experian.co.uk. 2019. “What is data minimization” [online] Retrieved from: https://www.experian.co.uk/business/glossary/data-minimisation/ [Accessed 30 April 2019].
  • Techtarget.com. 2005. “Crawlers” [online] Retrieved from: https://searchmicroservices.techtarget.com/definition/crawler [Accessed 30 April, 2019].
  • Saleem, S., Popov, O., Bagilli, I. 2014. Extended Abstract Digital Forensics Model with Preservation and Protection as Umbrella Principles, Procedia Computer Science, vol. 35, pp. 812-821
  • U.S. Department of Justice, Office of Justice Programs. 1994. Forensic Examination of Digital Evidence: A Guide for Law Enforcement [online] Retrieved from: http://www.ojp.usdoj.gov/nij [Accessed 2 May 2019].
  • Saleem, S., Popov, O. 2011., Protecting Digital Evidence Integrity by Using Smart Cards, Digital Forensics and Cyber Crime; Lecture Notes Institute of Computer Science Social Informatics and Telecommunications Engineering, vol. 53, pp. 110-119
  • Jones, R. 2007. Safer Live Forensic Acquisition, Computer Science Laboratory University of Kent at Canterbury.
  • Grobler, M., von Solms, S. 2007. Modelling Live Forensic Acquisition, Proceedings of the Fourth International Workshop on Digital Forensics & Incident Analysis (WDFIA 2009).
  • Aditya, K. et al. 2018. Enabling Trust in Deep Learning Models: A Digital Forensics Case Study, School of Computer Science University College
  • March, S., Smith, G. 1995. Design and Natural Science Research on Information Technology, Information and Decision Sciences Department, Carlson School of Management University of Minnesota
  • Barger, R. 2008. Computer Ethics: A Case-Based Approach, Cambridge University Press
  • Bostrom, N. & Yudkowsky, E. 2009. “The Ethics of Artificial Intelligence.” Cambridge Handbook of Artificial Intelligence, edited by Keith Frankish and William Ramsey. New York: Cambridge University Press

CC-BY share alike license image

ICEGOV 2022, October 04–07, 2022, Guimarães, Portugal

© 2022 Copyright held by the owner/author(s). ACM ISBN 978-1-4503-9635-6/22/10. DOI: https://doi.org/10.1145/3560107.3560114

Advertisement

Advertisement

An Insightful Analysis of Digital Forensics Effects on Networks and Multimedia Applications

  • Review Article
  • Published: 31 January 2023
  • Volume 4 , article number  186 , ( 2023 )

Cite this article

dissertations on digital forensics

  • Aishwarya Rajeev   ORCID: orcid.org/0000-0003-1938-3964 1 , 2 &
  • P. Raviraj 3  

234 Accesses

3 Citations

Explore all metrics

Humans have benefited greatly from technology, which has helped to raise standards of living and make important discoveries. But there are a lot of hazards associated with using it. The prevalence of digital video through mobile smartphone applications like WhatsApp and YouTube as well as web-based multimedia platforms are likewise gaining in importance as crucial. But there are also global security issues that are arising. These difficulties could cause significant issues, especially in cases where multimedia is a crucial factor in criminal decision-making, such as in child pornography and movie piracy. Consequently, copyright protection and video authentication are required in order to strengthen the reliability of using digital video in daily life. A tampered film may contain the relevant evidence in a legal dispute to convict someone of a violation or clear a guilty party of wrongdoing. Hence, to develop it is crucial to have reliable forensic techniques that would enhance the justice administration systems and enable them to reach just verdicts. This article discusses numerous forensic analysis fields, including network forensics, audio forensics, and video forensics. In this study, many algorithms such as Random Forest, Multilayer Perceptron (MLP), and Convolutional Recurrent Neural Networks (CRNN) are used for implementing different types of forensic analysis. Also, image fusion is used which can provide more information than a single image and extract features from the original images. This study came to the conclusion that the random forest provides the finest results for network forensic analysis with an accuracy of 98.02 percent. A lot of work has been done during the past years, through an analysis of current methods and machine learning strategies in the field of video source authentication and the study aims to provide a thorough summary of that work.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

dissertations on digital forensics

Similar content being viewed by others

dissertations on digital forensics

A comprehensive taxonomy on multimedia video forgery detection techniques: challenges and novel trends

dissertations on digital forensics

ENCVIDC: an innovative approach for encoded video content classification

dissertations on digital forensics

Impact of Media Forensics and Deepfake in Society

Explore related subjects.

  • Artificial Intelligence

Wen CY, Chen JK. Multi-resolution image fusion technique and its application to forensic science. Forensic Sci Int. 2004;140:217–32.

Article   Google Scholar  

Ali Z, Imran M, Alsulaiman M. An automatic digital audio authentication/forensics system. IEEE Access. 2017. https://doi.org/10.1109/ACCESS.2017.2672681 .

Gangwar A, Fidalgo E, Alegre E, Gonzáles V. Castro, pornography and child sexual abuse detection in image and video: a comparative evaluation, Proceedings of the 8th International Conference on Imaging for Crime Detection and Prevention, ISBN: 978-1-78561-687-7. 2017, pp. 37–42.

Perez M, Avila S, Moreira D, Moraes D, Testoni V, Valle E, Goldenstein S, Rocha A. Video pornography detection through deep learning techniques and motion information. Neurocomputing. 2017;230:279–93.

Slay J. Towards developing network forensic mechanism for botnet activities in the IoT based on machine learning techniques, Mobile networks and management. In: 9th international conference, MONAMI (2017), Melbourne, Australia, December 13–15; 2017. Proceedings. Springer. Vol. 235, p. 30.

Xiao J, Li S and Xu Q. Video Based Evidence analysis and extraction in digital forensics investigation. IEEE Access; 2019.

Hosler B, Mayer O, Bayar B, Zhao X, Chen C, Shackleford JA, Stamm MC. A video camera model identification system using deep learning and Fusion. In: IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE; 2019. pp. 8271–8275.

Chhabra GS, Singh VP, Singh M. Cyber forensics framework for big data analytics in IoT environment using machine learning. Multimed Tools Appl. 2020;79:15881–900.

Orozco ALS, Huamán CQ, Álvarez DP, Villalba LJG. A machine learning forensics technique to detect post-processing in digital videos. Future Gener Computer Syst. 2020;111:199–212.

Qamhan MA, Altaheri H, Meftah AH, Muhammad G, Alotaibi YA. Digital audio forensics: microphone and environment classification using deep learning. IEEE Access. 2021. https://doi.org/10.1109/ACCESS.2021.3073786 .

Sachdeva S, Ali A. Machine learning with digital forensics for attack classification in the cloud network environment. Int J Syst Assur Eng Manag. 2022;13(Suppl 1):S156–65.

Abbasi A, Javed AR, Yasin A, Jalil Z, Kryvinska N, Tariq U. A large-scale benchmark dataset for anomaly detection and rare event classification for audio forensics. IEEE Access. 2022. https://doi.org/10.1109/ACCESS.2022.3166602 .

Akbari Y, Al-Maadeed S, Elharrouss O, Khelifi F, Lawgaly A, Bouridane A. Digital forensic analysis for source video identification: a survey. Forensic Sci Int: Digital Investig. 2022;41:301390.

Google Scholar  

Akbari Y, Al-Maadeed S, Almaadeed N, Al-ali A, Khelifi F, Lawgaly A, et al. A new forensic video database for source smartphone identification: description and analysis. IEEE Access. 2022. https://doi.org/10.1109/ACCESS.2022.3151406 .

Neale C, Kennedy I, Price B, Yijun Yu, Nuseibh B. The case for zero trust digital forensics. Sci Int: Digit Investig. 2022;40:301352.

Download references

This article has not received any funding.

Author information

Authors and affiliations.

GSSSIETW, Affiliated to VTU Belagavi, Mysuru, Karnataka, India

Aishwarya Rajeev

Department of CSE, CIT, Affiliated to VTU Belagavi, Ponnampet, Karnataka, India

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Aishwarya Rajeev .

Ethics declarations

Conflict of interest.

It is stated by the authors that they have no conflicting interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the topical collection “Advances in Computational Intelligence for Artificial Intelligence, Machine Learning, Internet of Things and Data Analytics” guest edited by S. Meenakshi Sundaram, Young Lee and Gururaj K S.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Rajeev, A., Raviraj, P. An Insightful Analysis of Digital Forensics Effects on Networks and Multimedia Applications. SN COMPUT. SCI. 4 , 186 (2023). https://doi.org/10.1007/s42979-022-01599-8

Download citation

Received : 07 October 2022

Accepted : 17 December 2022

Published : 31 January 2023

DOI : https://doi.org/10.1007/s42979-022-01599-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Web-based multimedia platforms
  • Copyright protection
  • Image fusion
  • Machine learning strategies
  • Find a journal
  • Publish with us
  • Track your research

Forensic Focus

Home » Articles » Digital Forensic Evidence And Artifacts: Recent News And Research

Digital Forensic Evidence And Artifacts: Recent News And Research

dissertations on digital forensics

This month’s academic research reflects two aspects of the changing digital forensics industry: new ways to think not just about digital artifacts, but also about broader investigative processes — including interagency cooperation.

This round-up article includes three open-access articles from the in-progress August and December issues of the journal Forensic Science International: Digital Investigation :

  • Structured decision making in investigations involving digital and multimedia evidence (Ryser, Spichinger, Casey)
  • The role of evaluations in reaching decisions using automated systems supporting forensic analysis (Bollé, Casey, Jacquet)
  • Digital forensics as a service: Stepping up the game (van Beek, van den Bos, Boztas, van Eijk, Schramp, Ugen)

Also included are two pieces of research focusing on artifacts:

  • The paper “A Two-Stage Model for Social Network Investigations in Digital Forensics” (David, Morris, Appleby-Thomas) available from the Journal of Digital Forensics, Security and Law .
  • DFIR Review published “Parsing Google’s Now Playing History on Pixel Devices,” in which Kevin Pagano asks what information is recoverable from the use of the Now Playing feature on Google Pixel phones.

Bringing structure to forensic evidence evaluations

One of the most significant themes to have emerged in recent academic research is the need to improve transparency and ultimately, trust in digital forensic evidence . Beyond standardizing some aspects of the field, other proposals aim to take a more structured scientific approach to digital forensic evidence.

Pointing out that the results of forensic examination support many different kinds of decision-making during an investigation — operational, legal, and so forth — “ Structured decision making in investigations involving digital and multimedia evidence ,” authored by Elénore Ryser, Hannes Spichiger, and Eoghan Casey, proposes “a logically structured framework for scientific interpretation.”

Get The Latest DFIR News

Join the forensic focus newsletter for the best dfir articles in your inbox every month..

Unsubscribe any time. We respect your privacy - read our privacy policy .

This kind of framework, applied at all stages of the investigative process, could reduce the risk of mistakes stemming from “information overload, inaccuracy, error and bias.” That’s because decisions throughout investigations are based on limited information at any given point.

However, the authors wrote, forensic examiners can manage the uncertainty introduced by these limits. Relying on a hypothetical case of source camera identification based on real-world investigations, their framework offers a way to evaluate different explanations for the presence or absence of information.

By assigning different values to uncertain data, their decision-making — or the data they provide to investigators to make decisions — can improve. It lessens the risk of overestimating the reliability of digital and multimedia evidence, and makes the evidence, well, more “forensic.”

That’s even more important as automation begins to be implemented, as authors Timothy Bollé, Eoghan Casey, and Maëlig Jacquet describe in “ The role of evaluations in reaching decisions using automated systems supporting forensic analysis .”

Their paper extends the concept of structured evaluation from human analysis to automated systems (including, but not limited to, those relying on machine learning approaches; for example, the algorithms that classify child exploitation material, identify faces, or detect links between related crimes).

Besides reducing the risk of undetected errors or bias, evaluating these systems’ outputs could improve their performance, understandability, and the forensic soundness of decisions made using them. To that end, the authors provided a set of recommendations for automated forensic system design:

  • System performance should be evaluable based on whether the system is fit for purpose for a given forensic question. For example, the authors wrote, a facial recognition system might show whether an image or frame contains an object or a person, but cannot necessarily identify a specific object or person. 
  • Automated systems that support forensic analysis should be designed with understandability and transparency “baked in.”
  • One way to do this is for the system to “guide users through the forensic evaluation and decision making steps to be sure that they can understand and explain the result in a clear, complete, correct and consistent manner.”
  • The context for information should be retained throughout examination.
  • It should be possible to formulate explicit hypotheses, with any automated steps clearly described.

These recommendations provide a foundation for an automated system whose results are easier to evaluate in more structured ways.

Digital forensics as a service (DFaaS) goes international

For FSI:DI’s December issue, in “ Digital forensics as a service: Stepping up the game ,” coauthors H.M.A. van Beek, J. van den Bos, A. Boztas, E.J. van Eijk, R. Schramp, and M. Ugen described how the Netherlands Forensic Institute (NFI) implemented “digital forensics as a service” (DFaas) via the Hansken platform .

The paper — the third and final of three about DFaaS — highlights how DFaaS supports digital forensic knowledge sharing, digital traces’ contextualization, and standardization.

Implemented since 2010 — first under the name “xiraf” — Hansken was designed to minimize case lead time, maximize coverage of seized digital data, and efficiently mobilize specialists, all in a centralized environment based on security, privacy and transparency principles. 

As of 2019, Hansken includes law enforcement agencies from outside the Netherlands. Having withstood a 2016 judicial review, the platform is now used in over 1000 cases, 100 of which are being investigated concurrently. Among these are cases with data from more than 1000 devices and 100+ terabytes of raw material.

Lessons learned “from an organizational, operational and development perspective in a forensic and legal context” include:

  • The DFaaS business case is hard to make owing to hidden operational costs associated with transitioning from a traditional way of working to a centralized mode, making it difficult to measure and compare the costs.
  • All users’ needs must be taken into account so that they can experience the kinds of benefits that will enable them to embrace the changes to their process, especially when that involves relinquishing control over some parts of the process.
  • Working in an agile way in a broad governmental context is difficult owing to bureaucracy. Detailed documentation is needed to record benefits and mitigate risks, and decision making can be slow: the opposite of agile.
  • Continuous development — where new features become available once every few days or weeks, and changes are made to underlying third-party technologies — can frustrate forensic investigations.
  • A monolithic platform will never support all case-specific needs because the pace of technological change itself is too rapid to keep up with. The authors found that open source or commercial tools and case-specific scripts were still needed to process the digital evidence.
  • A DFaaS platform does not replace a digital forensics expert. Because Hansken “brings [tactical] digital evidence in reach of laymen,” the authors cautioned the continued need for critical reviews of the way forensic artifacts are interpreted and labeled.
  • DFaaS must serve all stakeholders in a criminal case, including the defense, taking into account the need to limit access for contraband or seized digital currency.

New approaches to social media forensic analysis and an old(-ish) Google Pixel artifact

At the Journal of Digital Forensics, Security and Law, Cranfield University’s Anne David, Sarah Morris, and Gareth Appleby-Thomas propose “ A Two-Stage Model for Social Network Investigations in Digital Forensics ” to identify and contextualize features from social networking activity artifacts. 

Their model focuses on understanding a user’s browser activity and the types of artifacts it can generate, and how activity and artifacts are linked.

First, URLs are identified and recovered from disk, and their features — such as the social network site visited or the actions performed by the user (search, follow), which “can be used to infer user activity or allude to the user’s intent” — extracted from them.

In the second stage, artifacts are corroborated, adding supplementary information about the URL feature extraction artifacts.

The outcomes, the authors wrote, include:

  • URL feature extraction can help to prioritize social network artifacts for further analysis
  • Determine social connections or relationships in an investigative context
  • Discover how recovered artifacts came to be, and how they can successfully be used as evidence in court

At DFIR Review, Kevin Pagano explores in “ Parsing Google’s Now Playing History on Pixel Devices ” what information is recoverable from the use of the little-discussed Now Playing feature on Google Pixel phones.

Part of the Pixel 2 and Pixel 2XL launch in 2017 and now included in every Pixel phone release since, Now Playing is a “baked in app/feature” that allows Google to recognize music playing nearby. With this history stored locally, Now Playing can offer valuable pattern of life data .

More from FSI: Digital Investigation

For subscribers, the August issue of FSI:DI also includes:

  • The challenge of identifying historic ‘private browsing’ sessions on suspect devices (Horsman)
  • A survey on digital camera identification methods (Bernacki)
  • Detecting child sexual abuse material: A comprehensive survey (Lee, Ermakova, Ververis, & Fabian)
  • Forensic speaker recognition: A new method based on extracting accent and language information from short utterances (Saleem, Subhan, Naseer, Bais, Imtiaz)
  • Smart contracts applied to a functional architecture for storage and maintenance of digital chain of custody using blockchain (Petroni, Gonçalves, de Arruda Ignácio, Reis, Martins) (Note: for more about the “blockchain of custody,” the Council of Europe’s Project LOCARD published a description of how blockchain technology is useful for digital evidence processing .)
  • Digital forensic tools: Recent advances and enhancing the status quo (Wu, Breitinger, O’Shaughnessy)

The December issue is in progress, but so far also includes:

  • Towards a conceptual model for promoting digital forensics experiments (Oliveira Jr, Zorzo, Neu)
  • A study on the decryption methods of telegram X and BBM-Enterprise databases in mobile and PC (G. Kim, M. Park, Lee, Y. Park, J. Kim)
  • A blockchain based solution for the custody of digital files in forensic medicine (Lusetti, Salsi, Dallatana)

Finally, JDFSL also included a paper on cryptography, passwords, and the U.S. Constitution’s Fifth Amendment; we’ll cover this in our upcoming quarterly Forensic Focus Legal Update.

Leave a Comment Cancel reply

You must be logged in to post a comment.

Jad Saliba, Founder & Chief Innovation Officer, Magnet Forensics

Jad Saliba, Founder & Chief Innovation Officer, Magnet Forensics

Book Your Place At The Refuge Tech Safety Summit 2024

Book Your Place At The Refuge Tech Safety Summit 2024

Forensic Focus Digest, August 23 2024

Forensic Focus Digest, August 23 2024

Next Level In Mobile Data Extraction And Decoding – XRY 10.10.1

Next Level In Mobile Data Extraction And Decoding – XRY 10.10.1

Andrea Lazzarotto, Digital Forensics Consultant and Developer

Andrea Lazzarotto, Digital Forensics Consultant and Developer

GMDSOFT Tech Letter: Unveiling Critical Evidence From Signal App Backup File Analysis

GMDSOFT Tech Letter: Unveiling Critical Evidence From Signal App Backup File Analysis

arXiv's Accessibility Forum starts next month!

Help | Advanced Search

Computer Science > Cryptography and Security

Title: machine learning in digital forensics: a systematic literature review.

Abstract: Development and exploitation of technology have led to the further expansion and complexity of digital crimes. On the other hand, the growing volume of data and, subsequently, evidence is a severe challenge in digital forensics. In recent years, the application of machine learning techniques to identify and analyze evidence has been on the rise in different digital forensics domains. This paper offers a systematic literature review of the research published in major academic databases from January 2010 to December 2021 on the application of machine learning in digital forensics, which was not presented yet to the best of our knowledge as comprehensive as this. The review also identifies the domains of digital forensics and machine learning methods that have received the most attention in the previous papers and finally introduces remaining research gaps. Our findings demonstrate that image forensics has obtained the greatest benefit from using machine learning methods, compared to other forensic domains. Moreover, CNN-based models are the most important machine learning methods that are increasingly being used in digital forensics. We present a comprehensive mind map to provide a proper perspective for valuable analytical results. Furthermore, visual analysis has been conducted based on the keywords of the papers, providing different thematic relevance topics. This research will give digital forensics investigators, machine learning developers, security researchers, and enthusiasts a broad view of the application of machine learning in digital forensics.
Comments: 99 pages, 10 figures, 3 tables
Subjects: Cryptography and Security (cs.CR)
Cite as: [cs.CR]
  (or [cs.CR] for this version)
  Focus to learn more arXiv-issued DOI via DataCite

Submission history

Access paper:.

  • Other Formats

license icon

References & Citations

  • Google Scholar
  • Semantic Scholar

BibTeX formatted citation

BibSonomy logo

Bibliographic and Citation Tools

Code, data and media associated with this article, recommenders and search tools.

  • Institution

arXivLabs: experimental projects with community collaborators

arXivLabs is a framework that allows collaborators to develop and share new arXiv features directly on our website.

Both individuals and organizations that work with arXivLabs have embraced and accepted our values of openness, community, excellence, and user data privacy. arXiv is committed to these values and only works with partners that adhere to them.

Have an idea for a project that will add value for arXiv's community? Learn more about arXivLabs .

digital forensics Recently Published Documents

Total documents.

  • Latest Documents
  • Most Cited Documents
  • Contributed Authors
  • Related Sources
  • Related Keywords

AutoProfile: Towards Automated Profile Generation for Memory Analysis

Despite a considerable number of approaches that have been proposed to protect computer systems, cyber-criminal activities are on the rise and forensic analysis of compromised machines and seized devices is becoming essential in computer security. This article focuses on memory forensics, a branch of digital forensics that extract artifacts from the volatile memory. In particular, this article looks at a key ingredient required by memory forensics frameworks: a precise model of the OS kernel under analysis, also known as profile . By using the information stored in the profile, memory forensics tools are able to bridge the semantic gap and interpret raw bytes to extract evidences from a memory dump. A big problem with profile-based solutions is that custom profiles must be created for each and every system under analysis. This is especially problematic for Linux systems, because profiles are not generic : they are strictly tied to a specific kernel version and to the configuration used to build the kernel. Failing to create a valid profile means that an analyst cannot unleash the true power of memory forensics and is limited to primitive carving strategies. For this reason, in this article we present a novel approach that combines source code and binary analysis techniques to automatically generate a profile from a memory dump, without relying on any non-public information. Our experiments show that this is a viable solution and that profiles reconstructed by our framework can be used to run many plugins, which are essential for a successful forensics investigation.

Cyber Security and Digital Forensics

An insight into digital forensics: history, frameworks, types and tools, digital forensics, digital forensics as a service: analysis for forensic knowledge, roadmap of digital forensics investigation process with discovery of tools, wake up digital forensics' community and help combating ransomware, privacy of web browsers: a challenge in digital forensics, the analysis and implication of data deduplication in digital forensics, digital forensics investigation on xiaomi smart router using sni iso/iec 27037:2014 and nist sp 800-86 framework, export citation format, share document.

  • Bibliography
  • More Referencing guides Blog Automated transliteration Relevant bibliographies by topics
  • Automated transliteration
  • Relevant bibliographies by topics
  • Referencing guides

Recent Dissertation Topics in Forensic Science

This article serves as a compass, guiding readers through a diverse array of recent dissertation topics that encapsulate the multifaceted nature of forensic research. From digital forensics to forensic psychology, the chosen dissertation topics reflect the evolving challenges and advancements in solving complex legal puzzles.

Forensic DNA Analysis:

  • “Next-Generation Sequencing (NGS) in Forensic DNA Profiling: Opportunities and Challenges”
  • “The Impact of DNA Transfer and Secondary DNA Transfer in Forensic Investigations”
  • “Ethical Implications of DNA Phenotyping: A Critical Analysis”

Digital Forensics:

  • “Artificial Intelligence in Digital Forensic Analysis: A Comprehensive Review”
  • “Cloud Forensics: Investigating Digital Crimes in Cloud Computing Environments”
  • “Deepfake Detection Techniques: Safeguarding Digital Evidence Integrity”

Forensic Anthropology:

  • “Facial Approximation in Forensic Anthropology: Integrating 3D Modeling Techniques”
  • “The Role of Forensic Anthropologists in Mass Graves Investigations”
  • “Advancements in Skeletal Trauma Analysis for Forensic Purposes”

Forensic Toxicology:

  • “Metabolomics in Forensic Toxicology: Profiling Endogenous and Exogenous Compounds”
  • “Designer Drugs: Analytical Approaches for the Detection of Novel Psychoactive Substances”
  • “Forensic Challenges in Analyzing Postmortem Fluids for Toxicological Investigations”

Forensic Psychology:

  • “The Impact of Jury Bias on Forensic Psychologists’ Testimonies: A Case Study Analysis”
  • “Virtual Reality Applications in Forensic Psychology Training: Enhancing Investigative Skills”
  • “Exploring the Ethical Dilemmas in Forensic Psychological Assessments”

Forensic Pathology:

  • “Cardiac Biomarkers in Forensic Pathology: Exploring their Role in Cause of Death Determination”
  • “The Use of Postmortem Imaging in Forensic Pathology: A Comparative Analysis”
  • “Forensic Aspects of Pediatric Traumatic Brain Injuries: Patterns and Challenges”

Forensic Odontology:

  • “Age Estimation in Subadults: Integrating Dental and Skeletal Methods in Forensic Odontology”
  • “Digital Methods in Bite Mark Analysis: Enhancing Accuracy and Reliability”
  • “Role of Dental Records in Disaster Victim Identification: A Global Perspective”

Forensic Entomology:

  • “Forensic Entomogenomics: Unraveling New Dimensions in Time of Death Estimation”
  • “Environmental Factors Influencing Insect Colonization on Decomposing Remains: A Forensic Study”
  • “The Use of Entomotoxicology in Forensic Investigations: Current Trends and Applications”

Share this post:

Reader interactions.

' data-src=

January 7, 2024 at 2:22 am

thank you for this post. I needed to submit a topic for my dissertation on Monday and you guys saved me big time

Leave a Reply Cancel reply

IMAGES

  1. (PDF) Advances in Digital Forensics Frameworks and Tools: A Comparative

    dissertations on digital forensics

  2. Sample digital forensics report

    dissertations on digital forensics

  3. A Beginner’s Guide to Digital Forensics

    dissertations on digital forensics

  4. Digital forensics fields

    dissertations on digital forensics

  5. Practical Digital Forensics

    dissertations on digital forensics

  6. Digital Forensics and Investigations-Week7 Notes

    dissertations on digital forensics

VIDEO

  1. Digital Forensics News Round-Up, July 03 2024 #dfir #digitalforensics

  2. Digital forensics : Advanced techniques for media analysis

  3. The role of digital forensics experts

  4. Digital forensics question paper|| geethanjali college of engineering and technology

  5. Friends working on the frontline in digital forensics

  6. Blockchain Island

COMMENTS

  1. PDF PhD Thesis Digital Forensics Practices: A Road Map for Building Digital

    PhD Thesis . Digital Forensics Practices: A Road Map for Building Digital Forensics Capability . Ahmed Jasim Almarzooqi . A Doctoral Thesis Submitted in Partial Fulfilment of the Award of Doctor of Philosophy . Faculty of Technology . De Montfort University . Leicester, United Kingdome .

  2. PDF PhD Thesis Alleviating the Digital Forensic Backlog: A Methodology for

    PhD Thesis Alleviating the Digital Forensic Backlog: A Methodology for Automated Digital Evidence Processing Xiaoyu Du A thesis submitted in ful lment of the degree of PhD in Computer Science Supervisor: Dr. Mark Scanlon Head of School: Assoc. Prof. Chris Bleakley UCD School of Computer Science

  3. PDF The Development of Current Digital Forensics Katherine Vreeland Snyder

    Thesis directed by Associate Professor Catalin Grigoras ABSTRACT The last few years have shown a rapid technological development in the digital evidence ... Digital Forensics" within the DOJ for training, research, and review of all training programs currently in existence (2). This bill was referred to the Subcommittee on Crime, Terrorism, and

  4. Digital forensics: an integrated approach for the ...

    Digital forensics is a new and developing field still in its infancy when compared to traditional forensics fields such as botany or anthropology. Over the years development in the field has been tool centered, being driven by commercial developers of the tools used in the digital investigative process. ... This thesis addresses issues ...

  5. PDF Title: Digital forensics: an integrated approach for the ...

    Digital forensics has become a predominant field in recent times and courts have had to deal with an influx of related cases over the past decade. As computer/cyber related criminal ... considered when dealing with digital evidence. This thesis addresses issues regarding digital forensics frameworks, methods, methodologies

  6. Current Challenges and Future Research Areas for Digital Forensic

    Cloud forensics also face a number of challenges associated with traditional digital forensic investigations. Encryption and other antiforensic techniques are commonly used in cloud-based crimes. The limited time for which forensically-important data is available is also an issue with cloud-based systems.

  7. PDF A THESIS PRESENTED Karthikeyan Shanmugam

    forensic model in the normal sense, it is what will be called a "meta-forensic" model. A meta-forensic approach is an approach intended to stop attempts to invalidate digital forensic evidence. This thesis proposes a formal procedure and guides forensic examiners to look at evidence in a meta-forensic way.

  8. PDF Research Trends, Challenges, and Emerging Topics in Digital Forensics

    As a result, over the last four years, research in the area of digital forensics has slowly but steadily increased. This upward trend re ects the key public and policy impact of digital forensics nowadays. Figure 2 also shows the domain-speci c distribution of the 109 review papers included in our analysis.

  9. Digital forensics research: The next 10 years

    STRIKE is a handheld forensics platform designed to process digital media and display what is found on its touch-screen user interface as new information is encountered (I.D.E.A.L., 2010). Unfortunately, STRIKE has not been demonstrated widely to either academic or commercial customers. 4.4. Scale and validation.

  10. A Comprehensive Digital Forensic Investigation Model and Guidelines for

    To address the issue of preserving the integrity of digital evidence, this research improves upon other digital forensic investigation model by creating a Comprehensive Digital Forensic Investigation Model (CDFIM), a model that results in an improvement in the investigation process, as well as security mechanism and guidelines during investigation.

  11. PDF Title A structured approach to malware detection and analysis in

    this thesis. I cannot imagine having a better advisor and mentor for my PhD study. To my second supervisor, Dr Gregory Epiphaniou: Although face-to-face meetings could be counted on one hand, your guidance was invaluable. Thank you for your time and effort. To the Director of the Digital Forensics Department of the Dubai Police, Captain Rashid

  12. Digital forensics and strong AI: A structured literature review

    In the first step we used Google Scholar with search terms AI digital forensics and Artificial Intelligence digital forensics to identify fitting papers. Then, we scrutinized the more than 10,000 results and derived search terms to narrow down the research (Schmid et al., 2022).Thus, we finally combined ("digital forensics" OR "digital forensic") with "AI" as well as "Artificial ...

  13. Current Challenges and Future Research Areas for Digital Forensic

    David Lillis, Brett Becker, Tadhg O'Sullivan, Mark Scanlon. View a PDF of the paper titled Current Challenges and Future Research Areas for Digital Forensic Investigation, by David Lillis and 2 other authors. Given the ever-increasing prevalence of technology in modern life, there is a corresponding increase in the likelihood of digital devices ...

  14. Research Trends, Challenges, and Emerging Topics in Digital Forensics

    Due to its critical role in cybersecurity, digital forensics has received significant attention from researchers and practitioners alike. The ever increasing sophistication of modern cyberattacks is directly related to the complexity of evidence acquisition, which often requires the use of several technologies. To date, researchers have presented many surveys and reviews on the field. However ...

  15. forensics: A review of reviews

    g issues of digital forensics. However, from 2017 onwards, the number of reviews published in the scienti c lit. rature has risen to nearly 70. As a result, over the last four years, research in the area of digital forensics has. slowly but steadily increased. This upward trend re ects the key public and policy impact.

  16. PDF Methods and Factors Affecting Digital Forensic Case Management

    A thesis submitted in partial fulfilment for the requirements for the degree of Doctor ... digital forensic investigations: Empirical evaluation of 12 years of Dubai police cases. Journal of Digital Forensics, Security and Law (JDFSL), 10(4), pp. 7-16. ADFSL Press.

  17. Ethical and Legal Aspects of Digital Forensics Algorithms: The Case of

    This is a very sensitive subject not only for digital forensics but also for forensics in general, as the entropy and unpredictability caused by the human factor are a double-edged sword for the outcome of the investigation. It can introduce bias and distortion of information which may render the outcome either misleading or worthless.

  18. An Insightful Analysis of Digital Forensics Effects on Networks and

    The difficulties and necessities of executing digital forensics with machine learning in numerous sectors were covered in this study. On the chosen feature set, several machine learning methods are used to find different anomalous events. In the paper, numerous techniques for network forensics, audio forensics, and video forensics were ...

  19. Digital Forensic Evidence And Artifacts: Recent News And Research

    1st September 2020 by Forensic Focus. This month's academic research reflects two aspects of the changing digital forensics industry: new ways to think not just about digital artifacts, but also about broader investigative processes — including interagency cooperation. This round-up article includes three open-access articles from the in ...

  20. [2306.04965] Machine Learning in Digital Forensics: A Systematic

    View PDF Abstract: Development and exploitation of technology have led to the further expansion and complexity of digital crimes. On the other hand, the growing volume of data and, subsequently, evidence is a severe challenge in digital forensics. In recent years, the application of machine learning techniques to identify and analyze evidence has been on the rise in different digital forensics ...

  21. digital forensics Latest Research Papers

    This article focuses on memory forensics, a branch of digital forensics that extract artifacts from the volatile memory. In particular, this article looks at a key ingredient required by memory forensics frameworks: a precise model of the OS kernel under analysis, also known as profile . By using the information stored in the profile, memory ...

  22. Dissertations / Theses: 'Digital Forensic investigations'

    Video (online) Consult the top 48 dissertations / theses for your research on the topic 'Digital Forensic investigations.'. Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA ...

  23. Recent Dissertation Topics in Forensic Science

    From digital forensics to forensic psychology, the chosen dissertation topics reflect the evolving challenges and advancements in solving complex legal puzzles. Forensic DNA Analysis: "Next-Generation Sequencing (NGS) in Forensic DNA Profiling: Opportunities and Challenges". "The Impact of DNA Transfer and Secondary DNA Transfer in ...