Top PDF Digital forensics challenges to big data in the cloud

Digital forensics challenges to big data in the cloud

Digital forensics challenges to big data in the cloud

Abstract — As a new research area, Digital Forensics is a subject in a rapid development society. Cyber security for Big Data in the Cloud is getting attention more than ever. Computing breach requires digital forensics to seize the digital evidence to locate who done it and what has been done maliciously and possible risk/damage assessing what loss could leads to. In particular, for Big Data attack cases, Digital Forensics has been facing even more challenge than original digital breach investigations. Nowadays, Big Data due to its characteristics of three “V”s (Volume, Velocity, and Variety), they are either synchronized with Cloud (Such as smart phone) or stored on the Cloud, in order to sort out the storage capacity etc. problems, which made Digital Forensics investigation even more difficult. The Big Data-Digital Forensics issue for Cloud is difficult due to some issues. One of them is physically identify specific wanted device. Data are distributed in the cloud, customer or the digital forensics practitioner cannot have a fully access control like the traditional investigation does.
Show more

5 Read more

Review Paper on Big Data Challenges and Cloud Computing

Review Paper on Big Data Challenges and Cloud Computing

Abstract—tremendous measure of information which is currently being gathered because of IOT, biomedical fields and online networking is termed as Big Data. As this data is very huge in volume it is very difficult to derive and analyse this data. Huge information is such on promotion due to its various applications in the division of Test information examination (information mining), web-based social networking investigation, portable information investigation, and so forth. Cloud computing is a most dominant innovation which performs huge scale and complex processing. It disposes of the necessity to keep up expensive registering equipment, committed space prerequisite and related programming. Monstrous development in the size of information or huge information produced through distributed computing has been distinguished. Idea of enormous information is a difficult and time-requesting task that requires an extensive computational space to guarantee fruitful information preparing and investigation. This paper incorporates definition, qualities, and arrangement of big data alongside certain Cloud Computing introduction.
Show more

5 Read more

Greening cloud enabled big data storage forensics : Syncany as a case study

Greening cloud enabled big data storage forensics : Syncany as a case study

Collectively, our research highlighted the challenges that could arise during forensic investigations of cloud- enabled big data solutions. While the use of deduplica- tion/chunking and encryption technologies can benefit users by providing an efficient and secure means for managing big data, evidence collection and analysis may necessitate intercepting and collecting of password and utilisation of other vendor-specific applications. This can be subject to potential abuse by cyber criminals seeking to hide their tracks. Without the cooperation of the user (suspect), forensics endeavours may end up an exercise in futility. Therefore, we suggest vendors implement a fo- rensically friendly logging mechanism (e.g., providing information about who accesses the data, what data has been accessed, from where did the user access the data, and when did the user access the data) that supports the collection of the raw log data outside the encrypted da- TABLE 4
Show more

15 Read more

Big Data and Cloud Computing: Challenges and Issues in Present Era

Big Data and Cloud Computing: Challenges and Issues in Present Era

Cloud computing and big data are conjoined. Big data provides users the ability to use commodity computing to process distributed queries across multiple datasets and return resultant sets in a timely manner. Cloud computing provides the underlying engine through the use of Hadoop, a class of distributed data-processing platforms. Large data sources from the cloud and Web are stored in a distributed fault-tolerant database and processed through a programming model for large data sets with a parallel distributed algorithm in a cluster. Big data utilizes distributed storage technology based on cloud computing rather than local storage attached to a computer or electronic device. Big data evaluation is driven by fast-growing cloud-based applications developed using virtualized technologies. Therefore, cloud computing not only provides facilities for the computation and processing of big data but also serves as a service model. The use of cloud computing in big data is shown in the following figure. The main purpose of data visualization is to view analytical results presented visually through different graphs for decision making. Therefore, cloud computing not only provides facilities for the computation and processing of big data but also serves as a service model. Cloud computing is correlated with a new pattern for the provision of computing infrastructure and big data processing method for all types of resources available in the cloud through data analysis.
Show more

8 Read more

Opportunities and Challenges in integrating Cloud Computing and Big Data Analytics to E governance

Opportunities and Challenges in integrating Cloud Computing and Big Data Analytics to E governance

Upon tremendous success of internet, today every organization is trying to expand their businesses through the web because of its popularity and high usage. In government sector, traditionally accessing the services manually was a quiet difficult task as it has involved so many procedures and formalities with lot of paperwork [2]. Hence governments across the world have come together and started delivering their services through web under the roof of “E-Governance”. Initially the use of E-governance was very limited but today it becomes very high. Therefore the existing technologies are not capable to meet the current demand of E-governance. So the government needs to think for technological improvement to meet the demand at reduced costs with increased efficiency. The technologies like cloud computing and big data analytics have huge capability to overcome the current challenges of E- governance and can be easily opted for implementation. The initial part of this paper explain the basics of E- governance along with Cloud computing and Big data analytics. The later part proposes a model for cloud and big data enabled E-governance along with their opportunities and challenges.
Show more

6 Read more

BIG DATA AND MOBILE CLOUD COMPUTING: ISSUES AND CHALLENGES

BIG DATA AND MOBILE CLOUD COMPUTING: ISSUES AND CHALLENGES

Modern era is the use of mobile devices like smart phones etc. As the technology has changed we are using mobile phones for each and every application we used to do with our computer system in previous times. But smart phones have their own limitations in terms of their small screen, small battery size, less memory as well as processing power. On the other hand applications running on mobile are becoming more and more complex. With the advent of IOT (Internet of Things) this demand has increased in form of processing capability as well as data handling requirements. Massive data or big data collected from different sensors used in mobile devices also place a demand for large computing infrastructure and processing power for data processing and analysis. Therefore there is a gap between demand and available resources. Mobile Cloud computing tries to overcome these Big Data handling issues but still there are many issues and challenges. The issues, existing solutions and approaches are presented.
Show more

6 Read more

Big Data Services Security and Security challenges in cloud environment

Big Data Services Security and Security challenges in cloud environment

solutions that incorporate the cloud and big data within an enterprise to build elastic, scalable private cloud solutions. The cloud has glorified the as-a- service model by hiding the complexity and challenges involved in building an elastic, scalable self-service application. The same is required for big data processing. Cloud computing is a promising application and innovation of these times. This combination of findings provides some support for the conceptual premise that the barriers and obstacles to the rapid development of cloud computing are issues of safety and security of information. Data storage and process cost reduction are mandatory requirements of any association, while the research of information and data reliability is now mandatory in each of the associations when making choicesThis research has raised many questions in need of further investigation. The cloud and its administration have some impact on the significance of what the vendor of cloud services provides for the certification of data.
Show more

8 Read more

Big Data Services Security and Security challenges in cloud environment

Big Data Services Security and Security challenges in cloud environment

Big data represents a new era in data exploration and utilization. Current technologies such as cloud computing and business intelligence (BI) provide a platform for the automation of all processes in data collecting, storage, processing and visualization. Big data is defined as having the following five V properties: volume, velocity and variety which constitute original big data properties. Big data velocity deals with the speed of the data processing of the datasets or the processing of a large volume of data. Big data veracity refers to noise and abnormality in the data as all data elements are not required for analysis. Big data validity deals with the correctness and accuracy of the data considered for analysis. Chen et al. (2010) state the economic case for cloud computing has brought unlimited attention to this technology. Cloud computing providers can mount data centres easily due to their ability to classify and provide computing assets. The emergence of the cloud and big data comes with data security and privacy security concerns. System integrators (SI) have been developing solutions that incorporate the cloud and big data within the
Show more

9 Read more

Big data in cloud: a data architecture

Big data in cloud: a data architecture

Abstract. Nowadays, organizations have at their disposal a large volume of data with a wide variety of types. Technology-driven organizations want to capture process and analyze this data at a fast velocity, in order to better understand and manage their customers, their operations and their business processes. As much as data volume and variety increases and as faster analytic results are needed, more demanding is for a data architecture. This data architecture should enable collecting, storing, and analyzing Big Data in Cloud Environment. Cloud Computing, ensures timeliness, ubiquity and easy access by users. This paper proposes to develop a data architecture to support Big Data in Cloud and, finally, validate the architecture with a proof of concept.
Show more

10 Read more

Spatial Big-Data Challenges Intersecting Mobility and Cloud Computing

Spatial Big-Data Challenges Intersecting Mobility and Cloud Computing

However, the envisaged SBD-based next-generation mobility services pose several challenges for current routing techniques. First, SBD requires a change in frame of reference, moving from a global snapshot perspective to the perspective of an individual object traveling through a transportation network. Second, SBD magnifies the impact of partial information and ambiguity of traditional routing queries specified by a start location and an end location. For example, traditional routing identifies a unique (or a small set of) route(s), given historical and current travel-times. In contrast, SBD may identify a much larger set of solutions, e.g., one route each for thousands of possible start-times in a week, significantly increasing computational costs. Third, SBD challenges the assumption that a single algorithm utilizing a specific dataset is appropriate for all situations. The tremendous diversity of SBD sources substantially increases the diversity of solution methods. For example, methods for determining fuel efficient routes leveraging engine measurement and GPS track datasets may be quite different from algorithms used to identify minimal travel-time routes exploiting temporally detailed roadmaps. Newer algorithms will be needed as new SBD becomes available, creating demand for a flexible architecture to rapidly integrate new datasets and associated algorithms. Other challenges include geo-sensing, privacy, prediction, etc.
Show more

9 Read more

Cloud Data Management Big Data

Cloud Data Management Big Data

shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effo[r]

58 Read more

Archiving and Sharing Big Data Digital Repositories, Libraries, Cloud Storage

Archiving and Sharing Big Data Digital Repositories, Libraries, Cloud Storage

Professor of Computer Science & Electrical Engineering Director, Integrated Media Systems Center (IMSC). Director, VSoE Informatics Viterbi School of Engineering University of Sou[r]

21 Read more

Spatial Big-Data Challenges Intersecting Mobility and Cloud Computing

Spatial Big-Data Challenges Intersecting Mobility and Cloud Computing

Third, the tremendous diversity of SBD sources substan- tially increases the need for diverse solution methods. For example, methods for determining fuel efficient routes that leverage engine measurement and GPS track datasets may be quite di↵erent from algorithms to identify minimal travel- time routes for a given start-time exploiting TD roadmaps. In addition, SBD data (e.g., TD roadmaps, GPS-tracks and engine-measurement datasets) di↵er in coverage, roadmap attributes and statistical details. For example, TD roadmaps cover an entire country, but provide mean travel-time for a road-segment for a given start-time in a week. In contrast, GPS-track and engine-measurements have smaller coverage to well-travelled routes and time-periods, but may provide a richer statistical distribution of travel-time for each road- segment, perhaps revealing newer patterns such as seasonal- ity. New algorithms are likely to emerge as new SBD become available and as a result, a new, flexible, architecture will be needed to rapidly integrate new datasets and associated al- gorithms.
Show more

6 Read more

Digital Forensics as a Big Data Challenge

Digital Forensics as a Big Data Challenge

The simple quantity of evidence associated to a case is not the only measure of its complexity and the growing in size is not the only challenge that digital forensics is facing: evidence is becoming more and more heterogeneous in nature and provenience, following the evolving trends in computing. The workflow phase impacted by this new aspect is clearly analysis where, even when proper prioritization is applied, it is necessary to sort through diverse cat- egories and source of evidence, structured and unstructured. Data sources themselves are much more differentiated than in the past: it is common now for a case to include evidence originating from personal computers, servers, cloud services, phones and other mobile devices, digital cameras, even embedded systems and industrial control systems. File formats
Show more

8 Read more

Digital Forensics : Challenges and Opportunities for Future Studies

Digital Forensics : Challenges and Opportunities for Future Studies

Reza Montasari (PhD) is a Senior Lecturer in Cyber Security in the Department of Computer Science at the University of Huddersfield. Dr. Montasari has held a number of academic teaching and research positions over the past 6 years and have published widely in the fields of digital forensics, cyber security, Cloud computing and Internet of Things (IoT) security. Dr. Montasari is a Fellow of the Higher Education Academy (FHEA), a Chartered Engineer (CEng) registered with the Engineering Council and a Member of The Institution of Engineering and Technology (MIET). Simon Parkinson has an honours degree in secure and forensic computing and a PhD in the cross-discipline use of domain-independent artificial intelligence planning to autonomously produce measurement plans for machine tool calibration. This resulted in the ability to produce measurement plans to reduce both machine tool downtime and the uncertainty of measurement. His research interests are in developing intelligent systems for manufacturing and cyber security. This involves his continuing research of developing and utilising artificial intelligence for task automation. His research interests are cyber security focused and cover aspects such as access control, vulnerability and anomaly detection, learning domain knowledge, mitigation planning, and software tools to aid situation awareness Dr Amin Hosseinian-Far is a Senior Lecturer in Business Systems and Operations; He is also the Chair of the research Centre for Sustainable Business Practices (CSBP) at the University of Northampton. In his previous teaching experience, Amin was a Staff Tutor at the Open University, and prior to that a Senior Lecturer and Course Leader at Leeds Beckett University. He has held lecturing and research positions at the University of East London, and at a number of private HE institutions and strategy research firms. Dr Hosseinian-Far has also worked as Deputy Director of Studies at a large private higher
Show more

17 Read more

Utilizing Cloud Computing to address big geospatial data challenges

Utilizing Cloud Computing to address big geospatial data challenges

We have collected big geospatial data with different spatiotemporal stamps and resolutions for environment and urban studies using vari- ous methods, e.g., Global Positioning System (GPS), remote sensing, and Internet-based volunteer (Jiang and Thill, 2015; Yang et al., 2011). The increment in volume, velocity, and variety of the spatiotemporal data poses a grand challenge for researchers to discover and access the right data for research and decision support (Yang et al., 2011). One method of addressing this Big Data discovery challenge is to mine knowledge from the big geospatial data and their usages (Vatsavai et al., 2012) for query expansion, recommendation and ranking. The mined knowledge includes but is not limited to domain hot topics, re- search trends, metadata linkage and geospatial vocabularies similarity. This process is challenged with Big Data volume, velocity and variety. Such a mining process poses two challenges: a) how to divide Big Data into parallelizable chunks for processing with scalable computing resources; and b) how to utilize an adaptable number of computing re- sources for processing the divided Big Data. Take the MUDROD project for NASA Physical Oceanography Distributed Active Archive Center (PO. DAAC) as an example, the 2014 web log (contains geospatial data usage knowledge) was over 150 million records and the mining task takes N 5 h to complete using a single server (6 cores, 12G memory and Win 7 OS). For high traf fi c websites with a large number of users sending requests concurrently, logs are produced at a much higher ve- locity, exceeding a single server's data-processing capability. In addi- tion, logs are semi-structured or unstructured data stored in various formats (e.g. Apache HTTP, FTP, NGINX, IIS log format or user-de fi ned format). Each format requires a speci fi c processing protocol complicat- ing the integration of different formats for further processing. The un- certainty affects the quality of mined knowledge with common noise (e.g., from web crawlers) requiring computational intensive crawler de- tection algorithms to preprocess original logs (Jiang et al., 2016).
Show more

9 Read more

CloudMe forensics : a case of big data investigation

CloudMe forensics : a case of big data investigation

Martini and Choo [51] proposed the first cloud forensic investigation framework, which was derived based upon the frameworks of McKemmish [52] and NIST [49]. The framework was used to investigate ownCloud [53], Amazon EC2 [18], VMWare [54], and XtreemFS [55]. Quick et al. [23] extended and validated the four-stage framework using SkyDrive, Dropbox, Google Drive, and ownCloud. Chung et al. [56] proposed a methodology for cloud investigation on Windows, Mac OSX, iOS, and Android devices. The methodology was then used to investigate Amazon S3, Google Docs, and Evernote. Scanlon et al. [57] outlined a methodology for remote acquisition of evidences from decentralized file synchronization networks and utilized it to investigate BitTorrent Sync [58]. In another study, Teing et al. [27] proposed a methodology to investigate the BitTorrent Sync application (version 2.0) or any third party and Original Equipment Manufacturer (OEM) applications. Do et al. [59] proposed an adversary model for digital forensics and demonstrated how such an adversary model can be used to investigate mobile devices (e.g. Android smartwatch – Do et al. [60] and apps). Ab Rahman et al. [61] proposed a conceptual forensic-by- design framework to integrate forensics tools and best practices in the design and development of cloud systems.
Show more

16 Read more

A Survey on Digital Forensics to Address Big Data Challenges

A Survey on Digital Forensics to Address Big Data Challenges

Digital devices have also been utilized to commit crime, for example using devices for Distributed Denial of Service (DDOS) attacks, controlling cloud-based CCTV units and accessing Internet connected printers [3]. It is reported that the Lizard Stressor malware accesses connected digital devices to dispatch DDOS attacks against telecommunication companies and government agencies. The malware is effective because it uses of devices which often run embedded Linux based operating systems that have no bandwidth limitations, have minimal security, and often have default passwords shared across multiple devices. Gartner have anticipated that by 2020 more than one fourth of cyber-attacks will involve disparate digital devices [4].
Show more

7 Read more

Research on Digital Forensics Based on Private Cloud Computing

Research on Digital Forensics Based on Private Cloud Computing

IaaS is a basic service model in which cloud providers often offer virtual machines and other resources. The virtual machines are run as guests by a hypervisor, such as Xen or KVM. Management of pools of hypervisors by the cloud operational support system leads to the ability to scale to support a large number of virtual machines. Other resources in IaaS clouds include images in a virtual machine image library, firewalls, load balancers, virtual local area networks (VLANs), and software bundles. IaaS cloud providers supply these resources on demand from their large pools installed in data centers. To deploy their applications, cloud users then install operating system images on the machines as well as their application software.
Show more

6 Read more

Need of Digital Forensics in Cloud Computing Enviornment

Need of Digital Forensics in Cloud Computing Enviornment

provisioned and released with minimal management effort or service provider interaction. Cloud computing provides computing services and storage by using virtualization which makes the cloud resources invisible to the user. A common application of cloud computing service is cloud storage services. Cloud storage services such as Dropbox provide storage to the individuals and businesses. Cloud storage services are used by people for backup purposes and for sharing documents, pictures, videos, and multimedia files with others. The advent of cloud storage services has raised concerns about the degree of security of could environment [15]. The advent of cloud storage services has raised concerns about the degree of security of could environment [16]. This is because current cloud offers proprietary solutions for dealing with security issues. Another concern is that computer forensics investigation in the cloud computing environment is more complicated. Challenges to cloud forensics have been categorized into technical, legal and organizational [10].It is suggested that more research and development of methodologies and tools on evidence gathering from the cloud environment are needed [14].
Show more

7 Read more

Show all 10000 documents...