Real-Time Data Processing

Top PDF Real-Time Data Processing:

Nanosurveyor: a framework for real-time data processing

Nanosurveyor: a framework for real-time data processing

the brightest sources of tunable X-rays, nanometer posi- tioning, nanofocusing lenses, and faster detectors. Exist- ing soft X-ray detector technology in use at the Advanced Light Source (ALS) for example generates 350 MBytes/s per instrument [7]; commercial detectors for hard X-rays can record 6 GB/s or raw data per detector  [8, 9], and a synchrotron light source can support 40 or more experi- ments simultaneously 24 hours a day. Accelerator tech- nologies such as multi-bend achromat [10] are poised to increase brightness by two orders of magnitude around the globe   [11, 12]. Next generation microscopes may exploit multi-color sources, increased detector paral- lelism, increased frame rate, or stroboscopic structured illumination to extract higher-dimensional, higher reso- lution, higher frame-rate characterization of a specimen. There is a need for reducing data into meaningful images as rapidly as it is acquired, using low-cost algorithms and computational resources.
Show more

10 Read more

REAL-TIME BIG DATA ANALYTICAL ARCHITECTURE FOR IMAGE PROCESSING FRAMEWORK

REAL-TIME BIG DATA ANALYTICAL ARCHITECTURE FOR IMAGE PROCESSING FRAMEWORK

Big Data analytics and Hadoop architecture has significantly improved the real time data processing and storage Data with significant amount of velocity, veracity and variety was in the earlier processed by Data warehouse. The problem in the Data warehouse technique is that it always used to construct a sample out of the entire Data so, any given instant of time only part of the Data has used to get analyzed at any given instance. However, with the introduction of HDFS and Hadoop file system. We are now capable of analyzing and storing large amount of Data in near real time latency response. Many past work has proposed gathering, analysis and interpretation of data from varies different sources includes Webs, e-commerce Data, physical sensors Data and many more. Even though, certain image processing algorithm is been proposed over Big Data architecture in the past most significant work proposes an overall image processing frame work in conjunction with Big Data with either similar of dissimilar properties can be analyzed through Big Data architecture in this work we have proposed a novel real time approach for extracting the features and performing processing operation on large volume of images in a distributed Big Data frame work. The proposed system distributes large volume if images by splitting them into smaller parts through a broker node into multiple physical Data nodes process the chunk of the images in parallel at individual node and extract their features. This feature is stored at this data node as well as sends back to the broker (name node). The name node keep track of the entire split a through a metadata system and data record result shows that improves he image processing speed and efficiency significantly over serial execution of either single or multiple node processing the data either serially or in parallel way.
Show more

7 Read more

IMPROVEMENT OF PERFORMANCE INTRUSION DETECTION SYSTEM (IDS) USING ARTIFICIAL 
NEURAL NETWORK ENSEMBLE

IMPROVEMENT OF PERFORMANCE INTRUSION DETECTION SYSTEM (IDS) USING ARTIFICIAL NEURAL NETWORK ENSEMBLE

Apache spark [9], originated from Berkeley, now licensed under Apache foundation offers much faster performance and a variety of features in comparison with the most sought out Hadoop Big Data Processing System. Though Hadoop is a matured batch processing system with many projects being completed and much expertise being available, it has its limitations. Hadoop is written in java and mainly rely on two functions, the Map and the Reduce, all operations are to be represented in terms of these two functions which makes the programming a little complicated. Spark program can be written using Java, Python or Scala and it offers more functions other than just the map and reduce and above all it provides an interactive mode, the spark shell, which makes programming much simpler for Spark compared to Hadoop. Hadoop persists data back to the hard disk after a map or reduce operation, while spark performs in- memory data processing and hence repetitive operations on same data will be done much faster. Hence memory requirement of Spark is higher compared to Hadoop but if the data fits in the memory, spark works faster or else it has to move data back and forth the disk which deteriorates spark’s performance. Being a batch processing system, Hadoop users have to depend on other platforms like Storm [10] for real time data processing, Mahout for machine learning or Giraph for graph processing. But Spark ecosystem includes Spark streaming, MLLib, GraphX and Spark SQL for real time data processing, machine learning, graph processing and SQL querying respectively, which gives competitive advantage for Spark over
Show more

5 Read more

Towards provisioning of real time smart city services using clouds

Towards provisioning of real time smart city services using clouds

Processing of data in nearly real time using distributed technologies e.g. cloud computing, is not new and notable technologies exist e.g. Apache Storm [14]. In our previous work [5] we tried to use Apache Hadoop and Apache Spark to show the potential of processing large city data in batch processing mode. However, there is limited literature evi- dence that real time processing has been tested for smart city applications with the objective to process Big smart city data in near real-time to respond to application specific queries. In this respect this paper aims to extend the pre- viously defined architectural framework to process real-time stream data from social networks using Apache Storm in a cloud environment and present results in visual form using OpenStreetMaps. A use case on urban congestion is defined where real-time data is acquired from different sources to process it using Storm. A distributed data processing algo- rithm is also presented that processes real-time data from multiple sources. This algorithm illustrates a novel applica- tion of the architecture discussed in Section 3. It shows how the architecture can be used to develop a real-time data processing platform for user consumption. A basic proof of concept is implemented by combining a Twitter stream with other open data (e.g. Bristol open data and Open- StreetMaps). The aim here is to demonstrate a basic sce- nario that processes a real-time data stream and generates useful information for decision making.
Show more

5 Read more

Real-Time Big Data Analytics using Hadoop

Real-Time Big Data Analytics using Hadoop

Abstract-- Recent Technology have led to a large volume of data from different areas(ex. medical, aircraft, internet and banking transactions) from last few years. Big Data is the collection of this field’s information. Big Data contains high volume, velocity and high variety of data. For example Data in GB or PB which are in different form i.e. structured, unstructured and semi-structured data and its requires more fast processing or real-time processing. Such Real time data processing is not easy task to do, Because Big Data is large dataset of various kind of data and Hadoop system can only handle the volume and variety of data but for real-time analysis we need to handle volume, variety and the velocity of data. To solve or to achieve the high velocity of data we are using two popular technologies i.e. Apache Kafka, which is message broker system and Apache Storm, which is stream processing engine.
Show more

5 Read more

198302 pdf

198302 pdf

• Real-Time Data Processing Systems Architec~ure and Design • Real-Time Computer Software Analysis and Sizing • Local Network Analyses and Design • System Acquisition Management • Comput[r]

268 Read more

Monitoring activities of satellite data processing services in real-time with SDDS Live Monitor

Monitoring activities of satellite data processing services in real-time with SDDS Live Monitor

Since SDDS is working on Linux, the first solution came to mind is using an existing open source monitoring solution such as Zabbix [1], Nagios [2], or MMonit [3] to accomplish the task. But the detailed functional analysis showed that these systems were mainly designed for monitoring IT infrastructure (servers, routers, switches, etc.), system and network services. Zabbix and Nagios support application monitoring but this function is commercial. To monitor a custom service (or application) both Zabbix and Nagios assume that one must write a wrapper which runs a number of tests to check the service and produces standardized output data interpretable by the interface of the solution. The server component of Zabbix or Nagios then call this wrapper directly or via a client agent on a regular basis to check the service. Such solutions are not suitable for SDDS because of several reasons. Processing data of di ff erent satellites involves various components, the components can change dynamically, and working states of each component also di ff er. Writing a wrapper for each satellite would lead to a big amount of source code to be maintained. Additional checks on a regular basis for too many services would affect the overall performance of the operating system. Thus, a lightweight event-driven monitoring mechanism would be better.
Show more

7 Read more

Introduction to Real-Time Processing in Apache Apex

Introduction to Real-Time Processing in Apache Apex

As we know big data handling along with real time processing is a necessity today. One of the famous big data handling platforms include Hadoop. Hadoop mainly concentrates on operations using big data. It not only allows storage and processing of big data but also does this in a distributed network over a large scale of clustered computers. Being an open source framework it is designed to scale up from a single node to a large number of computers consisting of individual RAM and storage. [5]
Show more

5 Read more

Data protection and privacy issues concerning facial image processing in public spaces

Data protection and privacy issues concerning facial image processing in public spaces

Concerning the processing of personal data and protecting the fundamental right to privacy, the solution presented has significant properties: it involves real-time processing, faces are detected in an image but not recognised, only statistical information is retrieved, no images are ever kept in the system or transmitted to external devices, tracking over multiple frames does not require previous images, and any face image is only kept in memory for the time processing of a single frame. As soon as statistical information is obtained, all images are erased from memory. In this way, much of the personal data processing regulations do not apply for example security of the information held, legitimacy of data collection as no personal data are collected, confidentiality, transparency, accountability, the right to access, request deletion or rectification of personal data, and so on. Moreover, from the statistical information contained in each tag it is impossible to trace back to any living person.
Show more

14 Read more

GeoSpatial IoT - the Need for an Event-Driven Spatial Data Infrastructure

GeoSpatial IoT - the Need for an Event-Driven Spatial Data Infrastructure

Historically, the discoverability and accessibility to geospatial content was difficult, as data was not available online, was encoded through different proprietary formats and was often not documented through metadata [1]. These were among the main technical arguments for the establishment of spatial data infrastructures (SDIs) during the 1990s. Nowadays, the situation is completely different. The rapid emergence of technologies such as the Internet of Things (IoT), combined with the increased importance of the private sector and citizens as users and creators of content is generating unprecedented amounts of data. Not only are the heterogeneity and volumes of data rapidly increasing, but they are produced in near-real time conditions. This, combined, poses completely new challenges regarding the utilisation and processing of data. Geospatial data is no exception. In particular, addressing the (spatio-)temporal dimension requires (i) new analytical methods, (ii) innovative means for encoding and exchanging spatio-temporal data from constrained devices, and (iii) new standards for data interoperability. That is
Show more

26 Read more

Applications of a Streaming Video Server in a Mobile Phone Live Streaming System

Applications of a Streaming Video Server in a Mobile Phone Live Streaming System

Due to a non-blocking way is used, in which the system does not need to open multiple threads to receive da- ta, the second solution can reduce system overhead. However, because of the high requirement of real-time vid- eo, when multiple acquisition terminals connected, a single thread processing cannot satisfy the requirements of system performance. On the other hand, although the first solution needs more system overheads, but every ac- quisition terminal uses a separated thread and the system can have good data independence and real-time per- formance. Moreover, the way of using a negotiate port explicitly publishes the only port connected to the server. When multiple acquisition terminals connect the same port, this solution can avoid the port is occupied. There- fore, the second solution is adopted in this paper.
Show more

9 Read more

A Real Time System for the Analysis of Sickle Cell Anemia Blood Smear Images Using Image Processing

A Real Time System for the Analysis of Sickle Cell Anemia Blood Smear Images Using Image Processing

Digital image processing techniques are important in the analysis of medical images. It includes image enhancement, image filtering, segmentation, image masking, edge detection etc. Image enhancement consists of contrast stretching, unsharp masking, edge detection etc. Image filtering is mainly used for removing noise from an image. There are several filtering methods which include, mean filter, median filtering,Gaussian filter, Laplacian filter etc. Segmentation includes region growing, splitting and merging and thresholding etc. Segmentation subdivides an image into its constituent parts. Medical fields like bioinformatics and biomedical imaging are using several machine vision techniques require, image processing components with sufficient accuracy. This is true in biomedical image processing which has experienced vigorous growth. Digital image processing techniques are used today in a wide range of applications; share a common need for methods capable of enhancing pictorial information for human interpolation and analysis. In the image processing after acquiring a digital image, the main tasks are enhancement, segmentation, measurement and data analysis. Image enhancement methods are often used to emphasise certain features and to remove artifacts respectively. Two types of measurements are made; feature measurements are defined by a segmentation process and field measurements are obtained globally from complete images. Finally, these features and field measurements must be analysed.
Show more

8 Read more

Modified Algorithm for Real Time SAR Signal Processing

Modified Algorithm for Real Time SAR Signal Processing

SAR is an important tool for the collection of high-resolution all- weather earth images, from both airborne and space-borne platform. The potential of SAR in a diverse range of application such as sea and ice monitoring [2], mining [3], oil pollution monitoring [4], oceanography [5], snow monitoring [6], classification of earth terrain [7] etc. led to the development of a number of airborne and spaceborne SAR systems [8]. An extensive literature exists on various processing techniques for generating an image from the radar returns of a SAR [2, 9–14]. The SAR signal processing can be broken into two phases: range processing and azimuth processing. Most coherent radars use some form of modulation or coding of the transmitted waveform to improve resolution. Resolution enhancement is achieved by two-dimensional signal processing of the radar data. Range (across-track) resolution is improved by correlation of the pulse echoes with transmitted pulse in the range compression process. Azimuth (along-track) resolution is improved by synthetically generating a long antenna aperture, while the real aperture is relatively small. This operation is known as azimuth compression. Azimuth compression is based on the fact that each echo reflected from a single point target has a different phase shift. This phase shift appears to be quadratic in time and results in a linear frequency shift of the successive pulse echoes. The azimuth compression operation focuses the echo signal in such a way that a zero phase shift remains and integrates the focus echo. As a result the resolution is improved.
Show more

10 Read more

State Vector Estimation Studies for ISRO’S New Launch Vehicle

State Vector Estimation Studies for ISRO’S New Launch Vehicle

Abstract: At any launch base, Real-time tracking and trajectory estimation play critical role during a satellite launch for flight safety as well as mission monitoring. A linear Kalman filter with a novel method of model compensation is designed and successfully implemented for Real time processing of radar data at Sriharikota Range. The verification and validation are effectively carried out using simulated data and the actual mission data of the previous launch vehicles. After validation, the data processing techniques are deployed successfully for launch operations. This paper describes the studies carried out on the above work to implement in Real Time trajectory estimation for 3rd generation launch vehicle LVM-3. Initially basic algorithm is explained with the model compensation technique. Afterwards data processing method for satellite launch vehicle is explained followed by application to nominal and off-nominal trajectories.
Show more

5 Read more

Mobile Application-IoT Based EKG Monitoring System

Mobile Application-IoT Based EKG Monitoring System

Abstract: ECG is the most commonly performed cardiology test that provides vital information to understand the state of person’s heart condition. It essentially trances the electrical activity of the heart as it pumps blood to rest of the body and is very useful for determining the state of the heart and any symptoms. Conventionally patient visits the clinic for the ECG to be taken and is given report in a day or two. The whole process is both time consuming and tedious for the patient. With the latest development in Internet of Things, Cloud technology and reliable and faster data transmission, this process can be made lot more convenient. This paper proposes a cost-effective, remote system to read such ECG data of a patient via a handy sensor-Shimmer sensing device and send it to an android device via Bluetooth. The android device in turn sends this data to the cloud for storage and analysis and is then transmitted to a doctor’s android device for observation. Unique feature of the proposed unit lies in the fact that it provides a singular platform wherein the patient is directly connected to his/her healthcare provider for transmitting the ECG data with little or no delay. Apart from ensuring secure transmission of ECG data from patient to doctor, this channel between the patient and the doctor lets them communicate with each other. It is used to receive any valuable feedback or guidance from the doctor and gives opportunity to constantly monitor the effects or symptoms and responses to medicines that the patient is undergoing. The unit makes use of several technological advancements in cloud such as data processing, real-time data streaming, security, user account sync while making all these available remotely through the android and the sensor device.
Show more

11 Read more

An Efficient Collision Free Dynamic Multilevel Priority Packet Scheduling In WSN

An Efficient Collision Free Dynamic Multilevel Priority Packet Scheduling In WSN

In the DMP task scheduling approach, the source of a data packet is used to define the priority of data packets other than real-time. The priority of non-real time data packet will be more if it is sensed at remote node rather than the current sending node. Moreover, when no real-time tasks are available, pr3 tasks can preempt pr2 tasks if they are in starvation for a long time. This allows the processing of different types of tasks with fairness. The memory is also dynamically allocated to three queues and the size of the highest-priority queue is usually smaller than the two other queues since pr1 real-time tasks do not occur frequently compared to non- real-time tasks. As the memory capacity of a sensor node is limited, this also balances memory usages. Moreover, tasks are mostly non-real-time and are processed in the pr2 and pr3 queues. Non-real-time tasks that a node x receives from the lower level nodes are known as non-real- time remote tasks and processed with higher priority (pr2) than the non-real- time local tasks that x senses. Thus, non- real-time remote tasks incur less average waiting time. In addition, the average waiting time will not be affected for real-time tasks that are processed using FCFS scheduling, since these real-time tasks occur infrequently with a short processing time. Admittedly, one of the concerns regarding our proposed DMP task scheduling scheme pertains to its energy requirements. Indeed, the DMP task scheduling mechanism could be less energy efficient in comparison to the other two approaches since the DMP scheme requires a few more processing cycles to categorize and place the tasks into three different queues as well as for context saving and switching (for preemption). However, given the increased demand for WSN-based solutions that efficiently support real-time emergency applications and ensure them minimum average task waiting time as shown in fig 7.4 and 7.5 and end- to-end delay, the proposed DMP task scheduling mechanism can be regarded as highly efficient.
Show more

9 Read more

Real Time Processing System of Massive Data Stream in Internet of Things Based on Apache Storm

Real Time Processing System of Massive Data Stream in Internet of Things Based on Apache Storm

Real Time Processing System of Massive Data Stream in Internet of Things Based on Apache Storm Jie WANG, Shuai ZHAO and Bo CHENG State Key Laboratory of Networking and Switching Technolo[r]

7 Read more

A survey on real time processing with 
		spiking neural networks

A survey on real time processing with spiking neural networks

Neural networks have emerged has one of the powerful tools for real time processing such as pattern classification, recognition, prediction and regression. In the last two decades, the high computational demands of neural networks have motivated researchers to explore and develop optimized hardware architectures. [16]-[19]. An FPGA Implementation of a large-scale neural network for pattern recognition by using Neural Engineering Framework (NEF) is reported in [20]. It is a standard three-layer feed forward network (input, hidden and output layers) constructed using subnetworks. The connections between the layers follow all-to-all connections with fixed random weights determined using pseudoinverse operation. The main aim of the work is not to achieve lowest test error but rather to develop a fast hardware pattern for real time tasks. The system is implemented with fixed-point numbers instead of floating point numbers to overcome the bottleneck of huge data storage requirement. Also to reduce the hardware cost the gray level of the input pattern is reduced to binary from a byte with negligible performance loss. Spike rates in NEF are used to calculate the weights. To implement the desired function, the low pass filter is used to sum the weighted output function to compute the firing rate. Effectively the neuron itself calculates the firing rate directly and becomes a non-spiking neuron. The high speed enables time multiplexing of neurons which will reduce the hardware resource overhead.
Show more

9 Read more

Audio Processing in Embedded Real-Time Linux Systems

Audio Processing in Embedded Real-Time Linux Systems

For both audio capture and playback, the data stream has to be directed to a file. The mixer tool is used to set the level of the recording source. For ALSA, there is an in package ALSAmixer, a textmode based mixer program for ALSA soundcard drivers.

5 Read more

Simulation and real time processing techniques for space instrumentation

Simulation and real time processing techniques for space instrumentation

The Science D ata Processor takes the normalized data from the four detectors and combines them into one data stream to the CDHS (see Figure 3.6). The data from the detectors are combined on a ‘first come first serve’ basis. The normalized x and y anode data are presented as a tri-state bus which covers all four detectors. The selection of an ‘active’ detector is made with the four event lines: from an inactive state, the first to register an event enables the data bus associated with that detector while preventing the other three from becoming active. After a dead time of 600 ns (McCalden, 1992) to process data, the bus is returned to its inactive state ready for the next event from an active detector to be selected. The dead time of 600 ns has no effect on the overall processing since it appears in parallel with that of the analogue processing, set at 2.1 ps (see Figure 3.5), however, it does limit the event to event resolution between detectors because events from different detectors are processed in parallel as each detector has its own processing electronics with its own dead time, and any event arriving at the Science Data Processor before 600 ns will be lost (M cCalden, 1992).
Show more

218 Read more

Show all 10000 documents...