Time constraint is the main factor in realtime operating system and it affects the deadline of the process. To achieve deadline, proper scheduling algorithm is required to schedule the task. In this paper an Adaptive scheduling algorithm is developed which is the combination of Earliest Deadline First (EDF) and Ant Colony Optimization (ACO). The EDF algorithm places the process in a priority queue and executed using the deadline. The priority of the processes depends upon the deadline and handles the under loaded condition. The limitation of EDF algorithm is that it cannot handle the overloaded condition. The execution of ACO algorithm is based on the execution time. Process which contains the minimum execution time is executed first. The limitation of ACO algorithm is, it takes more time for execution than EDF. Therefore, to remove the limitation of both the algorithms an Adaptive scheduling algorithm is developed. It increases performance of the system and decreases the system failure. Also the percentage of missing deadline is reduced. The advantage of an Adaptive scheduling algorithm is, it handles over-loaded and under-loaded condition simultaneously. The performance of an Adaptive scheduling algorithm is calculated in terms of Success Ratio that is the number of process scheduled and CPU Utilization. The result of execution time is compared with the EDF and ACO scheduling algorithm. The goal of an Adaptive scheduling algorithm is to show the switching between the scheduling algorithms and to decrease the system failure and increase the systemperformance.
ACO algorithm was proposed by Dorigo and Gambardella in the early 1990s and by now has been successfully applied to various combinatorial optimization problems . This scheduling algorithm is based on the nature of real ant, where each ant constructs a path and one or more ants simultaneously active at the same time. ACO scheduling algorithm mainly working with the time slice manner. The limitation of ACO algorithm is, that it takes more time for execution than the EDF algorithm. In ACO algorithm, each ant is called as a “node” and each of them will start their journey from different node. To apply in ACO scheduling algorithm each node is considered as a task and possibility of each node depend upon the “pheromone” value τ and heuristic value η and it is calculated as .
The objective of realtime task scheduler is to reduce the deadline miss of tasks in the system as much as possible when we consider soft realtimesystem . Realtime scheduling techniques can be broadly divided into two categories: Off-line and On-line. Off-line algorithms assign all priorities to tasks at design time, and priorities are remained constantly for the lifetime of a task. On-line algorithms assign priorities to tasks at runtime, based on execution parameters of tasks. On-line scheduling can be either with static priority or dynamic priority. Rate Monotonic (RM)  and Deadline Monotonic (DM)  are examples of On-line scheduling with static priority . Earliest Deadline First (EDF)  and Least Slack Time First (LST)  are examples of On-line scheduling with dynamic priority. EDF and LST algorithms are optimal under the condition that the jobs are preemptive, there is only one processor and the processor is not overloaded [5, 6]. But the limitation of these algorithms is, their performance decreases exponentially if system becomes slightly overloaded .
In this paper, a distributed p2p network was used to perform clustering based summarization process. Each overlay has dynamic overlays associated with peer nodes. Proposed clustering based summarization system works well on dynamic p2p networks Traditional cluster based summarization methods usually suffer with the computation speed, compression, peer selection and sentence clustering in order to generate high quality summaries. Traditional document clustering and summarization methods assume node adjacency and neighborhood information to build clusters and summaries. Proposed approach provides better solution to cluster different overlay networks by using probabilistic k representative clustering algorithm and forms efficient summaries using phrase rank based document summarization process. Experimental results give better performance in terms of execution time, entropy, similarity index are concerned.
Single cloud model works well if the workload is medium. If the number of incoming requests is on the higher side, single cloud model lacks in parameters like execution time, response time and so on. Single cloud model gets degraded when the workload becomes heavy. Small organizations cannot compete with the large or well-known organizations even though they have enough resources to solve the user requirements. In order to overcome these issues, federated cloud model was proposed.
Our approach can be illustrated by a domain specific case study which presents a great adaptation potential. Our choice was fixed on tax domain. It should be noted that a tax information system is supposed to be updated for each finance law publication. Therefore, tax information system must be flexible, scalable and customizable as much as possible. This business change (functional dynamic) represents, according to Kelly , one of the essential points justifying the adoption of the DSM approach. We focused only on the calculation and restitution of corporation tax. The main services of our validation scenario are “TaxCalculation” and “TaxRestitution”. The former is a generic service which will be specialized by the “CTCalculationService” (corporation tax calculation service), specific for calculating the value of corporation tax. The latter is a business process composed of several services. It allows a restitution of the corporation tax. Based on our Tax meta-model (see section 6) we can generate models for each tax (corporation tax, income tax…).
2.1 Web Service Composition in Bioinformatics Web services composition scenarios have been studied in bioinformatics field      . Applying web service composition in bioinformatics allows new composite services to be offered. Most of bioinformatics database and tools (web services) are exposed publicly on the web . Biologists might want to perform some experiments using distributed data and tools provided by many organizations. Using web services is useful as there are many web services in life science . Typically, biologists have to search for their desired services to help them in choosing applicable service. However, searching for accurate web services is under web service discovery problem, which is not covered in this study . Meanwhile, the services found by biologists will be utilized to perform experiments using Basic Local Alignment Searching Tools (BLAST) . There are some cases where biologists will use other services together to fulfill their requirements. Therefore, output resulted from one service will be used as input to another services manually. Figure 1 illustrates the sequence combinations of services example as discussed in . This task is time consuming and a domain expert is needed to assist biologists in web services environment. Thus, automatic web service composition can help them to achieve the objectives of the experiments.
called pheromone. They also deposit pheromone on their way back home. The ants that follow are more likely to follow the path that has more pheromone trail rather than moving in a random fashion, these ants will also deposit pheromone on the path thus making that path more attractive for other ants to follow. Thus the more the ants follow a path the more attractive that path becomes for other ants to follow. Moreover, the ants that take the shorter route will deposit pheromone on their route much faster than the ants that take longer routes; this will increase the probability of other ants to follow this route. Hence over a period of time all the ants will follow the shortest route to the food source, thus leading to an optimal solution [11, 12]. Additionally, the pheromone evaporates over time, thus reducing the probability of finding low quality solutions. This algorithm, though the convergence is slow leads to finding an optimal solution. Many researchers have used ant colony optimization algorithms for different image processing techniques like image segmentation, edge detection, etc. This paper proposes a method for using ant colony optimization technique for shadow segmentation.
Humans are usually difficult to manage in the context of information security. In fact, humans are not very predictable because they do not operate as machines where if the same situation happened they will operate in the same way, time after time. Human challenge lies in accepting that individuals in the organization have personal and social identity (i.e. unique attitudes, beliefs and perceptions) that they bring with them to work as well as their work identity conferred by their role in that organization , . While information security management activities comprise processes and procedures, it
Automated orthogonal defect classification (AutoODC)  enhances Relevance annotation framework. It automatically classifies the software system defects using textual features of defect reports. The semi-supervised text classification approach enriches the Naive Bayes (NB) classifier using expectation-maximization. Bug-triage employs semi-supervised classification method for avoiding the deficiency of the labeled bug report . A string kernel  classifies the text documents based on the sub-sequence length of the feature. Kernel-based learning system text categorization exploits the Support vector machine. Clustering based classification (CBC) approach considers both labeled and unlabeled data of the dataset. Initially, CBC clusters the labeled data. It labels the unlabeled data depends on clusters. The trained set of expandable labeled data is the input for classifier for improving the accuracy of classification .
Experience is applied on 50 moving object, 25 of them were actually a Man , and 25 were strange objects (non-Man), different shapes. We'll examine the impact of parameter Y, this parameter results from the ratio of the resulting similarity from NCC. In other words, our goal is not only capturing and classification of 50 objects , but we will determine that the 25 objects are human being and the rest are not. Otherwise we will get the False positive cases. Let's suggest that the number of training sample (templates) are 10 ,and lets study the impact of the parameter (Y) which may take values ranging between 0 and 100, according we can change the value of this parameter from 0 to 100. We can notice that when its values are approximately 0, most human and non- humans objects are classified as human. This means that we get 50 % false positive cases and 50% were correctly classified. After that we gradually increase the value of parameter to become 30 .At this value 41 of 50 objects are classified as human and 9 objects are classified as non- human, despite the fact that all these 9 objects are moving objects and are non- human, on the other hand, 41 objects that are classified as a human are divided into two parts: 25 humans and 16 non-human, This means that 16 objects are classified wrongly and 25 are classified correctly. In addition to that 9 objects are shown as correctly non-human. The number of objects that are correctly classified are 25+9=34 which means the correct percentage is 68% .We continue increasing the value Y till it becomes Y= 60 . Here we notice that 31 objects are classified as humans ;( among them there are 25 real humans and the rest are wrongly classified), 19 objects are classified as non–humans and truly
Micorstrip patch antennas play a major role in day to day life. In this paper the designed microstrip patch antenna is compact sized with circular polarization for RFID applications. Different arbitrary shaped slots like square circle plus are used and parameters like return loss, gain and frequency are observed for each of the patches. All these are done using ANSOFT HFSS . The antenna is fabricated using duroid as dielectric substrate (relative permittivity =2.2, loss tangent=0.0004) and coaxial feed. These designed antennas are fabricated and used in realtime applications.
Various phenomena arise in Indonesia ahead of the ASEAN Economic Community (AEC) in 2015 that labor shortages are ready to fill the needs of industry . With a ready workforce needs are still many in need in the business world, while Indonesia is not able to provide the ready-made power. Furthermore, the more years of education were scored power ready to use the growing number of phenomena that occur between the needs of ready workforce that is still inadequate and the growth of vocational education is increasing. With the growth of vocational education in Indonesia from year to year increases the required autonomy from a financial standpoint the Polytechnic so performance financial accounting information system will be in demand better. Higher education is one of the very high risk sector against criminal acts (cyber) / fraud. Colleges and universities reported that high level of criminal acts attacks (cyber), with millions of hacking attempts into the information system weekly. While social security and bank account numbers is always risky, but it is also susceptible to loss of valuable intellectual property such as patents granted to faculty and students, as well as the personal information of students, faculty and staff. Because the frequency of criminal attacks (cyber) / cheating in higher education institutions, the need to raise awareness of the virtual world has never been greater . With a variety of fraud (cyber) will affect the performance of financial accounting information systems in educational organizations especially Polytechnic.
255 third level of DWT decomposition in (LL3) sub band, while Fig. 1(c) shows the second four watermarks was embedded in the fourth level of DWT decomposition in (LL4, HL4, LH4, HH4) sub bands. They are distributed in different locations inside the image space, and this will improve the robustness and the security of the system.
Dissolved gas analysis is a common method for diagnosing faults in electrical transformers and determining the type of faults early on, depending on the specific standards used. Applying dissolved gas analysis methods can be used in diagnosis and in the evaluation process. There are many methods used in the diagnosis of faults in power transformers, including traditional and intelligent .The use of an intelligent expert system relies on dissolved gas analysis using artificial neural networks, and it gives excellent results in diagnosing faults and assessing the quality of insulating oil during service and the application of appropriate treatment.
Table 2 also shows that learner behaviours in the learning content design features (prerequisites, flowchart, references, objectives and details, help features (FAQ), support features (collaborate with teacher and expert), and evaluation feature (open question and pre-quiz) have significant effect on identifying the learning preferences for rational Learning Style. The findings also show that significant learner patterns include temporal behaviour with prerequisites, flowcharts, and FAQ features and navigation behaviour with references, objectives, open questions, communication channels with teachers and experts learning objects. Thus, it could be inferred that the rational learners tend to spend more time at reading and reviewing prerequisite topics and skills before studying a new topic. More time is also taken at logical thinking when viewing flowcharts. These learners tend to read the official help feature through reading the most common questions and its official answers. Furthermore, rational learners tend to navigate and browse official references such as books and articles to look for an intended topic/terminology/concept and prefer to open more communications channel with teachers and experts when facing challenges. In essence, it is found that rational learners adopt different WBES design features classified according to the main system components (Figure 6) as listed below:
A two layer neural network was also successfully used for the classification of connection records. Although the classification results were slightly better in the three layer network, application of a less complicated neural network was more computationally and memory wise efficient. From the practical point of view, the experimental results imply that there is more to do in the field of artificial neural network based intrusion detection systems. The implemented system solved a three class problem. However, its further development to several classes is straightforward. As a possible future development to the present study, one can include more attack scenarios in the dataset. Practical IDSs should include several attack types. In order to avoid unreasonable complexity in the neural network, an initial classification of the connection records to normal and general categories of attacks can be the first step. The records in each category of intrusions can then be further classified to the attack types.
An effective approach for reinforcement of IDSS per- formance is to develop an embedded simulation model that meets the desired objectives of the system [19-22]. Discrete-event simulation is a very powerful tool that can be used to evaluate alternative control policies in the manufacturing system [23-26]. Although the procedure of analyzing simulation results could rely on various guidelines and rules, decision-making still requires sig- nificant human expertise and computer resources. To efficiently use simulation in the decision process, inte- gration of IDSS with simulation has been emphasized [27-30]. However, there have been limited investigations on integrating IDSS with the modular simulation lan- guages as a unified approach for controlling manufactur- ing systems. So FMS control appears to be an excellent area for applying adaptive IDSS simulation-based con- troller.
Transient stability evaluation focuses on the reactive power flow of the power system in response to a fault. In transient stability prediction, the progress of the power system transient due to occurrence of disturbance is to be monitored. The key factor in transient stability prediction is based on the convergence and divergence of transient swings. The problem is formulated as the insertion of SVC in realtimesystem that is to be analyzed using ETAP simulation software for enhancing the transient stability. SVCs, with an auxiliary injection of a suitable signal, can significantly improve the dynamic stability performance of a power system (Byerly et al., 1982; Hammad 1986) presented a fundamental analysis of the application of SVC for enhancing the power system stability. Also, the enhancement of low frequency oscillation damping via SVC has been analyzed (Padiyar and Varma 1991; Zhou 1993; De Oliveira 1994; Messina et al., 1999). The SVC enhances the system damping of local as well as inter-area oscillation modes. Ref (Messina and Barocio 2003) studied the nonlinear model interaction in stressed power systems with multiple SVC voltage support. It is observed that SVC controller can significantly influence nonlinear system behavior especially under high-stress operating conditions and increased SVC gains (8).The general representation of SVC is represented in the Fig.1.1
Abstract: This paper discusses an analysis between scheduling algorithms in real-time multiprocessor environment. Many studies are already present regarding real-time scheduling algorithms. These constitute observations of real-time scheduling algorithms from the viewpoint of different traits particular to the system characteristics. In this paper algorithms especially EDF (Earliest deadline First) and RM (rate-monotonic scheduling) are studied. There are numerous misguided judgments exist about the properties of these two scheduling algorithm. This paper looks at RM against EDF under a few perspectives, utilizing existing hypothetical results or straightforward counterexamples to demonstrate that numerous regular convictions are either false or just limited to particular circumstances. Also, the parameters that can be used to evaluate algorithms also are defined. The paper is concluded by way of discussing outcomes of the study and futuristic research inside the subject of RM and EDF in real-time multiprocessor environment.