To establish the concept of high-performance and low-injury as well as to accomplish initial feedback from professional athlete during the focus group on their expectation of the system, overall display has been added with various sections as depicted in Fig. 6. The first section consists of switch that require the athlete to press to start before the run and press stop after the jump. The second section is sensors part, these is where maximum speed of the athlete during the run and maximum force exerted at the ankle during the jump were captured in between the start and stop switch period of time. The last section is the logs part, it stores the data of athlete and show the history of maximum speed and force at certain date and time. Hence, it consists of a real-time input of speed and force data capture for waist and ankle movement compared with the reference data by professional.
Abstract: Samsung, Fitbit and Sony are advanced companies that have produced various wearable devices for years, especially in sports technology for the needs of athletes. The high-tech developments within the Internet of things (IoTs) have also given athletes and coaches a smart way to develop the way the athlete trains and plays which will contribute towards economic benefit. However, the highjump sport monitoring system still use high-speed camera which is non-economical to the athlete and coaches. Besides, the camera itself needs technical expertise to setup the devices and the results are hard to understand and not in real-time basis. Therefore, the aim of this paper is to develop an IoT-based solution for real-time monitoring system in highjump sport. The OpenHAB mobile application (app) is created to communicate between the wearabledevice and the server. This mobile app helps the athlete to monitor their performances in the smartphone through IoT-based solution rather than using the costly high-speed camera. The outcomes have shown promising results since all data are able to be visualise and monitor in real-time, history of the training can be retrieved via the log files and the benchmark data acts as a guide to the intermediate athlete to improve the performance.
chip in  or the ADS1299 chip in ) or commercially available components to build application-specific analog front-ends (e.g. in this work and in ). It is clear that our instrument achieves a noise performance that is sig- nificantly better than the noise performance provided by the other three devices. It is important to emphasize here that although two of the devices presented in Table 5 (in [44, 45]) offer lower bandwidth than our device, they still provide an integrated noise value that is higher than the one offered by our instrument. The versatility of our AFE architecture and the real-time wireless biosignal transmis- sion it can offer at 1 kSPS sampling frequency allows the proposed instrument to record a wider variety of bio- signals compared to the other systems, with the lowest wireless transmission latency. In addition, it offers the highest input impedance, which allows it to efficiently interface with high-impedance electrodes (e.g. segmented electrodes in DBS), and the highest CMRR value, which can be very useful in applications where large common- mode disturbances (stemming from the application of strong stimulation pulses in simultaneous biosignal re- cording and stimulation setups) have to be rejected. The remaining features of our device are equally good or com- parable to the features of the other wearable and wireless systems presented in Table 5.
Proposed SLA based Inter cloud operations , does not use simulation to investigate and evaluate the performance and efficiency of different SLA- aware match making algorithms by supporting multiple SLA parameters. SLA-oriented Dynamic Provisioning Algorithm supports integration of market based provisioning policies and virtualization technologies for flexible allocation of resources to applications.
Though research on shadow detection and removal has been going on for more than two decades, high accuracy is still not achieved for complex scenes making it a ongoing research topic. Polidorio proposed a method in which the difference between saturation and intensity is thresholded in the HSI color space . Huang used the Phong illumination model and thresholded the hue and green –blue difference and the blue component . Tsai in his method considered the various color models like HSI, YC b C r , HCV, HSV and YIQ. He thresholded
After symbolizing each sentence text by a vector of characteristics, the identification function can be trained in two different techniques. The first is in a discriminating way with well known techniques such as SVM . In , mathematical regression model,the use of genetic algorithm ,probabilistic neural network, Gaussian mixture model and back propagation system for text summarization process have been inspected. This approach is a trainable Summarizer, which considers several characteristics, including positive keyword,sentence relative length, sentence position,sentence inclusion of named entities, sentence centrality,negative keyword, sentence resemblance to the title, sentence inclusion of numerical data, the bushy part of the sentence and combined similarity for each sentence to make summaries. As per Lefever et. al. , the challenging feature of this task is that it is in general not known previously specifically how many clusters to generate , thus the use of a Fuzzy Ants clustering method that does not depend on prior information of the number of clusters that need to be found in the documents. An analysis of benchmark data sets from SemEval's WePS2 or WePS1 competitions, shows that the resulting approach is successful with the agglomerative clustering method. Gorke et al.  implemented a procedure for grouping static or dynamic graphs. They use the minimum cut-trees to determine optimal clusters and present a technique to refine this data-structure when the graph nodes are updated. But, tree processing is an expensive operation and processing the initial tree is a global. Proposed tree has a high runtime and depends only on the size of the graph.
Humans are usually difficult to manage in the context of information security. In fact, humans are not very predictable because they do not operate as machines where if the same situation happened they will operate in the same way, time after time. Human challenge lies in accepting that individuals in the organization have personal and social identity (i.e. unique attitudes, beliefs and perceptions) that they bring with them to work as well as their work identity conferred by their role in that organization , . While information security management activities comprise processes and procedures, it
4.4.5. Influence of External Systems on the Quality of Financial Information External systems and a significant positive effect on the quality of financial information, thus the external system directly affects the quality of the financial information is significant. External system has indicators hardware and software indicators get the lowest loading factor is the hardware and which has a loading factor is the ultimate software with a high level of significance means that when the software is always carried out up to date latest version, performed maintenance and database systems in East Java Private Polytechnic it will improve the quality of financial information. All indicators on the external system gain above the required loading factor so good indicator of external systems hardware and software has a very strong influence and significant impact on the quality of financial information private Polytechnic in East Java. The development of hardware and software have grown so quickly that the Polytechnic swsata in East Java must upgrade the hardware and software up to date so that clicking the quality of financial information generated will be more qualified. Thus the results of this study support the research , with the title of Key Issues of Accounting Information Quality Management: Australian Case Study and the findings of the external factors is a critical factor that determines the quality of accounting information. In addition, this study does not support the research , Analysis of factors affecting the quality of local government financial information (Empirical Study of Semarang regency and municipal government),
The proposed approach addresses the problem of constraint importance for identification using ontology-based constraint prioritization. Semantic- HAC algorithm prioritizes the semantic constraints of the bug report based on the constructed BEME ontology. Root word includes constraints at various distances in the hierarchical form. The proposed approach considers triple-wise constraints and each constraint has the three elements of the bug ontology. A constraint has the highest priority for clustering if two elements of a constraint are a direct child (equal distance) from the root element. The constraint priority depends on the similar features between the bug reports. It provides the high bonding information of constraints to form clusters. Initially, a constructed BEME ontology consists of related information of each keyword in the bug report. Hence, the proposed approach exploits the BEME ontology for prioritizing the constraints.
Microstrip patch antennas are widely used now a days because of several advantages like compact size, ease of fabrication, lower cost etc .A patch antenna is a narrowband and wide beam antenna fabricated by etching the element pattern of antenna in a metal trace bonded to an insulating dielectric substrate, such as printed circuit board, with a continuous layer of metal bonded to the opposite of the substrate which forms a ground plane. The most used microstrip antenna is a rectangular patch because of ease of analysis and fabrication. The major advantage of using a rectangular patch is low cross polarization can be achieved. There are some operational disadvantages for these microstrip patch antennas like low efficiency, narrow bandwidth, low power handling capacity etc. The substrate used is RT-Duroid (relative permittivity=2.94 and loss tangent=0.0012).It has several advantages like low dielectric constant,loss tangent and low thermal coefficient.
clarification of the objects and deleting the abnormal points , consequently higher percentage of classification is linked with higher processing time about 0.6 seconds .When the size of block increases more than 5 or 7 , we notice that the average value for block with (7*7=49 pixels ) is comparatively big , so that some distortion occurs to the essential features of the image , there will be decrease in in this percentage and in the time of implementation. Whenever the size of block increases, the distortion increases and implementing time decreases. Even if the size of block becomes very large (19*19), (21*21), (23*23) then a huge distortion occurs in the image, therefore all the human bodies will classified as non-humans bodies even though the classifiers will consider all incoming objects as non-humans. Finally, the percentage will become 50% because we have only two types.
This paper proposes a new approach to extend the limitation of the previous algorithm by enabling their capability to generate a concurrent plan. The proposed algorithm based on Hierarchical Task Network (HTN) enhances SHOP2 planning system to detect and generate a concurrent plan based on the output of SHOP2 (sequence plan). To trigger the availability of concurrent planning, allocation of resources based on web services inputs is used. The inputs (resources) are compared with SHOP2 operator, and concurrent plan will initialize if the instance of inputs equal to SHOP2 operator. To evaluate our approach, we perform two experiments using pathway information retrieval and logistic dataset from SHOP2 benchmark problem. The result of pathway information retrieval shows that this approach is able to find and generate a concurrent plan, but it takes longer computational time. Meanwhile, by using logistic dataset, the proposed algorithm is efficient to handle concurrent tasks based on cost reduction by some pruned operators. Therefore, in our future work, we intend to examine the approach in other complex Bioinformatics and system biology workflow which widely used web services as their analysis tools.
problem solving feature (group assignment), and support features (group discussion, communication channel with friend) have a significant effect on identifying the learning preferences of the interpersonal Learning Style. The findings also show that the significant learning patterns covered temporal behaviour with the introduction, example, try-it by yourself, video tour and group discussion features and navigation behaviour with the group assignments, try-it by yourself, introduction and communicate with friends features. This means that the interpersonal learners tend to spend more time and behavioural interactions (navigation) at the practical learning methods through reading/reviewing the examples and practicing what they are taught via special editors or simulation tools. Furthermore, the results show that interpersonal learners tend to watch videos that take a tour of the whole system component rather than read details/explanations. The group discussion to collaborate between friends and with each other is a significant feature that identifies the learning preferences of interpersonal learners. In essence, it is found that interpersonal learners adopt different WBES design features classified according to the main system components (Figure 8) listed as follows:
eLearning is a dynamic process rather than a static one, the progression of time has redefined towards technological developments. Open-source platforms for educational purposes appeared more than 15 years ago. But only recently it is seen as a viable alternative to proprietary software. Rather, these platforms are frequently being modified by new demands in both technical and pedagogical aspects . Web based learning resource is an alternative to meet the expectations and needs of students in line with current modes of learning style . eLearning moved towards automation and administration in the form of Learning Management System, this system has been supported by Technology Acceptance Model along with Self Directed Learning. Social Networking Sites has obtained this learning system in default that, sharing of resource and posting of various comments in the online forums, supports the credentials of collaborative learning etiquettes.
Our proposed model loads real-time data received from the wireless sensors and the historic of the area for a first forecasting of this new area. The historic is examined by our model, to the fact that takes just the cases where floods had occurred to calculate the coefficients (a, b, c) in order to make the forecasting and warning by the proposed model.
The subject of transformer maintenance is important, and it should be considered in economic terms and in the continuity of work in transformers with high efficiency. High-power transformers will work more reliably if periodical and repeated maintenance and procedures are performed by the system. In particular, oil-immersed transformers need more attention than dry transformers, as the presence of chemical reactions in the insulating oil can lead to the loss of some of its chemical properties and efficiency .
256 A number of experiments are performed to evaluate the performance of the proposed watermarking algorithm on different gray scale images of size 512x512 like mandrill, boxer, boat, man, lighthouse, goldhill, lena and barbara, using MATLAB platform. The results of two gray scale host images, mandrill, boxer, boat and lighthouse along with used watermarked logo shown in Fig. 3, Fig. 4, Fig. 5, Fig. 6 and Fig. 7 respectively. Fig. 3 (a) - (b) show comparison between first host image and watermarked image, Fig. 4 (a) - (b) show comparison between second host image and watermarked image and Fig.5 (a) - (b) demonstrate comparison between third host image and watermarked image, Fig. 6 (a) - (b) show comparison between fourth host image and watermarked image, and Fig. 7 (a) - (b) demonstrate comparison between fifth host image and watermarked image while Fig. 8 (a) and (b) present first watermark logo and second watermark logo.
Each service has a lot of variation points . We have categorized these variation points into two categories: FunctionalVariation and TechnicalVariation. The former is divided into three classes: Logic, for the business logic variation; Interface, for methods signature and methods offered -by the service- variation and SchedulingStrategy, for the variation of the sequence of invocation of services used by a process. TechnicalVariation is divided into three classes: Locality, for the machine which hosts the service; GraphicalPresentation, which means that the graphical interface is adaptable according to the device used by the service requester and Persistence, for variation of Scheme or physical representation of persistent data. The persistence attribute “type” specifies the way in which the data will be persisted (relational, XML …etc). Also, SchedulingStrategy is specialized into two sub- strategies: SequentialSchedulingStrategy and ParallelSchedulingStrategy.
Network induced time delays can be constant, varying, or even random depending on the medium access control (MAC) protocol of a network. Networks for control purpose using CSMA protocols include DeviceNet and Ethernet [1-2]. A node on a carrier sense multiple access (CSMA) network monitors the network before each transmission. When the network is idle, it begins transmission immediately. Otherwise it waits until the network is not busy. When two or more nodes try to transmit simultaneously, a collision occurs. The way to resolve the collision is protocol dependent. DeviceNet, which is a controller area network (CAN), uses CSMA with a bitwise arbitration (CSMA/BA) protocol. Since CAN messages are prioritized, the message with the highest priority is transmitted without interruption when a collision occurs, and transmission of the lower priority message is terminated and will be retried when the network is idle. Ethernet employs a CSMA with collision detection (CSMA/CD) protocol. When there is a collision, all of the affected nodes will be back off, wait for a random number of time slots, and retransmit. Packets on these types of networks are affected by random delays, and the worst-case transmission time of packets is unbounded. Therefore, CSMA networks are generally considered nondeterministic. However, if network messages are prioritized, higher priority messages have a better chance of timely transmission.
Nowadays, from satellite images to medical diagnosis images are stored on our computers. To get these images on the computer they must be transmitted over phone lines or other cables. When the images are larger it takes longer compression time and higher storage space. A common characteristic of most images is that the neighboring pixels are correlated and therefore contain redundant information. Hence in order to remove that redundant information we have to detect less correlated pixels representation of the image. The two main components involved in compression are redundancy and irrelevancy reduction. In order to decrease the redundant information we can eliminate duplication from an image or a video. The reduction in irrelevancy eliminates the part of the image or video that will not be noticed by the signal receiver, namely the Human Visual System (HVS).