This paper presents a comparative study on variousmethods of spectrum slicing techniques for the last few years. Different slicing methods are explained by using WDM and AWG. The crosstalk produced in conventional methods can be eliminated by using spectrum slicing technique. The bit error rate (BER) and signal to noise ratio (SNR) introduced by spectrum slicing network was analyzed and compared with other systems. Most methods used laser light sources are used as the light source for spectrum slicing optical communication. Usage of more energy efficient light sources results in low power consumption of the system. Also this spectrum slicing technique can be incorporated in any optical system where low power consumption is preferred.
In this paper, various causes of cervical cancer followed by variousmethods and algorithms used for detecting cervical cancer cells at its early stage is discussed. The common problem identified in most of the researches is noise, over segmentation, obtaining poor optimal solution, not suitable for overlapping cells, classification issues etc. The research on classifying cervical cancer is not bounded to single problem, the study of complexity of cell nature need to be done in detail. This will helpful in determining the successful prediction of cervical cancer at its early stages.
Age estimation provides a valuable tool in forensic cases. Variousmethods like morphological age, secondary sexual characters, skeletal age indicators and dental age indicators are used in the forensic investigation as well as in medicolegal cases. Morphological age and secondary sexual characters are not reliable factors for age estimation whereas skeletal age and dental age indicators provide valuable information. This article explains different methods of age estimation.
Expansion of urban populations and increased coverage of domestic water supply and sewerage give rise to larger amounts of municipal wastewater. With the current stress on environmental health and water pollution concerns, there is an increasing awareness of the need to dispose of these wastewaters securely and beneficially. Use of wastewater in agriculture could be avitalattention when it is disposed of in arid and semi-arid regions. The quantity of wastewater available in most countries will account for only a small fraction of the total irrigation water requirements. But wastewater use will result in the conservation of higher quality water. As the cost of supplies of good quality water will generally be higher in water-short areas. Properly planned use of municipal wastewater improves surface water pollution problems and not only preserves valuable water resources but also takes benefit of the nutrients confined in sewage to grow crops. The nitrogen and phosphorus content of sewage reduces the requirements for commercial fertilizers. It is beneficial to consider effluent reuse at the same time as wastewater collection; treatment and disposal are planned so that sewerage system design can be improved in terms of effluent transport and treatment methods
Many ways of feeling recognition are planned throughout the past thirty years. Feeling recognition is difficult but at the same time fascinating enough to attract researchers having different backgrounds: pc vision, pattern recognition, science, neural networks and tricks. It’s thanks to this undeniable fact that the literature on feeling recognition is Brob dingnagian and various. Often, one system involves techniques driven by totally different principles. The usage of a mix of techniques makes it troublesome to classify these systems primarily based strictly on what kinds of techniques they use for feature classification. to possess a transparent and high-level categorization, we tend to instead follow a tenet advised by the psychological study of however humans use holistic and native options. Recent approaches for facial features detection area unit template primarily based methodology and have primarily based methodology. 
A fingerprint based gender identification system constitutes of digital images of fingerprint as its input which is then transformed into frequency domain, compared with the predetermined thresholds and finally, gender is declared. The fig-3 shows the block diagram of the proposed gender identification system by frequency domain analysis of fingerprints. The fingerprint image from the database can be applied as input to the system and then we have to obtain the fundamental frequency of various transforms and use them for gender classification. Threshold setting can be done manually by analysing the sample data. The above proposed method can be implemented using MATLAB.
The Time to Digital conversion techniques has been generally used in various applications, e.g. Time of Flight (TOF) measurements, Clock and data recovery techniques, laser range finder etc. This derives from the scaling of analog and mixed signal circuits in deep submicron technology: while voltage level decreases, the noise does not scale and signal to noise (SNR) ratio decreases. The analog performance decreases remarkably under 100nm technology node. Since most application have tough performance in terms of resolution and power.
The development of Information Technology has generated large amount of databases and huge data in various areas. The research in databases and information technology has given rise to an approach to store and manipulate this precious data for further decision making. Data mining is a process of extraction of useful information and patterns from huge data. It is also called as knowledge discovery process, knowledge mining from data, knowledge extraction or data /pattern analysis. To generate information it requires massive collection of data. The data can be simple numerical figures and text documents, to more complex information such as spatial data, multimedia data, and hypertext documents. To take complete advantage of data; the data retrieval is simply not enough, it requires a tool for automatic summarization of data, extraction of the essence of information stored, and the discovery of patterns in raw data. With the enormous amount of data stored in files, databases, and other repositories, it is increasingly important, to develop powerful tool for analysis and interpretation of such data and for the extraction of interesting knowledge that could help in decision-making. The only answer to all above is ‗Data Mining‘. Data mining is the extraction of hidden predictive information from large databases; it is a powerful technology with great potential to help organizations focus on the most important information in their data warehouses (Fayyad 1996). Data mining tools predict future trends and behaviors, helps organizations to make proactive knowledge-driven decisions (Fayyad 1996). The automated, prospective analyses offered by data mining move beyond the analyses of past events provided by retrospective tools typical of decision support systems. Data mining tools can answer the questions that traditionally were too time consuming to resolve. They prepare databases for finding hidden patterns, finding predictive information that experts may miss because it
Movement detection is a technology for detecting change in the surroundings relative to an object, Security systems which are being used now a days are not smart enough to provide real time notification after sensing the problem. To overcome this problem sensor based application can be used to view the activity and get notifications when the movement is detected, it saves the time and cost. This paper surveys various currently available techniques for movement detection based on different previously proposed system for motion detection.
Transportation problem is considered a vitally important aspect that has been studied in a wide range of operations including research domains. As such, it has been used in simulation of several real life problems. The main objective of transportation problem solution methods is to minimize the cost or the time of transportation.
In the last few years there have been great successes in the application of deep and machine learning for the use of both object detection and classification. However, when there is a limited amount of data available for many different classes, accuracy is low and decent results can often not be obtained. This research aims to show various case-specific methods to analyse the data and to extract important features to improve classification, such as the Hough transform and mean shift segmentation. A convolution neural network, Alexnet has been trained using both the raw data and the extracted features. When training and validating the network using the raw data an accuracy of 28% has been obtained. When applying extracted features, the handles of the kitchen, to the same network accuracy improved from a 28% to an accuracy of 41%. This increase of thirteen percentage points shows that significant improvement is possible when extracting features before training a network.
The physical mixtures (PM) of drug with different carriers were prepared by blending method. The physical mixtures were prepared at various drug to carrier ratios with all the carriers in the increasing order of carrier amounts. The carriers used were sodium starch glycolate, crospovidone, crosscarmellose sodium.
The TSAF uses multiple filters and each filter is adapted for filtering a particular portion of the interval between regeneration times, and is suitable for tracking signals whose statistical properties recur at various points in time. The filters are trained and the filter weights are obtained via an adaptive algorithm. Fig 3 shows the conceptual realization of TSAF. The averaged TSAF is shown in Fig 4 together the EA using 2000 trials. TSAF and EA are very similar and their correlation coefficient is 0.983, and the measurement time greatly reduced using TSAF. The tracking ability of TSAF makes it possible for the clinician to observe the signal variation trace in every single ensemble .
ABSTRACT: The Supply Chain Management is all about managing the “flow” of materials and information among respective departments in the industry. The key elements of Supply Chain are people and processes. In fact, the Supply Chain Management (SCM) is all about managing people and processes to ensure fulfillment of customer needs and desires. Whether it is procurement, production planning, manufacturing, inventory management, distribution, warehousing, waste management or logistics including (reverse), it is absolutely imperative that people and process focus help achieve customer results. Thus, if Supply Chain Management is all about people and processes, there cannot be any better improvement model than Total Quality Management (TQM) which focuses on people and processes. The integration of Total Quality Management principles with Supply Chain Management would be a significant enabler for sigma level improvements in Supply Chain Management performance Thus, with these factors, a general model is prepared in this thesis to understand the total production plan of any industry and improve the existing model of the industry by the application of various improvement methods such as Six Sigma, Lean Management, Taguchi’s Method, Quality Circle, Decision Theory, Method Study, Time Study, Market Survey and Operational Research methods to develop a model which gives optimal production in minimum time and increases the entire productivity of the organization with customer satisfaction.
Blacklist as a name suggest is considered to be baleful and which has been gathered through techniques like users vote. So whenever a new website is established, the browser guides us to check whether the new website comes under a blacklist. If it is under the blacklist the browser warns the users to stop sending personal informations like ID, bank A/c no: etc. It is noteworthy that blacklist can be recorded in the user’s computer or optionally on a requested server, by the phishers as and when where is the URL request. According to  blacklists are noted and established at different frequencies. The estimation was 50-80% of phishing URL’s which are displayed in the blacklist 12 hours after their launch besides other black lists through Google’s need on an average of 7 hours for update . So it is inferred and understood that a black list updated then and there in the interest of the safety of users and that they do not become victims of the blacklist hackers. The blacklist approach is embattled with respect to various solutions one of the important is harmless browsing in google. In this file or predefined phishers are used-URLs to trace out fraudulent URLs. A different technique that is followed for the protection of Microsoft IE9 which works against phishing technique and also protect site advisor which are actually data based solutions and these are created to detect and catch illegitimate attacks like Trojan horses and Spyware. These have crawler which is automatically operate and help to browse the website and also establish threat and rate the range of threat which is connected with the already entered URL. Nevertheless, site advisers cannot locate or identity newly created dangers. The third one is the anti- phishing tool called Verisign traces numerous websites which can recognize “clones” so that one could find out the illegal websites. There is always a competition between the attackers and users, so approaches are not fool proof. There is a technique tiled Net-craft which is comparatively small process that activates on a web browser. It depends on totally illegitimate website which comes under the blacklist which in turn is recognized by Net-craft and also which is induced by the users and this is verified by Net-craft. Net-craft clearly shows the location of the server where the webpage is hosted. Users who are experience accept that Net-craft is very useful to the operation to site an example webpage which has “ac.uk”. it is not done other than UK.
To make any attack in while not knowing their enemy, the offender spoof the supply scientific discipline addresses via victimisation the intermediate victims' scientific discipline. UDP-based and communications protocol protocols will be use to handle the flood that hit victim and mounted. To perform this simulation, several computer code and tools square measure used, see table1. The software package and OpenStack- house put in within the virtual box. All the observation and mitigation tools put in in Ubuntu OS to observe and mitigate the attacks .Evaluate of the performance of the planned dynamic resource allocation technique for DDoS mitigation in an exceedingly cloud from numerous views. initial studied the performance for non-attack situations, then investigated the performance of the planned mitigation technique against associate degree in progress DDoS attack, so estimate the value for the planned mitigation methods. Analyze the effectiveness likewise because the impact of the bestowed e-DDoS strategy on a really straightforward personal cloud tested. The achieved results, properly scaled, will be accustomed estimate its effects on large-scale cloud infrastructures. The auto-scaling condition is happy, a brand new VM is deployed within the cloud, and therefore the employment is distributed between the 2 VMs (by suggests that of the load-balancing proxy). the central processor usage on the primary VM decreases, whereas the central processor of the second VM will increase. Meanwhile, the attack strength is unendingly improved; thence, at the tip of the time window in Fig. 3, the CPUs of each the already active VMs square measure fully exhausted, and a replacement VM is regular to be deployed .
In this article we Investigates demonstrate various techniques synthesis of benzimidazole derivatives, a portion of these strategies is basic and inexpensive which can do in research center conditions. On the opposite side there are staggering expenses strategies, which need costly crude materials, high temperature, and long time. Be that as it may, present audit indicates techniques synthesis of benzimidazole derivatives including the contrast between them. A standout amongst the most significant procedures in the arrangement of benzimidazole derivatives known as the Van Leusen strategy, which respond aldimines with tosylmethylisocynide (TosMIC).Reaction expanded in two stage and named as the Van Leusen three-segment reaction (VL-3CR).
Colloidal silica is a stable dispersion of solid silica parti- cles. It has found various applications such as investment casting, semiconductor wafer polishing, coating, and tex- tiles, and it has also been used as an inorganic binder, a nano-size filler, and as a catalyst precursor [1,2]. Colloi- dal silica can be prepared by variousmethods and starting materials, as listed in Table 1 . These methods include ion exchange [3-5], neutralization or electrodialysis of aqueous silicates, hydrolysis and condensation of silane , peptization or milling of silica gel or powder , and direct oxidation of silicon [8-10].