In this chapter, a new concept termed non-stationary fuzzy set is defined. These have been created with the specific intention of modelling the variation (over time) of opin- ion, and then formalise the novel concept that previously proposed by Garibaldi  to model the variation in expert opinion. While apparently similar to type-2 fuzzy sets in some regards, non-stationary fuzzy sets possess some important distinguishing features. A non-stationary fuzzy set is, effectively, a collection of type-1 fuzzy sets in which there is an explicit, defined, relationship between the fuzzy set. Specifically, each of the instan- tiations (type-1 fuzzy set) is derived by a perturbation of (making a small change to) a single underlying membership function. While each instantiation is somewhat reminis- cent of a embedded type-1 set of a type-2 fuzzy set, there is not a direct correspondence between these two concepts. It is also possible to view a standard type-1 fuzzy set, either as a single instantiation or as repeated instantiations of the underlying set with no pertur- bation. Again, a non-stationary fuzzy set does not have secondary membership function. Hence, there is no direct equivalent to the embedded type-2 sets of type-2 fuzzy set. Sim- ilarly, there are no secondary grades. While it is true that distributions of membership grades across ‘vertical slices’ are still not, formally, the same as secondary membership functions. The inference process is quite different. The crucial point is that, at any in- stant of time, a non-stationary fuzzy set is (instantiates) type-1 fuzzy set. Hence the non-stationary inference is just a repeated type-1 inference (albeit with slightly different type-1 sets at each time instant). In contrast, type-2 inference involves passing type-2 fuzzy sets through the process, resulting in type-2 output sets that require type reduction prior to defuzzification.
techniques". Filippo Piccinini et al.  described about "Advanced Cell ClassiïnˇA˛er, the graphical software package for phenotypic analysis". Boukaye et al.  discussed "the remote sensing satellite data processing by employing data mining procedures to discover the risk places of epidemic disease". S. Bandaru et al.  has given "some significant existing data mining procedures and categorizes them by methodology along with the type of knowledge discovered". S. Sengupta  has used "the concepts of particle swarm optimization technique and ARM to design a rule based classification system". Linear discriminate analysis(LDA) is mostly utilized in discriminate analysis for the prediction of the class. Jen et al.  employed "the LDA" in their respective work. Decision tree phenomenon is utilized by C T Su et al.  and R. Armaanzas . Yeh et al. , Fei 2010 and Abdi and Giveki  have employed "the swarm intelligence method to design their diagnosis model". W.C.Yeh , S.W.Fei  and Aahan et al.  used "k nearest neigh- bour in their predictive models". P.J.Garca et al.  and Samanta et al.  adopted "LR in their respective research work". Garca et al. , Zheng et al.  also utilized compelling method in their model for intent of medical diagnosis.
3.3 Singular Value Decomposition (SVD) SVD is considered an ancient numerical analysis technique that was discovered long ago to serve many applications . Beltrami and Jordan in the 1870’s introduced SVD and used it for real square matrices. After that, and in 1902 Autonne’s used it for complex matrices  . However, SVD was improved by Eckart and Young Later, in 1939, to include rectangular matrices as cited . Recently the SVD becomes one of the most important numerical techniques used in image processing applications, such as image watermarking, image hiding, image compression and noise reduction  . SVD could be applied on any medium. But mostly, it is famous in image processing. The widespread use of SVD is due to its important features and characteristics, which are the following  :
5136 from comparing data of FMADM result with data on Bangawan Solo watershed condition. Measurement conditions are measured through GIS Analysis rules with buffering and query techniques as applied in Figure 3 and all Appendix (A, B, C). Contrast between FMADM results with real conditions then can be calculated accuracy value as in Table 4. EWS installed for almost 11 months with each accuracy is different, the lowest is 69.7% when the sample is taken on 30/09/2017, while the highest accuracy when the sample is measured on 09/12/2017 is 80.1%. Of the total 8126 data entered in the database and taken random samples every month as Table 4, the decision conditions that appear are Normal, Standby 2, and Standby 3 while Standby 1 does not appear in this sampling. The final assertiveness test results show that 76.3% is accurate.
One of the most considerable needs of our world is a safe communication. Many studies on encrypt data types like text, images, audios and videos have been achieved in order to meet such need. The general goal of audio encryption is to inhibit the possession of data by undesired people. Nowadays, conversations in any place can easily be supervised with help of some certain technological devices. So, it has become a necessity to take many security reservations to protect such information. Although many encryption systems appear, it is generally accepted that they can still be decrypt the encrypted data with some techniques and in a certain amount of time. Therefore, complexity and other factors of algorithms used during encryption have become important in encryption process . Also, traditional encryption algorithms are efficient
With the rapid development of digital technology and communication media, data such as text, images, audio, video, etc. are growing importance in day to day life. A large amount of data is being transmitted over internet. There is always a threat of an intruder accessing the private information. So a mechanism needs to be implemented in order to keep the integrity and confidentiality of the information. This has led to an explosive growth in the field of information hiding. Cryptography is the most common word that is used in information hiding. Cryptography means converting the text from readable format to unreadable format. Cryptography applies encryption techniques to convert the message into non-readable form but it does not hides the message i.e., the encrypted message is visible . It would be great to have something that can embed the secret message into some media in such a way that
The practice and research prove that the use of multi-criteria decisionmaking tech- niques helps making firm decision to produce more beneficial outputs as each technique in this regard imposes use of analytical and effective comparative processes. Further- more, embracing fuzzy logic to empower these techniques even further helps generate more realistic solutions with high reliability as it brings human experts’ view in-the- loop for evaluations and judgments. In this study, an approach using a framework of multiple fuzzy multi-criteria decisionmakingtechniques is implemented for solving display products selection problems. The approach is applied to a real problem case that suggests a set of products to exhibit in a department store recently opened in Bursa province Turkey, which sells modular home and office furniture. The proposed ap- proach has been successfully applied and the results were verified with human expert view; then a substantial list of display products are suggested to the company for other stores.
The research was executed to enhance performance by using fuzzytechniques and also in giving a systematic design procedure that takes into account many objectives and needs no interface with linguistic directions from human experts. The proposed methodology is a step towards application of a systems approach to reliability estimation by integrating physics-of-failure models with classical statistical methods. The integration of imprecise subjective information into the reliability estimation process generally reduces the requirements of extensive testing.
Science Fiction (Sci-Fi) brings several examples of modifications made in the human body, each having different goals in mind — it may be either improving or compromising intellectual, physical, or psychological abilities. Lately, with consistent advancements in the Health field, mostly brought about by e-Health startups and the interdisciplinary combination of Biology, Medicine, Computer Science, and Engineering, many of the modifications seen in big screens became a reality, albeit from a weak signal point of view and not yet mainstream solutions to Health issues. Aiming to define the scope of this research, as Sci-Fi works are abundant and take the form of movies, animes, mangas, and books, filtering all of those would be a herculean job. Hence, for this paper, only movies and animes were assessed, according to precepts established in the Methodology section. Taking our society’s progress into consideration, the aims of this work are twofold: (i) knowing to what extent there has already been real scientific progress with regard to science fiction scenarios and predictions of human body transformations; (ii) understanding the repercussions of humans undergoing such modifications applied to several fields, such as Economics, Sociology and Ethics, pinpointing scenarios that should be discussed in preparation for future changes.
In the fast handover, the mobile node, when there is another coverage zone of AP, it should quickly connect to the other AP, some techniques aimed at resolving this by scanning AP in advance or using the same IP address . Predictive based approaches are more promising in resolving the problem of latency. Bergh & Ventura have implemented data mining algorithm based on the user’s mobility history between wireless subnets . Alexander Magnano et al., have proposed new handover model for vehicular network that predicts next AP's destination for nodes and perform early registration based on probability analysis and vehicle movements . However, there is lacking of customization to dense environment and complex road topologies. This can be handled by developing location aware prediction. Wanalertlak et al., study user's short- term behavior including location information, group, time-of-day, and duration characteristics of mobile users to provide accurate next-cell predictions . However, the performance varies according to the density. Sandonis et al., present solution to provide Internet access from VANETs combining the Proxy Mobile IPv6 (PMIPv6) with The European Telecommunications Standards Institute Technical Committee Intelligent Transport System (ETSI TC ITS) and its GeoNetworking (GN) protocols . However, the handover scheme ignores a prediction which plays an important role in reducing the latency. Asefi et al., proposes network mobility management scheme for seamless delivery of video packets in VANET . It introduces adaptation of Proxy Mobile IPv6 (PMIPv6) for multi-hop VANET incorporating a handover prediction mechanism. The handover prediction mechanism is based on fixed velocity assumption which is not applied when the vehicle is accelerating or decelerating. Thus, it is not suitable for realistic conditions. Tin-Yu Wu et al., introduces Quality Scan scheme to enhance the handoff performance in VANETs . It’s pre- scanning method that considers the signal strength and the load balance of the nodes.
Finally, the top 10 ranked list is selected to be evaluated by human experts. Alksher 𝑒𝑡 𝑎𝑙. proposed a generic framework of idea evaluation process by intervening the human judgment, who were authoring these papers or familiar with, by identifying ideas from these text patterns manually without using the idea mining techniques . Human experts assessed and rated 500 extracted ideas from 50 abstracts to manually identify the idea components. The experts check and score the top ranked ideas to be defined as the ground truth for the evaluation, then a statistical evaluation to analyze the validity and reliability of the extracted ideas is used. The characteristics of this dataset are depicted in table 4.2.
Competence factor in human resources requirement depends on how government enforces the minimum competence. In PP-PSTE, government requires every expert must own a competence certificate  that will be detailed by related authority. That regulation has been supplemented by Ministry of Manpower (MMp) and MCIT. MMp has released Regulation of MMp 55/2015 about national competency standard (SKKNI) in information security . It details competencies should be held by professional in information security field. Because certification practice has strong relationship with information security, then Root CA must comply with this SKKNI and declare the compliance through CP and CPS Prov. 5.3.1, 5.3.2, 5.3.3, 5.3.4. Root CA describes explicitly that SKKNI in information security has been complied as standard competencies or describe the detailed items in CPS. MCIT enforces SMPI  which specify required standard for electronic system authority depends on its classification. Based on SMPI exploration, Root CA has been classified as strategic electronic system that must implement ISO/IEC 27001  in its system. As consequence, Root CA should require its personnel has competencies in ISO/IEC 27001. (Prov. 5.3.1, 5.3.2, 5.3.3, and 5.3.4).
Finally, the IT consultant can apply three intervention techniques to each type of client. These techniques differentiate the way of identifying problems, conceptualised in the diagnostic elements that we have previously specified: (1) intervention technique as an expert, which establishes a series of conditions on the different types of clients with the objective of analysing the state of the diagnostic element; (2) medical/patient intervention technique, which is a variant of the previous technique but with conditions adapted to the analysis of the client system, adequate when the system experiences clear symptoms by knowing which IT area is diseased and which ITCOS methodology can be applied; and (3) intervention technique as a process, which places the IT client system as responsible for its problem and assigns the IT consultant the role of facilitating how to solve it.
At the final phase, the results are discussed and interpreted. The studied articles were selected based on the following two metrics namely relevance to data visualization technique and tools and contribution of system and visual techniques. The first metric focuses specifically on visualization systems, visualizations of code security, binary files, or visual cryptanalysis. While for the second metric, the authors studied several visualization tools and function present the related work to achieve the objectives of this work.
Abstract:-Today’s environment every individual is facing the global competition. They have multi task and managerial decision at several stages. A manual method of decisionmaking becomes complex and it requires intelligent modeling techniques to get the desired results. It also helps us to select the best suitable element from a set of alternatives by minimizing the effort and maximizing the benefit. If the decision makers feel that the data are in imprecise form, involving vague and linguistic descriptions then the fuzzy set theory is a modeling tool for solving such complex systems. Fuzzy set theory is suitable to use when the modeling of human knowledge and human evaluations are needed. Fuzzy program considers random parameters as fuzzy numbers and constraints are treated as fuzzy sets. In fuzzy mathematical program, membership function is used to represent the degree of satisfaction of the decision makers’ expectations about the objective function level and the range of uncertainty of coefficients.
An offshore installation/a ship is a complex and expensive engineering structure composed of many systems and is usually different from other such designs/facilities . Offshore installations/ships need to constantly adopt new approaches, new technology, new hazardous cargoes etc., and each element brings with it new hazards in one form or another. Therefore, safety assessment should cover all possible areas including those where it is difficult to apply traditional safety assessment techniques. Such traditional safety assessment techniques are considered to be mature in many application areas. Safety assessment techniques currently used in offshore/ship safety assessment need to be further studied and the criteria for effective use of them need to be established in safety assessment. An effective way is that in which different safety assessment methods are applied individually or in combination, depending on the particular situation, to assess risks with respect to each phase of the offshore installation’s/ship’s life cycle and each accident category . Noveldecisionmakingtechniques based on safety assessment are also required to make design and operation decisions effectively and efficiently.
The use of mechanistic multiscale modeling and simulation (M&S) in biomedical research continues to expand, and direct application in healthcare will emerge in the not too distant future. The potential of M&S in personalized and precision medicine will require standardized data provenance and workflows to produce a credible end-product. However, there are currently no standards or guidelines to promote credible practice of M&S in academia or in industry. Lacking such standards, it will be still more difficult to promote these new tech- niques for direct clinical use or for regulatory approval of devices or clinical software adjuncts. The Committee on Credible Practice of Modeling & Simulation in Healthcare (CPMS) was established under the aegis of the federal Interagency Modeling and Analysis Group (IMAG) and the Multiscale Modeling (MSM) Consortium. The CPMS is developing and adapting guidelines and procedures for credible prac- tice of M&S in conjunction with cultivating consistent terminology, and developing and demonstrating workflows for credible practice. These tasks are made more difficult by the fact that different organs and different experimental and clinical problems require very different kinds of models. In particular, study of the nervous system has a par- ticular focus on electrical phenomenology, which necessitates unique conceptual underpinnings, computational techniques, and scaling levels (e.g. dendritic processing) for consideration in multiscale M&S. In addition, study of the nervous system faces a unique challenge of understanding (and modeling) cognitive and motivational processes. Related to this is the fact that experimental data for the nervous sys- tem, and the brain in particular, remains very limited compared to levels of experimental data from other organs. Despite all of these differences, it is necessary and desirable that computational neurosci- ence begin to meet the challenges of credibility through replicability, reproducibility, reliability and provenance, which are beginning to be the expected norm in other branches of computational systems biol- ogy. As an initial effort, the CPMS has defined 10 preliminary rules for M&S credibility as follows: 1. Define context clearly; 2. Use appropriate data; 3. Evaluate within context; 4. List limitations explicitly; 5. Use ver- sion control; 6. Document adequately; 7. Disseminate broadly; 8. Get independent reviews; 9. Test competing implementations; 10. Con- form to standards. Naturally, these guidelines will need to evolve with the further development of M&S practices, especially in the context of medical devices and clinical guidance (e.g., NIH SPARC program efforts, Deep Brain Stimulation, Neuroprosthetics, etc.), where numerous reg- ulatory hurdles, as well as liability considerations, will shape credible M&S practice in the future.
An effective CDDM hinges on the availability of: (i) the information required for decision-making in a structured and machine-interpretable form, (ii) suitable machiner- ies to interpret the information, and (iii) a method to help identify the relevant infor- mation, capture it in model form, and perform what-if analyses. The current practice of organisational decision-making that relies heavily on human experts making use of the primitive tools such as spreadsheets, word processors, and diagram editors etc. fares poorly on all the three criteria mentioned above .
Previous studies have shown that optimization models have been shown to perform well for both planning and real-time operation . However, the real time optimization models are more complex as it involves time series data. Data ranges are either daily, monthly, and annually. The uncertainties, inaccuracies and seasonal variations have also contributed to the complexity of the real time reservoir operation . Naresh and Sharma  for example, apply a hybrid of Fuzzy Logic and Neural Network to maximize the annual hydropower generation by finding its optimum water release. The optimum water release decision often in conflict with reservoir water usage. As most of the reservoirs are multi-purpose, finding efficient operating policies so that optimum release can be achieved is vital in reservoir management .
Firstly, we describe the approach, and we define the indicators of the participation regulation platforms. Secondly, we present the modelling of these indicators by the membership function. Afterwards, we explain how to develop Fuzzydecision rules.