of the elements covered by every granule in a fuzzycovering system. Then we use the corresponding belief function and the plausibility function to reduce the information system ensuring the probabilities of every element and mass function unchanged. Meanwhile, though there are a lot of papers studied the evidencetheory combined with rough set theory, most of them concentrate on attribute reductionbased on the ev- idence theory or generating belief and plausibility functions by lower and upper approximation operators. They did not discuss information fusion using the set of basic granules, or the set of all focal elements of the fusion mass function may not be the set of basic information granules. That is, the above information fusion is not based on rough set theory. Therefore, another motivation of this paper is how to fuse the multi-information systemsbased on rough set theory. If there are more than one fuzzy coverings, we should consider the multi-information fusion combining with some feature of multi-fuzzycoveringsystems to ensure the set of focal element being a normalized fuzzycovering of the universe of discourse. By the fused mass function, the new belief function and plausibility function can be obtained by the evidencetheory and we investigate how to generate the fuzzy rough approximation operators by the new belief (plausibility) func- tions.
In this paper a new method for threat assessment is proposed based on FuzzyEvidenceTheory. The most widely classical and intelligent methods used for threat assessment systems will be Evidence or Dempster Shafer and Fuzzy Sets Theories. The disadvantage of both methods is failing to calculate of uncertainty in the data from the sensors and the poor reliability of system. To fix this flaw in the system of dynamic targets threat assessment is proposed fuzzyevidencetheory as a combination of both Dempster- Shafer and Fuzzy Sets Theories. In this model, the uncertainty in input data from the sensors and the whole system is measured using the best measure of the uncertainty. Also, a comprehensive comparison is done between the uncertainty of fuzzy model and fuzzy- evidence model (proposed method). This method applied to a real time scenario for air threat assessment. The simulation results show that this method is reasonable, effective, accuracy and reliability.
R o ugh Se t s T h e o r y is a n e w ap p r o ach t o dat a an alysis which has at t ra ct e d at t e n t ion of many researchers all over the world [2,3,4]. This t heory overlaps wit h many ot he r t heorie s such as Fuzzy set theory , Evidencetheory  and Boolean re asoning me thods , nevertheless it c a n b e vie we d i n i t s o wn r i gh t s, a s a n independent disipline .
Data-level fusion (Pixel-level fusion): In data level fusion information combination is performed at the level of pixels or intensity values of the images. For pixel level fusion to take place the source images should contain complementary information and also, the pixels must represent the same physical properties of the field of view.
occurrence of printing errors. Broadly speaking, Fuzzy-Pattern-Classification is a known technique that concerns the description or classification of measurements. The idea behind Fuzzy-Pattern-Classification is to define the common features or properties among a set of patterns (in this case the various behaviours a printing press can exhibit) and classify them into different predetermined classes according to a determined classification model. Classic modelling techniques usually try to avoid vague, imprecise or uncertain descriptive rules. Fuzzysystems deliberately make use of such descriptive rules. Rather than following a binary approach wherein patterns are defined by “right” or “wrong” rules, fuzzysystems use relative “if-then” rules of the type “if parameter alpha is equal to (greater than, …less than) value beta, then event A always (often, sometimes, never) happens”. Descriptors “always”, “often”, “sometimes”, “never” in the above exemplary rule are typically designated as “linguistic modifiers” and are used to model the desired pattern in a sense of gradual truth (Zadeh, 1965; Bezdek, 2005). This leads to simpler, more suitable models which are easier to handle and more familiar to human thinking. In the next sections we will highlight some Fuzzy-Pattern-Classification approaches which are suitable for sensor fusion applications.
7 Resistance to foot traffic
Results of tests indicate that the systems can accept the limited foot traffic and light concentrated loads associated with the installation and maintenance. Reasonable care should be taken to avoid puncture by sharp objects or concentrated loads. Where traffic in excess of this is envisaged, such as maintenance of lift equipment, a walkway must be
The paper has presented also the usage of mathematical theory of evidence in evaluating of the possibility of object detection by monitoring stations. The level of object detection allows for effortless conversion to optimisation problem of monitored area coverage. Development of such task enables such distribution of monitoring stations that maintains the detection rate higher than the assumed value. The usage of software engineering methods in simulation study on distribution of shore monitoring stations is not limited solely to the possibility of realising a project of such type on the basis of model derived from this field. The main purpose of the application is to determine locations of shore observation points (taking into account facility location problems) in order to ensure the control, management and maritime traffic control in areas of limited surface.
Fig. 3 shows the distribution of final scores for all the clicks with two versions of CCFDP. The lighter graph (L) corresponds to the first version of CCFDP where only rule based module was used. The darker one (D) is the new version with multiple modules. Area I represents most of the valid clicks. This corresponds to the records with attributes which do not have presence in the fraudulent database and all key attributes satisfies the requirements defined in the algorithm to be a legitimate click. The percentage of traffic present in Area I with system L is much higher than that of system D. With the inclusion of multiple models the suspiciousness of clicks has increased and the graph is shifted to the Area II with the system D, which is still in the safer region. Area III shows the suspected clicks. These are records with the attributes present in the fraudulent database or attributes that exceed certain threshold values. It can be clearly seen in the graph how the scores have increased after fusing multiple pieces of evidence from different modules. Area IV includes invalid clicks. Blocked traffic is identified as clicks with highly suspicious scores usually greater than 0.9. As shown in Table 5 with the traditional system (rule based system) we were able to block only 520 fraudulent clicks but with the multi model system it was 643, which is about 24% additional clicks. We believe that advertisers should not be billed for any of these clicks.
and detect the diﬀerent defects. Wu et al.  proposed a method of Hilbert–Huang transform and instantaneous dimensionless frequency normalization and applied them for the gearbox system. However, there is a certain overlap between the normal state and a range of various fault states using this method. In other words, the scope of the di- mensionless indexes of normal equipment and fault equipment is diﬃcult to distinguish, which makes the decision more diﬃcult. To solve these problems, Xiong et al.  proposed a genetic programming method based on dimensionless indexes in the time domain, which has achieved positive results in rotating machinery classiﬁca- tion. However, constructing new dimensionless indexes with this method presents many deﬁciencies. For instance, the operator set and the termination character set aﬀect the complexity and convergence of the program. Therefore, when the search scope is expanded, it is easy to lose po- tential useful fault information. All of these variables can aﬀect fault diagnosis eﬃciency. To solve the problem of
The presented method using the advantage of fuzzy approach, maps the noisy data into the fuzzy memberships and even in this first step, the effect of noise is paled significantly. Then, by eigenvector and covariance analysis, decrease the number of features and only send the principle features to clustering methods. Thus, the resulted system is more efficient and robust to environmental noises.
k-nearest neighbor and dimensionless indicators to improve the reliability of bearing fault diagnosis. It should be noticed that, in the case of multiple sensors, differences usually exist in various sensors, such as the deployment places and the acquisition precision because of the manufacturing process. Not all the usefulness and quality of data collected from these sensors are the same. Hence, there is a strong need to assign important factor, namely the concept of evidence weight, to various sensors to maximize the usefulness of redundant information measured from these sensors so that the efficiency of both monitoring and the analytic system can be guaranteed and enhanced to the most extetnt. Recently, an improved D-S evidencetheory method was proposed in , where weights were proactively assigned to each evidence source by the authors’ experience. Nevertheless, human’s practical experiences may be not limited which may not be in line with the real situations due to the complex industrial equipment and environments. Besides, it is more convenient and credible to appropriately acquire evidence weights from experimental data sets rather than empirical decisions according to various application scenarios.
In this paper a new method based on fuzzytheory is developed to solve the project scheduling problem under fuzzy environment. Assuming that the duration of activities are trapezoidal fuzzy numbers (TFN), in this method we compute several project characteristics such as earliest times, latest times, and, slack times in term of TFN. In this method, we introduce a new approach which we call modified backward pass (MBP). This approach, based on a linear programming (LP) problem, removes negative and infeasible solutions which can be generated by other methods in the backward pass calculation. We drive the general form of the optimal solution of the LP problem which enables practitioners to obtain the optimal solution by a simple recursive relation without solving any LP problem. Through a numerical example, calculation steps in this method and the results are illustrated.
above, one can naively speculate as to whether a categorization of all coveringsystems with a ﬁxed minimum modulus might be possible in some way. Admittedly, such a notion seems intractable, if not impossible. But perhaps, a less ambitious task is possible. For example, could an enumeration be given of all coverings with a ﬁxed set of moduli or a ﬁxed least common multiple of the moduli? Recently , we have accomplished this goal for a very speciﬁc situation involving primitive covering numbers– a notion introduced by Zhi-Wei Sun  in 2007. While the methods in  are purely combinatorial, we show in this article how certain group actions can be used to examine the set of all coveringsystems of the integers with a ﬁxed set of distinct moduli.
Till now, simple arithmetic fusion methods have been developed to improve the quality of TWR images . First of all, additive fusion for TWR images is used in  to compensate the disturbances caused due to unknown wall characteristics. But the drawback of additive fusion is that it retains most of the clutter and background noise. Later, multiplicative fusion has been introduced in ,  to enhance the polarimetric radar images. This approach also has a drawback that it tends to suppress the targets with weak intensities.
Editing and postprocessing operations are no longer limited to computer science laboratories and are not restricted to the researches. Now, with not too expensive softwares like Photoshop, it becomes an easy task to make different kind of changes on photos even by the persons with limited information on image processing. Most of the devices have some sort of free image processing softwares that helps people to convert and demonstrate the taken image as they intended. Softwares like 3D-max helps to create completely virtual images independent of other real photos. Most of the times, manipulation of images is done with the aim of increasing their performance, however it sometimes could be used to transfer an untrue message or disﬁgure an existing fact. Figure 1.1 illustrates a very famous forged image that shows a very famous celebrity next to a politician. This image affected the Senator John Kerry’s destiny in the United State of America presidential election on 2004. The person who created this image by composting two different images was arrested and this image itself was used as a clue and proving evidence of the criminality in the court.
The universality of Ubiquitous Network produced a series of basic safety issues, such as trust, confidentiality and privacy. As a result, “computing trust” emerged as a new field  . The concept of “trust” comes from sociology which makes the model of reputation and trust very difficult to design. A trust model based on the reputation and the self-confidence put forwarded by Ramchurn et al in literature , processes history interactions using certain degree of fuzzy logic. In the model, self-confidence is derived from the direct interaction between the interactive nodes while reputation is derived from the indirect interaction and information collected from other nodes in the network. In literature , Schlager et al use fuzzy cognitive map and trust
Recent research in MANET is to detect and prevent the specific attack in the network. Kurosawa and Jamalipour (2007) proposed a mechanism for black hole detection for AODV. Fuzzybased genetic algorithm has been proposed by Wang Yunwu (2009) which uses initial rules from fuzzy algorithm and final rules from genetic algorithm. Genetic based intrusion detection system for TCP/IP networks has been proposed by Wei li (2010).Yi et al (2005) considered RREQ flooding attack, so they invented a new mechanism to prevent RREQ flooding attacks based on the next node supervision. Hu et al (2003) experimented how an attacker can use a rushing attack in the network in DSR and implemented a new method for rushin attack prevention mechanism for MANET. Though many analysts were trying to prevent the network from the attack, some researchers were suggested with general approach. Jungwan Kim et al (2001) proposed the artificial immune system for IDS and it is based on hierarchical approach which is inspired by human immune system. A same approach, Ariadne has proposed a mechanism for end-to-end delivery based on the key that has been shared. More effort is needed to prevent the network from attack. Mechanism proposed in above method is to protect network against other attacker through routing.
These methods are limited by processing crisp data, which have to be set exactly and the probabilities have to be known. The risk analysis are often specified only vaguely and their probabilities are based on experts estimations. According to the complexity of IT-Systems and their dependent interconnection and the inherent uncertainty of all possibilities, the fuzzy num- bers are an appropriate mathematical description of nonprecise data. Often only linguistic terms are used to describe the risk of systems, such as the HPEM risk assessment cube from Sabath , which is defined by the threat level, mobility (ML), and technological challenge. These terms could be transformed into fuzzy sets with different kinds of boundaries and membership functions. These soft or crossing boundaries are an advantage of the fuzzy probability theory. The fuzzy logic and set theory is used to determine the risk of a target system in a facility and its surrounding area. The approach (see Section II) turns every data into a computational mathematic. For that, the failure propaga- tion is estimated for series and parallel circuits (see Section III) and the fuzzy rules (see Section IV) to combine the linguistic terms. The risk analysis based on the fuzzytheory (see Section V) divides the complexity in three parts: the breakdown failure probability of the victim system, the hazard of IEMI sources in zones of accessibility and a classification of the whole environ- ment. An exemplary scenario (see Section VI) is used to discuss the approach to improve the protection against IEMI threats.
The observed values in real-world problems are often imprecise or vague. Imprecise or vague data may be the result of unquantifiable, incomplete and non obtainable information. Imprecise or vague data is often expressed with bounded intervals, ordinal (rank order) data or fuzzy numbers. In recent years, many researchers have formulated fuzzy DEA models to deal with situations where some of the input and output data are imprecise or vague. There are a relative large number of papers in the fuzzy DEA literature. Fuzzy sets theory has been used widely to model uncertainty in DEA. The applications of fuzzy sets theory in DEA are usually categorized into four groups (Lertworasirikul 2002, Lertworasirikul et al. 2003, Karsak 2008): the tolerance approach, the Į-level based approach, the fuzzy ranking approach and the possibility approach. While most of these approaches are powerful, they usually have some theoretical and/or computational limitations and sometimes applicable to a very specific situation (e.g., Soleimani-damaneh et al. (2006)). The tolerance approach was one of the first fuzzy DEA models that was developed by Sengupta (1992a) and further improved by Kahraman and Tolga (1998). In this approach the main idea is to incorporate uncertainty into the DEA models by defining tolerance levels on constraint violations. The Į-level approach is perhaps the most popular fuzzy DEA model. This is evident by the number of Į-level based papers published in the fuzzy DEA literature. In this approach the main idea is to convert the fuzzy CCR model into a pair of parametric programs in order to find the lower and upper bounds of the Į-level of the membership functions of the efficiency scores. The fuzzy ranking approach is also another popular technique that has attracted a great deal of attention in the fuzzy DEA literature. In this approach the main idea is to find the fuzzy efficiency scores of the DMUs using fuzzy linear programs which require ranking fuzzy sets. In this section, we also review a related method, called “defuzzification approach”, proposed by Lertworasirikul (2002). In this approach, which is essentially a fuzzy ranking method, fuzzy inputs and fuzzy outputs are first defuzzified into crisp values. These crisp values are then used in a conventional crisp DEA model which can be solved by an LP solver. The fundamental principles of the possibility theory are entrenched in Zadeh’s (1978) fuzzy set theory. In fuzzy LP models, fuzzy coefficients can be viewed as fuzzy variables and constraint can be considered to be fuzzy events. Hence, the possibilities of fuzzy events (i.e., fuzzy constraints) can be determined using possibility theory. Dubois and Prade (1988) provide a comprehensive overview of the possibility theory. Lertworasirikul (2002) have proposed two approaches for solving the ranking problem in fuzzy DEA models called the “possibility approach” and the “credibility approach.”