In this paper we presented a heuristic based algorithm called LIME-FOLD. This novel algorithm can induce very concise nonmonotonic **logic** programs to explain the implicit rules captured by any sophisticated classifier such as XGBoost. LIME is used to provide explanations as to what features contribute most to the XGBoost model’s prediction. How- ever, these explanations are local and specific to a given data sample and do not represent the global model’s behavior. We have shown that by filtering out the locally irrelevant features, a transformed dataset is created that, once given to the LIME-FOLD algorithm, yields a very concise **non**- **monotonic** **logic** program that is much more accurate than hypotheses induced by ALEPH, a state-of-the-art ILP sys- tem. We have justified our claim by running LIME-FOLD and ALEPH on the standard and transformed UCI standard benchmarks. In terms of the running time, our LIME-FOLD algorithm in average runs 80 times faster than ALEPH on 10 datasets reported in this paper.

Show more
Here the conclusion we would intuitively want to draw is that Tweety is a **non**-flying bird. However, using vanilla circumscription leaves open the option that Tweety is instead a flying penguin or a **non**-bird. Possible enrichments to the system to deal with this problem have been around for a long time and include amongst others circumscription policies and prioritized circumscription (eg [7]). Solutions revolve around the intuition that penguins are to be an exception to the rule that birds fly, creating means for (3) to overpower or temporarily disable (1). However, they fail to address what it is about this example that makes use want to make an exception to rule (1) rather than (2) or (3).

Show more
54 Read more

Besides, Answer Set Programming [21] (ASP) is an efficient unified formalism for both knowledge representation and reasoning in Artificial Intelligence (AI). It is a **non**-**monotonic** **logic** programming language allowing representation practical to reason with incomplete data. ASP has an elegant and conceptually simple theoretical foundation and has been proved useful for solving a wide range of problems in various domains [35]. Beyond its ability to formalize various problems from AI or to encode combinatorial problems [7, 32], ASP provides also an interesting way to practically solve such problems since some efficient solvers are available [20, 29].

Show more
14 Read more

The purposes of this work were to review the literature to identify NMDR relationships observed for some EDCs, to develop a methodology to assess whether those dose-response relationships were sufficiently reliable for use in risk assessments. In this qualitative approach, a judgment on the quality of the reviewed studies was not introduced, except for the case study on BPA. Rather, the aim was to derive tools that allow consideration of NMDR relationships in risk assessments. In recent years, NMDR profiles have been reported in the literature with an increasing frequency from a variety of in vitro and in vivo toxicological models involving substances that affect hormonal systems [10,74,75]. In an extensive re- view by Vandenberg et al. [10], they reported hundreds of examples of possible NMDR relationships for more than 20 natural hormones and more than 70 putative EDCs. Those examples were from studies performed in cultured cells, on whole animals or on human. It was not a surprise that such relationships were observed be- cause of the complexity resulting from the many modes of action through which EDCs may influence the actions of hormones. Along with the potential for complex pharmaco/toxicodynamic influences (e.g., expression of varying levels of multiple receptors and possible interac- tions with native ligands), the critical feedback mecha- nisms involved with regulation of hormonal systems creates a level of increased complexity. Dose-response relationship for EDCs would reflect this complexity and would likely result in a **non**-**monotonic** dose-response. Studies using several hormone-sensitive cell lines have shown that NMDR relationships can result from a var- iety of mechanisms [9,76]. However, that type of rela- tionship is not exclusive to EDCs and is also observed

Show more
15 Read more

Abstract. The hydroclimatic process is changing **non**- monotonically and identifying its trends is a great challenge. Building on the discrete wavelet transform theory, we devel- oped a discrete wavelet spectrum (DWS) approach for identi- fying **non**-**monotonic** trends in hydroclimate time series and evaluating their statistical significance. After validating the DWS approach using two typical synthetic time series, we examined annual temperature and potential evaporation over China from 1961–2013 and found that the DWS approach de- tected both the “warming” and the “warming hiatus” in tem- perature, and the reversed changes in potential evaporation. Further, the identified **non**-**monotonic** trends showed stable significance when the time series was longer than 30 years or so (i.e. the widely defined “climate” timescale). The signifi- cance of trends in potential evaporation measured at 150 sta- tions in China, with an obvious **non**-**monotonic** trend, was underestimated and was not detected by the Mann–Kendall test. Comparatively, the DWS approach overcame the prob- lem and detected those significant **non**-**monotonic** trends at 380 stations, which helped understand and interpret the spa- tiotemporal variability in the hydroclimatic process. Our re- sults suggest that **non**-**monotonic** trends of hydroclimate time series and their significance should be carefully identified,

Show more
10 Read more

cation in domains requiring **non**-**monotonic** reasoning activities. However, in the absence of conflicts, the inferential process can still be applied as it is. The analysis of the two reasoning approaches suggests that defeasible argumenta- tion might lead to explanations that are more suitable to understand for humans, both for a domain expert and a lay person. In fact, through the comparison per- formed above, on one hand, without some comprehension of fuzzy **logic** and its membership functions, the understandability/post-hoc interpretability, sim- ulatability of **non**-**monotonic** fuzzy reasoning and the extendibility of its models is compromised. On the other hand, defeasible argumentation tends to use the same natural language terms, provided by the domain expert, throughout the whole inferential process, except in the conflict resolution layer (semantics). Se- mantics vary in computational complexity (linear or exponential in the number of arguments), allowing fuzzy reasoning to offer an equal or lower complexity, since its fuzzification-engine-defuzzification layers are always linear in the num- ber of rules. However, Possibility Theory always requires the specification of a precedence order of exceptions in the inference engine of fuzzy reasoning. Con- trarily to acceptability semantics that do not require any precedence order of attacks for solving conflicts, thus it has a higher algorithmic transparency.

Show more
13 Read more

We also examine the conditions under which the presence of network effects improves welfare in this framework. This is particularly relevant given the ongoing trend towards e-commerce, under which consumers are isolated from the effects of physical crowding in stores. In-store sales are shown to improve welfare whenever the direct utility gained by consumers from the network at either firm is positive, which requires the total market size to be sufficiently small. This provides a welfare argument in favour of online only sales when the total market size is sufficiently large, which is absent if network effects are specified in a **monotonic** fashion. Finally, we study the welfare-maximising firm locations. We find that, relative to **monotonic** network effect models, the case for a duopolistic market structure is strengthened. For sufficiently large consumer population sizes, splitting demand between two firms not only reduces transportation costs but also maximises the aggregate network effect.

Show more
32 Read more

ously up until 603 K. Therefore, the superelastic behavior may have disappeared because of ordering of the ¢ phase when the specimen aged at 503 K. However, the XRD measurement for the specimen aged at 503 K shows only diffraction peaks from ¢ phase (Fig. 3). No striking dif- ferences were observed between the patterns from the specimens aged at 453, 503 and 553 K. Therefore ordering of disordered ¢ phase is not the cause of the **non**-**monotonic** aging temperature dependence of superelasticity.

Epidemic models are significant tools in analyzing the spread and control of infectious diseases. Quarantine, treatment and vaccination are most commonly used methods to control spread of infectious diseases. Of these, vaccination is a proven prophylactic approach and is used in healthy individuals to prevent occurrence of infectious diseases [1, 2]. Many infectious diseases like measles, mumps, rubella, hepatitis B, influenza are reduced to a great extent by the use of vaccination. Several clinical results [3] have shown that the vaccination does not give permanent resistance to the disease. Once effect of a vaccine wanes from the body, susceptibility towards the disease increases again. Therefore, in order to prevent the infection and eradication of the disease, the vaccination in population must reach its optimal level. A mathematical study [4] on a model for childhood diseases with **non**-permanent immunity has shown that the disease will persist in the population if the vaccination coverage level is below a definite value. An SEIV epidemic model [5] with a nonlinear incidence and a waning preventive vaccination has formulated and prove that there is always a backward bifurcation for increasing the rate at which infected individuals are treated. An epidemic model [6] has included partial temporary immunity and saturated incidence to obtain the critical vaccination coverage necessary for eradication of the disease.

Show more
17 Read more

The arc-eager algorithm is a **monotonic** parsing algorithm, i.e. once an action is performed, subse- quent actions should be consistent with it (Honni- bal et al., 2013). In **monotonic** parsing, if a word becomes a dependent of another word or acquires a dependent, other actions shall not change those dependencies that have been constructed for that word in the action history. Disfluency removal is an issue for **monotonic** parsing in that if an ac- tion creates a dependency relation, the other ac- tions cannot repair that dependency relation. The main idea proposed by RT13 is to change the original arc-eager algorithm to a **non**-**monotonic** one so it is possible to repair a dependency tree while detecting disfluencies by incorporating three new actions (one for each disfluency type) into a two-tiered classification process. The structure is shown in Figure 1(a). In short, at each state the parser first decides between the three new actions and a parse action (C1). If the latter is selected, an- other classifier (C2) is used to select the best parse action as in normal arc eager parsing.

Show more
In the literature, various techniques have been proposed to develop context-aware systems, including rule-based tech- niques [10], [3], [11]. In rule-based techniques, a context-aware system composed of a set of rule-based agents, and firing of rules that infer new contexts determine context changes and represent overall behavior of the system. In this work, we model context-aware systems as multi-agent rule-based defeasible reasoning systems. In order to model contexts and rules we use ontological approach. An ontology can represent a model of a domain of discourse that introduces a vocabulary to specify the concepts relevant to the domain and their relation- ships. The **logic** behind ontological knowledge representation is known as description **logic** (DL). The ability to model a domain and the decidable computational characteristics make DLs the basis for the widely accepted ontology languages such as OWL [12]. For context modeling we use OWL 2 RL, a profile of the new standardization OWL 2, and based on pD ∗

Show more
This paper investigates the effects of monetary policy on long-run economic growth via different cash-in-advance constraints on R&D in a Schumpeterian growth model with vertical and horizontal innovation. The relationship between inflation and growth is contingent on the relative extents of CIA constraints and diminishing returns to two types of innovation. The model can generate a mixed (**monotonic** or **non**-**monotonic**) relationship between inflation and growth, given that the relative strength of monetary effects on growth between different CIA constraints and that of R&D-labor-reallocation effects between different diminishing returns vary with the nominal interest rate. In the empirically relevant case where horizontal R&D suffers from greater diminishing returns than vertical R&D, inflation and growth can exhibit an inverted-U relationship when the CIA constraint on horizontal R&D is sufficiently larger than that on vertical R&D. Finally, the model is calibrated to the US economy, and we find that the growth-maximizing rate of inflation is around 2.8%, which is closely consistent with recent empirical estimates.

Show more
40 Read more

Chen and Wu[13] proposed a two species commensal symbiosis model with **non**-**monotonic** functional response, they showed that the unique positive equilibrium of the system is globally asymptotically stable. Stimulated by the work of Chakraborty, Das and Kar[24], we further incorporate the harvesting term to the system, however, the harvesting is restricted to a limited area. Our study shows that the harvesting effort and the area for harvesting plays essential factor on the dynamic behaviors of the system. Indeed, depending the parameter m, both species maybe driven to extinction, or one of the species will be driven to extinction, while the other one is permanent, or both species could be coexist in a stable state.

Show more
15 Read more

In this paper, a framework for non-conscious ways of reasoning has been presented based on fuzzy 305. multivalued logic, fuzzy semantics and frame oriented knowledge representation[r]

It is easy to construct examples where suggested sigmoid function does not approximate well the desired **monotonic** conditional probability function. Figure 1 illustrates the condi- tional probability approximations (for a sample consisting of 96 random numbers that are evenly split between both classes) for Platt’s approach and for the special one-dimensional case of the algorithm described further in this paper.

33 Read more

(e.g., enhanced survival of fecal microorganisms in colder water temperatures, release of nitrogen through decomposition) and management activities (e.g., fertilizer applications, tillage) often contribute to seasonal variation. Thus, some techniques beyond controlling for the effects of a flow covariate are often necessary for water quality trend analysis. Some trend analysis techniques require you to define a “season” in advance. Examination of box plots of data by season or other graphical displays may help identify reasonable divisions. In general, seasons should be just long enough so that some data are available for most of the seasons in most years of monitoring. If data are collected or aggregated on a monthly frequency, for example, seasons should be defined representing each of the 12 months. If data are considered in quarterly blocks, there should be four seasons. In agricultural settings, it may make sense to consider either two or four “seasons”: cropping and **non**-cropping, or **non**-cropping, seed preparation, cropping, and harvest.

Show more
23 Read more

(rescaled to a general interval [a, b]) as well as its converse given in part (b) of Theorem .. Pečarić and Raşa [] extended the inequality by using the method of index set functions; in the process they weakened assumption () and obtained a **monotonic** reﬁnement of the inequality.

The completely **monotonic** functions are useful in various branches, including mathematical analysis, probability theory and numerical analysis. Many papers have appeared providing inequalities for the gamma and various related functions. See, for example (Alzer& Batir, 2007; Alzer& Felder, 2009; Alzer & Grinshpan, 2007; Gao, 2011; Mortici, 2011; Salem & Kamel, 2015). In (Kazarinoff, 1956) proved that the function θ( ) which appeared in Wallis's formula

The latter were employed by Stanley [St] in 1981 to establish log concavity for the sequence N^.N^,... ,N*|p| > where N*T counts the number of linear extensions such that an element x in P has rank i. This fundamental theorem motivated the parallel results by D.E. Daykin, J.W. Daykin and M.S. Paterson [DDP] in 1984 for both strict and **non**- strict order-preserving maps. In this case we were able to construct an explicit injection to prove the inequality. These results are presented in Chapter 4.2.

189 Read more

This abstract argumentation practice may be utilised to better model a knowledge-base’s rule intuitively, in such a way that may be both appreciated by domain-experts and logically followed by those familiar with this notation, while also being tractable to a coded implementation, that will provide a means to both efficiently automate such processes and tackle relatively massive ontologies that would otherwise present a challenge to manually compute for each case possible from imported data. So, the process for developing such a structure starts with acquiring the arguments from the evidence within the knowledge-base, usually natural language propositions or more structured arguments with a particular language such as **logic** (Longo, 2014, p.63). The internal structure then is created via monological **logic** principles, with inference rules that link premises to conclusions, and these models may then be structured with one another via dialogical models, creating attacks. An argumentation framework is formed and the attacks amongst arguments are qualified as being successful or not, via preferentiality etc., and finally he dialectical status of arguments is assessed to determine what arguments will ultimately be accepted or rejected, under the chosen acceptability semantics, and these will lead to the final inferences generated by the framework. There can be multiple, varying extensions possible under the different semantics, but it may be prudent to select one depending on the designer’s preference (Longo, 2014, p.68).

Show more
103 Read more