divergence measures

Top PDF divergence measures:

Assessing Divergence Measures for Automated Document Routing in an Adaptive MT System

Assessing Divergence Measures for Automated Document Routing in an Adaptive MT System

To provide quick and accurate calculations of the Jensen- Shannon measure, we constructed an in-house software tool (Jaja et al., forthcoming). The tool has a “basic” mode where the user can upload two corpora and see the resulting Jensen-Shannon divergence measure (as well as other divergence measures) and the type and token counts. This mode can display a frequency-sorted or an alphabetically-sorted list of each corpora's types as well as a list of the unique and intersecting types between the two. Additionally, the tool has a “batch” mode where the user can upload multiple datasets, to calculate the divergence measure scores for all pairwise comparisons and then average over these.
Show more

8 Read more

ON CONVEXITY OF WEIGHTED FUZZY MEAN  DIVERGENCE MEASURES

ON CONVEXITY OF WEIGHTED FUZZY MEAN DIVERGENCE MEASURES

Complexity of life has given birth to uncertainties and fuzzy uncertainty is among them. Zadeh [6] introduced the concept of fuzzy set theory and there after a lot of research work has eased the complexity of life in this discipline. In the present study, our main objective is to discuss, fuzzy means divergence measures. Recently [2, 3] has considered some means analogous to information theoretic mean divergence measures studied by [4, 5]. Importance of event or experiment has been the outlook of every human being, therefore, we utilize the weighted distribution corresponding to fuzzy set theoretic distribution and consider the following fuzzy information scheme.
Show more

12 Read more

Solution to a Function Equation and Divergence Measures

Solution to a Function Equation and Divergence Measures

As early as in 1952, Chernoff 1 used the α-divergence to evaluate classification errors. Since then, the study of various divergence measures has been attracting many researchers. So far, we have known that the Csisz´ar f -divergence is a unique class of divergences having information monotonicity, from which the dual α geometrical structure with the Fisher metric is derived, and the Bregman divergence is another class of divergences that gives a dually flat geometrical structure different from the α-structure in general. Actually, a divergence measure between two probability distributions or positive measures have been proved a useful tool for solving optimization problems in optimization, signal processing, machine learning, and statistical inference. For more information on the theory of divergence measures, please see, for example, 2–5 and references therein.
Show more

9 Read more

ON BOUNDS FOR LOGARITHMIC NON-SYMMETRIC WEIGHTED DIVERGENCE MEASURES

ON BOUNDS FOR LOGARITHMIC NON-SYMMETRIC WEIGHTED DIVERGENCE MEASURES

Divergence measures have played a vital role to test reliability of the information, statement or the system. To minimize or maximize the same depends upon the goal or strategy of the experimenter, Recently Ruchi and Singh [3] has applied different divergence measures for profit maximization in share market. The same study has been extended to decision making process in case of world universities ranking problems. To correlate the different parameters for ranking problem, divergence measures have been tested.
Show more

8 Read more

A Refinement of Jensen's Inequality with Applications for f-Divergence Measures

A Refinement of Jensen's Inequality with Applications for f-Divergence Measures

The main aim of the present paper is to establish a different refinement of the Jensen inequality for convex functions defined on linear spaces. Natural applica- tions for the generalised triangle inequality in normed spaces and for the arith- metic mean-geometric mean inequality for positive numbers are given. Further applications for f -divergence measures of Csisz´ ar with particular instances for the total variation distance, χ 2 -divergence, Kullback-Leibler and Jeffreys divergences are provided as well.

10 Read more

Stolarsky and Gini Divergence Measures in Information Theory

Stolarsky and Gini Divergence Measures in Information Theory

One of the important issues in many applications of Probability Theory is finding an appropriate measure of distance (or difference or discrimination) between two probability distributions. A number of divergence measures for this purpose have been proposed and extensively studied by Jeffreys [14], Kullback and Leibler [18], R´ enyi [27], Havrda and Charvat [12], Kapur [15], Sharma and Mittal [28], Burbea and Rao [4], Rao [26], Lin [20], Csisz´ ar [6], Ali and Silvey [1], Vajda [35], Shioya and Da-te [29] and others (see for example [15] and the references therein).

12 Read more

Preliminary Test Estimators and Phi-divergence Measures in Pooling Binomial Data

Preliminary Test Estimators and Phi-divergence Measures in Pooling Binomial Data

In this paper we focus on the problem of pooling proportions of two independent random samples taken from two possibly identical binomial distributions (see Ahmed (1991)). This author considered a preliminary test based on restricted maximum likelihood estimator and classical Pearson's test statistic. In this paper instead of considering the restricted maximum likelihood estimator we shall consider the restricted minimum phi- divergence estimator and instead of Pearson's test statistics a family of phi-divergence test statistics. Therefore in this paper we introduce a family of preliminary test estimators for the problem of pooling binomial data that contains as a particular case the preliminary test estimator considered by Ahmed.
Show more

13 Read more

Some Inequalities for f-Divergence Measures Generated by 2n-Convex Functions

Some Inequalities for f-Divergence Measures Generated by 2n-Convex Functions

Let (Ω, A, µ) be a measure space satisfying |A| > 2 and µ a σ−finite measure on Ω. Let P be the set of all probability measures on the measurable space (Ω, A) which are absolutely continuous with respect to µ. For P, Q ∈ P, let p = dP dµ and q = dQ dµ denote the Radon-Nikodym derivatives of P and Q with respect to µ. Two probability measures P, Q ∈ P are said to be orthogonal and we denote this by Q ⊥ P if

12 Read more

On bounds of some dynamic information divergence measures

On bounds of some dynamic information divergence measures

However, in many applied problems viz., reliability, survival analysis, econom- ics, business, actuary etc. one has information only about the current age of the systems, and thus are dynamic. Then the discrimination information function be- tween two residual lifetime distributions based on Renyi’s information divergence of order  is given by

14 Read more

Portfolio optimization based on divergence measures

Portfolio optimization based on divergence measures

where W is the set of weights that produces feasible portfolios for the investor, and c is the target value of the Reward quantity. Its dual formulation consists of maximizing the Reward measure given a target Risk value. In the MV framework, the risk measure is represented by the covariance matrix of the assets returns. However, this representation suffers from the fact that it does equally represent risk on both sides of the financial distribution returns. Over the years, new risk measures have emerged, which consider the downside of the return distribution. Examples are the semi-variance introduced by Markowitz [1959], the lower partial risk measure introduced by Fishburn [1977] and further developed by Sortino and Van Der Meer [1991]. Roy [1952] introduced another method of portfolio allocation, known as the safety first principle. It consists of finding the allocation that has the smallest probability of ruin. Over the years, this concept has involved into the value-at-risk (VaR), advocated by Jorion [1997], and the conditional value-at-risk (CVaR), introduced by Rockafellar and Uryasev [2000]. Another famous portfolio selection framework is that of Black and Litterman [1992], where the investor can include his personal views, on the evolution of the market, in the portfolio criteria. The incorporation of views into the portfolio selection has been extended, by Meucci [2008], into the entropy pooling approach.
Show more

27 Read more

Series of New Information Divergences, Properties and Corresponding Series of Metric Spaces

Series of New Information Divergences, Properties and Corresponding Series of Metric Spaces

Professor, Department of Mathematics, Malaviya National Institute of Technology, Jaipur (Rajasthan), India 1 Ph.d Scholar, Department of Mathematics, Malaviya National Institute of Technology, Jaipur (Rajasthan), India 2 Abstract: Divergence measures are basically measures of distance between two probability distributions or these are useful for comparing two probability distributions. Depending on the nature of the problem, the different divergences are suitable. So it is always desirable to create a new divergence measure.

9 Read more

Some Slater's Type Inequalities for Convex Functions Defined on Linear Spaces and Applications

Some Slater's Type Inequalities for Convex Functions Defined on Linear Spaces and Applications

The main aim of the present paper is to extend Slater’s inequality for convex functions de…ned on general linear spaces. A reverse of the Slater’s inequality is also obtained. Natural applications for norm inequalities and f -divergence measures are provided as well.

13 Read more

Empirical likelihood-based adjustment methods

Empirical likelihood-based adjustment methods

Ideas from survey sampling will be used to generalize previously mentioned methods for nonparametric covariance adjustment. The proposed statistical methods will involve the use of criteria based on alternative divergence measures (other than Neyman’s MMCS implicitly used by the nonparametric covariance adjustment methods), the construction of confidence intervals based on test-inversion (as an alternative to the usual confidence intervals based on the asymptotic normality of the point estimator), the estimation of more general parameters of interest (other than differences between means), the use of more general side information constraints (other than the constraint of equal means for the covariates), and more general stratified versions.
Show more

157 Read more

Sharp Bounds for the Deviation of a Function from the Chord Generated by its Extremities and Applications

Sharp Bounds for the Deviation of a Function from the Chord Generated by its Extremities and Applications

for all real orders α 6= 0, α 6= 1 (and continuously extended for α = 0 and α = 1) in [11], where the reader may find many inequalities valid for these divergences, without, as well as with, some restrictions for p and q. For other examples of divergence measures, see the paper [9] and the books [11] and [15], where further references are given.

20 Read more

IPQpq …….(1.1)

IPQpq …….(1.1)

New divergence measures and their relationships with the well known divergence measures are also studied by Kumar, Chhina [14], Kumar, Hunter [13] and Kumar, Johnson [15]. J-divergence equals the average of the two possible KL-distances between two probability distributions, although

5 Read more

A non symmetric divergence and kullback leibler divergence measure

A non symmetric divergence and kullback leibler divergence measure

Information divergence measures and their bounds are well known in the literature of Information -symmetric information divergence symmetric divergence measure in terms of Kullback- Leibler divergence measure have been studied. Numerical bounds of new divergence measures are

6 Read more

New information inequalities on new f  divergence by using ostrowski’s inequalities and its application

New information inequalities on new f divergence by using ostrowski’s inequalities and its application

Divergence measures are basically measures of distance between two probability distributions or compare two probability distributions, i.e., divergence measures are directly propositional to the distance between two probability distributions. It that any divergence measure must take its minimum value zero when probability distributions are equal and maximum when probability distributions are perpendicular to each other. So, any divergence measure must increase as probability distributions move apart.
Show more

18 Read more

Vol 5, No 3 (2014)

Vol 5, No 3 (2014)

Key Words: Difference of generalized φ - divergence measures, convex and normalized function, new information inequalities, new divergence measures, bounds of new divergence measures.[r]

11 Read more

Modified Kaptur’s Measures of Entropy and Directed Divergence on Imposition of Inequality Constraints on Probabilities

Modified Kaptur’s Measures of Entropy and Directed Divergence on Imposition of Inequality Constraints on Probabilities

The problem of distribution of maximum entropy was solved by Freund and Saxena [3] not merely with maximization of Shannon’s [2] but also in the unavailability of moment constraints and determined the algorithm for obtaining the MAX ENT distribution. Kapur [1] provides the event conditional entropy, cross entropy also defined the cross entropy measures and are used to solve a number of entropy, cross entropy minimization. The cross entropy maximization problems incorporating inequality constraints on probabilities and maximum entropy probabilities distribution having inequality constraints on probabilities. The objective of the present paper is to examine some modified Kapur’s [1] measures of entropy when inequality constraints are imposed on probabilities; and we have specified the infirmities of Kapur’s [1] measures of entropy and then studied the measures of entropy revised from Kapur’s [1] measures of entropy. The globally measures of entropy, measures of directed divergence ,measures of inaccuracy, symmetric directed divergence, measures of information improvement, generlised information improvement are acquired resembling to the new measures of entropy.
Show more

8 Read more

Some New Inequalities for Hermite-Hadamard Divergence in Information Theory

Some New Inequalities for Hermite-Hadamard Divergence in Information Theory

One of the important issues in many applications of Probability Theory is finding an appropriate measure of distance (or difference or discrimination ) between two probability distributions. A number of divergence measures for this purpose have been proposed and extensively studied by Jeffreys [1], Kullback and Leibler [2], R´ enyi [3], Havrda and Charvat [4], Kapur [5], Sharma and Mittal [6], Burbea and Rao [7], Rao [8], Lin [9], Csisz´ ar [10], Ali and Silvey [12], Vajda [13], Shioya and Da-te [40] and others (see for example [5] and the references therein).

11 Read more

Show all 10000 documents...