• No results found

Kullback-Leibler relative entropy

Financial Portfolios based on Tsallis Relative Entropy as the Risk Measure

Financial Portfolios based on Tsallis Relative Entropy as the Risk Measure

... Tsallis relative entropy, which is the generalization of Kullback-Leibler relative entropy to non-extensive systems, is investigated as a possible risk measure in constructing ...

22

New Estimates of the Kullback-Leibler Distance and Applications

New Estimates of the Kullback-Leibler Distance and Applications

... Definition 2. Consider two random variables X and Y with a joint probability mass function p(x, y) and marginal probability mass function p(x) and q(y). The mutual information is the relative entropy ...

12

Adverse Factors for the Technology Modernization: Local Public Management

Adverse Factors for the Technology Modernization: Local Public Management

... hand, relative entropy statistics may also be used in goodness of fit testing as well as classical test statistics like Z scores, Z-square scores and classical chi-square ...distribution, ...

16

A New Upper Bound for the Kullback-Leibler Distance and Applications

A New Upper Bound for the Kullback-Leibler Distance and Applications

... The relative entropy is a measure of the distance between two ...The relative entropy D(p k q) is a measure of the inefficiency of assuming that the distribution is q when the true ...

11

Algorithms for Kullback  Leibler approximation of probability measures in infinite dimensions

Algorithms for Kullback Leibler approximation of probability measures in infinite dimensions

... Improving the efficiency of MCMC algorithms is a topic attracting a great deal of current interest, as many important PDE based inverse problems result in target distributions μ for which Φ μ is computationally expensive ...

26

Some Inequalities For The Kullback-Leibler And x²−Distances In Information Theory And Applications

Some Inequalities For The Kullback-Leibler And x²−Distances In Information Theory And Applications

... To be more precise, consider two random variables X and Y with a joint prob- ability mass r (x, y) and marginal probability mass functions p (x) and q (y) , x ∈ X, y ∈ Y. The mutual information is the relative ...

10

Bounds for Kullback-Leibler divergence

Bounds for Kullback-Leibler divergence

... Abstract. Entropy, conditional entropy and mutual information for discrete- valued random variables play important roles in the information ...for relative entropy D(p||q) of two probability ...

6

Gaussian approximations for probability measures on Rd*

Gaussian approximations for probability measures on Rd*

... by the famous Bernstein–Von Mises (BvM) theorem [28] in asymptotic statistics. Roughly speaking, the BvM theorem states that under mild conditions on the prior, the posterior distribution of a Bayesian procedure ...

31

Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

... response entropy H ( R ) because here one only needs to calculate the KullbackLeibler divergence D ( x || x ˆ ) , which may have a smaller estimation ...

21

Dimensional Reduction of Statistical Structural of a Paper by Information Geometry

Dimensional Reduction of Statistical Structural of a Paper by Information Geometry

... High dimensional data visualization and interpretation have become increasingly important for data mining, information retrieval, and information discrimination applications arising in areas such as search engines, ...

9

Discrimination between Gamma and Log-Normal Distributions  by Ratio of Minimized Kullback-Leibler Divergence

Discrimination between Gamma and Log-Normal Distributions by Ratio of Minimized Kullback-Leibler Divergence

... the Kullback- Leibler information is a measure of uncertainty between two functions, hence in this paper, we examine the use of Kullback-Leibler Divergence (KLD) in discriminating either the ...

11

On preferred point geometry in statistics

On preferred point geometry in statistics

... Abstract. A brief synopsis of progress in differential geometry in statistics is followed by a note of some points of tension in the developing relationship between these disciplines. The preferred point nature of much ...

18

Zipf–Mandelbrot law, f divergences and the Jensen type interpolating inequalities

Zipf–Mandelbrot law, f divergences and the Jensen type interpolating inequalities

... Motivated by the method of interpolating inequalities that makes use of the improved Jensen-type inequalities, in this paper we integrate this approach with the well known Zipf–Mandelbrot law applied to various types of ...

20

Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation

Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation

... However, training VAEs in NLP is more diffi- cult than the image domain (Kingma and Welling, 2014). The VAE training involves a reconstruction loss and a KullbackLeibler (KL) divergence be- tween the ...

9

Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization

Derivation of the PHD and CPHD Filters Based on Direct Kullback-Leibler Divergence Minimization

... and γ (X 0 |X) is the PDF of the state X 0 given the state X . As in single target filtering, the prediction and update steps cannot be computed in closed-form in general. In single target filtering, a well-known ...

10

Sensitivity to Prior Specification in Bayesian Identification of Autoregressive Time Series Models

Sensitivity to Prior Specification in Bayesian Identification of Autoregressive Time Series Models

... the Kullback-Leibler (KL) divergence (Kullback and Leibler, 1951) to measure the distance between the posteriors of the AR model order, resulting from different types of priors, in order to ...

15

Strong Consistency of the Prototype Based Clustering in Probabilistic Space

Strong Consistency of the Prototype Based Clustering in Probabilistic Space

... In this paper we formulate in general terms an approach to prove strong consistency of the Empirical Risk Minimisation inductive principle applied to the prototype or distance based clustering. This approach was ...

11

Survey on Change Detection in SAR Images with Image Fusion and Image Segmentation

Survey on Change Detection in SAR Images with Image Fusion and Image Segmentation

... the relative change in the average intensity between the two dates and not on a reference intensity ...the KullbackLeibler (KL) divergence was used as change ...

7

Kullback Leibler divergence based wind turbine fault feature extraction

Kullback Leibler divergence based wind turbine fault feature extraction

... The paper addresses the problem of fault feature extraction and selection of monitoring variables by employing Kullback-Leibler divergence (KLD) and kernel support vector machine (KSVM). In this paper, ...

6

On the Importance of the Kullback Leibler Divergence Term in Variational Autoencoders for Text Generation

On the Importance of the Kullback Leibler Divergence Term in Variational Autoencoders for Text Generation

... 2 Kullback-Leibler Divergence in VAE We take the encoder-decoder of VAEs as the sender-receiver in a communication network. Given an input message x, a sender generates a compressed encoding of x denoted by ...

10

Show all 10000 documents...

Related subjects