Top PDF Matrix Decomposition and Its Application in Statistics NK

Matrix Decomposition and Its Application in Statistics NK

Matrix Decomposition and Its Application in Statistics NK

' (mon -. . C. R. 98"":;. ()entifying multiple influential o!servations in linear Regression.  5ournal oG Applied >tatistics @8, 4@  #". ' Kumar, N. , Nasser, C., an) Sarker, S.+., 8"33. - Ne& Singular Xalue Decomposition Base) Ro!ust <raphical +lustering /echniue an) (ts -pplication in +limatic Data  5ournal oG -eograpCy and -eology , Danadian Denter oG >cience and Education  , Xol?@, No. 3, 884?8@>. ' Ryan /.I. 93##4;.  /odern !egression /etCods , 2iley, Ne& ork.

82 Read more

Advances in Nonnegative Matrix Decomposition with Application to Cluster Analysis

Advances in Nonnegative Matrix Decomposition with Application to Cluster Analysis

Hierarchical clustering organizes data into a cluster hierarchy or a binary tree known as dendrogram, according to the proximity matrix. The roof node of the dendrogram represents the whole dataset and each leaf node is regarded as a data object. Hierarchical clustering methods can be fur- ther divided into agglomerative (bottom-up) mode and divisive (top-down) mode. An agglomerative clustering starts with one-point (singleton) clus- ters and recursively merges two or more of the most similar clusters to form a cluster hierarchy, whereas a divisive clustering starts with a single cluster containing all data points and recursively splits the most appropri- ate cluster into smaller clusters. The process continues until the stopping criterion (e.g. the requested number of clusters) is achieved. To split or merge clusters, a linkage metric needs to be defined for measuring the dis- tance between two clusters. The most popular metrics are single-linkage, complete linkage, and average linkage (see surveys in [135] and [40]), all of which can be derived from the Lance-Williams updating formula [106]. Typical agglomerative clustering algorithms include CURE (Clustering Using Representatives) by Guha et al. [69], CHAMELEON by Karypis et al. [94], and BIRCH (Balanced Iterative Reduction and Clustering using Hierarchies) by Zhang et al. [199]. CURE represents a cluster by a fixed set of points scattered around it, which makes it possible to handle clus- ters of arbitrary shapes. CHAMELEON uses the connectivity graph G sparsified by a K-nearest neighbor model: only the edges of K most simi- lar points to any given point are preserved, the rest are dropped. BIRCH is designed for clustering very large databases, where it represents data by its statistics summaries instead of using original data features. For divisive clustering, a good example is the Principle Direction Divisive Par- titioning) algorithm presented by Boley [16], in which the author applied SVD to hierarchical divisive clustering of document collections.
Show more

96 Read more

The Generalized Matrix Decomposition Biplot and Its Application to Microbiome Data

The Generalized Matrix Decomposition Biplot and Its Application to Microbiome Data

Biplots have gained popularity in the exploratory analysis of high-dimensional microbiome data. The traditional SVD-biplot is based on Euclidean distances between samples and cannot be directly applied when more general dissimilarities are used. Since Euclidean distances may not lead to an optimal low-dimensional representation of the samples, we have extended the concept of the SVD-biplot to allow for more general similarity kernels. The phylogenetically informed UniFrac distance, used in our examples, defines one such kernel. In settings where a general (possibly nonlinear) distance matrix is appropriate, our approach provides a mathematically rigorous and computationally efficient method, based on the GMD, that allows for plotting both the samples and variables with respect to the same coordinate system.
Show more

12 Read more

Recovery of Missing Values using Matrix Decomposition Techniques

Recovery of Missing Values using Matrix Decomposition Techniques

1 and performs a bad recovery for the second block. All basic methods are not suitable techniques for block recovery in regular time series where peaks and valleys follow a periodic model of varying amplitude or frequency, or in irregular time series. Kurucz et al. [MKT07] proposed a technique based on EM and Singular Value Decomposition (SVD) [Mey00, Kal96, Bra02, ABB00] for comparing recommender systems where one of them contains missing values. A recovery of the missing values is performed before the comparison process. Each recommender system is represented by one column of values in a rating matrix which is decomposed using SVD. The result of the decomposition is modified using a method called gradient boosting [SZB + 11]. The EM algorithm is then applied to refine the result of gradient boosting. The proposed solution dynamically discovers data dependencies from co- ordinate axes that represent the recommender systems and used more than one recommender system. However, the application of gradient boosting on different recommender systems looses the dependencies among the original values of recommender systems. Therefore, this technique yields bad results for block recovery in case where more than one recommender system contains missing blocks.
Show more

128 Read more

A  Matrix  Decomposition  Method  for  Optimal  Normal  Basis  Multiplication

A Matrix Decomposition Method for Optimal Normal Basis Multiplication

whose elements are represented using a normal basis. The applications of finite field operations, particularly of multiplication, are found several areas, including cryptography, coding, and computer algebra. One of most popular application is in elliptic cryptography which uses large values of k, usually from 160 to 521; however, smaller fields are also commonly used, e.g, in error-correcting codes.

21 Read more

New Results on Hermitian Matrix Rank-One Decomposition

New Results on Hermitian Matrix Rank-One Decomposition

The first such type matrix rank-one decomposition method was introduced by Sturm and Zhang in [25] as a means to characterize the matrix cone whose quadratic form is co-positive over a given domain. This naturally connects to the S-lemma of Yakubovich, since the S-lemma is concerned with the positivity of a quadratic form over the domain defined by the level set of another quadratic form. As a matter of fact, these results can be viewed as a duality pair. The matrix rank-one decomposition procedure proposed in [25] is easy to implement, and can be considered as a constructive proof for the S-lemma. For a survey on the S-lemma, we refer to Polik and Terlaky [24]. Applications of such matrix decomposition technique can be quite versatile. For instance, Sturm and Zhang in [25] showed how one can solve the quadratic optimization problem with the constraint set being the intersection of an ellipsoid and a half-plane, via semidefinite programing (SDP) relaxation followed by a rank-one decomposition procedure. Following the approaches adopted in Sturm and Zhang [25], Huang and Zhang [18] generalized the constructive technique to the domain of Hermitian PSD matrices, proving the complex version of Yakubovich’s S-lemma, and also deriving an upper bound on the lowest rank among all the optimal solutions for a standard complex SDP problem. Pang and Zhang [23] and Huang and Zhang [18] gave alternative proofs for some convexity properties of the joint numerical ranges ([10, 4]) using the matrix rank-one decomposition techniques. Up to that point, the matrix rank-one decomposition was meant to be a complete decomposition. Ai and Zhang [2] obtained a partial rank-one decomposition result for the real symmetric positive semidefinite matrices, and used this result to fully characterize the condition under which the strong duality holds for the (nonconvex) quadratic optimization problem over the intersection of two ellipsoids: the problem is known as the CDT subproblem in the literature needed by the trust region method for nonlinear programming.
Show more

33 Read more

Scalable Low-rank Matrix and Tensor Decomposition on Graphs

Scalable Low-rank Matrix and Tensor Decomposition on Graphs

Fast tensor decomposition: Several methods have been presented in the past which propose fast tensor based decompositions [117], [118], [119], [120], [121], [122], [123]. These methods can be divided into three types based on the technique involved: 1) fast alternating least squares (ALS) algorithms which mostly rely on speeding up the convergence of various methods, 2) randomized sampling of the tensor in every dimension. Such methods iteratively sample the tensor along all the modes except the one under consideration and then perform ALS updates, 3) the sketching of empirical moments of the tensor. Such methods have recently started appearing in the literature and require the computation of higher order moments of the datasets. However, none of these methods can be used for low-rank and sparse decomposition and they do not use graphs to exploit intra-mode and inter-mode correlations. Thus, our goal here is to propose a scalable tensor low-rank decomposition framework that can exploit the intra and inter-mode correlations. A straight-froward extension would be to extend our FRPCAG framework for tensors as well. For simplicity we work with 3D tensors of same size n and rank r in each dimension. Let Y ∈ R n×n×n be a noisy tensor such that Y = X ∗ + η , where X ∗ ∈ MLT (P k μ ) and η models noise and errors. Furthermore, let Y μ and X ∗ μ be the μ t h matricization of Y and X ∗ . Then, a straight-forward extension of FRPCAG for recovering X would be to solve the following optimization problem:
Show more

225 Read more

Simultaneous Pursuit of Sparseness and Rank Structures for Matrix Decomposition

Simultaneous Pursuit of Sparseness and Rank Structures for Matrix Decomposition

Statistically, different structures have dramatically different interpretations. A low rank property of a matrix describes global information across different tasks, whereas sparseness concerns local information of specific task. For instance, for face images, the global infor- mation corresponds to the overall shape of a face, but the local information characterizes specific facial expression such as laugh and cry. In linear time-invariant (LTI) system, a low rank property corresponds to a low-order LTI system and a sparseness property captures an LTI system with a sparse impulse response (Porat, 1997). In a high-dimensional situation, betting on one type of structure may not be adequate to battle the curse of dimensionality. In this article, we seek a sparsest decomposition for the purpose of dimension reduction, from a class of overcomplete decompositions into simpler sparse and low-rank components. Specifically, a matrix Θ is decomposed as Θ 1 + Θ 2 , for a sparse Θ 1 and low-rank Θ 2
Show more

29 Read more

A study of the luminol chemiluminescent reaction and its application to glucoside detection in a wheat matrix

A study of the luminol chemiluminescent reaction and its application to glucoside detection in a wheat matrix

process. First a glucosidase enzyme will convert a glucoside to glucose and the aglycon. Glucose is then converted to hydrogen peroxide and gluconic acid via glucose oxidase. The reactions are shown in figure 13. Thus, die glucoside can be detected as hydrogen peroxide which will react with luminol. The enzyme reactions can be contained in a flow injection or chromatographic assay by immobilizing the enzymes onto column packing material and producing an in-line immobilized enzyme reactor (IMER). Another technique coupled to the system is HPLC. This HPLC system achieves separation of matrix components in an analytical column and deals with the back pressure resultant of a multi-column system.
Show more

45 Read more

The Financial Social Accounting Matrix for China, 2002, and Its Application to a Multiplier Analysis

The Financial Social Accounting Matrix for China, 2002, and Its Application to a Multiplier Analysis

The Financial Social Accounting Matrix for China, 2002, and Its Application to a Multiplier Analysis Li, Jia Graduate School of International Development, Nagoya University, Japan.. Onli[r]

26 Read more

Salient Object Detection via Structured Matrix Decomposition

Salient Object Detection via Structured Matrix Decomposition

Difference with preliminary work. Some preliminary ideas in this paper appeared in the conference version [68]. Com- pared with [68], the proposed SMD model in this paper is more general, and subsumes the version in [68] as a special case. The new SMD model not only inherits the major advantages of the preliminary model, i.e., it produces a decomposition of an observation matrix into structured parts with respect to image structure, but it is also armed with the new capability to enlarge the separation between salient objects and background in the feature space. The experimental results (Sec. 5) show clearly that the new model is more robust and the resulting saliency maps (Fig. 8) are more visually favorable.
Show more

15 Read more

Salient Object Detection via Structured Matrix Decomposition

Salient Object Detection via Structured Matrix Decomposition

Difference with preliminary work. Some preliminary ideas in this paper appeared in the conference version [68]. Com- pared with [68], the proposed SMD model in this paper is more general, and subsumes the version in [68] as a special case. The new SMD model not only inherits the major advantages of the preliminary model, i.e., it produces a decomposition of an observation matrix into structured parts with respect to image structure, but it is also armed with the new capability to enlarge the separation between salient objects and background in the feature space. The experimental results (Sec. 5) show clearly that the new model is more robust and the resulting saliency maps (Fig. 8) are more visually favorable.
Show more

14 Read more

N Summet k and Its Application in the Construction of Pascal Triangle and Pascal Matrix

N Summet k and Its Application in the Construction of Pascal Triangle and Pascal Matrix

Summetor was firstly introduced in research article, “Jeevan-Kushalaiah Method to Find the Coefficients of Characteristic Equation of a Matrix and Introduction of Summetor” by the authors Neelam Jeevan Kumar and Neelam Kushalaiah [1]. The Summetor operator name is taken from “sum” operator. Summetor operation is “Sum of all positive integers form one to n”. N-summet-k: Sum of all positive integers summeted k-times pro- gressively.

9 Read more

Some new sharp bounds for the spectral radius of a nonnegative matrix and its application

Some new sharp bounds for the spectral radius of a nonnegative matrix and its application

In this paper, we give some new sharp upper and lower bounds for the spectral radius of a nonnegative irreducible matrix. Using these bounds, we obtain some new and improved bounds for the signless Laplacian spectral radius of a graph or a digraph which are better than the bounds in [, ].

6 Read more

Singular value decomposition and its applications

Singular value decomposition and its applications

Hence we have found a way of ccmputing J such that M = M, the matrix that w0uld be generated by the QR algorithm if we had used it on M. We now generate the sequence by repeating the process, resetting J to J, and partitioning the matrix into smaller blocks if an intermediate q or£ becomes negligibla. r r If$!. is n negligible, we accept q as a singular value and work with a matrix of order n n -1. The successive M matrices of the QR algorithm with shifts, when applied to symmetric tridiagonal matrices, converge globally with at least quadratic: convergence to a diagonal matrix [ 12] Then since any Hennitian matrix can be transformed by Given's plane rotations to a symmetric tridiagonal matrix, and also all the eigenvalues of M are real, the successive J matrices tend to diagonal form.
Show more

25 Read more

Statistics: Its Utility, Its Risks

Statistics: Its Utility, Its Risks

Now, after summing up conclusions above this way, we further see five warnings on the risks of using statis- tics quite powerful, all quite sober, fundamental, and radically serious, one, human errors that cannot be stamped out infests science and statistics, two, mathematics our apodictic certainty that sires statistics is bankrupt at the base, three, science crafted by human hands and brains is as fragile as humanity, four, contingency the unplea- sant surprises stays against statistical understanding of science, and five, reality science and statistics try to reach is never reachable at all.
Show more

7 Read more

Antieigenvalues and antisingularvalues of a matrix and applications to problems in statistics

Antieigenvalues and antisingularvalues of a matrix and applications to problems in statistics

We ask the question what is x such that cos θ as defined in (i) is a minimum or the angle of separation between x and Ax is a maximum. Such a vector is called an anti-eigenvector and cos θ an anti-eigenvalue of A. This is the basis of operator trigonometry developed by K. Gustafson and P.D.K.M. Rao (1997), Numerical Range: The Field of Values of Linear Operators and Ma- trices, Springer. We may define a measure of departure from condition (ii) as min[(x 0 Ax)(x 0 A −1 x)] −1 which gives the same anti-eigenvalue. The same result holds if the maximum of the angle Φ between A 1/2 x and A −1/2 x as in condition (iii) is sought. We define a hierarchical series of anti-eigenvalues, and also con- sider optimization problems associated with measures of separation between an r(< p) dimensional subspace S and its transform AS.
Show more

24 Read more

Application of SPACE Matrix

Application of SPACE Matrix

One of the most important questions for concrete construction firms managers is that what the current status of strategic position and action evaluation matrix (Space Analysis) to find the answer of this question for a concrete construction the best degree of strategy managing to execute four strategies against the rivals, namely aggressive, conservative, defensive and competitive strategies. Value of each factor mentioned in this matrix, were earned by preparing questionnaire surveys. These factors were listed in four groups (Financial Strength, Industry Attractiveness, Environmental
Show more

11 Read more

Decomposition of Automata PDL and its Extension

Decomposition of Automata PDL and its Extension

is always an interesting issue which relates a specification language to states (systems) with a structure. We prove that APDL enjoys a good decomposition property under a very large class of process context, while this problem is more complicated for regular PDL, which has an exponential blow- up. We introduce proposition identifiers to APDL to obtain a language - recAPDL, which has a good balance between expressiveness and ease of analysis. We prove that this ex- tended specification language still has a good decomposition property for a large class of process contexts. After that we present a way to solve the weak (branching) bisimulation equations as an application of recAPDL, by combining the decomposition property and the decision procedure proposed in [1], which has a time complexity which is polynomial in the size of the programs in the formulas and exponential in the number of the sub-formulas.
Show more

6 Read more

Sparse Non negative Matrix Factorization and its Application in Overlapped Chromatograms Separation

Sparse Non negative Matrix Factorization and its Application in Overlapped Chromatograms Separation

Step1: The detector output of each experiment has been taken as an individual column of matrix A. The experimental data were taken for mixtures of different concentration ratios to get a pseudo second order data. (Only one-way data can be taken with the help of detector available with us.) Hence, the shape and area of the overlapped chromatogram is based on its chemical concentration. Here, the shift in position or shape of the chromatogram couldn’t be differentiated. Hence, preprocessing was not needed. The main focus of this work is resolving the overlapping components. It is proposed to use ML-sNMF algorithm to perform the deconvolution of a data matrix A.
Show more

10 Read more

Show all 10000 documents...

Related subjects