• No results found

low-rank matrix estimation

Bootstrap-Based Regularization for Low-Rank Matrix Estimation

Bootstrap-Based Regularization for Low-Rank Matrix Estimation

... These two examples already reveal a variety of different phenomena. On one hand, sta- ble autoencoding with the Gaussian noise model (7) reduces to a singular-value shrinkage estimator (Section 2), and thus leads us back ...

29

Compressed Sensing, Sparse Approximation, and Low-Rank Matrix Estimation

Compressed Sensing, Sparse Approximation, and Low-Rank Matrix Estimation

... Bounds on the restricted isometry constant have been established in [42] and in [135] for partial DFT matrices, and by extension, for partial subsampled orthogonal transforms. For instance, [135] proves that if A is a ...

171

Generalized SURE for optimal shrinkage of singular values in low-rank matrix denoising

Generalized SURE for optimal shrinkage of singular values in low-rank matrix denoising

... Low-rank matrix estimation is classically done in the setting where the additive noise is Gaussian with homoscedastic ...of low rank matrix recovery beyond the Gaussian ...

50

Probabilistic Low-Rank Matrix Completion from Quantized Measurements

Probabilistic Low-Rank Matrix Completion from Quantized Measurements

... a low rank real-valued matrix M given a subset of noisy dis- crete (or quantized) ...likelihood estimation of M , under a constraint on the entry- wise infinity-norm of M and an exact ...

34

Lower bounds for the low rank matrix approximation

Lower bounds for the low rank matrix approximation

... the low-rank component and E ∈ R m×n is the perturbation compo- nent of ...the matrix E is small, the classical principal components analysis (PCA [–]) can seek the best rank-r ...

14

Simultaneous Pursuit of Sparseness and Rank Structures for Matrix Decomposition

Simultaneous Pursuit of Sparseness and Rank Structures for Matrix Decomposition

... the same time, the proposed method has high true positives ranging from .92 to 1.00 and low false positives between 0.00 and 0.01, as compared to true positives ranging 0.04 to .44 and false positives between 0.03 ...

29

ISAR Imaging Based on MEMP Method and Low Rank Matrix Denoising Technique

ISAR Imaging Based on MEMP Method and Low Rank Matrix Denoising Technique

... In inverse synthetic aperture radar (ISAR) imaging, it is often desired to estimate two-dimensional (2-D) frequencies from the echo matrix. The conventional method used to resolve the caterers is Discrete Fourier ...

11

Optimal Estimation of Low Rank Density Matrices

Optimal Estimation of Low Rank Density Matrices

... full rank matrix (see Koltchinskii 2011a, the proof of Proposition ...of estimation of a density matrix of a quantum state by maximizing the von Neumann entropy subject to constraints based on ...

36

Effective removal of Noise from Videos using Low Rank Matrix Completion

Effective removal of Noise from Videos using Low Rank Matrix Completion

... the estimation accuracy of damaged pixels unavoidably decreases, as there exist other types of image noises in additional to impulsive ...using low rank matrix yields most visually pleasant ...

7

Singular Value Decomposition for High Dimensional Data

Singular Value Decomposition for High Dimensional Data

... statistical estimation is not possible without the assumption of strong structure in the ...for matrix data which require assumptions such as low rank in addition to sparsity or ...

106

 INFORMATION TECHNOLOGY GOVERNANCE USING COBIT 4 0 DOMAIN DELIVERY SUPPORT 
AND MONITORING EVALUATION

 INFORMATION TECHNOLOGY GOVERNANCE USING COBIT 4 0 DOMAIN DELIVERY SUPPORT AND MONITORING EVALUATION

... nonlocal estimation step after the initial CS recovery for de-noising ...nonlocal estimation is based on the well-known nonlocal means (NL) filtering that takes advantage of self-similarity in ...nonlocal ...

11

Matrix Completion and Low-Rank SVD via Fast Alternating Least Squares

Matrix Completion and Low-Rank SVD via Fast Alternating Least Squares

... We used an R implementation for each of these in order to make the fairest comparisons. In particular, algorithm softImpute requires a low-rank SVD of a complete matrix at each iteration. For this we ...

36

Memory Efficient Kernel Approximation

Memory Efficient Kernel Approximation

... R u is testing labels and G te ∈ R u×l is the approximate kernel values between testing and training data. The covtype and mnist2m data sets are not originally designed for regression, and here we set the target ...

32

Characterization of Relationships Between the Domains of Two Linear Matrix-Valued Functions with Applications

Characterization of Relationships Between the Domains of Two Linear Matrix-Valued Functions with Applications

... These facts show that algebraic features and performances of the matrix set in (1.4) are necessarily worth for investigation from both theoretical and applied points of view. In fact, a class of fundamental and ...

16

Image-Matching-Retrieval Procedure to Clean Interpretation

Image-Matching-Retrieval Procedure to Clean Interpretation

... summary of connected tags, the aim of tag ranking would be to rank the tags based on their relevance towards the image content. Both Pascal VOC2007 and SUN Attribute datasets are utilized within this experiment ...

5

A Note on Quickly Sampling a Sparse Matrix with Low Rank Expectation

A Note on Quickly Sampling a Sparse Matrix with Low Rank Expectation

... Previously, Hagberg and Lemons (2015) studied a fast technique to generate sparse random kernel graphs. Under the random kernel graph model, nodes i and j connect with probability κ(i/n, j/n), where the function κ is ...

13

Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery

Harmonic Mean Iteratively Reweighted Least Squares for Low-Rank Matrix Recovery

... of low complexity linear ...promote low-rankness and carries three major strengths, in particular for the matrix completion ...the low-rank matrix for relevant, interesting ...

49

Low Rank Positive Approximants of Symmetric Matrices

Low Rank Positive Approximants of Symmetric Matrices

... The current interest in positive approximants was initiated in Halmos’ paper [4], which considers the solution of (1.7) in the spectral norm. Rogers and Ward [5] considered the solution of (1.7) in the Schatten-p norm, ...

15

Online Embedding Compression for Text Classification Using Low Rank Matrix Factorization

Online Embedding Compression for Text Classification Using Low Rank Matrix Factorization

... For NLP specific tasks, the word embedding matrix often accounts for most of the network size. The embedding ma- trix is typically initialized with pretrained word embeddings like Word2Vec, (Mikolov et al. 2013) ...

8

A new method based on the manifold alternative approximating for low rank matrix completion

A new method based on the manifold alternative approximating for low rank matrix completion

... As is well known, Candés and Rechat [7] replaced the rank objective in (1.1) with its con- vex relaxation, and they showed that the lowest-rank matrices could be recovered exactly from most sufficiently large ...

12

Show all 10000 documents...

Related subjects