• No results found

Sparse Regularization

Region-based convolutional neural network using group sparse regularization for image sentiment classification

Region-based convolutional neural network using group sparse regularization for image sentiment classification

... group sparse regularization (R-CNNGSR) for image sentiment classification, to utilize sentiment regions for ...group sparse regularization to keep the network in the ...

9

Using sparse regularization for multi-resolution tomography of the ionosphere

Using sparse regularization for multi-resolution tomography of the ionosphere

... Figure 9a and b show the reconstruction obtained with SH and DM. SH reconstruction (Fig. 9a) is quite sensitive to the noise, which causes additional oscillations and artefacts. DM reconstruction (Fig. 9b) shows a better ...

12

Analysis of Multi-stage Convex Relaxation for Sparse Regularization

Analysis of Multi-stage Convex Relaxation for Sparse Regularization

... Note that by repeatedly refining the parameter v, we can potentially obtain better and better convex relaxation in Figure 1, leading to a solution superior to that of the initial convex relaxation. Since at each step the ...

27

RNN Architecture Learning with Sparse Regularization

RNN Architecture Learning with Sparse Regularization

... Neural models for NLP typically use large numbers of parameters to reach state-of-the- art performance, which can lead to excessive memory usage and increased runtime. We present a structure learning method for learn- ...

6

Sparse Regularization for Inverse Problems Governed by Evolution Equations.

Sparse Regularization for Inverse Problems Governed by Evolution Equations.

... algorithm, which will briefly describe here. The fully details of the method are developed in the theses [6],[18]. The ω-k algorithm is a fast Fourier domain technique and works well under certain assumptions. The ...

126

Discovering Phonesthemes with Sparse Regularization

Discovering Phonesthemes with Sparse Regularization

... We train our two-stage model on the phone- micized vectors; the features that are assigned a nonzero weight are our model-predicted phon- esthemes. The features of our morpheme-level model are binary indicator features ...

6

Radar Imaging of Sidelobe Suppression Based on Sparse Regularization

Radar Imaging of Sidelobe Suppression Based on Sparse Regularization

... of sparse representation and reconstruction. In Section 3, sparse SAR signal imaging through conventional method and sparse method is implemented and we analyze the imaging quality of different ...

8

NON-SEPARABLE REGULARIZATION BASED DE- CONVOLUTION

NON-SEPARABLE REGULARIZATION BASED DE- CONVOLUTION

... with sparse regularization. This sparse regularization can be categorized into two ...term regularization terms is called as penalty ...the regularization terms or penalty ...

8

Regularization of distributions

Regularization of distributions

... Hence, ,.qa, b is the subspace of :Da, b formed by those distributions that admit extensions to It is also convenient to consider the space T#a, b} formed by those smooth functions defin[r] ...

12

Conditional random fields and 
		regularization for efficient label prediction

Conditional random fields and regularization for efficient label prediction

... 𝑚𝑎𝑥 𝑙( 𝐵| 𝐴; 𝑤) − 𝜆𝑤𝑇 𝑤 (2) The term λ here will control the smoothness that is being applied to the model. A higher value of λ means smoothing will be more and value of λ zero will results into no smoothing at all. The ...

5

Sparse kernel density construction using orthogonal forward regression with leave one out test score and local regularization

Sparse kernel density construction using orthogonal forward regression with leave one out test score and local regularization

... The subset model selection procedure can be carried as fol- lows: at the th stage of the selection procedure, a model term is selected among the remaining to candidates if the re- sulting -term model produces the ...

10

Sparse kernel density construction using orthogonal forward regression with leave one out test score and local regularization

Sparse kernel density construction using orthogonal forward regression with leave one out test score and local regularization

... of sparse modeling over several other state-of-art ...a sparse density estimate with comparable accuracy to that of the Parzen window ...constructing sparse and accurate kernel density ...

10

A numerical study of the SVDMFS solution of inverse boundary value problems in two-dimensional steady-state linear thermoelasticity

A numerical study of the SVDMFS solution of inverse boundary value problems in two-dimensional steady-state linear thermoelasticity

... SVD-based regularization methods, such as the TRM [36], the DSVD and the TSVD [37], while the regularization parameter or the trun- cation number was chosen according to the DP [38], GCV criterion [39] and ...

40

Sparse machine learning models in bioinformatics

Sparse machine learning models in bioinformatics

... usually sparse. If it is sparse, then the graph Laplacian matrix L is ...is sparse, then its eigen-decomposition is very efficient ...fast sparse method for clustering large-scale ...

334

Two new regularization methods for solving sideways heat equation

Two new regularization methods for solving sideways heat equation

... the measured Cauchy data with measurement errors. This is a severely ill-posed problem: any small perturbation in the observation data can cause large errors in the solution u(x, t) for x ∈ [, L). Therefore, most ...

17

Numerical Methods for Fredholm Integral Equations of the First Kind

Numerical Methods for Fredholm Integral Equations of the First Kind

... S. Yousefi et al. [12] utilized the CAS wavelet approximations method to reduce the Fredholm integral equations to the solution of algebraic equations. Illustrative examples are included to demonstrate the validity and ...

6

Nonparametric Sparsity and Regularization

Nonparametric Sparsity and Regularization

... Second order methods, see, for example, Chan et al. (1999), could also be used to solve sim- ilar problems. These methods typically converge quadratically and allows accurate computations. However, they usually have a ...

50

New Regularization Algorithms for Solving the Deconvolution Problem in Well Test Data Interpretation

New Regularization Algorithms for Solving the Deconvolution Problem in Well Test Data Interpretation

... new regularization algorithms for solving the first-kind Volterra integral equation, which describes the pressure-rate deconvolution problem in well test data interpretation, are developed in this ...

13

Kernels: Regularization and Optimization

Kernels: Regularization and Optimization

... Since the kernel has to effectively capture the domain knowledge in an application, we study the problem of learning the kernel itself from training data. The proposed solution is a kernel on the space of kernels itself, ...

16

An optimal order yielding discrepancy principle for simplified regularization of ill posed problems in Hilbert scales

An optimal order yielding discrepancy principle for simplified regularization of ill posed problems in Hilbert scales

... simplified regularization in the setting of Hilbert scales, which eliminates the drawback of the method in [1] yielding the optimal order for a range of values of ...

13

Show all 1973 documents...

Related subjects