In this article we propose to combine an integrated method, the PCA-GMM method that generates a relatively improved segmentation outcome as compared to conventional GMM with Kalman Filtering (KF). The combined new method the PCA-GMM-KF attempts tracking multiple moving objects; the size and position of the objects along the sequence of their images in dynamic scenes. The obtained experimental results successfully illustrate the tracking of multiple mov- ing objects based on this robust combination
mixture model (GMM). Once the pixelwise GMM likelihood is obtained, the final binary mask is either generated by thresholding [4, 6, 7] or according to more sophisticated decision rules [8–10]. Although the Gaussian mixture model technique is quite successful, the obtained binary masks are often noisy and irregular. The main reason for this is that spatial and temporal dependencies are neglected in most approaches. Thus, the method of our detection unit improves the standard GMM method by regarding spatial and temporal dependencies and integrating a limitation of the standard deviation into the traditional method. While the spatial dependency and the limitation of the standard deviation lead to clear and noiseless object boundaries, false positive detections caused by shadows and uncovered background regions so called ghosts can be reduced due to the consideration of the temporal dependency. By combining this improved detection method with a fast shadow removal technique, which is inspired by the technique of , the quality of the detection result is further enhanced and good binary masks are obtained without adding any complex and computational expensive extensions to the method.
14 Read more
estimators. However, we show that this is not the case, once we generalize the evidence to multifactor models. For example, when estimating risk premia, the Beta method has better properties in multifactor models across test portfolios and sample periods than the SDF/GMM method, while the main consensus is that there no differences between both methods. Moreover, our results are not driven by a simulation calibrated from a single factor structure, as in JW (2002); they are obtained from actual realizations of historical data. This allows us testing the methodologies under more complex set-ups other than varying according to a known distribution. The closest paper is probably due to Shanken and Zhou (2007). However, once again, although the objective of the paper is similar, they report empirical results based only on simulations rather than on real data sets.
31 Read more
Potential endogeneity of independent variables, inclusion of the lagged dependent variable and the presence of the country-specific effects have made impossible for us to estimate using panel estimation model such as pooled OLS, fixed and random effect respectively. Problems aforementioned would bring Nickel (1981) bias if we use the panel data estimation. Thus, generalized method of moment (GMM) proposed by Arellano and Bond (1991) have the capability to combat these problems. GMM method can tackle the country-specific effects by taking the first differences of equation (1) but however, we suffer missing values of some explanatory variables and will subsequently bring difficulties in the transformed data (Roodman, 2009). Therefore, we use forward orthogonal deviation transformation procedure proposed by Arellano and Bover (1995) to wipe out the country-specific effects. However, new bias appears resulted from forward orthogonal deviation, which is correlation between lagged dependent variable and the
14 Read more
We suspect problems of endogeneity in the estimation equation related to causality of exogenous variables to the dependent variable (especially the debt variable). Therefore, traditional econometric methods such as Ordinary Least Square (OLS), fixed effect and Generalized Least Square (GLS) do not allow us to obtain efficient estimates of such model. So, to solve this problem, we introduce the generalized method of moments on panel (GMM) proposed by Arellano and Bond (1991), Arellano and Bover (1995) and Blundell and Bond (1998). This method can provide solutions to simultaneity bias, reverse causality (especially between debt and profitability) and possible omitted variables. Moreover, it can control the individual and temporal specific effects. Indeed, GMM method is used to solve the problems of endogeneity not only at the debt variable, but also for other explanatory variables by using a series of instrumental variables generated by lagged variables.
18 Read more
In this stage, we identify the main pattern of the collected historical FOREX data using the GMM method. The GMM method is a widely-used clustering algorithm that can group the data into clusters with the centers as the peaks of the Gaussian mixture distribution presumed by the GMM approach. The distribution of each observation is specified by a probability density function through a finite mixture model of G components as represented in Equation (1). This can be easily achieved by using the implementation of GMM for clustering available in R software for statistical computing (R Core Team, n.d.). The function “Mclust()” from the package “mclust” (Scrucca, Fop, Murphy, & Raftery, 2016) provides the model based clustering required with the Gaussian distribution.
In above section, the damage mechanisms of STCRC columns are preliminarily identified through the correla- tion analysis between cumulative energy and peak frequency. Further investigation on failure mechanisms are perform by RA − AF and GMM method from the perspective of the fracture modes. Tensile cracks lead to AE signals with higher average frequency and lower RA values, on the contrary, shear cracks generate signals with higher RA value and low frequency (Aggelis, 2011). Therefore, the fracture modes of STCRC columns can be determined by the RA and AF values of the AE signals. Considering that the method is fit for crack classification of concrete dam- ages (Ohno and Ohtsu, 2010), thus only the AE events in low-frequency band (0-150kHz) are used for RA − AF crack classification.
14 Read more
Over the last decade, the GMM  has become established as the standard classifier for text-independent speaker recognition . It operates on atomic levels of speech and can be effective with very small amounts of speaker specific training data. The primary focus of this work was on a task domain for a real application, such as voice mail labelling and retrieval. The Gaussian Mixture speaker model was specifically evaluated for identification tasks using short duration utterances from unconstrained conversational speech, possibly transmitted over noisy telephone channels.
construction and, therefore, valid instruments. However, if the variable whose lags are used as instruments, is generated by a noncausal AR process, its lags may be endogenous and, hence, unsuitable as instruments, yielding an inconsistent GMM estimator. In a simple special case with lags of the explanatory variable used as instruments, we have shown that the OLS and 2SLS estimators even converge in probability to the same limit. Moreover, the J -test typically used to test for the exogeneity of the instruments, may be inconsistent, and, in general, has low power against endogenous instruments. In other words, the J -test cannot be relied on to reveal the endogeneity problem. Our …nite-sample simulation experiments con…rm these …ndings.
17 Read more
We have introduced new moments in a GMM estimation of a spatial regression model. Given that when = 0 some of the suggested moments are redundant, we have proposed to use only a subset of the moments in the estimation procedure. Our Monte Carlo experiments point at conditions (9)-(11) as those that yield the best performance of the GMM estimator.
Convex optimization is used in the image to transform the image histogram into the flattest histogram, related to a MB constraint. An exact HS method is used to storing the image brightness. When the gray levels of the input image are equally distributed, FHSABP is very similar to GHE. Thereore, it is designed to store the average brightness, which may produce low contrast results when the average brightness is either too low or too high. In histogram modification framework (HMF), which is based on histogram equalization, contrast enhancement is refered as an optimization problem that reduces the cost.
Change Detection [4, 10] is the method of analyzing two images taken at different times over the same geographical area and identifying the changes that have taken place between the two different acquisition times. Synthetic Aperture Radar (SAR) system offers a wide coverage area and it is insensitive to the weather and illumination conditions. Hence it finds vast applications in Remote Sensing. The drawback existing in SAR image is that it they contain speckle noise. This makes the change detection of SAR images more challenging than the other optical images.
This paper considers first-order autoregressive panel model which is a simple model for dynamic panel data (DPD) models. The generalized method of moments (GMM) gives efficient estimators for these models . This efficiency is affected by the choice of the weighting matrix which has been used in GMM estimation. The non-optimal weighting matrices have been used in the conventional GMM estimators. This led to a loss of efficiency. Therefore, we present new GMM estimators based on optimal or suboptimal weighting matrices. Monte Carlo study indicates that the bias and efficiency of the new estimators are more reliable than the conventional estimators.
20 Read more
In order to show the improvement in the resolution of satellite images of the proposed method over the conventional and state-of-art image resolution enhancement techniques, two satellite images with different features are used for comparison. Figure.4 shows that high resolution images using the proposed techniques in (f) is much sharper than the original low-resolution images in (a), bilinear interpolation in (b), bicubic interpolation in (c) wavelet zero padding in (d), DWT based image decomposition and bicubic interpolated images in (e). The proposed technique is evaluated in terms of Peak Signal to Noise Ratio (PSNR), and Quality Index (QI) and compared with other techniques. It is clear that the proposed DWT-GMM technique is outperform than bilinear interpolation, bicubic interpolation, wavelet zero padding, DWT based image decomposition and bicubic interpolated techniques.
Abstract: This study aims to develop an elderly care system for improving the interpersonal relationship of the elderly with mild cognitive impairment (MCI) by employing the speaker recognition technique and association functionality of social network platforms. Firstly, the speaker recognition units based on the Gaussian Mixture Model (GMM) and Gaussian Mixture Model-Universal Background Model (GMM-UBM) are implemented to identify the visitor via individual input utterance. After the visitor is identified, the proposed system will be linked to the private database and social network platforms to extract the associated message of two parties. Experimental results indicate that the speaker recognition unit based on GMM-UBM achieves the best performance. Finally, five elderly persons are invited to measure the usability of the proposed system. A questionnaire is used to survey the five elderly persons, and the result indicates that the proposed system is highly potentially applicable in improving the interpersonal relationship of the elderly with MCI.
12 Read more
MALIČKÁ LENKA. 2017. The Role of Immovable Property Taxes in the EU Countries – Taxes on Land, Buildings and Other Structure in Sub‑national Tax Revenues under the Conditions of Tax Decentralization. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, 65(4): 1383–1392. The literature concerned in fiscal federalism and fiscal decentralization promotes the sub‑national responsibility for sub‑national resources and spending. In this paper sub‑national tax revenues are compared to total tax revenues expressing the tax decentralization for the sample of EU 28 countries. Beside it, the main part of sub‑national taxes, the immovable property tax – tax on land building or other structure, is compared to total sub‑national tax revenues. Using the GMM system estimation determinants of sub‑national tax revenues, real estate tax revenues and tax decentralization are investigated on the sample of EU countries. Results show the significant negative relation between GDP per capita growth, population density and inflation rate and all variables in question. In the case of sub‑national government real estate tax revenues the positive relation with public debt is observed. Keywords: Fiscal federalism, fiscal decentralization, tax decentralization, sub‑national government, local tax, immovable property tax, GMM dynamic panel model
10 Read more
The overall improvement achieved by applying the GMM is of approximately 7.6% in R 2 . This demonstrates the ability of such an algorithm to compensate for bias by detecting the unmodelled stochastic (i.e. non deterministic) behaviour. The rules after bias compensation are illustrated in Figure 8, where one can notice that the antecedents are similar to the ones presented in Figure 4 but the consequent is slightly different. The consequent mean values for Rule 1 and Rule 2 were refined by approximately 25µm (to the left) and 150µm (to the right), respectively. Such slight changes in the mean values did not actually change the linguistic forms of these rules. It is worth mentioning at this stage that the prediction performance of the modified GMM is superior to that of the traditional GMM presented in (Yang et al., 2012), with an overall improvement of approximately 2% in R 2 . Moreover, as stated previously, these simple but informative rules can be accurately used to represent the TSG process.
19 Read more
ric unknowns. A method to incorporate moment restrictions derived from economic theory into predictive distributions is also proposed in the literature. Robertson et al. (2005) use exponential tilting projection to obtain a refined predictive distribution of macroeconomic variables subject to Taylor rule restrictions. Giacomini and Ragusa (2014) provide formal justification of the method and show that when the moment restrictions are correct the resulting predictive distribution is indeed superior to the original predictive distribution in terms of log-score. The method considered in this thesis is different from theirs in that the exponential projection is applied to the prior distribution of the underlying data distri- bution as opposed to the posterior predictive distribution. Moreover, the underlying data distribution is flexibly modeled and can allow for nonlinearity, while they only consider a linear vector autoregressive model.
117 Read more
This section presents the estimation using GMM System estimator proposed by Arellano and Bover, (1995) and Blundell and Bond, (1998, 2000). The estimator GMM system (GMM SYS) permits the researchers to solve the problems of serial correlation, heteroskedasticity and endogeneity for some explanatory variables. These econometric problems were resolved by Arellano and Bond (1991), Arellano and Bover (1995) and Blundell and Bond (1998, 2000), who developed the first differenced GMM (GMM DIF) estimator and the GMM system (GMM SYS) estimator. The GMM SYS estimator is a system containing both first differenced and levels equations. The GMM SYS estimator is an alternative to the standard first differenced GMM estimator. To estimate the dynamic model, we applied the methodology of Blundell and Bond (1998, 2000), and Windmeijer (2005) to small sample correction to correct the standard errors of Blundell and Bond (1998, 2000). The GMM system estimator is consistent if there is no second order serial correlation in the residuals (M 2 statistics). The dynamic panel data model is valid if the estimator
10 Read more
Given the range of displacement and compensation effects at work and the opposing effects they have on the innovation’s employment effects , Vivarelli (2012) indicates that the overall effect of innovation on employment can be ascertained only empirically. This systematic review sets out to accomplish this task in the context of LICs, with respect to which the evidence base is limited and highly heterogeneous. Given the heterogeneity of the existing work and the ambiguity implied by opposing effects of the displacement and compensation mechanisms summarised above, the review adopts a mixed-method synthesis proposed by Harden and Thomas (2005). The method involves mapping the qualitative and quantitative evidence in a systematic manner. While qualitative synthesis compensates for the limited extent to which contextual factors can be incorporated into quantitative meta-analysis, the latter allows for synthesizing evidence form diverse studies, after controlling for the effects of publication selection bias and observable sources of heterogeneity in the evidence base.
36 Read more