Blinder-Oaxaca Decomposition Technique

Top PDF Blinder-Oaxaca Decomposition Technique:

The Decomposition of Inter Group Differences in a Logit Model: Extending the Oaxaca Blinder Approach with an Application to School Enrolment in India

The Decomposition of Inter Group Differences in a Logit Model: Extending the Oaxaca Blinder Approach with an Application to School Enrolment in India

The Oaxaca (1973) and Blinder (1973) method of decomposing group differences in means into a “discrimination” and a “characteristics” component is, arguably, the most widely used decomposition technique in economics. This method has been extended from its original setting within regression analysis, to explaining group differences in probabilities derived from models of discrete choice with a binary dependent variable and estimated using logit/probit methods (Gomulka and Stern, 1990; Blackaby et. al., 1997, 1998,1999; Nielsen, 1998). However, there are two constricting aspects of this decomposition and of its extension to logit/probit models, that are often overlooked.
Show more

19 Read more

Socioeconomic inequality in oral health behavior in Iranian children and adolescents by the Oaxaca-Blinder decomposition method: the CASPIAN- IV study

Socioeconomic inequality in oral health behavior in Iranian children and adolescents by the Oaxaca-Blinder decomposition method: the CASPIAN- IV study

Methods: A representative sample of 13486 school students aged 6 – 18 years was selected through multistage random cluster sampling method from urban and rural areas of 30 provinces in Iran. Principle Component Analyses (PCA) correlated variables summarized as socioeconomic status (SES). Association of independent variables with tooth brushing was assessed through logistic regression analysis. Decomposition of the gap in tooth brushing between the first and fifth SES quintiles was assessed using the counterfactual decomposition technique. To assess the relation between tooth brushing and each socioeconomic category, Concentration Index (C) and the slope index of inequality (SII) were used, representing the linear regression coefficient.
Show more

8 Read more

Regional inequalities in child malnutrition in Egypt, Jordan, and Yemen: a Blinder-Oaxaca decomposition analysis

Regional inequalities in child malnutrition in Egypt, Jordan, and Yemen: a Blinder-Oaxaca decomposition analysis

The Oaxaca decomposition is a technique that decom- poses inequities between any two groups and has been used extensively in explaining wage differentials between males and females, immigrants and natives, blacks and whites workers. The intuition behind the Oaxaca de- composition is that it quantifies the gap in the outcome between the two groups into two parts, a part that is ex- plained by the gap in the level of the determinants, such as income or education level, and a part that is explained by the gap in the effect of the determinants on the out- come variable. For instance, rural children could be less healthy not only because they visit health care providers less frequently but also because health care providers at rural region are less effective. The Oaxaca decompos- ition quantifies the contribution of each factor to the gap in the outcome, thus identifying which factors con- tribute most to generating inequality between the two groups [18]. To the best of our knowledge, the current study is the first to examine the disparity in child malnu- trition using Oaxaca decomposition in Egypt, Jordan, and Yemen.
Show more

11 Read more

Association Between Age and Obesity Over Time

Association Between Age and Obesity Over Time

used to compute 2 counterfactuals. The first counterfactual quantifies how the obesity rate would have changed if the regression coefficients and intercept changed as they did between 2003–2004 and 2011–2012 but the population composition did not change (ie, the means of the covariates in 2011–2012 were the same as they had been in 2003–2004). The second counterfactual quantifies how the obesity rate would have changed if the population composition changed as it did between 2003–2004 and 2011–2012 but the regression coefficients did not change (ie, the regressions coefficients for 2003–2004 and 2011–2012 were the same). The value of the first counterfactual represents the contribution of differences in the regression coefficients and intercepts between waves (ie, changes in associations) to the decline in the obesity rate, and the value of the second counterfactual represents the contribution of differences in the mean levels of the covariates between waves (ie, changes in population composition) to the decline in the obesity rate. In addition to estimating the overall contribution of these 2 components of change, the Blinder- Oaxaca regression decomposition technique may be used to estimate the contributions of individual variables to change. However, in the traditional Blinder-Oaxaca regression decomposition approach, the estimated contribution
Show more

14 Read more

Multilevel PWM Waveform Decomposition and Phase-Shifted Carrier Technique

Multilevel PWM Waveform Decomposition and Phase-Shifted Carrier Technique

Note that in this case the only difference between surfaces is about their reference point. Thus, each surface is shifted by π/2 toward y´ axis. Adding up these surfaces gives the resulting surface for the PSC technique which is shown in Fig. 8(e,f). It is clear that the surface for APO disposition and the one for PSC technique are the same. The only difference that we see in the figures is that four periods of the surface for PSC is shown, comparing to one period that is shown for APO. It is because the y′ axis in PSC is four times greater in scale than the y axis in APO disposition (Remember that ω c = 4ω′ c ).
Show more

15 Read more

An Improved Decomposition Technique for Solving Integer Linear Programming Problems

An Improved Decomposition Technique for Solving Integer Linear Programming Problems

The outline of this paper is like in Section 2, some basic ideas and necessary definitions related to the work have been mentioned. In Section 3, some existing techniques to solve ILPPs have been presented. In section 4, an improved algorithm has developed to solve ILPPs based on Benders Decomposition. In Section 5, 6 & 7, a ILPP has been solved by using proposed algorithm manually and generate a computer code using AMPL. In Section 8, a tabular comparison between manual output and programming output has presented. Finally in Section 9, convergences of Master and Sub problem values have been drawn graphically [3].
Show more

5 Read more

A Review on Multiphase Permanent Magnet Synchronous Motor Drive

A Review on Multiphase Permanent Magnet Synchronous Motor Drive

In the next paper by T.A.Lipo in 1996, a unified approach to the modeling and field-oriented control of dual three phase induction machine with one phase open is presented. Using the concept of vector space decomposition, the proposed technique is established on the basis of the asymmetrical winding structure directly, and thus provides a precise, physically insightful tool to the modeling and control of induction machines with structural unbalance.[13] Again by T.A.Lipo in the year 2001, describes a technique of injecting third Harmonic zero sequence current components in the phase currents ( Fig :2), which greatly improves the machine torque density. Analytical, finite element and experimental results were presented to show the system operation and to demonstrate the Improvement on the torque density. [14]
Show more

8 Read more

A PROFICIENT LOW COMPLEXITY ALGORITHM FOR PREEMINENT TASK SCHEDULING INTENDED 
FOR HETEROGENEOUS ENVIRONMENT

A PROFICIENT LOW COMPLEXITY ALGORITHM FOR PREEMINENT TASK SCHEDULING INTENDED FOR HETEROGENEOUS ENVIRONMENT

The slow response of thresholding, failure to detect fast eye blinks and the lack of an effective de-noising technique forced researchers to study the frequency characteristics of the EEG as well. The method of dealing with ocular artifact in the EEG, focusing on the relative merits of a variety of EOG correction procedures. The distinction between frequency and time domain approaches, the number of EOG channels required for adequate correction, estimating correction coefficients from raw versus averaged data, differential correction of different types of eye movement, the most suitable procedure for estimating correction coefficients, the use of calibration trials for estimation of correction coefficients, and the distinction between ‘coefficient estimation’ and ‘correction phase’ error are also discussed.[5]
Show more

5 Read more

A Lossless Multspectral Image Compression With Wavelet Band Decomposition And Binary Plane Technique

A Lossless Multspectral Image Compression With Wavelet Band Decomposition And Binary Plane Technique

Multispectral image compression has attracted many researchers to propose conduct their innovative works, few of them which are related to the current work were presented in this section. In [5] Ruedin et.al proposed a 2D integer wavelet transform based band ordering mechanism with predictions using the wavelet coefficients and presented a class conditioned lossless image compressor with athematic encoding. The method also compared against 3D-SPIHT and KLT transform however, the band decomposition method introduces artifacts and discontinuities by which the compressed loses its originality. Zhang et al in [6] proposed a lossy to the lossless blower for hyperspectral pictures, comprising in a whole number KLT in the otherworldly measurement and a 2-D DWT in the spatial measurement, trailed by a 3-D Tarp-based coder. In [7] Bhagyaraju et.al displayed an improved SPIHT calculation which is utilized to pack the multi-otherworldly satellite pictures. Here, the addition based super-goals strategy is utilized to improve the multispectral pictures and furthermore to gauge a high-goals (HR) picture from a low goals (LR) picture. At that point, utilizing the discrete wavelet change (DWT) the de-associated otherworldly groups are changed. The utilization of Improved SPIHT calculation quantizes and encodes the ghastly groups. The fundamental preferred position of this calculation is the high-pressure proportion of bits per pixel per band and the advancement of most extreme coding proficiency.
Show more

5 Read more

The Income-Health Relationship “Beyond the Mean” : New Evidence from Biomarkers

The Income-Health Relationship “Beyond the Mean” : New Evidence from Biomarkers

The relationship between income and health is one of the most explored topics in health economics but less is known about this relationship at different points of the health distribution. Analysis based solely on the mean may miss important information in other parts of the distribution. This is especially relevant when clinical concern is focused on the tail of the distribution and when evaluating the income gradient at different points of the distribution and decomposing income-related inequalities in health is of interest. We use the unconditional quantile regression approach to analyse the income gradient across the entire distribution of objectively measured blood-based biomarkers. We apply an Oaxaca-Blinder decomposition at various quantiles of the biomarker distributions to analyse gender differentials in biomarkers and to measure the contribution of income (and other covariates) to these differentials. Using data from the Health Survey for England we find a non-linear relationship between income and health and a strong gradient with respect to income at the highest quantiles of the biomarker distributions. We find that there is heterogeneity in the association of health to income across genders which accounts for a substantial percentage of the gender differentials in observed health.
Show more

29 Read more

Oaxaca Mezcal, A Natural Growth Stage Cluster The Mezcal de Oaxaca, A Natural Cluster in Growth Stage

Oaxaca Mezcal, A Natural Growth Stage Cluster The Mezcal de Oaxaca, A Natural Cluster in Growth Stage

The actions of coordination among actors in the chain are still emerging and are framed exclusively in the purchase of supplies on a consolidated or sporadic. Maguey producers represent the link with lower economic benefits, some of the causes attributed is the unwillingness to complete the purchase and agree a fair price. This has led to the increase of maguey in periods of scarcity. Mezcal producers represent the link that has managed to formalize partnerships with educational institutions and social organizations, such as; Benito Juarez Autonomous University of Oaxaca (UABJO), Mexican Council Regulatory Quality Mezcal (COMERCAM), State Committee of Sistema Producto Maguey Mezcal-(CESPMM), among others. Where it is worth noting that the COMERCAM is the only organization supporting the mezcal industry has economic autonomy, as the rest of the organizations are still dependent on government.
Show more

9 Read more

A New Decomposition Technique for Daily Peak Load Demand

A New Decomposition Technique for Daily Peak Load Demand

Rasak, et al. (2010) the paper seeks to find an appropriate forecasting technique for the moving holidays’ effects in Malaysia. Such holidays include eidulfitr, Chinese New Year and Deepavali. The paper says these moving holidays could overlap with fixed holidays and thereby increase the complexity of the load forecasting problem. In order to achieve the objectives three methods were considered- SARIMA, constrained SARIMA and dynamic regression, the results obtained using MAPE were 4.84%, 3.85% and 2.39% for each of them respectively.

6 Read more

An Improved Decomposition Algorithm and Computer Technique for Solving LPs

An Improved Decomposition Algorithm and Computer Technique for Solving LPs

Abstract - Dantzig-Wolfe decomposition (DWD) principle relies on delayed column generation for solving large scale linear programs (LPs). In this paper, we will present an improved decomposition algorithm depending on Dantzig-Wolfe decomposition principle for solving LPs by giving algorithm and sequential steps by using flowchart. Numerical examples are given to demonstrate our method. A computer technique for solving LP is also developed with proper instructions. Finally we have drawn a conclusion stating the privilege of our method of computation.
Show more

11 Read more

A novel non negative matrix factorization technique for decomposition of Chinese characters with application to secret sharing

A novel non negative matrix factorization technique for decomposition of Chinese characters with application to secret sharing

The decomposition of Chinese characters is difficult and has been rarely investigated in the literature. In this paper, we propose a novel non-negative matrix factorization (NMF) technique to decompose a Chinese character into several graphical components without considering the strokes of the character or any semantic or phonetic properties of the components. Chinese characters can usually be represented as binary images. However, traditional NMF is only suitable for representing general gray-level or color images. To decompose a binary image using NMF, we force all of the elements of the two matrices (obtained by factorizing the binary image/matrix to be decomposed) as close to 0 or 1 as possible. As a result, a Chinese character can be efficiently decomposed into several components, where each component is semantically unreadable. Moreover, our NMF-based Chinese character decomposition method is suitable for applications in visual secret sharing by distributing the shares (different character components) among multiple parties, so that only when the parties are taken together with their respective shares can the secret (the original Chinese character(s)) be reconstructed. Experimental results have verified the
Show more

8 Read more

An Effective Image Fusion Technique based on Multiresolution Singular Value Decomposition

An Effective Image Fusion Technique based on Multiresolution Singular Value Decomposition

Due to the multiresolution property, discrete wavelet transform is widely used in image processing [13]. DWT is a technique, which converts an image from spa- tial domain to frequency domain. It is used to analyze an image at different resolutions. We can obtain hori- zontal, vertical and diagonal information of the images using DWT. At first level decomposition, DWT decom- poses the image into two parts: approximation and de- tailed parts. Approximation part contains one low fre- quency subband (LL) and detailed parts contain three high frequency subbands (LH, HL and HH), as shown in Fig. 1(a). Most of the information of image is con- tained in approximation part. For second level decom- position, approximation part is further decomposed into four frequency subbands, as shown in Fig. 1(b). The de- composition levels can be increased as per the require- ment.
Show more

13 Read more

Image Dehazing Technique Based On DWT Decomposition and Intensity Retinex Algorithm

Image Dehazing Technique Based On DWT Decomposition and Intensity Retinex Algorithm

The proposed work studies different types of methods and technologies that have been used for image dehazing and observed that the low contrast and noise remains a barrier to visually pleasing images in dehazing conditions. In that condition, to find out a more accuracy in image enhancement process there is need to detect and measure the intensity level of individual pixel channel as well as have to present an appropriate enhancement factor for enhancement purpose, so that effective and efficient image enhancement process will be created. A new method for image dehazing based on DWT Decomposition and intensity Retinex. The air-light of image is estimated by decomposing the image using non- symmetry and anti-packing model to refine the illumination value. Next, the scene transmission function is calculated using the combination of the boundary constraints and the contextual regularization. The proposed method produces high quality dehazed picture in most cases and decrease artifact. What’s more, the tone of the image is natural. But there’s still some points needing to be improve such as the time consuming and the tone’s
Show more

7 Read more

Implications of the selection of a particular modal decomposition technique for the analysis of shallow flows.

Implications of the selection of a particular modal decomposition technique for the analysis of shallow flows.

conditions, it can be expected that due to the still dominant two-dimensionality of the flow, the results will contain enough information on the dominant dynamics. As explained later in detail, this information can be found in the temporal coefficients resulting from a POD, which, after a Fourier spectral analysis, can guide the search for corresponding modes using the DMD technique. Thus, the aim of this paper is to not only to demon- strate the potential utility of DMD in hydraulics research, but also to show how it can be complemented, and the interpre- tation of its results enhanced, through the use of information from POD. Overviews of these two techniques are provided in the next two sections, with further detail available in the liter- ature. Results where DMD and POD largely return the same behaviours are then presented, before considering a case in greater detail where more complex flow dynamics requires the use of the two techniques in parallel.
Show more

12 Read more

Singular Value Decomposition based Steganography Technique for JPEG2000 Compressed Images

Singular Value Decomposition based Steganography Technique for JPEG2000 Compressed Images

where is the pixel of stego image and X is the pixel of cover image, and are the height and width of the images, respectively. The larger the PSNR, better is the quality of stego image. In general, a stego image is acceptable by human perception if its PSNR is greater than 30 dB [20, 21]. The PSNR is used for evaluating the imperceptibility of data hiding techniques. In order to show the effectiveness of the proposed technique, eight images, namely, Lena, Boat, Baboon, Bridge, Couple, Crowd, Pepper and Airplane are used as cover images, each of size 512×512. Barbara image of size 256×256 is taken as secret image. These cover images are compressed using different bit rate, namely, 4 bits per pixel (bpp), 2 bpp, 1 bpp and 0.5 bpp. SF is used in embedding process and the optimal value of SF is determined using GA. Five generations with 20 population size are considered in GA optimal process using fitness of (2). Then secret image is embedded into cover image while compressing using JPEG2000 standard, using optimal SF value. Same value of SF is required in the extraction process to extract the bits of the secret image. These results are presented in Table 1.
Show more

8 Read more

Inertial measurement techniques for human joints' movement analysis

Inertial measurement techniques for human joints' movement analysis

In wavelet packet, a signal is successively decomposed by suitably chosen lowpass and highpass filters and at each stage of decomposition, the resulting signals are down sampled by a factor of 2. The outputs of the lowpass filter and highpass filters represent the coarse and detail information of the signal respectively. The decomposition is successively repeated in the similar manner for the required number of stages (called levels). The noisy accelerometer and drifting gyroscope signals were both decomposed separately using Matlab © to 8 levels using the Daubechies 20 wavelet family (filter). The coefficients of the terminal nodes were expressed as nodes 255 to 510. The values of these nodes for both the decomposed noisy accelerometer and drifting gyroscope signals were compared for similarity by performing correlation. Two nodes with the largest correlation magnitude (i.e. closest similarity) were chosen and the values of the nodes that were not selected were set to zero. The accelerometer signal was then reconstructed based on the new wavelet packet coefficients for the noisy accelerometer. The reconstructed accelerometer signal obtained using the above method was compared with the complementary filter approach for combining the noisy accelerometer and drifting gyroscope signals. The movement angle from the complementary filter was obtained from Eq. (1) as
Show more

9 Read more

A Hybrid of EMD-SVM Based on Extreme Learning Machine for Crude Oil Price
              Forecasting

A Hybrid of EMD-SVM Based on Extreme Learning Machine for Crude Oil Price Forecasting

Is specific in this study. The SVMs were used to evaluate the nonlinear behavior of the predicting data set because Gaussian kernels aim to present good performance under common efficiency assumptions Scholkopf (1997). Parameter C and  for SVM are estimates using cross-validation technique. Cross-validation is a technique that can be used to estimate the quality of a neural network. When applied to several neural networks with different free parameter values (such as the number of hidden nodes, SVM, and so on), the results of cross- validation can be used to select the best set of parameter values. While there are several types of cross- validation. The best way to get a feel for how k-fold cross-validation can be used with neural networks. Empirical Mode Decomposition (EMD):
Show more

11 Read more

Show all 10000 documents...