In all cases, I have studied the estimatingprocess (including related company processes), using documentation about the case, such as project definitions, drawings used for estimating, and results of the estimate. I also conducted informal interviews with people involved with the estimatingprocess. Last, I was an observer of day-to-day practice at the company, Ballast Nedam. Altogether, this enabled me to gain integrated knowledge about the estimatingprocess. I have structured my findings in a uniform matter. In each case, I described the estimatingprocess from start (input) to finish (the estimate) focusing on process steps, involved people, and estimating methods. I also addressed project specific aspects to analyze complexity features. By doing this, I was able to list similarities and differences between different cases more easily. According to Eisenhardt (1989), this led to a more sophisticated understanding of the processes I studied.
Process capability of a process is defined as inherent variability of a process which is running under chance cause of variation only. Process capability index is measuring the ability of a process to meet the product specification limit. Generally process capability is measured by 6 assuming that the product characteristic follows Normal distribution. In many practical situations the product characteristics do not follow normal distribution. In this paper, we describe an approach of estimatingprocess capability assuming generalised g and h distribution proposed by Tukey.
Therefore the research is needed because in Malaysia the use of Cost Estimating system was very limited. The most common and widespread method for taking-off was still the traditional method and spreadsheet method. In the mean time, although most of the contractors have their highly capable Cost Estimating system, there are still many of them who still use the spreadsheets program. The reason is that spreadsheet programs actually save more time than the Cost Estimating production system. For all this spreadsheet users, template or standard sheets were being developed to help out Cost Estimatingprocess. The users will only have a key in the dimension and then, they will have quantities for multiple items in one single sheet.
Expert Judgement (EJ) is used extensively during the generation of cost estimates. Cost estimators have to make numerous assumptions and judgements about what they think a new product will cost. However, the use of EJ is often frowned upon, not well accepted or understood by non-cost estimators within a concurrent engineering environment. Computerised cost models, in many ways, have reduced the need for EJ but by no means have they, or can they, replace it. The cost estimates produced from both algorithmic and non-algorithmic cost models can be widely inaccurate; and, as the work of this paper highlights, require extensive use of judgement in order to produce a meaningful result. Very little research tackles the issues of capturing and integrating EJ and rationale into the cost estimatingprocess.
The application of engineering practices and scientific principles to the creation of cost estimates along a product life cycle is one of the basic aspects of Cost Engineering. Cost estimates are used as fundamental criteria to make design decisions in the development stage and also to make business decisions in collaboration between OEMs and their supply chain. The achievement of an estimate requires experience and knowledge of different techniques and methodologies. Key aspects on its creation are the adoption of a cost estimatingprocess, the availability of the needed data and the proper management of the information used during the process. The collaboration between the OEM and its supplier can be facilitated by having a better common understanding of how the cost estimates have been created. The cost estimatingprocess used is then a fundamental piece of trust. In this context, V-CES project has developed a set of virtual tools and services around cost estimating processes to support the creation of cost estimates, the improvement of competences of the Cost Engineering Community, and the common understanding on cost between OEMs and their supply chain. The main purpose of this paper is to present the research conducted in the definition of cost estimating processes and the virtual solutions developed around them.
“Benchmark’s ease of use and windows-based feel has made estimating much more efficient in our Council. The system is also very flexible and we can easily customise it to suit our needs. One of the features I use a lot is the integrated email; this makes the whole estimatingprocess much easier and faster. The reports are also fantastic – we can instantly generate reports showing the total plant, labour, materials and subcontractor requirements for our works.”
Establishing a HR Planning framework is a major achievement, that enables an organization to ensure how best to use its human resources to achieve outputs and outcomes. Employee plays crucial role in the success of business organization, it is important that organization should put consideration and careful planning into human resource practices. It is imperative that your Human resource practices should correspond with your business plans. Human resource planning is an ongoing process. Organization must continually monitor and forecast personnel needs and concerns. Human resource planning is something that you can learn and improve on through experience and effort. HRP helps to link the long term purpose, goals and objectives of the HR function/HR plans. It also examines what people are presently doing in their jobs in the organization. It examines and analyze what kinds of people are doing the work at present and the present strengths and weaknesses of the HR policies. Finally it compares present and future
In case of individuals who are price and value conscious, if the reward/benefit sought after purchase is in future it has a negative influence on purchase inten- tion . Usually in the case of green products the benefits are realized at a point in time in future . Price fairness has been identified as one psychologi- cal factor that exerts an important influence on consumers’ reactions to prices (e.g.   ). Fairness is a decision based on the significance of the value derived from a given product or service, whereas value is simply an assessment between the amount paid and some conception about quality which may be based upon either the buyer’s reference price or their willingness to pay . For example a value based consumer would prefer normal vegetables/lentils instead of organic vegetable/lentils free from pesticides due to his value consciousness. Fairness has been conceptualized as “a judgment of whether an outcome and/or the process to reach an outcome are reasonable, acceptable, or just” . Con- sumers usually form an acceptable price range in their minds on the basis of their previous transaction and consider any deviation from this range as unfair  . Evidence from previous studies indicates that consumers attach im- portance to price fairness and don’t accept ranges considered unfavorable by them (e.g.    ). In fact they look for causality and try to under- stand why this has happened  . Since green products are priced higher in comparison to their non-green alternatives, consumers’ perceptions of the value they offer might be less in the short run  . However, the total utility of
The consistency of quasi-likelihood estimators is discussed in Chapter 3. It is obvious that the consistency of the quasi-likelihood estimators depends strongly on how much information we can get from the data or the process. In process inference problems, the usual way to establish an estimating function space associated with the inference problem we consider is to choose a martingale from the semimartingale representation of the given process. As we point out in Chapter 3, sometimes the quasi-likelihood estimators may not be consistent if the martingale from the given process does not contain information about the parameters which is unbounded as the sample size increases. How can we get more information from a given process or sample and when do the martingales contain all the information we can get from the sample? Those questions are discussed in Chapter 3. Furthermore, we also give several sufficient conditions for the consistency of the quasi-likelihood estimators. The discussions are based on two different approaches; one concerns the integral surface case and the other the asymptotic linear case.
proposed estimator is high among some estimators either classical or robust. So it is a more robust estimator compare to these estimators. The MSE as a measure of efficiency of an estimator is shown to be small for proposed estimator in compare to the MSE of the other estimators when local contamination exists. The sensitivity of the introduced estimator with respect to the changes in mean and standard deviation of the outlying subgroups, a and sd, respectively is roughly lower than the other estimators. So construction of any control chart based on this estimator could result in a more precise control limits in practical situations. The analyzer can rely on this control chart to control a process more comfortably. This may be verified by comparing the power of the test for this control chart and the others.
This paper considers the identification, estimation, and inferential theory of the FAVAR model. Three sets of identification restrictions are considered. We propose a likelihood- based two-step method to estimate the parameters. Consistency, convergence rates, asymp- totic representations, and the limiting distributions have been established. The impulse response function and its confidence intervals are also provided. An important result from our theory is that if the identification conditions are imposed on the population variance rather than on the sample variance of the factor process, an additional term, which arises from the distance between the sample variance and the population variance, would enter the final asymptotic representations. Consequently the limiting variances of the estimators are larger. We studied the ways in which this distance affects the limiting distributions. The finite sample Monte Carlo simulation confirms our theoretical results.
From the past to the recent years of advancement in the effort estimation methodology, is said to have been increased due to the intimation of the term “agile”, which is an incremental and iterative approach , following a set of values and principles progressing over the collective effort of self-organizing cross-functional teams . Agile methods breaks down the product into smaller incremental constructs. Agile methods are bound to primarily focus on process adaptability and customer satisfaction by rapid delivery of working software product. Methodologies of agile include Extreme Programming (XP), Scrum, Feature-Driven Development (FDD), etc .
In order to confirm the performance of the proposed algorithm, two experimental works were carried out with a mobile robot in indoor environment. The first experiment is that the mobile robot goes to a goal with avoiding collision to a standing human object. The second experiment is to avoid two walking human objects. Where, the computer used for the experiment of estimating positions and velocities of obstacles was utilized in the experimental works of collision avoidance. The resultant motions of the robot generated with the algorithm during the experiments are shown in Fig. 10. It was confirmed that the robot with the proposed algorithm can move to the goal position safely in the situations including static and moving human obstacles. Examples of the computation result by the proposed algorithm for a certain instance taken from the experiments are shown in Fig. 10 (b) and Fig. 10 (d). It is observed that the algorithm estimates both position and velocity of objects near the robot and generates a safe direction.
In the normal blood-sucking process of a mosquito, the unevenness and stoppage of flow due to the non-Newtonian aspect of blood, are disadvantageous because the blood viscosity increases as the flow speed decreases or under reversed flow conditions. The two pumps of a mosquito seem to contribute to maintaining the forward flow. As a result, unevenness and stoppage in the flow are minimized in the three phases of the systaltic motion of the two pumps. In phase 1, the PP sucks the liquid with forward inertia due to the earlier expansion with a time shift (). The expansion of the PP contributes to reduce reverse flow, compared with the degree of reverse flow in phases 2 and 3. At the end of phase 3, the CP starts to expand while the PP contracts, accumulating blood in the CP and then transporting it to the PP.
DOI: 10.4236/ojs.2018.85051 772 Open Journal of Statistics which is quite popular for the analysis of non-Gaussian correlated data. Its main advantage is that one is only required to specify correctly the mean structure of the response for the parameter estimator to be consistent and asymptotically normal. In the presence of missing data, GEE is only valid under the strong assumption of MCAR. The first effort to make GEE applicable to the more realistic MAR scenario was Multiple Imputation Generalized Estimating Equations (MIGEE), proposed by Little and Rubin . Here, missing values are multiply imputed and the resulting completed datasets are analysed through standard GEE methods. Following Rubin’s rule, the final results obtained from the completed datasets are combined into a single inference. Robins  extended GEE be developing the Inverse Probability Weighted Generalized Estimating Equations (IPWGEE), which consists of weighting each subject’s contribution in the GEE by the inverse probability that a subject drops out at the time they dropped. IPWGEE produces consistent estimates provided the weight model is correctly specified. Double Robust Generalized Estimating Equations (DRGEE) arise as a third generalisation of GEE to deal with data subject to MAR mechanism. The main idea is to supplement the IPWGEE with a predictive model for the missing quantities conditional on the observed ones . This method produces consistent estimates provided the dropout or conditional model is correctly specified. Doubly robust methods have widely received attention in the literature in the last decade (see    ).
There has recently been some focus on so-called stochastic volatility models, that is, two-dimensional diffusions where one of the coordinate processes is completely unob- served. This of course complicates the analysis even further; see Sørensen (2000, Section 3.4) for a survey of estimation techniques. Estimating functions have been developed (Sørensen, 1999b) and the Bayesian approach as well as indirect inference and EMM have been applied (Andersen and Lund, 1997; Eraker, 2001; Gourieroux et al., 1993). It is not yet clear if the approximate likelihood methods from Section 5 can be extended to cover such models, but there are other suggestions on approximate likelihood analy- sis (Sørensen, 2003). Still, inference for stochastic volatility models, is far from fully explored.
The analytic process used in this empirical example involved; (1) estimating the propensity score using logistic regression to predict program participation status using pre- intervention covariates; (2) dividing the entire sample into 5 strata of the propensity score; (3) generating weights for each individual based on their study group and strata; and (4) estimating treatment effects using weighted OLS (i.e., the weight was specified as a weight in the OLS model). The results indicated a statistically significant difference in favor of the control group (P=0.031; 95% CI: 0.008, 0.174). In other words, the control group had a larger decrease in hospitalizations than did the treatment group.
To overcome the above drawbacks, a vibratory setup is introduced to enhance the mechanical properties of weldments. In this process, mechanical vibrations are given to the weldments with periodical force during the welding operation . Compared to above said techniques, vibratory assisted welding has advantages such as economic, flexibility in use, pollution free and less manufacturing time . Due to these special characteristics, vibratory welding techniques have been implemented effectively with varying degree of success [19-21]. Even though, the effect of vibratory parameters on mechanical properties of weldments is still a remaining question mark to the researchers. Dynamic solidification technique during welding has been proposed to prompt the mechanical vibrations during welding of butt welded joints. It was presumed that butt welded joints arranged under vibratory conditions had high hardness with no loss of its ductility. Authors utilized the vibratory setup to affect the mechanical vibrations to the weld pool amid welding. Because of vibratory welding process, change of mechanical properties has been observed. It was inferred that the refined microstructure component was in charge of the change of impact strength, tensile strength, flexural strength and hardness of butt welded joints of mild steel plates. Authors observed that post weld vibratory treatment will not influence the crystal structure, the increase in all properties are related to the crystal structure only. Finally, General regression neural network technique (GRNN) based tool has been developed for estimating tensile strength, impact strength, flexural strength and hardness for given input parameters [22-29].