Model-based approaches

Top PDF Model-based approaches:

Model based approaches for predicting gait changes over time

Model based approaches for predicting gait changes over time

Recently much attention has been devoted to use of human gait patterns as bio- metric. Gait recognition aims to discriminate individuals by the way they walk. Approaches to gait recognition can be broadly classified as being model-based and model-free. Model-based methods [2, 3] model the human body structure. Model-free methods [1, 8] generally characterise the whole motion pattern of the human body. In this paper we employ the model-free (static) method of Veres et al [6]. However, in these works only databases recorded over a short time interval were evaluated.

8 Read more

Welfare economics and bounded rationality : The case for model based approaches

Welfare economics and bounded rationality : The case for model based approaches

Solution 3: Use of contextual information. The ancillary conditions themselves can form part of the evidence used to select a positive model of choice (this point refers to type 1 failure, too). For example, if ancillary conditions that make the problem more complex tend to produce more violations of WARP, this is prima facie evidence that a model based on complexity may be correct. BeRa argue that this type of evidence should only guide the selection of choice situations that are included in the domain of choice. They give the example of the choices of a blind person made in choice situations where information is presented visually, which should be discarded. But the same argument that Bernheim [5] makes regarding the use of ancillary evidence to exclude suspect choice situations can be tuned to select between positive models of choice. For example, ancillary conditions that a¤ect the salience of alternatives may help interpret the primitives in the satis…cing-plus model. Or, a documented choice fact is the tendency of employees to place large part of their savings in pension fund of the company for which they work (eg. Beshears et al. [7]), thus increasing the risk to their wealth to a level that is hard to justify rationally. Here, there is strong contextual evidence that the workers’choice of pension plan expresses status-quo bias rather than ‘reasoned’choice. Then, rather than discarding choice situations, one could use instead positive choice models of status-quo bias, e.g. Masatlioglu and Ok [30].

25 Read more

Model based approaches for recognising people by the way they walk or run

Model based approaches for recognising people by the way they walk or run

motion and the moving objects was developed. Huang combined canonical space transformation based on Canonical Analysis with eigenspace transformation for feature extraction to extract a gait signature [Huang'99]. The potential of image self-similarity [Abdelkader'01], area-based metrics [Foster'01], static body parameters [Johnson'01], velocity moments [Shutler'01] and symmetry [Hayfron-Acquah'01] have been used to generate gait signatures. Recently, stride and cadence has been investigated as gait parameters for recognition [Abdelkader'02], besides, continuous Hidden Markov Models have been applied to explore the structural and transitional characteristics of gait [Kale'02]. Gait recognition is not only limited to the computer vision community, another medium, an in-air sonar-based method, has been deployed in recognising walking people [Sabatini'98]. Gait can also be combined with other more established biometrics such as face [Shakhnarovich'01] or fingerprint to further improve performance. Current approaches discussed so far are mainly based on statistical measurements. A statistical approach assumes a statistical basis to describe whole body motion for pattern classification. This approach may not have in-built knowledge of the gait pattern or characteristics, and may require much training to achieve good performance.

131 Read more

Analysis of different model based approaches for estimating dFRC for real time aplication

Analysis of different model based approaches for estimating dFRC for real time aplication

Methods: Four model-based methods for estimating dFRC are compared based on their performance on two separate clinical data cohorts. The methods are derived from either stress-strain theory or a single compartment lung model, and use commonly controlled or measured parameters (lung compliance, plateau airway pressure, pressure-volume (PV) data). Population constants are determined for the stress-strain approach, which is implemented using data at both single and multiple PEEP levels. Estimated values are compared to clinically measured values to assess the reliability of each method for each cohort individually and combined.

15 Read more

WordNet Based Features for Predicting Brain Activity associated with meanings of nouns

WordNet Based Features for Predicting Brain Activity associated with meanings of nouns

The improvement of this combinatory scheme can be viewed from another aspect. Concept accu- racy, defined as classification accuracy of the concept paired with each of the other 59 concepts, shows the performance of the system for each con- cept (Figure 4). The concept accuracies of the li- near combinatory scheme are compared with the Baseline and WordNet systems and results are illustrated in Figure 4. The accuracy of some ambiguous concrete nouns like ‘saw’ are improved in WordNet-based model and this improvement is maintained by linear combinatory model. Im- provements have been seen in combinatory model. The second scheme uses a cross validation of the remaining 58 concepts to train the system, for deciding on each pair of concepts. After training, each system (WordNet and Base) is assigned a weight according to its accuracy. Decision on the test pair is based on a weighted combination of the systems. The results of this scheme are shown in Table 4. It has an improvement of 3.4% in compar- ison to the baseline model.

9 Read more

Modeling, Construction, and Validation of a Simulator for a Nuclear Process Control Test Facility

Modeling, Construction, and Validation of a Simulator for a Nuclear Process Control Test Facility

In NPCTF, an electric heater (Heater) is employed to mimic the fission energy production in a reactor, and a chiller (Chiller) is used to simulate the process of heat exchange in a steam generator. The water level disturbed by vapor bubbles in a steam generator is simulated by a water tank called the heat exchanger behind tank (HX-Tank). A pressurizer tank (Pressurizer) represents the pressurizer and a turbine device (Turbine) is used to demonstrate the turbine in NPPs. They are both simplified based on their basic functions. To support multiple research areas, auxiliary components, that is, pumps, valves, pipes, and water-storage tanks, are incorporated into the facility to make sure the entire system can work effectively. Specially, thirty actuators and twenty-five sensors as well as the interface panels to connect industry-grade remote control systems are built in the facility [6].

244 Read more

Face Recognition under Pose Variations

Face Recognition under Pose Variations

A regression technique can be used because of linear transformation between the features which is used to obtain frontal face features of non-frontal probe images for matching [1]. In the field of face recognition past studies show that the biggest challenge is to reliably recognize people in the presence of image/object variations that occur naturally in our daily life. In practical applications head pose variations is extremely important. Rotation, scaling and lighting variations are major challenges in face recognition.For tackling the pose variation problem various algorithms are developed from last 30 years.J. Kumar, Aditya Nigam, Surya Prakash and Phalguni GuptaClassifies the face recognition approach by three general approaches (i) Model based approach (ii) Appearance based approach and (iii) 3D based approach [1].Shan Du, Rabab Ward classifies the algorithm to tackle the pose variations in following categories (i) The invariant feature extraction based approach (ii) The multiview based approach and (iii) 3D range image based approach.[4].R. Rajlakshmi, M. K. Jeyakumar review/ discuss the face recognition by using different classifiers and the feature extraction methods as a) Face recognition using ICA b) Face recognition using Pose estimation and shadow compensation c) Face Recognition based on Hybridization process d) Face Recognition across pose illumination and e) Face Recognition by Histogram fitting and AAM (Active Appearance Method) [3]. H. Zhou, A. H. Sadkahas combined Gabor features within the scope of diffusion-distance calculation. This strategy starts from the Gabor filtering that consists of three scales and six orientations. It is followed by the calculation of diffusion distance based on a Bayesian model. The recognition rate of the proposed algorithm reduces while handling the occlusions due to dramatical pose changes [5].Z. Liu, J Yang, C. Liu use fusion of color, local spatial and global frequency information for face recognition. In this method the multiple features of input image are obtained by hybrid color space, the Gabor image representation, the local binary patterns (LBP) and the discrete cosine transform (DCT) [6].

5 Read more

Analysis of Probabilistic Parsing in NLP

Analysis of Probabilistic Parsing in NLP

In contrast, statistical systems allow broad coverage, and may be better able to deal with unrestricted text for more effective handling of the task at hand. Connectionist systems exhibit flexibility by dynamically acquiring appropriate behaviour based on the given input. For example, the weights of a connectionist network can be adapted in real time to improve performance. However, such systems may have difficulty with the representation of structures needed to handle complex conceptual relationships, thus limiting their abilities to handle high-level NLP. Suitable tasks: Symbolic approaches seem to be suited for phenomena that exhibit identifiable linguistic behaviour. They can be used to model phenomena at all the various linguistic levels described in earlier sections. Statistical approaches have proven to be effective in modelling language phenomena based on frequent use of language as reflected in text corpora. Linguistic phenomena that are not well understood or do not exhibit clear regularity are candidates for statistical approaches. Similar to statistical approaches, connectionist approaches can also deal with linguistic phenomena that are not well understood. They are useful for low-level NLP tasks that are usually subtasks in a larger problem. To summarize, symbolic, statistical, and connectionist approaches have exhibited different characteristics, thus some problems may be better tackled with one approach while other problems by another. In some cases, for some specific tasks, one approach may prove adequate, while in other cases, the tasks can get so complex that it might not be possible to choose a single best approach. In addition, as Klavans and Resnik pointed out, there is no such thing as a “purely statistical” method. Every use of statistics is based upon a symbolic model and statistics alone is not adequate for NLP. Toward this end, statistical approaches are not at odds with symbolic approaches. In fact, they are rather complementary. As a result, researchers have begun developing hybrid techniques that utilize the strengths of each approach in an attempt to address NLP problems more effectively and in a more flexible manner.

11 Read more

Hybrid Model for Privacy Protection Using Profile and Click Based Approaches

Hybrid Model for Privacy Protection Using Profile and Click Based Approaches

The procedure of Hybrid Model begins by joining profile and click and Location Based significant techniques . The client’s profile, history-logs and utilization of Location to improve the query search results on the web. The Hybrid Model consider data to the system is set of customer’s defined preferences as dynamic customer profile and clicked data Module. In Profile customer communicates his/her preferences as sensitive nodes and non-sensitive nodes. Exactly when customer fires a query on the web for looking information , The query is always checked with customer's profile by the online profiler. Whether the query-word is available in customer profile or not.? If the query is available in customer’ profile, then this query is considered for standard procedure for searching.

7 Read more

Estimation of Dynamic Stochastic Frontier Model using Likelihood based Approaches

Estimation of Dynamic Stochastic Frontier Model using Likelihood based Approaches

Almost all the existing panel stochastic frontier models treat technical efficiency as static. Consequently there is no mechanism by which an inefficient producer can improve its efficiency over time. The main objective of this paper is to propose a panel stochastic frontier model that allows the dynamic adjustment of persistent technical inefficiency. The model also includes transient inefficiency which is assumed to be heteroscedastic. We consider three likelihood-based approaches to estimate the model: the full maximum likelihood (FML), pairwise composite likelihood (PCL) and quasi-maximum likelihood (QML) approaches. Moreover, we provide Monte Carlo simulation results to examine and compare the finite sample performances of the three above-mentioned likelihood-based estimators. Finally, we provide an empirical application to the dynamic model.

37 Read more

A comparison of data-driven and model-based approaches to quantifying railway risk

A comparison of data-driven and model-based approaches to quantifying railway risk

That two distinct models for fatality numbers should give different results will come as no surprise. The basis for comparison is difficult however. The use of confidence intervals to give an estimate of the uncertainty around the mean number of fatalities per year is fraught with difficulties. The main difficulty is that such confidence intervals are model based and hence do not reflect uncertainty outside the model class. Even in this case where an appeal is made to the Central Limit Theorem, it is not clear whether the number of observations is large enough, which is particularly worrying with such “thick tail” as seen in Figure 2. We found no statistically significant difference between the SRM version 3 and the data.

6 Read more

Performance Analysis of Roundabouts under Mixed Traffic Flow Conditions

Performance Analysis of Roundabouts under Mixed Traffic Flow Conditions

The comparison showed in fig.07 is between the average values of circulating flow and entry capacity estimated for all the entry approaches combined at krida square for a week. The curves suggest that when the circulating flow increases the entry (approach) capacity decreases. The polynomial curves have significant R2 values. R2 value for Tanner’s capacity model is relatively different from the other two. The relation between observed entry capacities estimated by N.C.H.R.P. Report 572 capacity Model and H.C.M. Capacity Model at Krida sq. shows that N.C.H.R.P. Report 572 capacity Model gave the entry (approach) capacity relatively higher with approx. 5% difference.

10 Read more

Approaches to Model Query Interactions

Approaches to Model Query Interactions

Abstract. A typical database workload consists of several query instances of different query types running con- currently. The execution of each query may interact with the execution of the other queries. It is well known that such interactions can have a significant impact on database system performance. In this article we propose three new approaches to model and measure query instance and query type interactions. Our approaches require no prior as- sumptions about the internal aspects of the database system, making it non intrusive, namely, portable across systems. Furthermore, to demonstrate the profit of exploiting query interactions, we have developed a novel interaction-aware query scheduler for online workloads, called Intelligent Scheduler for Multiple-query Execution Ordering (ISO, for short). In order to verify the efficiency of the proposed approaches for measuring query interaction and of ISO, an experimental evaluation using TPC-H workloads running on PostgreSQL has been done. The results show that the proposed approach has potential to improve the efficiency of database tuning tools.

10 Read more

RECENT METHODS FOR OPTIMIZATION OF PLASTIC INJECTION MOLDING PROCESS –A RETROSPECTIVE AND LITERATURE REVIEW

RECENT METHODS FOR OPTIMIZATION OF PLASTIC INJECTION MOLDING PROCESS –A RETROSPECTIVE AND LITERATURE REVIEW

Ren Shie et al. (2008) analyzed the contour distortions of polypropylene (PP) composite components applied to the interior of automobiles. Combining a trained radial basis network (RBN) and a sequential quadratic programming (SQP) method , an optimal parameter setting of the injection molding process was determined. The specimens were prepared under different injection molding conditions by varying melting temperatures, injection speeds and injection pressures of three computer-controlled progressive strokes. Minimizing the contour distortions was the objective of this study. Sixteen experimental runs based on a Taguchi orthogonal array table were utilized to train the RBN and the SQP method was applied to search for an optimal solution. In this study, the proposed algorithm yielded a better performance than the design of experiments (DOE) approach. In addition, the analysis of variance (ANOVA) was conducted to identify the significant factors for the contour distortions of the specimens.

15 Read more

Model development for supporting advanced risk based approaches in air traffic management

Model development for supporting advanced risk based approaches in air traffic management

One of the goals of this study is to support an advanced risk-based approach. A swift look has been given to risk analysis in other fields, but this did not give an appropriate method. Many risk-based approaches are aimed at the (possible) damage incurred by occurrences; this can be expressed as either money, lives, etc.. This is not applicable for the situation of this study, since in most cases there is no measurable amount of damage; this is done in [13], also [14] discusses multiple models. This makes it hard to quantify risk, and is thus the reason why statistical characteristics on occurrence data is studied and not the quantification of risk.

86 Read more

Verification, Testing, and Runtime Monitoring of Automotive Exhaust Emissions

Verification, Testing, and Runtime Monitoring of Automotive Exhaust Emissions

state that an engine controller is robustly clean. We sketched how this property can be verified using hyperproperty model checking, and we explained how robust cleanness can be twisted to support black-box testing derived from standardized test procedures. Finally we discussed a runtime monitoring approach for the latest regulations regarding real driving emissions, RDE. We have put the last two contributions into practice, using black-box testing on a Euro 6 diesel- powered Nissan NV200 Evalia fixed to a chassis dynamometer, and using runtime monitoring on an Audi A7 equipped with a 3 litre Euro 6 diesel engine driving on normal roads. The latter uses low-cost infrastructure that potentially can be deployed massively to end customers for emissions monitoring, possibly combined with crowd-sourcing. In both concrete cases we were able to collect evidence that the emission cleaning mechanisms used within these cars are not acting as they should act.

17 Read more

Systematic analysis of the evolution of electricity and carbon markets under deep decarbonization

Systematic analysis of the evolution of electricity and carbon markets under deep decarbonization

endogenously: each generator is continuously learning to improve performance in the profit objective using the previous trading day’s profit as a benchmark to evaluate the current day’s performance. There are several reasons why companies will want to maintain a utilisation target. This could be part of their long-term market share strategy, or it could reflect prior contracting, or in some cases it could reflect availability obligations promised to the regulator. As will be seen later, assumptions about utilisation are critical to price formation, but if a low utilisation hurdle is selected, then it provides a basis for the company to substantially withdraw capacity, or indeed, if sustained, shrink in size. It should be re-emphasised that the outputs of such a strategic model should only be considered indicative of what might be possible. How real agents will chose to co-ordinate is highly speculative; sometimes less so than models of imperfect competition would suggest, sometimes more collusively. Furthermore, even in the simple setting of a symmetric duopoly, without demand elasticity, where offers are for a fixed amount of capacity, the often-cited work by Fabra et al (2006) informs us that there will be three equilibrium solution regions, one at the competitive level for low demand, one unbounded or at a cap for high demand and an intermediate region of indeterminate or mixed strategies. The intuition is that in the intermediate conditions, the incentive to undercut when one agent is moving the prices up creates the potential for cycling behaviour. In our more complex setting, the solution regions are not amenable such simple analysis, but we expect a similar perspective that pure equilibria may often not exist. As such, computational learning models can at best only be indicative of the potential for co-ordination.

30 Read more

Kopf, Julia
  

(2013):


	Model-based recursive partitioning meets item response theory: new statistical methods for the detection of differential item functioning and appropriate anchor selection.


Dissertation, LMU München: Fakultät für Mathematik, In

Kopf, Julia (2013): Model-based recursive partitioning meets item response theory: new statistical methods for the detection of differential item functioning and appropriate anchor selection. Dissertation, LMU München: Fakultät für Mathematik, Informatik und Statistik

The model-based recursive partitioning algorithm maintains the fundamental steps of the parti- tioning method reviewed in Section 2.2, but coherently extends them in the light of the model- based paradigm. According to this paradigm, the recursive process now estimates the basic sta- tistical model beginning with all available observations. The result of this step is the estimated parameter vector from the optimization of the objective function, typically the (log-)likelihood. In almost the same manner as classification trees, the recursive process starts: instead of test- ing the association, now the parameter instability is assessed using so called generalized M- fluctuation tests (Zeileis, 2005; Zeileis and Hornik, 2007). If the data indicate parameter insta- bility, the split of the parent node in two daughter nodes is executed. Relying on the data points in the new subgroups only, the algorithm again searches for parameter instability until no fur- ther significant instability is found, or another stopping criterion is fulfilled. The resulting tree structure can be visualized as illustrated in the examples presented below, so that the di ff erent groups can be compared. Note, however that statistical tests conducted after model selection – such as significance tests for group di ff erences after recursive partitioning – may be a ff ected by the e ff ects described by Leeb and P¨otscher (2005) and Berk et al. (2010), and should thus be based on new data.

205 Read more

Real-Time Implementation of Reduced Order Compensators on a Cantilever Beam

Real-Time Implementation of Reduced Order Compensators on a Cantilever Beam

very closely resembles the full order model. However, in Figure 7 (b) we present the simulated closed loop impulse response (the model with system matrix A − BK) and we see that in the full and reduced order case the dynamics are relatively unchanged. We made an attempt to use the control in real-time and we see in Figure 7 (c) that the control has very little influence on the dynamics (perhaps they are even worse). Since it doesn’t work in the full order case we cannot expect it to work upon reduction. Hence, with this formulation we do not have enough (pardon the pun) control over the control. For the next try at LQG balancing we choose the weighting matrices so that a stabilizing control is obtained with the full order model. However, we observe that in this case the reduced order model based control dynamics are not close to the full order system. This suggests that we must tune the parameters with both open loop and closed loop responses in mind. We found sufficient parameters (for n r = 10) to be q = 5 × 10 2 ,r = 5 × 10 −4 , u = 0.1,

6 Read more

Racial Disparity in Social Spatiality: Usage of National Parks and Opera Attendance

Racial Disparity in Social Spatiality: Usage of National Parks and Opera Attendance

In addition, satisfaction can be considered a fluid concept as it changes when the patient’s expectations or standards of comparison change, even though the actual health care received may remain constant (Goldstein et al., 2000). Thus, what exactly are the benefits of using patient satisfaction surveys, and why would this information be beneficial to physical therapists? Goldstein and colleagues (2000) contend that patient satisfaction surveys provide several benefits for physical therapists. First, data from such surveys can help to enhance the therapist–patient relationship, as therapists use the information to improve or modify aspects of the interaction experience that are meaningful to their patients. Also, this feedback can help physical therapists develop strategies for provision of care to facilitate the retention and recruitment of patients. Second, patient satisfaction data may be used to predict patient behavior based on the assumption that differences in levels of patient satisfaction can influence clinical outcomes to a certain degree. Third, satisfaction surveys can be used as evaluative tools to assess provider services and facilities of the structure, process, and care outcomes.

76 Read more

Show all 10000 documents...