# uncertainty analysis (UA)

## Top PDF uncertainty analysis (UA):

### Uncertainty analysis for a wave energy converter: the Monte Carlo method

Abstract— Developing wave energy converter technology requires physical-scale model experiments. To use and compare such experimental data reliably, its quality must be quantified through an uncertainty analysis. To avoid uncertainty analysis problems for wave energy converter models, such as providing partial derivatives for time-varying quantities within numerous data reduction equations, we explored the use of a practical alternative: the Monte Carlo method (MCM). We first set out the principles of uncertainty analysis and the MCM. After, we present our application of the MCM for propagating uncertainties in a generic Oscillating Water Column wave energy converter experiment. Our results show the MCM is a straightforward and accurate method to propagate uncertainties in the experiment; thus, quantifying the quality of experimental data in terms of power performance. The key conclusion of this work is that, given the demonstrated relative ease in performing uncertainty analysis using the MCM, experimental results reported in the future literature of wave energy converter modelling should be accompanied by the uncertainty in those results. More broadly, this study aims to precipitate awareness among the wave energy community of the importance of quantifying the quality of modelling data through an uncertainty analysis. We therefore recommend future guidelines and specifications pertinent to uncertainty analysis for wave energy converters, such as those developed by the International Towing Tank Conference (ITTC) and International Electrotechnical Commission (IEC), to incorporate the MCM with a practical example.

### Uncertainty analysis of dynamic thermal rating based on environmental parameter estimation

Dynamic thermal rating (DTR) of transmission lines is related to wind speed, wind direction, ambient temperature, and so on. Among the environmental parameters, there is a difference between the obtained environmental parameters and the true value. Therefore, only the deterministic values of environmental parameters and DTR are not accurate enough. Considering the environmental parameters obtained with uncertainty, the uncertainty of environment parameters based on Monte Carlo Method (MCM) is studied in this paper. According to the heat balance equation of transmission lines, the uncertainty analysis of transmission line ampacity is realized based on CIGRE standard. The best estimation value, standard uncertainty, and confidence interval are obtained under a given confidence level of environmental parameters. The experimental results show that DTR can fully improve the transmission capacity of transmission lines, and MCM is an effective method to assess uncertainty of DTR.

### Measurement Error And Uncertainty Analysis By Excel

In estimating the overall uncertainty, it may be necessary to take each source of uncertainty and treat it separately to obtain the contribution from that source. Each of the separate contributions to uncertainty is referred to as an uncertainty component. When expressed as a standard deviation, an uncertainty component is known as a standard uncertainty. If there is correlation between any components then this has to be taken into account by determining the covariance. However, it is often possible to evaluate the combined effect of several components. This may reduce the overall effort involved and, where components whose contribution is evaluated together are correlated; there may be no additional need to take account of the correlation

### Bayesian uncertainty assessment of flood predictions in ungauged urban basins for conceptual rainfall runoff models

In the context of urban planning and flood prediction, a reliable measure of uncertainty in predicted runoff is of vi- tal interest. It is current practice to map prediction un- certainties entirely to parameter uncertainties and propagate them through the model (Wagener and Gupta, 2005; Ajami et al., 2007; Vrugt et al., 2008a). A popular example for this approach is Generalized Likelihood Uncertainty Estima- tion (GLUE) (Beven and Freer, 2001). However, as shown by Ajami et al. (2007), ignoring either input forcing er- ror or model structural uncertainty may lead to unrealistic model simulations and associated uncertainty bounds that do not consistently capture and represent the real-world be- haviour of the watershed. It has been demonstrated that Bayesian statistics is conceptually more satisfying than other approaches of uncertainty analysis (Mantovan and Todini, 2006; Yang et al., 2008). One advantage of formal Bayesian approaches is the possibility to disentangle the effect of forc- ing, parameter and model structural error on total predictive uncertainty, which cannot be achieved with a GLUE (Vrugt et al., 2008a).

### An integrated uncertainty and ensemble based data assimilation approach for improved operational streamflow predictions

Over the past several decades, a variety of uncertainty analysis methods and data assimilation techniques have been developed and reported in the hydrologic literature. Some of the uncertainty analysis methods, among many others, include the generalized likelihood uncertainty estimation (GLUE) (Beven and Binley, 1992), the Bayesian total er- ror analysis (BATEA) (Kavetski et al., 2002), the Inte- grated Bayesian Uncertainty Estimator (IBUNE) (Ajami et al., 2007), the Framework for Understanding Structural Er- rors (FUSE) (Clark et al., 2008b; Clark and Kavetski, 2010), and Markov Chain Monte Carlo (MCMC) methods includ- ing the Random Walk Metropolis algorithm (Kuczera and Parent, 1998), the Shuffled Complex Evolution Metropolis (SCEM-UA) algorithm (Vrugt et al., 2003), and the Differ- ential Evolution Adaptive Metropolis (DREAM) algorithm (Vrugt et al., 2008). Most of above methods have not been commonly applied in practice, particularly for the NWS op- erational models SNOW17 and SAC-SMA. This is partly due to their high computational requirements, since efficiency is extremely important in hydrologic forecasting operations (Weerts and El Serafy, 2006). Most recently, He (2010) proposed an Integrated Sensitivity and UnceRtainty analy- sis Framework (ISURF). The ISURF first applies the Gen- eralized Sensitivity Analysis (GSA) (Hornberger and Spear, 1981) method as a screening tool to identify sensitive model parameters. Subsequently, ISURF applies the DREAM al- gorithm (Vrugt et al., 2008) to explicitly quantify the uncer- tainty structure of these identified parameters. This step-wise method significantly reduces the computational load and has been demonstrated to efficiently and adequately identify the uncertainty structure of the SNOW17 and SAC-SMA param- eters at a set of study sites with contrasting hydroclimatic conditions (He, 2010; He et al., 2011b).

### Elimination of visceral leishmaniasis in the Indian subcontinent : a comparison of predictions from three transmission models

The ﬁtted IRS efﬁcacy for models E0 and E1 led to a sandﬂy birth rate reduction between 37% in Gopalganj and 72% in Saharsa, which aligns with reported reduction in sandﬂy density of 72% after IRS with DDT (Joshi et al., 2009). For model W, the IRS efﬁcacy corresponds to an annual reduction in sandﬂy density of 6.7% in Gopalganj and 10.7% in Saharsa. This is apparently at odds with estimates for reductions in sandﬂy densities from ﬁeld studies (Joshi et al., 2009), but could reﬂect poor implementation of IRS and increasing sandﬂy resistance to DDT (Coleman et al., 2015). However, given the correlation of the IRS efﬁcacy factor with other parameters (e.g. the average duration of asymptomatic infection), it is also possible that the IRS efﬁcacy factor has been underestimated due to inaccuracies in the estimates for correlated parameters. The issue of parameter identiﬁability is not restricted to the IRS efﬁ- cacy, as the parameter uncertainty analysis for model W in SF1 shows. The average sandﬂy-to-human ratio and the infectivity of asymptomatic individuals are strongly negatively correlated, and the average duration of asymptomatic infection and amplitude of the seasonal forcing of the sandﬂy birth rate are positively corre- lated for most districts, which means that these parameters cannot be uniquely identiﬁed from the CARE data, which may account for some of the differences between the models’ predictions. Hence, data from high quality epidemiological and entomological studies are needed alongside the CARE data to tease apart the correlation between these parameters. Using more detailed data on disease progression (e.g. longitudinal serological data for humans and infection prevalence data for sandﬂies) would then also improve the accuracy of model predictions (as demonstrated by models E0 and E1).

### Point Counterpoint: Reflex Cultures Reduce Laboratory Workload and Improve Antimicrobial Stewardship in Patients Suspected of Having Urinary Tract Infections

A retrospective study evaluating urine testing and antibiotic prescribing practice for 676 patients who were ⱖ12 years old and had a positive urine culture reported that 60% of urine tests were ordered without indication. One hundred eighty-four of 676 (27%) patients had ASB, and 37/184 (20%) were treated with antibiotics. Importantly, of the patients with ASB that were treated with antibiotics, 89% were given antibiotics based on pos- itive UA results (8). The injudicious use of antibiotics in this set- ting was further evident in a prospective study of 343 adult women seen in the emergency department. That study reported overtreat- ment of 47% of patients when UA was positive for LE, nitrite, or trace blood but urine culture was negative. Using these UA crite- ria, 13% of patients with true signs and symptoms of UTIs (and positive urine culture) would not have been treated due to nega- tive UA results (9). This study also demonstrated that UA perfor- mance characteristics are highly dependent on the cutoffs that are adopted; had a more stringent UA cutoff been used (e.g., an LE of ⬎2 and positivity for nitrite), the overtreatment rate would have decreased to 13%, but the undertreatment rate would have esca- lated to 48%.

### Monologism of Hofstede’s Static Model vs Dialogism of Fang’s Dynamic Model: Contradictory Value Configuration of Cultures through the Case Study of Farsi Proverbs

Among various cultural models, the dichotomy of static versus dynamic models has provided a fertile ground for research. Although a number of static models are suggested, the dominant trend in almost all static models is provided by Hofstede who focuses on cultural differences along four major dimensions (power distance, individualism, uncertainty avoidance, and masculinity) and reduces “the complex phenomenon of culture in simple and measurable terms” (Fang, 2010, p. 156). The main concern is whether static bipolar models can cope with the requirements of the globalized era when cross-cultural communication “in an increasingly borderless and wireless workplace, marketplace, and cyberspace” (Fang, 2012, p. 2) is needed. Studying Fang’s dynamic cultural model versus Hofstede’s static cultural dimensions theory, the present paper, through the case study of Iranian culture, hypothesizes that dynamic models, such as Fang’s (2005, 2012), which recognize the paradoxical essence of cultures, emphasize all-dimensional cultural nearness. In Fang’s model, cultures are dialogic and open for cross-cultural interaction rather than monologic and segregated.

### A New Information Diffusion Model in Weibo: UA SCIR

Through several experiments, we analyzed the evolution of various node densities in the UA-SCIR. As showed in Figure2, the susceptible nodes showed a monotonically decreasing trend because they would become contacted nodes with probability α, and then transferred into infected nodes or refractory nodes. Infected nodes wouldn’t become refractory so the density of infected nodes was unidirectional rising. As there was a mechanism for refractory nodes to repost again in the model, it can be seen that refractory nodes density slightly decreased in the final stage.

### Scalable power system communications emulation with OPC UA

OPC UA is a server-client based communication protocol. Thus, it consists of two separated parts in the communication strategy. The server is delivering values, while the client is retrieving messages. Our approach thus combines the server part and the client part in each node. Any node’s client connects to the server value register and subscribes to the desired value, thus getting updates immediately. Any message to be sent is written into the value register and updated on change, thus delivering the message to the subscribed clients. This approach is robust to Denial-of-Service (DoS) attacks, since the submitted value is persistent at the server and can be retrieved at any later time, as opposed to one-time-messages. This type of server-client architecture is commonly established in the Internet of Things (IoT) domain, where weak and/or sporadic connectivity is a common property.

### Studying the effects of non-uniform stress distributions on soil heave

The relatively large consolidation observed after the point of maximum swell could be caused by a number of possible effects. It is possible that the reduction in dry density of the clay coupled with saturation possibly past the liquid limit made the sample fail in bearing capacity. The heaving observed does show a likeness to heaving explained by general shear failure under bearing. Another possibility could be that in this case there is a higher vertical stress than lateral confining swell allowing for higher rates of lateral swell compared to vertical swell. After pressure caused by the primary swell is completed the pressure from secondary swell is not enough to maintain the load but because of low lateral pressures consisting almost entirely of lateral swell pressure from surrounding soil. This would require more complex analysis of changes in stress applied from soil swell during the process of swell both under and outside of the loaded area.

### The Role of Uncertainty in Moment-to-Moment Player Motivation : A Grounded Theory

theories vary. Thomas Malaby [50] for instance draws on sociological and anthropological thought on contingency to argue that games are engaging because their "contrived con- tingency" allows us to engage with the basic indeterminacy of human existence. Mark Johnson [33] meanwhile deploys Deleuze to tease apart different kinds of unpredictability in games of chance. But authors concur that some perceived lack of certain knowledge about what is the case, what to do, or what will happen at a future moment is core to the motivational pull of gameplay. Drawing on many of these sources and his own practical experience, game designer Greg Costikyan [16] developed an influential categorization of eleven sources of Uncertainty in Games, including e.g. stochastic randomness as in a Roulette game, hidden information (like the hidden cards of an opponent in Poker), or player unpredictability - not knowing how the opponent will act next. Building on this descriptive categorization of uncertainty as a game fea- ture, Power and colleagues [59] have attempted to measure and differentiate uncertainty as a player experience. Their Player Uncertainty in Games Scale (PUGS) distinguishes five factors: uncertainty in decision-making, uncertainty in taking action, uncertainty in problem-solving, exploration behaviour to reduce uncertainty, and external uncertainty, capturing ran- dom(ized) outcomes.

### A Silver Hexacyanoferrate-Graphene Modified Glassy Carbon Electrode in Electrochemical Sensing of Uric Acid and Dopamine

The oxidation peak of UA appears at 0.48V on the cyclic voltammogram no cathodic peak was observed on the reverse scan within the investigated potential range, indicating totally irreversible electron-transfer kinetics for the UA oxidation. So, 0.5 V potential could ensure the UA oxidation at enough speed for the chronoamperometric measurement. As above mentioned, the work electrode potential was set at 0.3 V and 0.5 V for detection of DA and UA, respectively, in the chronoamperometric measurements.

### Joint Propagation of Ontological and Epistemic Uncertainty across Risk Assessment and Fuzzy Time Series Models

A two-dimensional Monte Carlo Analysis (2D MCA) is a term used to describe a model that simulates both uncertainty and randomness in one or more input variables. Uncertainty in the parameter estimates can be represented in a PRA model as follows. Consider a random input variable whose parameter estimates are affected by uncertainty. Assume normal PDFs can be specified for both uncertain parameters: the mean and the standard deviation. Uncertainty in the mean is described by the normal PDF with parameters (μ mean =5, σ mean =0.5); similarly, uncertainty in the standard

### Canadian snow and sea ice: assessment of snow, sea ice, and related climate processes in Canada's Earth system model and climate-prediction system

ization with the greatest average pattern correlation across the three fields (land-surface temperature, precipitation, and SWE) and the two seasons of JFM and AMJ. The spatial pat- tern correlation coefficient of each field with its observational counterpart is labelled. Plotted in Figs. 5c and 6c is the worst all-round match realization, which is the single realization with the least (most negative) pattern correlation. The best match realization exhibits tradeoffs across fields, for exam- ple in the ability to represent the structured pattern of spring- time precipitation change (r = 0.38 for the middle panel of Fig. 6b) versus wintertime land-surface temperature change (r = 0.01 for the left panel of Fig. 5b). The worst match case exhibits a similar range of correlations, on the negative side, and generally looks quite different from the best match case. This preliminary analysis of intra-ensemble variability sug- gests limits on how much regional-scale information about changes for snow cover and related climate variables can be extracted from ESMs. The key point is that caution is needed in judging a model on its ability to reproduce spatial patterns of trends in SWE and related climate parameters, even on these multidecadal timescales.

### Amperometric Sensor Based on Carbon Nanotubes and Polycations for the Determination of Vitamin C

This paper describes the development of a simple and efficient nanostructured platform based on polycations that was used as an analytical sensor for the determination of VC. The proposed sensor is easy to prepare, inexpensive, and has low detection limit, long linear range and a short analysis time. Compared with most reported methods, the present modified electrode possesses lower detection limit and longer linear range and could be employed for practical applications.

### EXTRACTION OF URSOLIC ACID FROM OCIMUM SANCTUM AND SYNTHESIS OF ITS DERIVATIVES: COMPARATIVE EVALUATION OF ANTI OXIDANT ACTIVITIES

Evaluation of the Antioxidant Activity of UA and Derivatives using Various in -vitro Assays: Inhibition of Lipid Peroxidation in Liposomes: The lecithin was dried under vacuum onto the wall of a round-bottom flask. The resulting lipid film was hydrated in 50 mm of Tris-HCl buffer at pH 7.4, shaken for 15 min and sonicated (samples thermostated at 0-2C). Before the end of sonication a fluorescent probe, 3-[p-(6-phenyl)-1, 3, 5-hexatrienyl]-phenylpropionic acid was added to the liposome suspension. The assay was conducted by the procedure described earlier 15 . Trolox Equivalent Antioxidative Capacity (TEAC) for UA, UA-4, and UA-7: The assay was performed as previously reported with slight modifications 16 . Samples (40 g/ml concentrations) were mixed with 200 l of ABTS+ radical cation solution in 96-well plates and absorbance was read (at 750 nm) in a microplate reader. Trolox equivalents (in M) were derived from the standard curve at 5 min of incubation. Determination of IC 50 -value of UA, UA-4 and

### The cost of diagnostic uncertainty: a prospective economic analysis of febrile children attending an NHS emergency department

Costs incurred during inpatient stay were obtained from NHS reference costs 2015/16. The tariff HRG PW20C (paediatric fever of unknown origin, CC score = 0) was utilised to reflect a 3-day short stay inpatient admission. As children could be admitted for anywhere between 1 and 72 h under the reference tariff, this figure was divided through by 72 and multiplied by the number of hours of inpatient admission. Patients who exceeded the 3-day limit incurred an excess bed day charge which was applied from the fourth day until discharge [21]. Finally, indirect costs were estimated for each patient, using the ‘ full absorption approach ’ . This included the anticipated use of facilities, such as toilets, and the time of administrative staff typing up and sending discharge notes to the patient ’ s general practitioners. Societal costs, including parental absence from work, and children ’ s absence from school were not included, as the analysis was conducted from a healthcare provider perspective. Due to the short time frame of the analysis, costs were not discounted. All unit costs were in 2017 prices and are provided within Table 2.

### Integrating KNX and OPC UA Information Model

Nodes are accessible by Clients using OPC UA Services (interfaces and methods). Client and server applications use OPC UA Client and Server Application Programming Interface (API) to exchange data, respectively. OPC UA Client/Server API is an internal interface that isolates the Client/Server application code from an OPC UA Communication Stack. Implementation of the OPC UA Communication Stack is not linked to any specific technology; this allows OPC UA to be mapped to future technologies as necessary, without negating the basic design.

### Estimating the uncertainty of the liquid mass flow using the orifice plate

Abstract. The article presents estimation of measurement uncertainty of a liquid mass flow using the orifice plate. This subject is essential because of the widespread use of this type of flow meters, which renders not only the quantitative estimation but also qualitative results of this type so those measurements are important. To achieve this goal, the authors of the paper propose to use the theory of uncertainty. The article shows the analysis of the measurement uncertainty using two methods: one based on the "Guide to the expression of uncertainty in measurement" (GUM) of the International Organization for Standardization with the use of the law of propagation of uncertainty, and the second one using the Monte Carlo numerical method. The paper presents a comparative analysis of the results obtained with both of these methods.