Probability Theory

Top PDF Probability Theory:

A Study on Properties of Dempster-Shafer Theory to Probability Theory transformations

A Study on Properties of Dempster-Shafer Theory to Probability Theory transformations

The Dempster-Shafer Theory (DST) and the Probability Theory (PT) are two theories that have been used for modeling uncertain data. In each theory, the combination and the marginalization rules are utilized for various applications. The main different of these two theories is that the Dempster-Shafer theory includes probability theory as well as set theory. In other words, in the Dempster-Shafer theory, the Basic Probability Assignment (BPA) is applied to assign masses to a subset of the frame of discernments while in the probability theory, the Probability Density Function (PDF) assigns values to the singleton members. The problems arise when we want to make a decision in DST. Therefore, the BPA in DST should be transformed to the probability density function in PT.
Show more

14 Read more

ON THE RESULTS OF USING INTERACTIVE EDUCATION METHODS IN TEACHING PROBABILITY THEORY

ON THE RESULTS OF USING INTERACTIVE EDUCATION METHODS IN TEACHING PROBABILITY THEORY

The current research has shown that usage of the presented CAS-based education methods (interactive presentations and patterns for computer modeling, and an improved set of home tasks) makes it possible to achieve higher education outcomes in the first teaching module “Probability Theory”. The basic principles and methods of education with these techniques were discussed in (Garner, 2004; Kramarski & Hirsch, 2003). The experiences in using CAS in senior mathematics classroom and delineate changes in education methods are explored in (Garner, 2004). In addition, its advantages such as handheld CAS calculators that are able to perform algebraic, graphic and numeric calculations are shown. In addition, the paper (Kramarski & Hirsch, 2003) is devoted to research of didactic aspects and advantages of usage of CAS in teaching, especially the usage of integrating Self-Regulated Learning (SRL) within the CAS. It revealed that CAS+SRL students outperformed CAS students on algebraic thinking and that (CAS+SRL) students regulated their learning more effectively. The present paper demonstrates a positive effect of usage of interactive CAS-based education methods overall process of education and its outcomes, as well.
Show more

15 Read more

Unification of probability theory on time scales

Unification of probability theory on time scales

The theory of time scales was introduced by Stefan Hilger in his PhD thesis in 1988 in order to unify continuous and discrete analysis. Probability is a discipline in which appears to be many applications of time scales. Time scales approach to probability theory unifies the standard discrete and continuous random variables. We give some basic random variables on the time scales. We define the distribution functions on time scales and show their properties.

13 Read more

Probability theory applied to genetic populations

Probability theory applied to genetic populations

in probability theory in a basic paper by Kolmogorov (19), and since then various theoretical papers have been written about them (e0go (14), (8), (9))» hut to the author0s know­ ledge no rigorous proof of their applicability has been given in the genetic situation0 This is probably because the fre­ quency of a gene in a population is generally not a Markov variate9 and because the time scale is essentially discrete, not continuouso Therefore, in chapter 1 the justification is given for the approximation procedures under certain suffic­ ient conditionsv and these conditions are wide enough to in­ clude genetic problems as special cases0
Show more

214 Read more

Book review: Paradoxes in Probability Theory, by William Eckhardt

Book review: Paradoxes in Probability Theory, by William Eckhardt

In this strict sense of probability theory, all seven problems treated by Eckhardt fall outside of it. Take for instance the Two-Envelopes Problem, which to the untrained eye may seem to qualify as a probability problem. But it doesn’t, for the following reason. Write Y and 2Y for the two amounts put in the envelopes. No probability distribution for Y is specified in the problem, whereas in order to determine whether you increase your expected reward by changing envelopes when you observe X = $100 (say), you need to know the distribution of Y , or at least the ratio P (Y = 50)/P (Y = 100). So a bit of modelling is needed. The first thing to note (as Eckhardt does) is that the problem formulation implicitly assumes that the symmetry P (X = Y ) = P(X = 2Y ) = 1 2 remains valid if we condition on X (i.e., looking in the envelope gives no clue on whether we have picked the larger or the smaller amount). This assumption leads to a contradiction: if P (Y = y) = q for some y > 0 and q > 0, then the assumption implies that P (Y = 2 k y) = q for all integer k, leading to an improper probability distribution whose total mass sums to ∞ (and it is easy to see that giving Y a continuous distribution doesn’t help). Hence the uninformativeness assumption must be abandoned. Some of the paradoxicality can be retained in the following way. Fix r ∈ (0, 1) and give Y the following distribution:
Show more

6 Read more

A New Approach to Probability Theory with Reference to Statistics and Statistical Physics

A New Approach to Probability Theory with Reference to Statistics and Statistical Physics

A new approach to probability theory is presented with reference to statistics and statistical phy- sics. At the outset, it is recognized that the “average man” of a population and the “average particle” of a gas are only objects of thought, and not real entities which exist in nature. The concept of av- erage (man) is generalized as a new concept of represental (man) whose epistemological status is intermediate between those of the particular (the man) and the universal (a man). This new con- cept has become necessary as a result of emergence of statistics as a new branch of human know- ledge at the beginning of the nineteenth century. Probability is defined with reference to the re- presental. The concept of probability is the same in probability theory and in physics. But whereas in statistics the probabilities are estimated using random sequences, in statistical physics they are determined either by the laws of physics alone or by making use of the laws of probability also. Thus in physics we deal with probability at a more basic level than in statistics. This approach is free from most of the controversies we face at present in interpreting probability theory and quantum mechanics.
Show more

15 Read more

Towards a quantum probability theory of similarity judgments

Towards a quantum probability theory of similarity judgments

In this section we will present an alternative model for similarity judgments based on Quantum Probability theory (QP). The use of QP for modelling these types of judgments follows on from a number of recent attempts to describe various phenomena in psychology, and the social sciences more generally, using non- classical models of probability. In brief, there is some consensus that certain types of probabilistic reasoning, in situations where there is not just uncertainty but also a form of incompatibility between the available options (see e.g. Busemeyer et al. 2011), may be better modelled using QP than by classical probabilities theories such as Bayseian models. For examples and a more detailed justification of the use of QP in this context see e.g. Aerts and Gabora (2005), Atmanspacher et al. (2006), Busemeyer and Bruza (2011), Khrennikov (2010).
Show more

15 Read more

Notes on Probability Theory

Notes on Probability Theory

Depending on the random experiment, S may be finite, countably infinite or uncountably infinite. For a random coin toss, S = {H, T }, so |S| = 2. For our card example, |S| = 28, and consists of all possible unordered pairs of cards, eg (Ace of Hearts, King of Spades) etc. But note that you have some choice here: you could decide to include the order in which two cards are dealt. Your sample space would then be twice as large, and would include both (Ace of Hearts, King of Spades) and (King of Spades, Ace of Hearts). Both of these are valid sample spaces for the experiment. So you get the first hint that there is some artistry in probability theory! namely how to choose the ‘best’ sample space.
Show more

124 Read more

Probabilities. Probability of a event. From Random Variables to Events. From Random Variables to Events. Probability Theory I

Probabilities. Probability of a event. From Random Variables to Events. From Random Variables to Events. Probability Theory I

Probability Theory I Great Theoretical Ideas In Computer Science Victor Adamchik.. Danny Sleator.[r]

8 Read more

Probability theory and application of item response theory

Probability theory and application of item response theory

Since 1970s item response theory became the dominant area for study by measurement specialist in education industry. The common models and procedures for constructing test and interpreting test scores served the measurement specialist and other test users well for a long time. In this paper, a number of assumptions about the classical test theory are discussed. In addition, the contemporary test theory that is item response theory (IRT) will be discussed in general. There are four strong assumptions about the probability theory of item response theory such as the dimensionality of the latent space, local independence theory, the item characteristics curves and the speededness of the test. Furthermore, this discussion aimed to assist departments of education in considering the IRT that contributed in introducing the use of computerized adaptive testing (CAT) as they move to transition testing programs to online in the future.
Show more

9 Read more

Adaptive Probability Theory: Human Biases as an Adaptation

Adaptive Probability Theory: Human Biases as an Adaptation

We will see that the problem of the probabilistic biases can be understood as an adaptation, in an environment where there is uncertainty, errors, deception and a need to learn. In this sense, it is possible that our brains are actually correcting the probability values to values that would represent a better prediction in natural environments, but not necessarily in laboratory experiments or math classes. Special attention will be given to explaining the weighting functions , that change the stated value of the probability of an event to . Those functions are used in many theories describing human probability decisions, as per example, Prospect Theory (Kahneman and Tversky, 1979), Cumulative Prospect Theory, CPT (Kahneman and Tversky, 1992), as well as in models that describe paradoxes of CPT, such as transfer of exchange theory (Birnbaum and Chavez, 1997), or gains decomposition theory, by Luce (2000) and Marley and Luce (2001). All these models can be described as a general class of configural weight models, where the weight of a possibility can depend also on the other possibilities available. A good, recent review of these models, comparing their predictions with many experiments can be found in Birnbaum (2005).
Show more

29 Read more

Lotteries and Probability Theory

Lotteries and Probability Theory

This objection applies to any use to which the objective conception of probability might be put. There is also a strong objection, however, relating specifically to the use of the objective conception when talking about lotteries and decision-making. The goal, as indicated before, is to find a conception of probability according to which it is plausible to say that fair lotteries ought to play an important role in decision-making. But if probability is an objective property of physical processes, than objective equiprobability is neither a necessary nor a sufficient condition for a process to play such a role. Clearly, it is not sufficient. A process could generate outcomes with equal probability without this equality being perceived by an agent seeking to use the process to make a decision. If an agent falsely 6 believes that tossing a particular coin will almost invariably result in heads, then if she decides to use the coin toss in making a decision, few would say that she is using a fair lottery, even if the lottery is “really” equiprobable. At a minimum, in order
Show more

22 Read more

Maintaining Information Privacy A Review on Probability Theory

Maintaining Information Privacy A Review on Probability Theory

Therefore, it is natural to treat the data mining activity as a game played by multiple users, and apply probabilistic ways to analyze the iterations and interaction between different users. Probability provides a formal way to model situations where a group of agents have to choose optimum actions considering the mutual effects of other agents' decisions. Information is modeled using the concept of information set which represents a player's knowledge about the values of different variables in the game. The outcome of the game is a set of elements picked from the values of actions, payoffs, and other variables after the game is played out. A player is called rational if he acts in such a way as to maximize his payoff. A player's strategy is a rule that tells him which action to choose at each instant of the game, given his information set. A strategy profile is an ordered set consisting of one strategy for each of the players in the game. An equilibrium is a strategy profile consisting of a best strategy for each of the players in the game. The most important equilibrium concept for the majority of games is Nash equilibrium. A strategy profile is a Nash equilibrium if no player has incentive to deviate from his strategy, given that other players do not deviate. Probability has been successfully applied to various fields, such as economics, political science, computer science, etc. Researchers have also employed probability to deal with the privacy issues related to data
Show more

5 Read more

Some applications of the saddlepoint methods in probability theory

Some applications of the saddlepoint methods in probability theory

Consider- a one server-queue with Poison input and exponential service tine, the queue discipline being "first cone first served”. Let the waiting tine of the n arriving customer in the queue be W . Clearly, the tine of the server consists of idle periods and busy periods. By a busy period is neant the tine interval between the arrival of a cus­ tomer who does not have to wait for service (empty queue) and the first subsequent instant when the queue is empty again. Let f^ = Pr{N = n] be the probability that a busy period consists of N = n customers. An idle period is the tine interval between the end of a busy perion and the com­ mencement of the next. Let £ be the event that a customer who arrives to find the queue empty. Then
Show more

92 Read more

On Some Applications of the Vougiouklis Hyperstructures to Probability Theory

On Some Applications of the Vougiouklis Hyperstructures to Probability Theory

In (Corsini, 1994), it is proved that the fuzzy sets are particular hypergroups. This fact leads us to examine properties of fuzzy partitions from a point of view of the theory of hypergroups. In particular, crisp and fuzzy partitions given by a clustering could be well represented by hypergroups. Some results on this topic and applications in Architecture are in the papers of Ferri and Maturo (1997, 1998, 1999a, 1999b, 2001a, 2001b). Applications of hyperstructures in Architecture are also in (Antampoufis et al., 2011; Maturo, Tofan, 2001). Moreover, the results on fuzzy regression by Fabrizio Maturo, Sarka Hoskova-Mayerova (2016) can be translate as results on hyperstructures.
Show more

16 Read more

Levy processes - from probability theory to finance and quantum groups

Levy processes - from probability theory to finance and quantum groups

Quantum Lévy processes first arose in work by W. von Waldenfels on a model of the emission and absorption of light by atoms interacting with “noise”. The quantum stochastic process obtained appeared to be a noncommutative analogue of a Lévy process on the unitary group U(d), and this was made precise in terms of quantum Lévy processes when U (d) was replaced by a noncom- mutative ∗ -bialgebra that generalizes the coeffi- cient algebra of U (d). The theory of quantum Lévy processes has been extensively developed by M. Schürmann and U. Franz in Greifswald, Ger- many (see [13] or Chapter 7 of [9]). In particular, all quantum Lévy processes are equivalent to so- lutions of quantum stochastic differential equations driven by creation, conservation, and annihilation processes acting in a suitable Fock space.
Show more

13 Read more

Statistics & Probability theory Dr Nisha 20.01.15.ppsx

Statistics & Probability theory Dr Nisha 20.01.15.ppsx

Classification of Quantitative Techniques QT QT Statistics Statistics Theoretica l Statistics Theoretica l Statistics Descripti ve Statistics Descripti ve Statistics Inductive Statist[r]

31 Read more

Quantum probability theory as a common framework for reasoning and similarity

Quantum probability theory as a common framework for reasoning and similarity

proposed that when assessing the rel- ative probability of statements about a hypothetical person, Linda, participants employ a process of similarity. Thus, the idea that similar or identical cognitive processes may underlie superficially dis- parate processes, like categorization and reasoning, is not new. What has been per- haps lacking is the development of specific models, which can be applied across dif- ferent areas. Our purpose is to outline our ideas regarding such a model for proba- bilistic reasoning and similarity. We do so in the context of recent work with cogni- tive models based on quantum probability (QP) theory.
Show more

5 Read more

Assessing schematic knowledge of introductory probability theory

Assessing schematic knowledge of introductory probability theory

knowledge becomes proceduralized, and that expertise increases with the acquisition of schemas (Chi et al., 1982). Quite involved and precise use of schematic knowledge is required to perform well in the classification task (Cooper & Sweller, 1987; Sweller, 1989). It is likely that students in this sample had not yet acquired a mature understanding across all aspects of the problem domain and that the relative level of knowledge acquisition in this sample was not as well developed as samples from other studies. The probability domain itself is known to be particularly difficult for students to master (Konold, 1995). Discovering the conditions that determine the association between classification of problem relatedness and problem solution is an area for further research.
Show more

39 Read more

Probability theory and application of item response theory

Probability theory and application of item response theory

One of the solutions in constructing good test that can measure the ability precisely is by using the item response theory (IRT) in building up the test. IRT models are mathematical fimc[r]

12 Read more

Show all 10000 documents...