A new approach to probability theory is presented with reference to statistics and **statistical** phy- sics. At the outset, it is recognized that the “average man” of a population and the “average particle” of a gas are only objects of thought, and not real entities which exist in nature. The concept of av- erage (man) is generalized as a new concept of represental (man) whose epistemological status is intermediate between those of the particular (the man) and the universal (a man). This new con- cept has become necessary as a result of emergence of statistics as a new branch of human know- ledge at the beginning of the nineteenth century. Probability is defined with reference to the re- presental. The concept of probability is the same in probability theory and in **physics**. But whereas in statistics the probabilities are estimated using random sequences, in **statistical** **physics** they are determined either by the laws of **physics** alone or by making use of the laws of probability also. Thus in **physics** we deal with probability at a more basic level than in statistics. This approach is free from most of the controversies we face at present in interpreting probability theory and quantum mechanics.

Show more
15 Read more

Complex physical, biological, and sociotehnical systems often display various phenom- ena that can’t be understood using traditional tools of single disciplines. We describe work on developing and applying theoretical methods to understand phenomena of this type, using **statistical** **physics**, networks, spectral graph theory, information the- ory, and geometry. Financial systems–being highly stochastic, with agents in a com- plex environment–o↵er a unique arena to develop and test new ways of thinking about complexity. We develop a framework for analyzing market dynamics motivated by linear response theory, and propose a model based on agent behavior that naturally incorporates external influences. We investigate central issues such as price dynamics, processing and incorporation of information, and how agent behavior influences sta- bility. We find that the mean field behavior of our model captures important aspects of return dynamics, and identify a stable-unstable regime transition depending on easily measurable model parameters. Our methods naturally connect external factors to internal market features and behaviors, and therefore address the crucial question of how system stability relates to agent behavior and external forces.

Show more
128 Read more

There is, be that as it may, one zone of utilization of **statistical** **physics** strategies, in which scientists have rather effectively associated themselves to the standard of research in exact back the little literature on multiracial models of benefit returns. The presentation of supposed multi- fractal models as another class of stochastic procedures for resource returns was essentially propelled by the discoveries of multi-scaling properties. Multi-scaling (frequently likewise indicated as multi-fractality itself) alludes to procedures or data which are portrayed by various scaling laws for various minutes. Summing up eq., these characterizing elements can be caught by reliance of the worldly scaling parameter on the correlated minute.

Show more
Social dynamics is the understanding of the move from an underlying disordered state to a design that showcases arrange at minimum halfway. Such moves possess large amounts of traditional **statistical** **physics**. It merits condensing a few imperative ideas and apparatuses utilized as a part of that specific situation, as they are pertinent additionally for the examination of social dynamics. We show them utilizing a paradigmatic case of request issue moves in **physics**, the one displayed by the Ising model for ferromagnetism. Past its importance as a **physics** show, the Ising ferromagnetism can be viewed as a straightforward model for supposition dynamics, with operators affected by the condition of the larger part of their interfacing accomplices.

Show more
11 Read more

T. Lux has given a review [35] on stochastic models borrowed from **statistical** **physics** using microscopic interactions between a larger number of traders to deduce macroscopic regularities of the market, independent of microscopic details. We will show in this paper the macroscopic behavior after the time averaging does not depend on particularities of each agent but on a set of general parameters such as interaction strength between them and the economic environment. On the same line, R. Cont [36] has shown various **statistical** properties of asset returns with emphasis on properties common to a wide variety of markets and instruments. This analysis invalidates many of the common **statistical** approaches used to study financial data sets. We need therefore a general model which does not use empirical rules in the course of calculation. This motivates our present work.

Show more
19 Read more

analysis using the tools of **statistical** **physics**. It is divided into two subsections, relating to two types of swarms: monolayer and multilayer swarms. Although both types have some common elements, mono- and multi-layer swarms are prepared differently in the lab and therefore constitute different biological manifestations of swarms. Comparing the two types sheds light on the impact the biological setup has on the swarm dynamics. Section “Theoretical aspects” shifts to the physicist point-of-view, in which bacteria are viewed as a statis- tical ensemble of particles with appropriate properties. The section briefly reviews some of the relevant theoret- ical results from the rapidly growing fields of collective motion and active matter. The implications of these the- ories to bacterial swarms are discussed. In section “An individual within the crowd”, we address the interesting problem of studying what an individual cell actually does within a swarm. Additional swarming related phenom- ena, including swarming bacterial species which were not discussed in previous sections, are surveyed in sec- tion “Further swarming related phenomena”. We con- clude with our personal view on interesting open directions for future research.

Show more
17 Read more

should take over some of the methods typical of the new types of thinking, which have specifically developed the application of statistics. These manners of thinking, reunited, should appear in a statistics book of a modern type, which should naturally include, in addition to **statistical** testing and decision-making, the thinking of **statistical** **physics**, but also the **physics** or the thinking in quantum mechanics. There can be found the first argument of the evaluative superiority of the thinking of **physics** as related to the thinking of economics or of statistics in the thinking of **statistical** **physics**. This argument represents the contribution made by Josiah Willard Gibbs (1839-1903), also called “the father of **statistical** **physics**”, author of the book titled Elementary Principles in **Statistical** Mechanics, published at Yale University in 1902. It is Gibbs who founded **statistical** mechanics, or **statistical** physic, by outstandingly simplifying the physicist’s own manner of thinking and working, although at the time there were fewer than 1,000 **physics** graduates in the whole world. By introducing a geometrical representation having the power of substituting the experimental referential, subsequently called the Gibbs space, which reduced the macroscopic world to the microscopic one, the father of **statistical** mechanics transformed the finite world of a very large number of particles (n ≅ 10 23

Show more
11 Read more

Abstract: This paper explores several types of income which have not been explored so far by authors who tackled income and wealth distribution using **Statistical** **Physics**. The main types of income we plan to analyze are income before redistribution (or gross income), income of retired people (or pensions), and income of active people (mostly wages). The distributions used to analyze income distributions are Fermi-Dirac distribution and polynomial distribution (as this is present in describing the behavior of dynamic systems in certain aspects). The data we utilize for our analysis are from France and the UK. We find that both distributions are robust in describing these varieties of income. The main finding we consider to be the applicability of these distributions to pensions, which are not regulated entirely by market mechanisms.

Show more
Many real-world problems in machine learning, signal processing, and communications assume that an unknown vector x is measured by a matrix A, resulting in a vector y = Ax +z, where z denotes the noise; we call this a single measurement vector (SMV) problem. Sometimes, multiple dependent vectors x ( j ) , j ∈ { 1, · · · , J } , are measured at the same time, forming the so-called multi-measurement vector (MMV) problem. Both SMV and MMV are linear models (LM’s), and the process of estimating the underlying vector(s) x from an LM given the matrices, noisy measurements, and knowledge of the noise statistics, is called a linear inverse problem. In some scenarios, the matrix A is stored in a single processor and this processor also records its measurements y; this is called centralized LM. In other scenarios, multiple sites are measuring the same underlying unknown vector x, where each site only possesses part of the matrix A; we call this multi-processor LM. Recently, due to an ever-increasing amount of data and ever-growing dimensions in LM’s, it has become more important to study large-scale linear inverse problems. In this dissertation, we take advantage of tools in **statistical** **physics** and information theory to advance the understanding of large-scale linear inverse problems. The intuition of the application of **statistical** **physics** to our problem is that **statistical** **physics** deals with large-scale problems, and we can make an analogy between an LM and a thermodynamic system [Tan02; GV05; Krz12a; Krz12b; MM09; BK15]. Therefore, we can apply **statistical** **physics** analysis tools as well as algorithmic tools into understanding large- scale LM’s and their corresponding linear inverse problems. In terms of information theory [ CT06 ] , although it was originally developed to characterize the theoretic limits of digital communication systems, information theory was later found to be rather useful in analyzing and understanding other inference problems. We use some of the concepts and ideas of information theory to understand the theoretic performance limits in various aspects of linear inverse problems.

Show more
128 Read more

For predicting crystal structures, especially before be- ing synthesized, the equations are needed to determine their discrete particle positions and their period vectors (cell edge vectors h = a, b, or c, forming a right-handed system). Since the particles (atoms, ions, electrons) in- side crystals always obey Newton’s second law or the Schrodinger equation, the only unknown is the equation for the period vectors, especially when crystals are under general external stress. It has been derived in the frame- work of Newtonian dynamics in recent years[1], which can be combined with quantum **physics** by further mod- eling. Here we will employ a new and concise approach based on the principles of **statistical** **physics** to rigorously derive it into a new form, then applicable to both classi- cal **physics** and quantum **physics** by itself. It also turned out to be the equation of state and the mechanical equi- librium condition for crystals under external stress and temperature. Later, the new form and the previously derived one will be shown to verify each other.

Show more
10 Read more

Learning has always been a central topic in several disciplines, such as philosophy or psychology. However, thanks to the development of new tools for the analysis of complex systems, **statistical** **physics** has begun to provide its own contributions to the theory of learning from examples [20]. Despite the fact that it has only focused on quite simple models up to now, it has anyway proposed innovative points of view. Indeed, **statistical** **physics** has played a great role in the quantitative characterization of learning scenarios in simple learning systems, especially by pointing out the conditions under which good learning performances can be achieved. In this regard, great progress have been done by exploiting the strong analogy between learning and spin glass systems [21]. In this section we thus would like to briefly describe the approach of **statistical** **physics** to learning.

Show more
186 Read more

Many problems in computing and communication theory can be mapped to spin systems. For instance, error- correcting codes, in particular LDPC codes [16] and hard computational problems such as K-SAT [17] and graph- coloring [18, 19], can be mapped to diluted spin systems with random p-spin interactions and local fields. In the coding example, interactions are defined by the parity-check constraints, while the local fields are induced by the codeword and received message. In the **statistical** **physics** treatment, for mathematical convenience, the message bits {0, 1} and ’⊕’ operation are mapped onto spin values {+1, −1} and multiplication using the mapping x → (−1) x .

Show more
17 Read more

all 1600 spins nearly equally and thus corresponds to the magnetization order parameter. Thus, even without any prior physical knowledge, one can extract relevant order parame- ters using a simple PCA-based projection. PCA is widely employed in biological **physics** when working with high-dimensional data. Recently, a correspondence between PCA and Renormalization Group flows across the phase transition in the 2D Ising model [31] or in a general setting [32] has been proposed. In **statistical** **physics**, PCA has also found applica- tion in detecting phase transitions [33], e.g. in the XY model on frustrated triangular and union jack lattices [34]. It was also used to classify dislocation patterns in crystals [35, 36]. **Physics** has also inspired PCA-based algorithms to infer relevant features in unlabelled data [37].

Show more
132 Read more

Most of the modeling efforts developed in the **statistical** **physics** of complex systems [9] are relatively new to more humanities oriented communities. One of the key methodological aspect is that of identifying and defining the simplest (minimal) models (i.e., algorithmic procedures) which could lead to efficient communication systems. It is important to stress the need in this field of shared and general models to create a common framework where different disciplines could compare their approaches and discuss the results. Moreover, the simplicity of the modeling schemes may allow for discovering underlying universalities, i.e., realizing that, behind the details of each single model, there could be a level where the mathematical structure is similar. This implies, on its turn, the possibility to perform mapping with other known models and exploit the background of the already acquired knowledge for those models. In this respect, **statistical** **physics** brings an important added value.

Show more
33 Read more

, which is all that is known and considered in thermodynamics, then you would not know what microscopic state the system is in. In order for you to know the microscopic state of the system I would have to tell you the position and momentum etc., of every molecule in the vapour. This is illustrated in Fig. 1. As with the information that thermodynamics provides, N , V and U, you do not know what the microscopic state is, then you have to make a guess and this inherently involves probabilities. We have to use probabilities because we do not know the exact microscopic state. Indeed the number of microscopic states is enormous and the system goes from one to another very rapidly as the particles move around, as the momenta are changed by collisions etc.. This lack of knowledge of the exact microscopic state is why this course is called **statistical** **physics**.

Show more
19 Read more

More recently, new trends in the analysis of macroeconomic variables using **statistical** **Physics** distribution emerged. Thus, [7] and [8] used new types of distributions such as Fermi-Dirac and polynomial distribution. Also, new methods used for calculation of different values for the deciles of data were taken into account. Thus, apart from mean income and mean wealth, upper limit on income/wealth was a methodology used to calculate the income and wealth by using the highest value from the ones ranked increasingly in a decile. The term was used to the best of our knowledge by the national **statistical** body from Finland [9].

Show more
Although thermal fluctuations are absent, granular systems nonetheless exhibit large fluctu- ations (6). This is exemplified by the exponential distribution of contact forces in jammed (solid- like) granular assemblies (7, 8) and by large, intermittent stress fluctuations in quasistatic flows (9). The ubiquitous presence of fluctuations with well-defined distributions that seem to depend on a handful of macroscopic parameters (6) suggests that **statistical** ensembles could prove to be an important tool for predicting the emergent properties of granular materials. There are two aspects that need to be elucidated: (a) the establishment of the ensemble and (b) how to use the ensemble to calculate and predict the emergent behavior of granular materials. After introducing the history of athermal ensembles in Section 1, we discuss the special role of constraints in the enforcement of mechanical equilibrium and how this differs from thermal ensembles. In Section 2, we review the existing experimental and numerical tests of the ensembles, and in Section 3, we describe how to use the distributions of microstates in an ensemble to calculate collective properties. These same techniques can be extended to systems with slow dynamics, and examples of this are provided in Section 4. Finally, we close with a summary of open questions in the field (Section 5).

Show more
23 Read more

Random M atrix Theory was previously used in Nuclear **Physics** to study the **statistical** behaviour of energy levels of nuclear reactions [37]. According to quantum mechanics, the energy levels are given by the eigenvalues of a Hermitian operator, the Ham iltonian which was postulated to have independent random elements. However, analysis of the eigenvalues of real d ata showed deviations from the spectra of fully random matrices, thus indicating non-random properties, useful for an understanding of the interactions between nuclei. This approach is nowadays applied to the study of correlations of time series of returns in the stock market, where physicists try to find the non-random properties of the m atrix of correlations [38, 39, 40]. W ith the prediction of the eigensystem of a random m atrix, compared with the eigensystem of the m atrix of empirical d a ta of stocks, we can see eigenvalues far from the prediction spectrum, th a t have a lot of information about the m arket [41, 42], the index of the m arket, or the clustering in industrial sectors in markets. The index of the m arket can be calculated as the simple mean of the prices of all the stocks th a t belong to the m arket, or the weighted mean, where some stocks contribute more to the index, related with the size of the company. The industrial sectors can be different for different classifications, but normally the industrial sector indicates which kind of business the company is engaging in.

Show more
193 Read more

In this chapter we have demonstrated that wave functions obtained from quantum simulation studies can be used to calculate very accurate vibrational spectra of model molecular systems. When realistic potentials are used, the normal mode variational approach has significant problems due to the large anharmonicities and local mode methods are more useful. The approximations concerning correlations between the inter- and intramolecular degrees of freedom, which must be made to implement local mode methods, can be significant, particularly for systems like the hydrogen bonded molecular cluster. Quantum simulation calculations include the effects of these couplings exactly and when the projection technique described in this chapter is used very accurate intramolecular vibrational frequencies of complicated molecular systems may be obtained. The general approach may also be applied to study intermolecular motions but **statistical** fluctuations and specification of coordinates present significant problems in more complicated systems.

Show more
252 Read more

The solution of the MSM for an arbitrary multipole- multipole interaction potential has been exhibited by Blum 22 For the hard sphere plus imbedded dipole system, the MSM has 23 been com[r]

229 Read more