Instrumental valuetheory in application, as proposed by Tool, has been criticized as ambiguous (Samuels 1995, 99), and as little more than value judgment based on two desirable conditions, when other equally desirable conditions might have been chosen instead (Gordon 1990). In turn, these criticisms have been refuted by Tool (1990) with the argument that the criteria of instrumental valuetheory are neither absolute nor eternal; they are normative objects of inquiry. Tool regarded the instrumental value principle “a product of inquiry and subject to revision or abandonment by further inquiry. It has no standing except as a construct for inquiry and as a tool for analysis and judgment. Its relevance is repeatedly tested by its incorporation in, and guidance of, inquiry and its use as a judgmental standard in problem solving . . . It is derived exclusively from the experience continuum of people and . . . it articulates what often has historically been meant by progress, reform, or betterment” (1110). Supporting this view, Bush has noted that a particular activity or behavior may have either instrumental or ceremonial significance, or sometimes it may possess both ceremonial and instrumental significance (Bush 1987).
Existing approaches for VaR estimation may be classified into three approaches. First, the non-parametric historical simulation (HS) approach provides simple empirical quantiles based on the available past data. Second, fully parametric models approach based on an econometric model for volatility dynamics and the assumption of conditional normality (J.P. Morgan’s Riskmetrics and most models from the GARCH family) describe the entire distribution of returns including possible volatility dynamic. Third, extreme valuetheory approach parametrically models only the tails of the return distribution. Since VaR estimations are only related to the tails of a probability distribution, techniques from EVT may be particularly effective. Appealing aspects of EVT have made convincing arguments for its use in calculating VaR and for risk management in general.
In the literature, EVT is compared to GARCH based parametric value-at-risk estimation models. Yamai and Yossiba (2005) find out the empirical fact that value-at-risk models do not give the proper risk estimation in volatile market conditions while the EVT has more successful prediction performance. Kuester et al. (2005). Acerbi (2002), Inui and Kijima (2005) and Martins and Yao (2006) also empirically show that EVT has superior in risk estimation with financial time series. By using more than 30 years of the daily return data on the NASDAQ Composite Index, Kuester et al. (2005) compare the out-of-sample performance of value-at-risk models and extreme valuetheory. They state that a hybrid method, combining a heavy-tailed GARCH filter with an extreme valuetheory-based approach, performs best overall. Extremes in returns are observed in time series data from hedge funds and emerging markets where high volatility and unstable money flows occurs. In the literature, we point out that the empirical evidence on the EVT is generally based on the data from hedge funds and emerging financial markets. Amin and Kat (2003) empirically show that while hedge funds combine well with stocks and bonds in the mean-variance framework, this is no longer the case when skewness is considered. By using hedge funds data, Liang and Park (2007) empirically show that EVT is able to foresight the fat-tails in returns especially in high volatility in negative direction. Blum
We must indicate first that Marx's surplus valuetheory does not explain M but only that the commodities sold, C', have a greater value than the commodities bought by capitalists., C-. According principles of valuetheory section the difference between and has to be traced to the expended to produce them. For Marx, at this stage at least, the capitalist character of the production process does not change the law of value. The point deserves, however, a special explanation because the cost of production C now encompasses an element absent from the theory of value (the payment of wages) and because the difference is appropriated by non-labourers. Moreover, this point presents a problem only because the appropriation of surplus value is realized on the market (and not in a despotic way).
seventies under the banner of the ‘inquiry’. It will be argued here that such approaches are deficient where the study of the theory of value is concerned. Whilst providing valuable insights into the quotidian conditions of work in contemporary capitalism, and compelling evidence as to the veracity of the Marxist concept of exploitation, such examples as those presented by the Worker’s Inquiry tradition bear only the slightest proximity to the conceptual framework of the theory of value, with its explanation of how individual labours are rendered social by the system of commodity exchange. These examples suggest that instances of class conflict and domination provide a far more observable set of phenomena for research than do the categories of Marx’s theory of value. The theory of value and its attendant categories (such as abstract labour) are only ever at best implicit in such research, but ‘rarely is it explicitly incorporated into the conceptualisation of the problem’ (Wright 1981a: 65, emphasis added). In light of this, this paper is an attempt to explore how the theory of value can be conceptualised as a problem for social research to investigate. This is principally a question of what might be the appropriate object of research for an empirical study of value, one which demands what might be called a ‘social’ inquiry rather than a ‘worker’s’ one per se. We will first outline in brief the conception of valuetheory henceforth utilised.
Longin, F., Pagliardi, G., 2016. Tail relation between return and volume in the US stock market: An analysis based on extreme valuetheory. Economics Letters, 145, 252-254. Nadarajah, S., Chu, J., 2017. On the inefficiency of Bitcoin. Economics Letters, 150, 6-9. Osterrieder, J., Lorenz, J., 2017. A statistical risk assessment of Bitcoin and its extreme tail
This insight provides a tremendous expansion of the range of phenomena that can be explained within Marx’s value framework. Marxist valuetheory, sitting on its Procrustean simultaneist bed, has been obliged to ignore all real economic phenomena, where prices differ systematically and regularly from values: where the market reigns, these theorists of the market actually have nothing to say, since they discuss only the fictitious “underlying values” of Sweezy, the “long run” prices of Kurz and Salvadori or the “93 per cent accurate” vertically integrated labor coefficients of Anwar Shaikh. Marx—the supreme analyst of money, it is often forgotten—provides a means to lay bare the underlying social significance of all sums of money by expressing them as quantities of labor time. The correspondence is exact, not fuzzy. His system is a guide to a real, not a fictional capitalism.
E arned valuetheory or earned value management (EVM) is a standardized method used to monitor and control projects. It introduces an objective criterion about the status of projects through measures for deviations in costs and schedule. It also permits a quick evaluation of the status of projects with regard to execution timelines, costs, and tasks. EVM can be seen as a unique and simple system that integrates multiple evaluations into a unique reporting system. Through this method, it is possible to monitor and control, in a homogeneous way and using the same methodology, different projects with different timescales, different volumes, or different resource needs.
Danielsson and De Vries (2000) compare the J.P Morgan RiskMetrics VaR technique with HS and their own semi-parametric method. This method uses the empirical distribution for smaller risks and extreme valuetheory for the largest risks. A window period of 1500 days of return data is used and they find that at low probability RiskMetrics under predicts the VaR while HS over predicts the VaR. They conclude that their semi-parametric method is more accurate than the other two methods. McNeil and Frey (2000) combine the fitting of a GARCH-type model, to estimate the current volatility, with EVT, to estimate the tail of the innovation distribution of the GARCH model. They develop a two-step method (discussed in Section 2.5.3) for calculating a conditional EVT-VaR measure which they test on the Standard and Poor’s and DAX index. An AR(1)-GARCH(1,1) model with normal innovations is used to model the volatility and then a GPD is fitted to the tails of the extracted standardized residuals. A moving window period of 1000 days is used and they test at the 0.95, 0.99 and 0.995 quantiles. A simulation study is conducted to determine the threshold choice for use in their two-step method. It is determined that the choice of a threshold equal to100 (or 10% of the window size) is optimal. They find that their procedure gives better results than those methods which ignore the heavy tails of the innovations or the stochastic nature of the volatility.
The growth of the risk management industry has a long history as far back as the early 1970s following the increased instability of financial markets. For example: the fixed exchange rate system broke down in 1971, which led to flexible and volatile exchange rates. The Russian default in August 1998 sparked a global financial crisis that culminated in the near failure of a big hedge fund, Long Term Capital Asset Management. The September 11, 2001, terrorist attack, in the United States, destroyed the World Trade Center in New York, disrupting the financial markets for six days. In addition to the unspeakable human cost, the U.S. stock market lost $1.7 trillion in value (Jorion, 2007). These kind of events are very difficult to predict and plan for, however extremely destructive when they occur (Malz, 2011).
The responses to Question 4 indicate that time is an important factor in establishing the value of a gaming experience. One half expressed time value in terms of the level of difficulty or ease in basic and complex gam- ing duties/sequences/tasks. These respondents consistently complained about “boring tasks” or if certain tasks “took too long”. This indicates that time value is tied to both gamer skill and to game design. The second half of respondents commented on a very different phenomena, what I call perceived time, where time value was high if the time spent playing seemed to pass more quickly than in the real. Respondents expressed this situation in terms that usually degraded the slow passage of time in the real world, as well as how certain real world pursuits (particularly work) seemed to pass more slowly than other real world pursuits (like “standing in line at the bank”. Many likened gaming to vacation or travel time, where time seemed to pleasurably stand still. The relational quality of time value in gaming then seems to both be based on time within the game itself and time in the real world. This may account for the popularity of certain games (Everquest and World of Warcraft were mentioned specifically 34 times) in the respondents’ eyes; each of these games provided a high level of ‘disappearance’ of time and featured very little repetition and quotidian tasking.
In so far as the value-form of the commodity, and in particular its most developed by form in is of co-operative activity which the coexpression, money, necessitated a itself lacks inten[r]
EVT : extremes from a very large domain of stochastic processes follow one of the three types: Gumbel , Frechet/Pareto , or Weibull. Only those three types characterize the behavior [r]
Marshall Sahlins’ essay in this issue also makes a strong claim about anthro- pology’s contribution to valuetheory and, rather than using “guerrilla tactics,” he engages in a full-blown attack on the way value has been theorized in modern, neoclassical economics. His strategic approach is organized along several lines of assault. First, he demonstrates how the general claim by economists that the economy should be treated as a separate system is, in fact, impossible to sustain, as the impact of so-called “exogenous factors” is a necessary part of any economic explanation. Far from being a self-regulating system, the market is only one way of objectifying the cultural-historical order. Thus, by banishing culture as “exogenous” to the market, economists also banish a conceptual apparatus, which would actual- ly be able to explain economic practices. Furthermore, Sahlins effectively demo- lishes the much-beloved figure of economic thinking—Homo economicus—the rational, calculating, and maximizing individual, stripped of all actual human char- acteristics. Sahlins ventures to claim that Homo economicus, as a matter of historical fact, is a fiction, poignantly asserting that the “self is not the sole end of an individual’s existence any more than it is the exclusive means.” (2013: 168). Sahlins then gives ethnographic substance to his claim by giving examples of parents’ love for their children “as other selves of themselves” and a wealth of other ethnographic examples concerning kinship relations across the globe. His point is to show that material interest and agency, quite evidently, are inherent in relationships rather than in individuals. After destroying economics’ blind beliefs in the market, rational choice, and the cult of the individual, Sahlins proceeds to build up an all-inclusive anthropological approach to value, which he calls the “political economy of alterity.” Central to his approach is the ethnographic observ- ation that, cross-culturally, material goods of the highest values originate from the outside, culturally defined, and in particular from transcendent cosmic realms. As cause for this alterity of the supreme values, Sahlins points to the universal human condition of finitude: key factors of human life—health, prosperity, and death—are beyond the powers of human agency; therefore, human societies need to be linked to suprahuman powers outside themselves. The value that originates from beyond is subsequently appropriated in material goods—thus linking the creation of so- called “material value” to the cosmological imagination.
We would consider Marx of little relevance today if events had indeed passed him and his ideas by. But they have not. Rather Marx’s account of capitalism has been, until recently, subject to an entirely undeserved obsolescence, one which the crisis appears to be rapidly reversing if the soaring sales of Capital are evidence. Moreover, the traditional grounds on which Marxists choose to ignore this account have been theoretically disproven by scholars working with the Temporal Single System Interpretation (TSSI) of Marx’s valuetheory as explained in Freeman (2010b) in the pages of this journal. Other interrelated causes of crisis to which Marx drew attention, such as the constraints which capitalist accumulation places on demand, and his refutation of Say’s law, are more often than not discounted with no
formalization of the GT. Great minds like even Hicks tried to provide concrete boundaries, mathematical expositions, algebraic and geometric, but faced immense difficulties. Keynes‟ letter to Hicks‟ article on „Mr. K eynes and the Classics‟ held a mild criticism though it had a friendly tone: at one time I tried the equations, as you have done, with I (Income) in all of them. The objection to this is that it over-emphasizes current income. In the case of inducement to invest, expected income for the period of investment is the relevant variable. Keynes‟ criticism clearly pointed towards the IS-LM model that Hicks had developed and claimed that it was a true exposition of the GT. The result has been that the elementary teaching of Keynesian economics has been a victim of IS-LM and related diagrams and algebra. It is tragic that Keynes made no public protest when they began to appear 77 . Also, as John Robinson put it, modern teaching has been confused by Hicks‘ attempt to r educe the GT to a version of static equilibrium with the formula IS/LM. Hicks has now repented and changed his name from J.R. to John, but it will take a long time for the effects of his teaching to wear off. Of late, in 1973, Hicks has pointed out however that, the General Theory […..] provides a model on which the academic economists can comfortable perform their accustomed tricks. Haven‘t they just? With ISLM I myself fell into the trap. All said, the GT still awaits a more formalization of the conjectures pointed out by Keynes. Till date, the general theory stands as a badly written book. In his extreme hurry to bring out his propositions to the public, Keynes completely forgot and lost sight of the fact that what was going to come out was a strong integration of monetary and valuetheory. But many economists of his times believe that Keynes had a very little understanding of microeconomic tools. Though he made significant contributions to these through his index number
There are two primary constructs defined by Victor Turner in his early publications regarding pilgrimage. The first construct is liminality, which Turner defines as an entity that is “neither here nor there; they are betwixt and between the positions assigned and arrayed by law, custom, and convention” (1969, p. 95). Individuals in this category are viewed as being between two sectors. The pilgrimage represents (literally and figuratively) the process by which the two sectors are linked. The second key construct in Turner’s theory is communitas, which is the Latin form of community. Turner selected the Latin term rather than the English term community to “distinguish this modality of social relationship from an area of common living” (p. 96). Moreover, Turner further elucidated his conceptualization of communitas by stating, “Certain fixed offices in tribal societies have many sacred attributes; indeed, every social position has some sacred characteristics. But this ‘sacred’ component is acquired by the incumbents of positions during the rites of passage, through which they changed
ent argumentative modalities are incommensurable: a term’s mean- ing within one modality may not be fully translatable into the terms used in another modality. As a consequence, there is simply no way to fully justify one type of constitutional assertion—say a “doctrinal” assertion—in terms of another modality; and thus a conflict between two or more modalities cannot be resolved in a practically justifiable way. This lack of a “trans-modal” algorithm has bothered many theo- reticians (and perhaps limited Bobbitt’s own influence) because scholars and advocates tend to want to speak normatively and gener- ally try to provide “right” answers. I agree with Bobbitt that no such algorithm exists, but I hope that—in drawing attention to Thomas Kuhn’s account of value judgments and theory choices—this Article has shed some new light on the processes by which we choose between incommensurable interpretive approaches in close and difficult cases. Kuhn, like Wittgenstein in language and Bobbitt in law, under- stood that competing scientific paradigms are ultimately incommen- surable. The very terms used in one paradigm refer to a different network of concepts in another, and so the decision to adopt a new paradigm in terms that exist in an older one cannot be justified. Kuhn’s critics responded with fears that his account undermined the basic “rationality” of science: If a scientist must choose to adopt a new paradigm before she has the conceptual apparatus necessary to justify that choice, it appears the choice itself cannot arise from the application of neutral or “objective” principles. There is, in other words, no algorithmic way to claim that a paradigm choice is ulti- mately “right” or “wrong”—and science appears ultimately to be a subjective kind of pursuit. Kuhn pushed back, however, and argued objective choice “criteria” do inform—if not determine—scientific the- ory choices. These criteria are objective in that they are broadly shared and form something like a “shared canon” that scientists must refer to in justifying their theory choices. The lack of a universal choice algorithm means only that some element of the final decision remains personal and subjective. Thus, for Kuhn, the emphasis given a particular criterion in a particular context is open to individual value judgment, and importantly, it is incumbent on the individual scientist to explain this judgment when justifying her final theory choice.
initial chapters of Capital, the value of living labour was the money wage, a non-allocated amount of purchasing power, and the endowment of variable capital was given in money terms. 17 In NI there is no duality between values and prices. The system is a single one. Everything is reckoned in terms of money and attention is focused on net output, instead than on gross output. Commodities values are interpreted as employment contents. The monetary productivity of labour is used as numéraire. The value of variable capital is the money value of labour-power, expressed by the share of wages in national income, and there is no need to transform values into prices. The labour theory of value is retained as explanation of the origin of value, though not of the relative prices of commodities. Wages are valued at the production prices of the wage goods bought by workers and the values of commodities are interpreted as shares of employment per unit of output, that is as ‘normalized’ production prices. This allows to pass from
The business block holds the cells that operate behind the sales environment - including supply chain, peripheral suppliers, operational and information aspects, economic contribution me[r]