Given the budget constraint of intertemporal portfolio decision problems, there must be a positive relationship between future market returns and the propor- tion of wealth consumed at a given moment in time. In fact, the empirical success for the proxy of the consumption/wealth ratio proposed by Lettau and Ludvigson (2001a) is impressive. Simultaneously, empirical literature has shown the power of financial ratios such as the book/market and dividend yield in explaining future changes in returns. This article argues that these empiri- cal findings about predictability found for the log consumption/wealth and the log book/market ratios should not be understood as independent phenomena. In particular, we develop an expression similar to the equation based on the intertemporal budget constraint, which is based on a well known accounting principle that shows the theoretical ability of the book/market ratio to pre- dict returns. Moreover, we provide a theoretical expression suggesting that book/market and consumption/wealth ratios must be positive and contempora- neously related. Along this line, we report a high contemporaneous correlation between these two ratios in both the United States and Spain.
In this paper we investigate the price effects of trading intensity. Extending on the Madhavan et al. (1997) model, we split the intensity effect into liquidity and information effects. We provide a measure of market quality that is the ratio of the covariance bias to the variance bias. Analyzing about 6 years of tick by tick data, we find that the bid-ask spread in a pure limit order bookmarket contains a risk component associated with managing the time to trade, and this component accounts for roughly 19.6% of the implied bid-ask spread. Extending our model to investigate intraday patterns, we find that the adverse selection cost exhibits a U-shaped pattern reflecting uncertainty at market openings in the Helsinki Stock Exchange (HEX) and in the New York Stock Exchange (NYSE). The results emphasize the importance of managing time in limit order book markets.
The major objective of this study is to analyze in a limit order bookmarket the determinants of the buy intensity, the sell intensity, and the resulting buy-sell pressure. The main contribution and novel feature of this paper is to study these issues in a dynamic continuous-time framework. Modelling the buy and sell arrival process with a bivariate dynamic intensity model provides an estimate of the simultaneous buy and sell intensity in each instant of time. The major advantage of this approach is that no aggregation over time is required and the method can account for all limit order arrivals between consecutive transactions. Hence, the proposed framework enables continuous-time modelling of a traders’ decision of when to trade and on which side of the market to trade while taking into account any changes in the limit order book and incorporating the (multivariate) dynamics of the buy and sell arrival processes. This approach is a very natural and powerful way to study order book dynamics on a completely disaggregated level and is superior to the (non-dynamic) methods for qualitative dependent variables recently used by Al-Suhaibani and Kryzanowski (2000), Griffiths, Smith, Turnbull, and White (2000), Hollifield, Miller, Sand˚ as, and Slive (2002) or Ranaldo (2004). Moreover, the intensity framework allows the measurement of imbalances between both sides of the market on the lowest aggregation level. In this context, the difference between the estimated simultaneous buy and sell intensity yields a readily interpretable measure of the permanent buy-sell pressure in the market. Such a measure allows the characterization of market periods in which traders have a strong preference for one-sided trading.
Received 10.01.2017, received in revised form 25.02.2018, accepted 12.03.2018
The purpose of this article is to review trends concerning the presence of contemporary Russian writers on the bookmarket in Germany. The interest of the German readers in the Russian literature dates back to the 19th century. However, over the years the list of books has significantly changed. In addition to the attention to the classical literature, there appeared interest in the works of immigrants, dissidents and writers banned in the Soviet Union. At the present time Russian classics is still being published. Moreover, German readers are very keen on reading both writers who are successful in Russia and the authors constantly living abroad, including German-speaking countries.
Two incentives exist for both hardware and software pricing. For hardware, as in the classic razor-and-blades strategy, firms can set low hardware prices to “invest” and penetrate the market so that they can earn from subsequent software sales. Firms can also exploit consumer heterogeneity and “harvest” on the hardware by conducting IPD; they can open with high prices to skim high-valuation consumers and then cut prices later to appeal to low-valuation consumers. For software, firms have incentives to “invest” in new consumers and “harvest” on existing consumers. As the consumer mix evolves over time, it is potentially beneficial for firms to dynamically price software as well. Furthermore, hardware and software pricing are linked, as software price affects the attractiveness of hardware and the hardware price affects the number of software users. In practice, firms either conduct IPD separately for hardware and software without fully exploiting the link between them (e.g., consoles and video games) or conduct IPD only for hardware, keeping the software price stable (e.g., Amazon Kindle and e-books). The possibility of joint IPD on both hardware and software remains understudied by both researchers and practitioners.
Rosenberg et al. (1985) test relationship between stocks return and BE/ME in US market. For this test they used 1,400 of the largest U.S. companies from the New York Stock Exchange (NYSE), and a few from other exchanges like the AMEX and NASDAQ in the COMPUSTAT database during the period between from January 1973–September 1984. They found a positive relation between average stock returns and BE/ME (book value of common equity (BE) / market value of common equity (ME)). Higher return earned by the stocks which are having higher value of BE/ME and lower return earned by stocks which are having lower value of BE/ME than control for betas in US market. Similarly, Chan et al. (1991) examined the related cross-sectional differences in returns on a Japanese stocks to four explanatory variables for returns were tested: size, book/market ratio, earnings/price ratio, cash flow/price ratio. They applied alternative statistical specifications and various estimation methods on Stocks data taken from the Tokyo Stock Exchange (TSE) during the period between from January 1971– December 1988. They found a greater impact and significant positive relationship between the expected return and Book/market and cash flow/price ratios. However, after controlling for other variable, the impact of the Earning / Price ratio was insignificant.
Holding all parameters fixed for a given firm, an increase in the cost of capital should lower its market value. Since book values are generally not affected by the cost of capital, one might conjecture that a higher cost of capital translates ceteris paribus into a lower M-to-B ratio. Yet such reasoning is likely to be misleading. For instance, for firms operating in a competitive industry, a higher cost of capital would translate into higher revenue in the future in order for this firm and others (whose cost of capital presumably also increased) to earn zero economic profits. In such settings, the question then becomes how the Conservatism Correction factor, which coincides with the M-to-B ratio for competitive firms, is affected by changes in the cost of capital. Intuitively, it makes sense that this correction factor is increasing in the cost of capital, because incumbent assets that were recorded at their effective replacement value now become more valuable. We demonstrate this analytically and obtain empirical support for our hypothesis by showing that the estimated Conservatism Correction factor has a positive association with the cost of capital.
environment, where consumers buy both ebooks and paperbacks on Amazon.com, is a good setting for studying substitution pattern. Amazon.com lists print book and its ebook version side by side on product pages, so cannibalization is most likely to occur as paperback buyers can easily become aware of the competing ebook oers. I combine three unique data sets. The rst is an individual level panel data of consumer book purchase history from January 2011 to December 2011 gathered by comScore. It is based on a random sample of more than 2 million Internet users in the United States. There are 2922 households buying 9570 book titles on 4978 shopping trips. The data have information on the time they buy a book, the format, the price, the quantity and household demographics like family income and zip code. The second data set is the publicly available book characteristics I collect from the Amazon website. For each title ever purchased in any format, I collect data on the price, rating, number of comments, ranking, genre, publishing date, and other book characteristics (e.g. ISBN, publisher, author) of both paperback and ebook formats. In total, I have 15,810 pieces of title-format information. The third data set is the device adoption record gathered by comScore. It is an individual level panel data on Kindle purchase in year 2007-2011.
Palavras-chave: Anomalias; Efeito Tamanho; Efeito Book-to-Market; Abordagem
Abstract: This paper aims to analyze the size and book-to-market (B/M) effects in the Brazilian capital market from the fundamentalist approach perspective. Specifically, we sought to verify whether alternative measures of company size play the same role of market value, and if alternative measures of future cash flows play the same role of B/M in the explanation of stock returns. The population included all the firms listed at B3 (Brasil, Bolsa, Balcão) from December 1995 to December 2016. The data were analyzed based on the methodology proposed by Ohlson and Bilinski (2015) and estimated by quantile regression. The results indicate the existence of the size effect in the Brazilian capital market, since the four measures used were negative and statistically related to the stock returns. This fact contrasts the perspective of the fundamentalist approach, once the size effect, in the analyzed period, has as a characteristic a market anomaly rather than as an intrinsic relationship between the proxy for market value and stock return. As for the book-to-market effect, the results also do not support the arguments of the fundamentalist approach, once the book value not reinforced the market value, so not representing the expected variation in future cash flows of Brazilian shares. That is, it has not been possible to identify which alternative proxies for future cash flows form quotients with the market value as good as the B/M in the explanation of stock returns.
These data levels give various depths of information about the market, but they can be complicated to monitor because of the enormous volume of U.S. financial trades per day, as seen in Figure 5. Trade data itself represents only about 1 percent of total order flow data, which contains new orders, cancel orders, modify orders, and trade data. The message traffic of order flow data is about 100 times larger than trade data alone. (CME Group, 2013). Because of the large quantities of data, users — market participants, exchanges, and regulators — typically will only use a subset of the total data available to them. For example regulators generally use a combination of Level 2 and Level 3 data, with the addition of some private account information when investigating market events. In addition, much of this type data is not stored because higher level data is extensive and complex, requiring tremendous amounts of computational processing capacity, and such data collection and storage is costly. Consequently, large quantities of useful information that might prove helpful to regulators’ study of unexpected or destabilizing market events, such as flash crashes, are not available.
ii) using algorithms (“black boxes”) to submit Orders, quotes or trade reports to the trading system. The Exchange recognises that to mitigate risk, Member Firms using algorithms may wish to check those trading strategies by submitting trial Orders or quotes. In these circumstances the Exchange will not generally consider the Orders or quotes submitted to the trading system as prohibited testing under this rule. Member Firms using algorithms are however reminded of their obligations under rules 1.1.9 (adequate systems and controls) and 1.3.1 (misleading acts, conduct and prohibited practices) to maintain the integrity of the market.
The liquidity resilience notion introduced in [Panayi et al., 2014] is a member of the latter category, and was the first to explicitly define resilience in terms of any of the possible liquidity measures and in terms of a liquidity threshold at which resilience is measured against. Hence, the concept of resilience of the liquidity measure was converted to a notion of relative resilience for a given operating liquidity threshold that a user may specify. This was important, since as discussed in Lipton et al. , there are several different market participants in modern electronic exchanges and their liquidity demands and requirements differ depending on their mode of operation. In particular, this will mean that they would likely care about relative liquidity resilience characteristics at different liquidity thresholds, which may also depend on the type of liquidity measure being considered. All such characteristics are then easily accommodated by the framework developed in Panayi et al. , where the central concept is that liquidity is considered ‘replenished’ by the market or market maker when a (user-specified) liquidity measure returns to a (again, user- specified) threshold. In a financial market where liquidity is resilient, one would expect that the time required for this liquidity replenishment would be low. This replenishment time was captured by Panayi et al.  through the idea of the threshold exceedance duration (TED):
study notwithstanding, empirical evidence presented by KSS continues to pose a challenge to the consistency of the B/M effect. We now review that evidence. First, consider those firms that are on the CRSP tapes but are not on COMPUSTAT. If one looks at the lowest decile of such firms, in terms of market capitalization, one finds that their deviation (alpha) from the Fama and French (1993) three-factor (market, size, and B/M factors) model is quite large, about -7%. These are extremely small and volatile firms, however, and the t-statistics for the significance of this deviation are only -1.65 and -2.18 when the lowest decile is split into two subportfolios. This finding may be related to Loughran’s (1997) observations concerning the very low returns on small growth firms.
This paper provides both an ex ante and ex post quantitative assessment of the impact of vertical integration on market outcomes in Australian National Electricity Market (NEM). Until April 2004, there was very little vertical integration in the NEM, although a few retailers owned peaking generation plants and some generators have retail arms aimed at large industrial users. In 2003, the largest energy retailer in the state of Victoria, Australia Gas Light Company, known as AGL, proposed to acquire a stake (as part of a consortium) in the largest base-load generator in the Victoria, Loy Yang A (LYA). Concerned about potential anti-competitive harm as well as the idea that this might be a first step in a wave of vertical acquisitions, the competition authority, the Australian Competition and Consumer Commission (ACCC), challenged the acquisition. The merging parties successfully overcame this challenge and effective April 1, 2004 control of LYA was transferred to the consortium in which AGL held a 35 percent stake. A pre-condition to the acquisition was that AGL would give Court enforceable undertakings that it would not be involved in the day-to-day bidding and contract trading of LYA with representation only at the Board of Directors level. That is, the acquisition would be a passive one.
Conservatism can be defined as a tendency of accounting to require a higher degree of verifiability for recog- nizing good news compared to verifiability for recognizing bad news in terms of flexibility of generally ac- cepted accounting principles  . Conservatism depends on many factors, including a contract between the business and other stakeholders, the possible formation of lawsuits, efforts to reduce or postpone tax, public in- terest, enhanced quality of financial information, reduced political costs, reduced information asymmetry and level of competition. Competitive pressures will lead to increased conservatism. Companies use conservatism in their financial reports to prevent the entry of new competitors into the industry and to prevent the disclosure of confidential information to existing competitors  . In fact, conservatism results from ambiguity. Accoun- tants use conservatism when they encounter ambiguity. The purpose of conservatism is to prevent incorrect de- cisions by investors and other users of financial statements . Demand for conservatism results from various sources. According to Basu (1997), debtors and other creditors demand more timely information about bad news compared to good news. The tendency to accelerate the recognition of losses and postpone the recognition of profits represents the conservative approach to profits and losses. Accordingly, Basu (1997) proposed the asymmetric timeliness of earnings. There are four measures known for conservatism, including asymmetric timeliness in recognition of profits and losses; accrual-based conservatism; negative skeins of profit distribution and cash flows; and conservatism based on market value. Any network in which goods and services are pur- chased is called market. Competitive market refers to a market where many buyers and many informed sellers act ineffectively on price level . In a competitive market, companies have to use those methods of production with the lowest cost and the highest efficiency to provide the consumer with a better quality and lower price. Under normal conditions, competitive market involves competition for preventing the entry of new competitors to the industry and competition between firms in an industry. Usually, the most important measure of perfor- mance is rate of return, which contains information content for investors and it is used to evaluate performance. Reduced rate of return is a warning for the firm, representing poor performance. Rate of return contains consi- derable information content, because performance evaluation based on market value reflects information of in- vestor. Return is a driving force which motivates and rewards investors.
Partitioning firms according to their bankruptcy risks The tradeoff theory argues that firms with higher market-to-book ratios are more likely to issue equity because they realize new growth opportunities and thus issue equity to downwardly adjust the target leverage ratios. Such an argument only applies to firms with leverage ratios reasonably close to their targets. Specifically, imagine a high bankruptcy risk and over-levered firm issuing equity. Two alternative interpretations may be offered: (1) the firm wants to raise funds and reduce the leverage ratio (toward the target), and (2) the firm downwardly adjusts the target leverage ratio due to new opportunities. Given that the firm has a high bankruptcy risk and is over-levered, the former explanation is probably more reasonable than the latter. 8 Similarly, imagine a low bankruptcy risk and under-levered firm issuing equity. Since the firm is already under-levered, if the target leverage ratio falls due to the new investment opportunity, the firm should do nothing (instead of issuing equity) and let the target ratio fall naturally to the current leverage ratio. Hence, if the tradeoff theory is pertinent, we should find that the market-to-book ratio is critical only for firms around the target, but not vital for firms away from the target.
In this subsection we consider the stock price momentum defined as the divi- dend-adjusted six-month past stock return. Previous studies have documented that stock prices show short-term persistence. To our knowledge, the existence of this phenomenon has not yet been tested for validity in the Swedish stock market. Ta- bles 3 and 4 show that, when used as the only regressor (specification 4), the slope coefficient for momentum does indeed have a positive sign, which corresponds with expectations. Nevertheless, it is only significant for the Winsorized sample (t-statistic 2.642). Thus, there is only limited evidence to support Hypothesis 4 contingent on the treatment of outliers. This seems to indicate that past momentum does indeed predict future stock returns, but it is unable to capture extreme stock return perfor- mances, which seems to be consistent with the theoretical understanding of the con- cept. Nevertheless, specification 6 shows that momentum loses its significance when used in combination with the other risk factors (with a t-statistic of 1.002 for the full sample and 1.357 for the Winsorized sample). This casts doubt on the generality of the superior performance of the four-factor model.
We have made the case that tangible and intangible assets should be treated symmetrically for accounting purposes, but the case of R&D investment is actually more complicated because of the long gestation lags noted by DiMasi (2008) and because of the greater uncertainty about the expected income stream than tangible investment. Berndt et. al. (2006) show that only 40 percent of drugs that start the pre-clinical process make it to the clinical stage (the Phase I, II, and III trials of the FDA regulatory process), and only 8 percent of the drugs make it into the market place. Given these odds, much of PHARMA’s $7.2 million R&D investment will disappear before it earns a return. The very high degree of uncertainty about outcomes, particularly in the early stages of a research project, might be better characterized by a Schumpeterian or “animal spirits” model of the investment decision, with only a tenuous link between ex ante cost and ex post returns.
Licensed under Creative Common Page 394 product falls. The reason could be the inability of the purchasing power of customers to meet the prices charge for goods and services. Lending rate is the price paid to borrow a debt capital or the amount charged, expressed as a percentage of principal paid by a borrower to a lender for the use of such funds (Brigham & Houston, 2004). It constitutes the base from which various financial institutions lend to the final customer and as such vary. A firm’s performance is the general measure of its overall financial health over a given period of time. High performance reflects management effectiveness and efficiency in making use of company’s resources and this also contributes to the country’s economy at large. Financial performance of a firm can be measured using so many performance measures such as Return on Investment, Earnings per Share, Return on Assets, Return on Equity and Tobin’s Q. This research employed earnings per share as a proxy for the book value financial performance and Tobin’s Q as market value performance measures.
 Fama and French (1996b) and Fama and French (1996a) give four arguments: the premium of the financial distress is not special to a particular sample since it is checked for different periods. It was also the subject of many studies made on international database. The size, book to market equity, earning to price and cash flow ratios, indicators of expected incomes (Ball 1978), have a great utility to test models of asset pricing like the CAPM. And in fourth point, the limited number of the anomalies excludes the assumption ofdata-mining.