Top PDF An Ordinal Approach to Risk Measurement

An Ordinal Approach to Risk Measurement

An Ordinal Approach to Risk Measurement

In order to unify these settings, we take completely distributive lattices as underlying universes, and consider an important class of aggregation functionals considered on these structures, namely, the class of Sugeno integrals. This setting has several appealing aspects, for it provides sufficiently rich structures well studied in the literature, which allow models and measures of risk from an ordinal point of view, and which do not depend on the usual arithmetical structure of the reals. In the next section, we survey the general background on lattice theory as well as representation and characterization results concerning Sugeno integrals on completely distributive lattices. In Section 3, we propose notions of risk measure and of quantile-based risk measure within this ordinal setting, as well as present their axiomatizations and representations. In Section 4 we briefly discuss possible directions for future work.
Show more

11 Read more

Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach 1

Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach 1

In the study of market risk of financial products, the Change of Measure is used as an important evalua- tion criterion. Using COM, one can properly evaluate the possibility of a future cash flow of the instru- ment. This principle was the motivation behind our work. We asked, “Why can’t a scenario be evaluated using the Change of Measure approach?” After all, in a sense future cash flows are nothing but a scenario for some economic process. For us, it is the loss generation process driving the severity distribution cali- brated by historical loss experience. If operational risk scenario data were of the quality of market risk scenario data (such as option, future, and forward contracts with high liquidity), then we could have used scenario data to calibrate our loss generating process as is done in many market risk modeling approaches such as in Jackwerth and Rubinstein (1996). In the case of operational risk, we are proposing to use COM in validating and updating the loss generating process. Cerny (2009) in chapter 9 discusses COM in the context of market risk and shows how it is used for dynamic allocation of wealth. An extension of this work will be to study the dynamic allocation of risk management resources using COM. As problematic as operational risk scenario analysis data are, the use of scenario data for purposes of operational risk measurement should be given the same priority as internal loss data. MacMinn ( 2005) differentiated cor- porate risk exposure between pure and speculative. In MacMinn’s framework operational risk will be classified as a pure risk. Following similar analysis and using COM we can differentiate among various tails risks and their impacts on the value of the firm based on their insurability. An interesting extension of this work will be to analyze using COM various tail risks in the MacMinn’s framework.
Show more

27 Read more

Contributions to solvency risk measurement

Contributions to solvency risk measurement

It would be interesting to relate more closely the research done on dy- namic risk measurement and on model uncertainty. The focus of this work would then be to construct risk measurement procedures like the ones pro- posed in Chapters 3 and 4, which allowing for model uncertainty in the probability space, are still able to produce assessments that are somehow consistent over time. The key point here is to define a new class of risk measures where it is possible to identify and separate the two sources of ran- domness: randomness due to model uncertainty and due to the stochastic nature of the process. In particular the component due to model uncertainty should decrease with time as the estimation procedure will be based on more data points becoming available and thus will be more accurate. A possible strategy would be to consider a worst case approach such as the one proposed in Chapter 4, where we calculate the risk measure according to different can- didate models in a certain set M and then take the worst outcome. Allowing for a dynamic component in the selection of the set of models M used, would be a first step to include model uncertainty in dynamic risk measurements.
Show more

155 Read more

Bayesian joint ordinal and survival modeling for breast cancer risk assessment

Bayesian joint ordinal and survival modeling for breast cancer risk assessment

We chose a latent variable formulation for the longitudinal process which translated the ordinal scale to the framework of linear mixed models, with a logistic distribution for the measurement error. The latent variable approach facilitates the computational implementation of the model but introduces complexity in the interpretation of results. We assume a logistic distribution for the latent variable which implies the logistic link for cumulative probabilities. Other models might be also appropriate. The most usual alternatives are the normal and the extreme value distributions which result in the probit and complementary log-log regression links, respectively. It is widely accepted that probit and the logit links produce similar results. This also occurs in our study (results not shown), where we have implemented the probit link to assess the robustness of the model. This is not the case for the extreme value distribution which, unlike the logistic and normal ones, is not symmetrical.
Show more

24 Read more

Explaining Regional Demand for Federal Farm Credit Programs: An Ordinal Probit Approach

Explaining Regional Demand for Federal Farm Credit Programs: An Ordinal Probit Approach

There was no indication that the greater importance of farming within the county (LANDFARM) increased the availability of credit from private lenders. In fact, those counties with a larger share of land devoted to farming were more likely to be high-use counties. This reflects the importance of the FSA loan programs in the Great Plains, Mississippi Delta, and Piedmont regions where farming is the primary use of land. The reliance of these local economies on agriculture may expose commercial lenders to additional risk, resulting in more conservative lending practices which, in turn, would result in more farmers turning to FSA for their credit needs. Conversely, if may merely reflect the fact that as farming activity rises in a county demand for all types of credit, including that supplied by FSA, rises.
Show more

16 Read more

Detection of Genes for Ordinal Traits in Nuclear Families and a Unified Approach for Association Studies

Detection of Genes for Ordinal Traits in Nuclear Families and a Unified Approach for Association Studies

It has been observed that traditional linkage studies are not as powerful as association studies for the iden- tification of genes contributing to the risk for common, complex diseases (R isch and M erikangas 1996). There has been growing interest in genomewide association analyses using SNPs. For example, K lein et al. (2005) successfully employed this approach to identify the complement factor H gene on chromosome 1 for age- related macular degeneration, the major cause of blind- ness in the elderly. Even though many human conditions and diseases are measured in an ordinal scale, the existing analytic approaches were developed for binary or quantitative traits. The main objective of this report is to present a score statistic to detect genes associated with an ordinal trait and demonstrate its power through simulation and real data. We observed that the statistic belongs to a general class of test statistics that have been useful for association analyses of binary and quan- titative traits. Through simulation studies, we discov- ered that our proposed score statistic (i.e., O-TDT) can serve as a unified test for all types of traits (binary, ordinal, or quantitative). The new test is more powerful when the trait is ordinal and comparable to existing tests when the trait is binary or quantitative. The analysis of SNP rs714697 supports the notation that it is highly TABLE 4
Show more

7 Read more

Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach

Scenario Analysis in the Measurement of Operational Risk Capital: A Change of Measure Approach

Subjectivity and possible biases will always be inherent characteristics of scenario data. Methods for in- terpreting scenarios must take these features into account. As Kahneman, Slovic, and Tversky (1982) put it: “A scenario is especially satisfying when the path that leads from the initial to terminal state is not immediately apparent, so that the introduction of intermediate stages actually raises the subjective probability of the target event.” We have undertaken research that seeks to explain how one could control and reduce the biases and subjectivity in scenario data in the context of operational risk. Very preliminary results show that scenario data generated in the format discussed above are less subjective and therefore more suitable than data produced in other formats. We also think that this is the most natural format in which workshop participants can interact and express their beliefs. The discussion of how judgment happens in Chapter 8 of Kahneman (2011) further corroborates some of our findings in the context of scenario data generation for operational risk.
Show more

30 Read more

Credit risk measurement methodologies

Credit risk measurement methodologies

High bank failures and the significant credit problems faced by banks during the Global Financial Crisis (GFC) are a stark reminder of the importance of accurately measuring and providing for a credit risk. There are a variety of available credit modelling techniques, leaving banks faced with the dilemma of deciding which model to choose. Historically, prominent methods include external ratings services like Moody’s, Standard & Poor’s (S&P) or Fitch, and financial statement analysis models (which provide a rating based on the analysis of financial statements of individual borrowers, such as the Altman z score and Moody’s RiskCalc). Credit risk models which measure default probability (such as Structural Models) or Value at Risk (VaR) attained a great deal more prominence with the advent of Basel II. This article examines four widely used modelling techniques, including external ratings, financial statement analysis models, the Merton / KMV structural model and the Transition models of CreditMetrics and CreditPortfolioView, including an overview of the models and a comparison of their relative strengths and weaknesses. Structural models are based on option pricing methodologies and obtain information from market data. A default event is triggered by the capital structure when the value of the obligor falls below its financial obligation (such as the Merton and KMV models). VaR based models provide a measurement of expected losses over a given time period at a given tolerance level. These include the JP Morgan CreditMetrics model which uses a Transition Matrix, and the CreditPortfolioView model which incorporates macroeconomic factors into a Transition approach.
Show more

8 Read more

Measurement and assessment of systematic risk of selected industries in stock exchange using wavelet approach

Measurement and assessment of systematic risk of selected industries in stock exchange using wavelet approach

According to conducted studies, it could be said that CAPM is considered as an important model by financial and economic researchers, mainly because it is able to specify the systematic risk in different stock exchanges in the world. However, there are some shortcomings of CAPM. For instance, this model can only present a fixed coefficient of risk for long-term that the present study has tried to resolve this problem by using State-Space model in order to present a time-series of Beta instead of one digit. Another shortcoming of CAPM is the lack of awareness regarding short-term and long-term Betas (long-term and short-term fluctuations). This paper has divided fluctuations into large and small fluctuations with different Betas using the Wavelet approach.
Show more

14 Read more

Financial Risk Measurement for Financial Risk Management

Financial Risk Measurement for Financial Risk Management

In summary, while the literature on modeling the covariance matrix dynamics is progressing rapidly along many different directions, there is still no consensus on the relative merits of the approaches. It is clear, however, that the use of high-frequency intraday data and realized covariance measures hold the promise of substantially improving the accuracy of covariance matrix forecasting. Going one step further, in direct parallel to the approach taken in the univariate setting of section 2.2.3, the realized covariance forecasts discussed above may also be embedded within a multivariate GARCH setting to provide a vehicle for combining the realized covari- ance matrices with a multivariate distribution for the return innovations. We briefly discuss some recent ideas for implementing this next.
Show more

130 Read more

Performance Measurement Analysis of XYZ Company Based on Risk with AHP and OMAX Concept Approach

Performance Measurement Analysis of XYZ Company Based on Risk with AHP and OMAX Concept Approach

Performance measurement with OMAX is to assist companies in making decisions which KPI that should be followed up first. By knowing the KPI priority for the company, based on it risks to company’s goals achievement, will be the guideline to arrange the right strategy for organization. In this case, there are 2 main KPIs that need to be improved its performance because they are still below the standard, they are KPI meet the needs of raw materials, parts and supporting materials with a score of 3 and receivable turnover with a score of 3. Performance increase for both KPI will affect to improve performance for another KPI which also has a score of 3, namely account debt turnover.
Show more

11 Read more

A constrained regression model for an ordinal response with ordinal predictors

A constrained regression model for an ordinal response with ordinal predictors

The intermediate approach proposed here is defined to achieve a set of linear estimates described by multiple mag- nitudes, as in the nominal scale approach, but allowing one direction only, as in the interval scale approach. The lat- ter is attained by restricting the effects of model ( 1 ) to be monotonic in either direction. The monotonicity assump- tion should not necessarily be taken for granted in regression with ordinal predictor and response. But it has a special sta- tus, similarly to linearity between interval-scaled variables. According to Stevens ( 1946 ) the interval scale is defined by the equality in the meaning of differences between values regardless of the location of these differences on the mea- surement range. A linear relationship between interval-scaled variables means that the impact of a change in the predictor on the response is proportional to the meaning of the change of measurement at all locations of the measurement scale. For the ordinal measurement scale, only the order of measured values is meaningful. In this case, monotonic relationships are those that imply that a change in the predictor of the same meaning (i.e., changing to a value that is higher, or lower, respectively) at all locations of the measurement scale has an effect of the same meaning on the response.
Show more

22 Read more

A Latent Mixture Approach to Modeling Zero-Inflated Bivariate Ordinal Data

A Latent Mixture Approach to Modeling Zero-Inflated Bivariate Ordinal Data

To examine the relationship between ordinal responses and a group of covariates, operations were performed in two stages. First, individuals were split into four latent risk- groups: and as described in the last paragraph of the previous section. Baseline category multinomial logit regression was used to model group membership. In the second stage, given the latent risk-group membership, three different ordered probit models were fit—two univariate and one bivariate—belonging to the second, third, and fourth components of the MZIBOP model equation (3.8). Although, all these equations in the models are estimated simultaneously, the covariates belonging to each equation can be distinct. To this end, a goal was to identify the specific covariates that are strong predictors for identifying one of the four risk-groups that the adults belong. This can be achieved through the mixture components, i.e. multinomial logit part of the proposed model. Another aim was to identify the covariates that are strong predictors of level of consumption of either marijuana or cocaine through the univariate and bivariate ordered probit part of the proposed model, given that they are users of one or both of these two drugs.
Show more

120 Read more

Sensitivity analysis of scenario models for operational risk Advanced Measurement Approach

Sensitivity analysis of scenario models for operational risk Advanced Measurement Approach

Quantile matching is a logical method for fitting continuous distributions to severity data collected using quantile approach. The method involves parameter estimation by minimizing the squared difference between empirical quantiles (as elicited from experts) and theoretical quantiles (defined by the inverse cumulative distribution function of the selected distribution). The objective is to ensure that the fitted distribution has the same quantiles as the expert opinion, by minimizing the following objective function for two or more quantiles:

15 Read more

Standardized Measurement Approach for Operational risk: Pros and Cons

Standardized Measurement Approach for Operational risk: Pros and Cons

We propose one class of models that can act in this manner and allow one to incorporate the key features offered by AMA LDA type models which involve internal data, external data, BEICF's and Scenarios, with other important information on factors that the SMA method and OpCar approaches have tried to achieve but failed. As noted in this response, one issue with the SMA and OpCar approaches is that they try to model all Operational risk processes at the institution or group level with a single LDA model and simplistic regression structure, this is bound to be problematic due to the very nature and heterogeneity of Operational risk loss processes. In addition it fails to allow for incorporation of many important Operational risk loss process explanatory information sources such as BEICF's which are often no longer informative or appropriate to incorporate at institution level, compared to individual Business Line/Event Type (BL/ET) level.
Show more

20 Read more

A sparse grid approach to balance sheet risk measurement

A sparse grid approach to balance sheet risk measurement

We describe precisely in Section 2, the pricing and hedging model, the dynamics of the risk factor X and the value of the asset and liability side of the balance sheet. Let us stress that the risk factor model is given under the so-called real-world probability measure P , which might be objectively calibrated using time series of financial markets or represents the management view. This real-world model may be –and most of the time is– completely different from the pricing and hedging model which might be simplified for runtime/trackability purposes, prudent (pricing and hedging include a margin) or being constrained by regulation.
Show more

30 Read more

The measurement of farmers' risk attitudes using a non-structural approach

The measurement of farmers' risk attitudes using a non-structural approach

Most previous researches on measuring farmers’ risk attitudes have fo- cused on a specific region and time, a certain commodity and a specific type of risk. It is hard to find research done on risk attitude measurement over time and in the more aggregated level such as a country or a state. The results of this approach might have limited implications and usefulness since they can be different in accordance with another time, region or commodity. The non-struc- tural approach developed by Antle (1989) can solve this problem and provide risk attitude estimates over time. Farmers’ risk attitude over time is useful to analyze policy effects. For example, the effects of decoupled payments can be measured by the change of farmers' risk attitude over time.
Show more

18 Read more

The Advanced Measurement Approach to Operational Risk

The Advanced Measurement Approach to Operational Risk

Ellen Davis is the editor of Operational Risk magazine and BaselAlert.com, based in London. Previously, she was an editor on Risk magazine, and the editor of Asia Risk magazine, based in Hong Kong. She originally hails from New York City, where she had a long freelance career during which her work appeared in Treasury & Risk Management, Global Finance, BusinessWeek, CFO magazine, and a number of other financial publications. She holds a BA from Wellesley College and an MBA from New York University.

5 Read more

Measurement of Farm Credit Risk: SUR Model and Simulation Approach

Measurement of Farm Credit Risk: SUR Model and Simulation Approach

Empirical testing applies Bootstrap techniques (Hollander and Wolfe 1999). A total of 5,000 random samples of residuals for each equation were drawn from the least square regression results for the SUR model. Each sample is randomly selected by the 7 risk classes or equations, and each class has 9 observations randomly picked out with one for each year in the sample. In each sampling run, the random sample is then used to compute a value of λ LR and Jenn respectively, in which the corresponding sample correlation matrix R is obtained by using expression (6). Both statistics are of χ 2 distribution with 21 degree of freedom. Overall, the mean value of λ LR is 39.02 out of the total 5,000 calculated values, greater than the 1 percent critical value of 38.93. So the null hypothesis of diagonal for the correlation matrix is rejected, and thus asset correlation among farm risk classes is confirmed
Show more

37 Read more

Understanding the Internal Measurement Approach to Assessing Operational Risk Capital

Understanding the Internal Measurement Approach to Assessing Operational Risk Capital

The Basel Committee working paper CP2.5 on the Regulatory Treatment of Operational Risk has proposed various rules for the pillar 1 capital charge. Banks are currently in the process of making constructive suggestions for the modification of these rules in time for the next consultative paper, due early next year. I do not wish to pass judgment on whether the rules should be accepted. As a mathematician I take the rules as given. Without rules there can be no assumptions and no logical deduction leading to any conclusion.

9 Read more

Show all 10000 documents...