The operation of multi-purpose reservoir with flood control and water supply purposes is one of the complex and dynamic problems in reservoiroperation. Flood is one of the natural disasters that can cause damages to the infrastructure and loss of life. The reservoirs that function as flood mitigation mechanism require fast and accurate decision in determining the water release. During heavy rain, reservoir needs to impound water and release them gradually to maintain safe discharges at downstream areas  and minimize the downstream damages and to ensure dam safety . Conversely, during less intense rainfall, the reservoir has to impound adequate water to maintain its water level without affecting its release for water supply. Through these major functions, reservoir can be regard as one of the emergency environments that require humanexpertdecisionmaking and monitoring.
Abstract—During emergencies such as flood and drought seasons, reservoir acts as a defence mechanism to reduce the risk of flooding and maintaining water supply. During this period, decision regarding the water release is very crucial. During flood season, early water release decision should be established to prepare the reservoir for incoming in-flow. While during drought season, reservoir water level should be maintained in order to sustain the supply and other usages. Reservoiroperation during these two seasons cause conflicting decision as incoming inflow is hardly predicted. Modeling the reservoir water release decision can be one of the solutions to this problem. The modeling is based on reservoir’s operator previous experiences when dealing with such situations. These experiences provide valuable information on the decision when the reservoir water should be released. Temporal data mining technique has been applied to extract temporal pattern from the reservoir operational record and neural network has been applied as the modeling tool. The neural network model was developed to classify the data that in turn can be used to aid the reservoir water release decision. In this study neural network model 8-23-2 has produced the acceptable performance during training, validation and testing.
in Eupen and La Gileppe reservoirs leads to a residual flood risk as high as EUR 11 850 000 yr −1 for the entire catchment. Since the corresponding risk obtained with the present opera- tion rules equals EUR 12 600 000 yr −1 (see reference value in Table 7, and increase by 200 % according to Table 8), an en- hancement of the reservoiroperation rules to mitigate flood risk is limited to a potential reduction of maximum 6 % for this extreme scenario. Nonetheless, two perspectives of im- provement of the reservoiroperation may contribute to mit- igate the impacts of climate change on flood risk. The for- mer consists in a reduction of the mean target level, inducing a significant reduction of the minimum reservoir level. To compensate this reduction of the minimum reservoir level, the amplitude of the time evolution of the target level may be decreased without inducing extra flood risk. The second perspective is a reduction of the discharge threshold for de- tecting flood downstream in the “flood management” mode, which has no influence on the other performance indicators.
Dimensionality problems associated with the traditional optimization approaches have motivated the development of evolutionary algorithms. Notable among these are genetic algorithm (Goldberg 1989), ant colony optimization, and particle swarm optimization. Many works (DeJong 1975; Michalewicz 1992) have established the validity of the GA technique in function optimisation. A brief review of GA applications to civil engineering problems in general and water resources problem in particular is given by Wardlaw and Sharif (1999) and Sharif and Wardlaw (2000). Meraji et al. (2011) presented particle swarm optimization for application to operation of Dez reservoir in southern Iran. Results obtained have shown that algorithm works better than the non-linear programming (NLP) model. Afshar et al. (2011) presented honey bees mating optimization algorithm for developing operating rules for multireservoir systems. A detailed review of evolutionary algorithms as applied to reservoiroperation models with single as well as multiple objectives has been presented by Ayeyemo (2011). Rani and Moreira (2010) present an elaborate survey of simulation and optimization modelling approaches used in reservoir systems operation problems. Simulation models present an alternate approach for determining optimal solutions for problems that are not amenable to solutions by available techniques. Often, the time required to obtain a solution to a complex problem is too long. For such problems, the cost of obtaining an optimal solution is more than the savings obtained through applying the optimal solution. In such cases simulation models provide an effective tool for obtaining solutions that could be practically implemented.
The aim of the remainder of this chapter is to present an overview of the important con- cepts in type-2 fuzzy logic systems. A type-2 FLS is constructed by the same structure of type-1 IF-THEN rules, which is still dependent on the knowledge of experts. Expert knowledge is always represented by linguistic terms and implied uncertainty, which leads to the rules of type-2 FLSs having uncertain antecedent part and/or consequent part:, which are then translated into uncertain antecedent or consequent MFs. The structure of rules in the type-2 FLS and its inference engine is similar to those in type-1 FLSs . The inference engine combines rules and provides a mapping from input type-2 fuzzy sets to output type-2 fuzzy sets. To achieve this process, we must find unions and intersections of type-2 sets, as well as compositions of type-2 relations. The output of the type-2 inference engine is a type-2 fuzzy set. Using Zadeh’s extension principle , type-1 defuzzifica- tion can derive a crisp output from type-1 fuzzy set; similarly, for a higher type set as type-2, this operation reduces the type-2 fuzzy sets to type-1 fuzzy sets. This process is usually called ‘type reduction’. The complete type-2 fuzzy logic theory with the handling of uncertainties, such as the operations on type-2 fuzzy sets, centroid of a type-2 fuzzy sets, type-reduction, and etc., can be found in [9, 35–42].
Abstract—Emergency situation required fast and accurate decision as every decision is very critical to save human lives. Naturally, during this situation humans made decision based on their past experiences by which their nerves and brain system will perceive the situation and mapped with their experiences to produce action. This naturalistic decisionmaking approach has been one of the attention in emergency management research. In this paper a conceptual model of Intelligent Decision Support System for reservoiroperation during emergency situation is proposed. This model simulates humandecision based on three models: situation assessment, expectancy forecasting, and decision modeling. Situation assessment is to extract the temporal data from the hydrological and operational databases. This data will be used in the forecasting module, to forecast the future event. The decision module will utilized the temporal and the forecasted data to produce the final decision. Artificial intelligence techniques are utilized in every model. The model is expected to assist reservoir operator in makingdecision during emergency situation; typically during heavy rainfall when early and fast decision is required to release the reservoir water in order to leave enough space for incoming water and to release the water in the save carrying capacity of the downstream channel. Thus avoiding flood in downstream areas.
As serious concern is raised on the safety of ships the world over, the International Maritime Organization (IMO) has continuously dealt with safety problems in the context of operations, management, survey, ship registration and the role of the administration. The improvement of safety at sea has been highly emphasized. The international safety-related marine regulations have been driven by the serious marine accidents. Lessons were first learnt from serious accidents. Then regulations and rules were produced to prevent similar occurrences. For example, the capsize of the “Herald of Free Enterprise” in 1987 certainly raised serious questions with regard to operational requirements and the role of management, and so stimulated discussions at the IMO. This finally resulted in the adoption of the International Management System (ISM) Code for the Safety Operations of Ships and for Pollution Prevention. The “Exxon Valdes” accident in 1989 seriously damaged the environment by a large scale of oil spill. It facilitated the implementation of the international convention on Oil Pollution Preparedness, Response and Co-operation (OPRC) in 1990. Double hull or mid-deck structural requirements for new and existing oil tankers were subsequently applied . Following the Scandinavian Star disaster in 1990 which resulted in the loss of 158 lives, and then the catastrophic disaster of the “Estonia” in the Baltic Sea in September 1994, the role of human error was highlighted in marine casualties. As a result of such incidents, the new Standards for Training, Certificates and Watchkeeping (STCW’95) for seafarers were subsequently introduced.
Decisionmaking is a very important aspect of business operation. There cannot be effective management of business without accurate decisions taken. Many businesses lack the ability to come out with fast and error free decisions which are easy to implement to enhance more efficiency. The case is serious when talking about trading businesses. Financial decisionmaking demands a lot of accuracy in other to enhance maximum use of the limited resources of a business. The human aspect of decisionmaking makes it difficult for the business to achieve more efficiency and effectiveness. This is because of the bureaucracy in decisionmaking coupled with the inaccuracy and inconsistency in the results by humans in decisions.
The PPFS was tailored to a speci ﬁ c purpose, built within a challenging time frame with limited resources and constrained by the data and information available. Being a tool for participatory integrated assessment, model evaluation and validation methods were focused on the model purpose rather than selecting commonly used quantitative measures (Bennett et al., 2013). Its credibility underpins its capability to serve the industry and stakeholders well into the future. To meet future needs, the following improvements in particular are suggested, namely (1) more detailed treatment of grazing land management practices using emerging understanding and data (O ’ Reagain and Scanlan, 2011; O'Reagain and Scanlan, 2013; Scanlan et al., 2013, Walsh and Cowley, 2011), (2) greater differentiation between different types of cattle and beef product markets, (3) income opportunities from ecosystem services other than carbon, (4) closing the loop between rainfall, irrigation water availability and crop yields, (5) a more sophisticated way of dealing with environmental dimensions and feedback relationships, such as relating to land, water and biodiversity, and (6) modelling a larger range of market-based policy instruments. Formal output validation of the model after Bennett et al. (2013) will also be necessary to achieve not just in- dustry but also scienti ﬁ c credibility, particularly if (a) the geographical scale of model applications is to be expanded or (b) the scope of application is to include policy analysis. Ex-post evaluation of events, such as the temporary ban of cattle live export to Indonesia in 2011 could be used for formal output vali- dation of the PPFS, which would also give the PPFS more credence as a lobbying tool in conversations between the pastoral industry and government.
The output of the MPC block produces the future con- trolling signal over the prediction horizon and predict the MF, which drives brain response to make a correct deci- sion. The cost function of this optimization problem is represented as a weighted squared sum of the predicted errors (Eq. 1). Moreover, MPC ensures that the MF and correct decisionmaking stays within pre-determined limits (Eq. 5). These are referred to as constraints such as the range of amplitude and frequency of brain re- sponses during the decision process. Generally, the MPC controller is solving the optimization problem over the prediction horizon while satisfying the constraints. The predicted item with the smallest cost function gives the optimal solution and therefore determines the optimal MF (outputs of MPC) that will get the predicted item as close as possible to the desired set points. Figure 4 shows the algorithmic modeling for the process of de- cision making. This flow chart represents our proposed model systematically.
Humans are surprisingly efficient in making decisions (Griffiths & Tenenbaum, 2006; Green, Benson, Kersten, & Schrater, 2010). This efficiency is remarkable because it occurs in the face of uncertainty, relies on imperfect knowledge (must utilize ambiguous cues), and takes place in an environment of variable risk and non-deterministic outcomes (Trimmer et al., 2011; Fellows, 2004; Bland & Schaefer, 2012; Bach & Dolan, 2012; Payzan-LeNestour & Bossaerts, 2011). The underlying mechanism of decisionmaking however is not yet clearly understood. Our lab is particularly interested in understanding how people improve their decisions based on their experiences. Our approach is to investigate this mechanism in patients who have brain injury and difficulties in adaptive sequential decisionmaking tasks. The contribution of this thesis is to employ computational methods to compare the results of brain damaged patients and healthy people. This comparison can shed light on the decisionmaking mechanism in the brain. On the other hand, it can help more understand the possible reasons behind the brain damaged people’s failure in this process.
DOI: 10.4236/as.2018.93024 341 Agricultural Sciences To understand agricultural adaptations and transitions, modelling has become an essential methodology   . Until recently, most agronomic models remained mainly driven by an economic optimization approach, focusing on the farm and its technical and economic characteristics  . They generally present the farm decision-making in terms of factors, barriers and motivations . To better understand changing dynamics,  identify a need for integrated and dynamic modelling frameworks, based on explicit and well-motivated cog- nitive and social behavioral theories. Complementarily,  shows that adapta- tion is better understood by focusing on the “how” instead of “why”, and by considering the time dynamics of the farmer’s relationships to others. The mod- elling of decision-making has grown from the review of  more than 10 years ago to the recent one of . Insights from psychology have been more and more integrated in the models in order to understand how farmers decide, innovate and change. Among the various models of farmer decision-making, the agent-based models have particularly developed in the last few years. Agent-based models integrate cognitive and social behavioral theories, as well as farmers’ relation with other people and the environment   . Moreo- ver, studying the evolution of a population of interdependent farmers is particu- larly appropriate to design and test different policy options favoring innovation or adaptation.
to provide a means of representing a decision- making process. The approach is demonstrated by reference to an example model. The aim is to illustrate how the two packages can be linked successfully. Albeit that the example presented is fairly simple, the approach provides the basis for further work on more complex problems. The paper starts with a brief review of previous work in which simulations and expert systems have been linked. After this, the example simu- lation model is described and the representation of the decision-making process in XpertRule is explained. An overview of the methodology for linking Witness and XpertRule is given, as well as a detailed description of the Visual Basic code required to link the two software packages. A full listing of this code is provided in the ap- pendix.
acteristics of CEO candidates in LBO and VC transactions. Ang, de Jong, and Van der Poel (2008) as well as Huang (2010) show that CEOs are more likely to divest divisions they are less familiar with. Xuan (2009) analyzing the career paths across companies’ divisions and Malmendier, Tate, and Yan (2011) looking at Depression experience or military experience, show that past experience affects corporate deci- sions. Our findings are complementary. We show that industry-specific experience in general, and bargaining power in particular are important determinants for cor- porate performance and corporate decisionmaking. Our findings also speak to the current debate on general vs. specialist CEOs. Lazear (2002, 2004), Murphy and Zabojnik (2007), and Frydman (2007) document an increased importance of general skills. Cremers and Grinstein (2011), however, show that managerial talent pools are industry-specific, suggesting that industry-specific experience is important to firms. Industry-specific bargaining ability can be interpreted as one dimension of industry-specific skills. Our results are consistent with both views. We show that in- dustry experience affects corporate performance. However, we do not find evidence that industry experience creates higher value in acquisitions; it directly affects how the surplus is split between the bidder and the target.
Abstract. The main purpose of the paper is the presentation of the new concept of humandecision-making process modelling using an analogy with the Automatic Control Theory. From the authors' point of view this concept allows to develop and improve the theory of decision-making in terms of the study and the classification of specificity of the human intel- lectual processes in different conditions. Literature shows that the main distinguishing fea- ture between the Heuristic / Intuitive and Rational Decision-Making Models is the presence of so-called phenomenon of “enrichment” of the input information with human propensity, hobbies, tendencies, expectations, axioms and judgments, presumptions or bias and their justification. In order to obtain additional knowledge about the basic intellectual processes, as well as the possibility of modelling the decision results in various parameters character- izing the decision-maker, a complex of simulation models was developed. These models are based on the assumptions that: basic intellectual processes of the Rational Decision-Making Model can be adequately simulated and identified by the transient processes of the propor- tional-integral-derivative controller; basic intellectual processes of the Bounded Rationality and Intuitive Models can be adequately simulated and identified by the transient processes of the nonlinear elements. The taxonomy of the most typical automatic control theory elements and their compliance with certain decision-making models with a point of view of decision- making process specificity and decision-maker behavior during a certain time of professional activity was obtained.
To generate ERP waveforms, matched stimulus-locked (-200 to 2000 ms relative to the onset of coherent motion) and response-locked (-1000 to 100 ms relative to button press) epochs were extracted from the continuous data. All epochs were baselined to the average over a 200 ms period preceding the onset of coherent motion. Epochs were then separated into ‘easy’ and ‘hard’ conditions. Since we assume equivalent decision processes for trials with upward and downward motion, we collapsed trials over motion direction. However, since trials with upward and downward motion were associated with right and left-hand responses, leading to stronger changes in the left and right hemisphere respectively, simply averaging over both motion directions would distort the lateralisation of motor processes. Therefore, the topography of all trials with a right-hand response (correct ‘up’ trials and incorrect ‘down’ trials) was mirrored along the midline, so that all contralateral activity was projected onto the right hemisphere (i.e. activity recorded in electrodes on the left hemisphere was now associated with electrodes on the right hemisphere). Finally, in line with previously suggested procedures (Kelly & O’Connell, 2013; O’Connell et al., 2012), the data were converted to current source density (CSD) estimates to increase spatial selectivity. The CSD transformation was applied using the CSD toolbox, which uses a spherical spline algorithm, with the spline interpolation constant m set to its default value (m = 4; Kayser & Tenke, 2006).
The objective of this study is to present a novel tool for predictive modelling of urban growth. The proposed tool, named iCity e Irregular City, extends the traditional formalization of cellular automata (CA) to include an irregular spatial structure, asynchronous urban growth, and a high spatio-temporal resolution to aid in spatial decisionmaking for urban planning. The iCity software tool was developed as an embedded model within a common desktop geographic information system (GIS) with a user-friendly interface to control modelling operations for urban land-use change. This approach allows the model developer to focus on implementing model logic rather than developing an entire stand-alone modelling application. It also provides the model user with a familiar environment in which to run the model to simulate urban growth. Ó 2006 Elsevier Ltd. All rights reserved.
Modelling prices of well-known crude oils (e.g., Brent, WTI and Tapis) has been the focal point of most analyses in this area. In fact, out of 16 different papers that we have reviewed, only three covered crude oils other than the benchmark types. Bacon and Tordo (2005) modelled price differentials of 56 different crude oils using a pooled cross-section time series. They examined the relationship between crude oil prices and quality features, such as API, sulfur and total acidity number (TAN). They concluded that each quality feature impacts price differentials of different crude oils. For example, a one-unit increase in API raises the price of a crude oil by $0.007/ barrel when compared to the Brent crude (Bacon and Tordo, 2005). Yousefi and Wirjanto (2004) examined the empirical role of the exchange rate on crude oil price formation by OPEC members and covered crude oils produced by all members except Iraq, Kuwait and Venezuela. They concluded that there is a degree of rivalry among OPEC members in order to obtain more market power. Their results confirmed the idea that OPEC has no unified price and suggested a partial market-sharing model (Yousefi and Wirjanto, 2004).
Rouse and Morris define the mental model as a psychological mechanism used to describe system objectives and forms, to interpret system functions, to observe system status, and to predict future state of the system . Johnson-Laird argues that the mental model allows people to speculate and understand current real events and to act accordingly . Endsley argues that the mental model can be used to guide the formation of situational awareness and decisionmaking through the mental model to complete the process of information acquisition to action . The theory of the current mental model includes Simon's proposed Physical Symbol System Hypothesis (PSSH), the Norman model, the SOAR model, and the social model of the mind.
followed, the volume of the reservoir increased more then five times and the surface area was almost doubled (Table 1). Unfortunately, there is no complete hypsographic curve over the reservoir. The main interest so far has been in the volume curve for power regulation which covers only the interval between minimum and maximum pool elevation i.e. the upper 30 m of the reservoir. However, an approximate hypsographic curve (Fig. 3) based on the existing volume curve has been constructed using some GPS-based echo sounding in the summers of 1999 and 2000. The maximum depth is 92 m (SMHI, 1996), and the mean depth, calculated from the hypsographic curve, is approximately 30 m.