the forecast errors [10]. In fact, since the latter cannot be avoided, it is important to increase the understanding on how they may occur [17]. This understanding can then facilitate their proper modelling, which is still an open challenge for the hydrological community. In fact, many studies are devoted to this problem in various hydrometeorological and hydroclimatic contexts (e.g. [12, 13, 17]). Herein, we examine the error **evolution** in multi-step ahead **forecasting** with an emphasis on monthly streamflow processes. Our aim is to create a representative image of the underlying phenomena and, thus, we compare an adequate number of **forecasting** methods on a large number of simulated time series, while we also present a comparative case study using monthly streamflow data to illustrate important points. The novelty of our study is that we examine the errors at each time step of the forecast horizon themselves and not their summary provided by commonly used metrics for the assessment of multi-step ahead forecasts (e.g. Nash-Sutcliffe, RMSE, MAPE).

10 Read more

Time-**evolution** and hence, **forecasting** the growth profiles of business-centric technoeconomics are ascertained. As an example, the vast telecommunication (telco)-specific business is considered as a complex enterprise depicting a cyber- space of digital ecology (DE) with a backbone of network that supports a host of information sources and destinations facilitating a variety of triple (voice, data and video) services. To specify the temporal trend of **evolution** of telco eco- nomics in a series format, the approach pursued here (and differs from traditional series analyses) takes into account only a selective (and justifiable) set of autoregressive integrated moving average (ARIMA) parameters consistent with the test data. However, this simplified approach yields sufficiently accurate time-series (depicting the business growth) extendable to **forecasting** regimes. The efficacy of the proposed method is determined via goodness-fit evaluations both in time- and frequency-domains. The data adopted in the computations conform to typical telco service industry.

10 Read more

The localization of sensor node is an essential problem for many economic **forecasting** applications in wireless sensor networks. Considering that the mobile sensors change their locations frequently over time, Monte Carlo localization algorithm utilizes the moving characteristics of nodes and employs the probability distribution function (PDF) in the previous time slot to estimate the current location by using a weighted particle filter. However, it also has the problem of insufficient number of valid samples, which further affects the node ’ s localization accuracy. In this paper, differential **evolution** method is introduced into the Monte Carlo localization algorithm. The sample weight is taken as the objective function, and differential **evolution** algorithm is implemented in sample stage. Finally, the node position is estimated by making the sample close to the actual location of the node instead of being filtered out. The simulation results

Abstract: Multi-step ahead streamflow **forecasting** is of practical interest for the operation of hydropower reservoirs. We provide generalized results on the error **evolution** in multi-step ahead **forecasting** by conducting several large-scale experiments based on simulations. We also present a multiple-case study using monthly time series of streamflow. Our findings suggest that some **forecasting** methods are more useful than others. However, the errors computed at each time step of a forecast horizon within a specific case study strongly depend on the case examined and can be either small or large, regardless of the **forecasting** method used and the time step of interest.

11 Read more

Abstract. We present an experiment on fifty multilayer per- ceptrons trained for streamflow **forecasting** on three water- sheds using bootstrapped input series. This type of neural network is common in hydrology and using multiple train- ing repetitions (ensembling) is a popular practice: the infor- mation issued by the ensemble is then aggregated and con- sidered to be the final output. Some authors proposed that the ensemble could serve the calculation of confidence in- tervals around the ensemble mean. In the following, we are interested in the reliability of confidence intervals ob- tained in such fashion and in tracking the **evolution** of the ensemble of neural networks during the training process. For each iteration of this process, the mean of the ensem- ble is computed along with various confidence intervals. The performance of the ensemble mean is evaluated based on the mean absolute error. Since the ensemble of neural net- works resemble an ensemble streamflow forecast, we also use ensemble-specific quality assessment tools such as the Continuous Ranked Probability Score to quantify the fore- casting performance of the ensemble formed by the neural networks repetitions. We show that while the performance of the single predictor formed by the ensemble mean improves throughout the training process, the reliability of the associ- ated confidence intervals starts to decrease shortly after the initiation of this process. While there is no moment during the training where the reliability of the confidence intervals is perfect, we show that it is best after approximately 5 to 10 it- erations, depending on the basin. We also show that the Con- tinuous Ranked Probability Score and the logarithmic score do not evolve in the same fashion during the training, due to a particularity of the logarithmic score.

10 Read more

Forecasts for the McIntosh static (red ﬁlled squares) and **evolution**-dependent (blue open circles) methods can be directly compared here, as both are applied to the same testing time period and so have the same climatology. For the static case, the majority of points lie within the shaded area, which can contribute positively to the BSS. However, while three points lie on the line of perfect reliability (i.e., y = x) most are found below this line, indicating the method is over- **forecasting** (i.e., the values of forecast probabilities are too high relative to the observed frequency of events for that forecast bin). It is interesting to note that the **evolution**- dependent case also appears to be over-**forecasting**, but in a more consistent manner (i.e., linearly biased from perfect reliability) than the static case. Notably, the static method achieves a worse (and negative) BSS compared to the **evolution**-dependent method, which is reﬂected in the reliability diagrams by more signi ﬁ cant deviation of data points from the y = x line and their relatively larger occurrence frequencies (e.g., for the static case, p = 0.6–0.7 is the greatest outlier while being the third-most populated bin).

14 Read more

ABSTRACT: Power price **forecasting** is a significant part of smart grid because it makes smart grid cost efficient.The existing methods for price **forecasting** may be difficult to handle with huge price data in the grid since the redundancy from feature selection cannot be averted. To solve such a problem, a novel electricity price **forecasting** model is Hybrid feature Selection, Feature Extraction and Classification (HSEC) are integrated into a single framework design. In this novel model, first, a Grey Correlation Analysis (GCA) based Hybrid Feature Selector (HFS), combining Relief-F algorithm and Random Forest (RF) is designed to calculate the feature importance and control the feature selection. For feature extraction, Kernel Principle Component analysis is used to further reduce the redundancy among the selected features. Finally, design a differential **evolution** (DE) based Support Vector Machine (SVM) classifier for to forecast the price accurately.

As wind power is a mature and important renewable energy, wind power capacity **forecasting** plays an important role in renewable energy generation’s plan, investment and operation. Combined model is an effective load **forecasting** method; however, how to determine the weights is a hot issue. This paper proposed a combined model with differential **evolution** optimizing weights. The proposed model can improve the performance of each single **forecasting** model of regression, BPNN and SVM. In order to prove the effectiveness of the proposed model, an application of the China’s wind power capacity was evaluated from 2000 to 2010. The experiment results show that the proposed model gets the maximum mean absolute percentage error (MAPE) value 1.791%, which is better than the results of regression, BPNN and SVM.

This model is applied at an aggregate level and variables included are chosen to capture the inter-itinerary competition dynamic along three dimensions: time of the day, carrier and level of service. Results obtained suggest that itineraries sharing the same time show a moderate level of competition while those sharing time and carrier or level of service show a high level of competition (Loris Belcastro et al., 2014). The neural networks enhanced **forecasting** accuracy and went beyond the capabilities of the more conventional statistical analysis used at the time. A hybrid model of neural network and statistical analysis was developed in order to forecast air traffic flow at fixes on a 30-min aggregation level within China’s air traffic network. For this study, the decision variables were a combination of information provided by radar data and historical airline flight schedule data (Fucheng Qiu and Yi Li, 2014). Support Vector Machine (SVM) techniques are used to develop a model that improves on the simple time series approach to air traffic **forecasting**. This technique has several advantages over traditional econometric models proving that potential benefits can be obtained when applying data mining techniques in air traffic **forecasting** (Yi Cao et al., 2013). The modern approach uses complex network theory quantitative parameters as explanatory variables in the input dataset, and trains logistic regressions and neural networks to predict the likelihood of previously un-connected airport-pairs being connected in the future, and the likelihood of connected airport-pairs becoming un-connected. The main objective of this study was to improve on the FAA TAF assumption of a static routing network, which was done by adding an initial step that models US network **evolution**. The accuracy of the results was between 20% and 40%, leaving room for improvement. In addition, the work done did not improve on the current FAA methodology for **forecasting** air traffic levels on existing routes(Jiang Chunshui and Yu Haiyang, 2012).

**Forecasting** :A problem arising from time series analysis is to forecast (medium/long term) or to now cast (short term: 1 or 3 hours) the systems **evolution**. Predicting of photochemical smog is an example of complex data modeling because the processes involved are detected by measuring at only a few ground sites chemical indexes which depend on partially known chemical mechanisms, on poorly understood emission fields and on uncertain turbulent mixing and transport phenomena. The data sets used in this work (Liguori, 1996) consist of hourly mean concentrations of air pollutants and meteorological parameters recorded at different urban sites during 1995 in Mestre (Venice, Italy - Figure 1). The monitoring network is described in Table 1 and included meteorological parameters from a private monitoring network (EnteZonaIndustrialedi Porto Marghera), data from the airquality network of the Venice Municipality, and data on vehicle flow rates (Liguori, 1996). The large database of hourly time series (the shorter one with 7000 values) allowed preliminary broad statistical analysis."The ANNs implemented have been selected trying to achieve both modelling efficiency and architectural simplicity. The Pearson correlation index with other simple statistical tests were used (Devore, 1990) as quick screening criterion of network performances and, only with the best results, more accurate statistical analyses were performed (systematic and unsystematic mean square error (Devore, 1990); Willimot indexes of agreement, (Willimot, 1982); probability of detection, missing rate, false alarm."

Abstract: The rapidly increasing use of renewable energy resources in power generation systems in recent years has accentuated the need to find an optimum and efficient scheme for **forecasting** meteorological parameters, such as solar radiation, temperature, wind speed, and sun exposure. Integrating wind power prediction systems into electrical grids has witnessed a powerful economic impact, along with the supply and demand balance of the power generation scheme. Academic interest in formulating accurate **forecasting** models of the energy yields of solar energy systems has significantly increased around the world. This significant rise has contributed to the increase in the share of solar power, which is evident from the power grids set up in Germany (5 GW) and Bavaria. The Spanish government has also taken initiative measures to develop the use of renewable energy, by providing incentives for the accurate day-ahead **forecasting**. **Forecasting** solar power outputs aids the critical components of the energy market, such as the management, scheduling, and decision making related to the distribution of the generated power. In the current study, a mathematical **forecasting** model, optimized using differential **evolution** and the particle swarm optimization (DEPSO) technique utilized for the short-term photovoltaic (PV) power output **forecasting** of the PV system located at Deakin University (Victoria, Australia), is proposed. A hybrid self-energized datalogging system is utilized in this setup to monitor the PV data along with the local environmental parameters used in the proposed **forecasting** model. A comparison study is carried out evaluating the standard particle swarm optimization (PSO) and differential **evolution** (DE), with the proposed DEPSO under three different time horizons (1-h, 2-h, and 4-h). Results of the 1-h time horizon shows that the root mean square error (RMSE), mean relative error (MRE), mean absolute error (MAE), mean bias error (MBE), weekly mean error (WME), and variance of the prediction errors (VAR) of the DEPSO based **forecasting** is 4.4%, 3.1%, 0.03, − 1.63, 0.16, and 0.01, respectively. Results demonstrate that the proposed DEPSO approach is more efficient and accurate compared with the PSO and DE. Keywords: differential **evolution** and the particle swarm optimization; hybrid meta-heuristic approach; mean absolute error; mean bias error; mean relative error; root mean square error; variance of the prediction errors; weekly mean error

23 Read more

improved if the technology applied becomes more sophisticated. The introduction of rotary, powered cutting equipment was one of the truly major advances in dentistry. [1, The term rotary instruments in dentistry refers to a group of instruments that turn on an axis to perform a work such as cutting, abrading, burnishing, finishing or polishing tooth A hand piece is a device for holding rotating instruments, transmitting power to them and for ning them intra orally. Dental handpiece of today is a sophisticated combination of precision parts moving in perfect synchronization at extremely high speed. In this review article we discuss about the **evolution**, classification and mechanics,

Load demand prediction is important for electric power planning and must be assessed with proper model. The power utility needs to forecasts in order to supply energy to consumer without interference. Neural Networks method is the most popular research topic have been done over the last decade. However, the use of Kohonen Self Organizing Map (SOM) not really explored. This paper present a **forecasting** method based on type of unsupervised learning neural networks. The main purpose of this project is to investigate the self organizing maps (SOM) neural networks can be used to forecast load demand. In this project, the SOM network is explored to understand the technique of SOM. In addition, this project targeted to improve the accuracy of short term loads **forecasting** through SOM neural networks technique. This study are focused on testing the first eight hours of the day to be forecast in order to identify its common patterns with the historical database previously trained by neural network. Weekdays data were us as their patterns same. After training, the data were testing and then forecasted. Finally the errors were compared. The MAPE error is 0.322%, 0.128% and 0.64% which is below than 3%. It show that SOM able to use in load **forecasting**.

GATA-factors are a family of zinc finger proteins with two highly conserved zinc fingers and adjacent basic domains. This DNA- binding domain binds to (A/T)GATA(A/G) sequences, which gave the name to these transcription factors (reviewed in Molkentin, 2000). So far, no GATA factors have been reported from non- bilaterians. While other transcription factors often form large fami- lies with many subfamilies, only 6 GATA factors have been isolated from vertebrates and only 3 were identified in Drosophila . The six vertebrate GATA factors are subdivided in two classes, GATA 1- 3 and 4-6. Especially the GATA factors 4-6 play a central role in specifying the mesendoderm and in the differentiation of endoder- mal and mesodermal tissues in invertebrates and vertebrates (Shoichet et al., 2000; reviewed in Patient and McGhee, 2002). In vertebrates they may act as master regulators of endoderm forma- tion, under the control of Nodal signalling during early development (Fujikura et al., 2002; Aoki et al., 2002), while in later organogen- esis they are regulated by BMP and FGF signalling (e.g. Rossi et al., 2001). Thus, in vertebrates, GATA factors 4-6 are under control of signalling pathways that act in the early specification of endo- derm and mesoderm. The best studied invertebrate model for the role of GATA factors is C. elegans , where different GATA factors act in several hierarchical cascades during the specification of the EMS cell, which gives rise to the endoderm and most of the mesoderm (Maduro et al., 2001; reviewed in Maduro and Rothman, 2002). This common role of homologues of the GATA 4,5 and 6 factors in specification of the mesendoderm led Rodaway and Patient (2001) to propose that the mesendoderm is an ancient germ layer, that was specified in the Urbilateria by the same set of genes, with GATA 5 homologs high up in the hierarchy. This attractive hypothesis will certainly be tested by the analysis of more species, both among the Bilateria and also among the diploblastic non-bilaterians. The specificity of the different GATA factors most certainly is assured by the specific interaction with other transcrip- tion factors with overlapping but not identical expression domains (reviewed in Molkentin, 2000; Patient and McGhee, 2002). It will be most interesting to investigate to what extent this network of protein interactions has been conserved or modified throughout **evolution**.

Lee-Carter Method: The Lee and Carter model (named LC hereafter) is a demographic and statistical model that is used to project mortality rates (Lee and Carter, 1992). The method can be seen as a special case of a principal components method (Bozik and Bell, 1987; Bell and Monsell, 1991) with a single component. Therefore, this model’s extensions have been used by actuaries for multiple purposes. In addition, the LC model uses an autoregressive moving average process (ARIMA) with a special case of the random walk with drift (RWD) **forecasting** model.

Several books have been written on time series analysis. Their writings were based on theoretical aspects of time series analysis and are mainly concerned with mathematical theory. Another author who made an immeasurable contribution to time series analysis literature is Box and Jenkins (1970). The book describes the approach of time series analysis, **forecasting** and control. It is based on a particular class of linear stochastic models.

Evaluation of predictions is an important step in any **forecasting** process. For point estimates this is a straightforward process that typically involves determining the Euclid- ean distance between the predicted and observed points. There is a vast literature on evaluation metrics for point **forecasting** models, for a review of the most popular meth- ods see [1]. However, there are conspicuously less papers available that describe meth- ods for evaluating density **forecasting** models. In fact, one must turn to the meteorolog- ical and financial literature to find any papers that focus on the evaluation of density forecasts with any degree of rigour. This is in spite of density forecast evaluation being a considerably more complex problem than point estimation. Diebold et al. [2] suggest that there might be three reasons for this neglect.

12 Read more

Abstract Achieving high accuracy in load **forecasting** requires the selection of ap- propriate **forecasting** models, able to capture the special characteristics of energy consumption time series. When hierarchies of load from different sources are con- sidered together, the complexity increases further; for example, when **forecasting** both at system and region level. Not only the model selection problem is expanded to multiple time series, but we also require aggregation consistency of the forecasts across levels. Although hierarchical forecast can address the aggregation consis- tency concerns, it does not resolve the model selection uncertainty. To address this we rely on Multiple Temporal Aggregation, which has been shown to mitigate the model selection problem for low frequency time series. We propose a modification for high frequency time series and combine conventional cross-sectional hierar- chical **forecasting** with multiple temporal aggregation. The effect of incorporating temporal aggregation in hierarchical **forecasting** is empirically assessed using a real data set from five bank branches, demonstrating superior accuracy, aggregation consistency and reliable automatic **forecasting**.

24 Read more

Abstract: with increasing dependence on agriculture, industries and day-by-day household comfort upon the continuity of electric supply from PHCN systems in Nigeria, the forecast of electrical demand have assumed a great importance. For electricity supply not to be an cog in the wheel of progress of the Nigerian economy and perhaps a snag in the attainment of the millennium development goal MDGs load **forecasting** must be performed to coordinate electricity demand and supply. This research work focuses on Nigeria electricity demand forecast from 2013-2030 towards vision 2025 using Time Series Analysis on past load demand.

kind of sensational enlightenment and explanatory power that the natural sciences had already provided? (Giddens: 1997:15). There is no getting around the fact that no unique truth exists or a way of approaching a problem, furthermore, change is the only constant, thus we have to seek new opportunities that allow us to incorporate new methodologies and epistemological perspectives. We must scrutinize in the most advanced scientific proposals nowadays, which are not necessarily under the foundations of the Social Sciences, but in the so-called hard sciences. While it is true that I consider the debate and criticism positive for the generation of knowledge, I think that the debates that arise in the field of the organizational studies should not be exclusionary. That is to say, given that each organization is unique and the phenomena that are studied within it have different characteristics, it is not possible to establish an exceptional pattern. For this reason, all of the debate between the positivist theory of the organization and its critics depends on the ontological situation of the organization and its relationship to society. At present, we are living in anage of transformations so great as that in which the interesting for studying enterprises as a science and in which the great classics such as Taylor and others developed their work, who wondered around the object of study, indicating different answers in which their theoretical efforts defined the own rationality of that fundamental component called enterprise, which fructified temporarily. Organizations are characterized by their **evolution** and thus are dynamic. The theoretical architectures that are built to understand, comprehend or explain them must evolve, but it is also necessary to define new properties of characteristics in the object of study. It is evident that the object of study is not the same, because it is also evolutionary, it is dynamic, without a doubt there must have emerged new properties, new characteristics that must be defined. New questions arise. What efforts within the current Organizational Studies are geared to the definition of new properties in the object of study? Is it possible to think that there only exists a crisis in the Kuhnian paradigms and not in their objects of study? The existing literature review on studies of the organization provides an overall picture that allows us to situate ourselves in the object of study.