This paper describes the method of a prototype forecast component of the energy resource management control algorithm for STATCOMs with battery energy storage. It is desired to be computationally efficient and of minimal complexity due to the desired purposes of **forecasting** each load in a LV network. The forecast **model** is comprised of a **basis** **structure** selected from observed **electricity** **demand** data and an **electricity** **demand** difference **forecasting** component estimated by the **autoregressive** method. The produced **forecasting** **model** had a R 2 of 0.65 and a standard error of 368.55 W. During validation of the **model**, discrepancies between the forecasted and observed **electricity** **demand** profiles were observed. To overcome forecast **model** limitations, future work will involve more precise clustering of **demand** profiles according to additional temporal and environmental variables. This is to enable forecasts under a more diverse range of **electricity** **demand** profiles. The final developed **forecasting** **model** will be a core component of the firmware controlling STATCOMS with energy storage systems.

Show more
Clustering of numerical data can be considered as the most important unsupervised learning problem. It forms the **basis** of many classification and modeling algorithms. The purpose of clustering is to determine the intrinsic **structure** in a large set of unlabeled data, producing groups whose members are similar in some way. In order to improve the **forecasting** accuracy of SVM **model**, FCM clustering technique has been introduced to preprocess the thousands of training data by clustering them into different groups according to their natural memberships, which are the value of **electricity** prices rather than the time scale. The advantages of doing this are the noise can be reduced for each of training data sets after the data aggregation; meanwhile, SVM **model** could maintain its characteristics of empirical risk and generalization ability. Therefore, in this thesis, a hybrid **forecasting** **model** has been developed by conjunctive use of FCM clustering algorithm and SVM algorithm in order to overcome the limitations of individual models and get a high degree of prediction accuracy.

Show more
152 Read more

recently developed class of nonlinear models, used for time series and other types of data, based on principles derived from what is known about the **structure** of the brain (for a brief discussion of Neural Network modeling, please see Appendix III). Certain transformed inputs have been found to be fairly stable with respect to such types of variation and were found to have significant predictive power for **short**-**term** **demand** forecasts, and were used to form the **basis** of the Neural Network **model**. For the 5-minute **forecasting** outcome, which is of primary interest here, these inputs are the logarithmic changes in **demand** over the past four 5-minute periods immediately prior to the period being predicted, and five such changes leading up to and including the period occurring exactly one week before the time of the desired prediction.

Show more
20 Read more

For **short**-**term** load **forecasting**, the Back-Propagation Network (BP) network is the most widely used one. Due to its ability to approximate any continuous nonlinear function, the BP network has extraordinary mapping (**forecasting**) abilities. The BP network is a kind of multilayer feed forward network, and the transfer function within the network is usually a nonlinear function such as the Sigmoid function. The typical BP network **structure** for **short**-**term** load **forecasting** is a three-layer network, with the nonlinear Sigmoid function as the transfer function [21]. Fully connected BP networks need more training time and are not adaptive enough to temperature changes therefore some have moved to using non-fully connected BP models [22]. Although a fully connected ANN can capture the load characteristics, a non-fully connected ANN is more adaptive to respond to temperature changes. Results also show that the **forecasting** accuracy is significantly improved for abrupt temperature changing days. There is also merit in combining several sub- ANNs together to give better **forecasting** results such as using recurrent high order neural networks (RHONN) [23]. Due to its dynamic nature, the RHONN **forecasting** **model** can adapt quickly to changing conditions such as important load variations or changes of the daily load pattern [22]. A back-propagation network is a type of array which can realize nonlinear mapping from the inputs to the outputs. Therefore, the selection of input variables of a load **forecasting** network is very important. In general, there are two selection methods. One is based on experience and the other is based on statistical analysis such as the ARIMA and correlation analysis.

Show more
20 Read more

Concerning the support vector machines, due the limitations of the theory and the software in the experiment we build 24 models, each for single hour of a day. It is well known that support vector machines generalization performance depends on a good setting of global parameters: C , H and the kernel function. The problem of optimal parameter selection is further complicated by the fact that SVM **model** complexity depends on all these parameters. Due to these, we have arbitrary chosen values of these parameters and tried several different configurations. The final setting was following. Parameter H which controls the width of the insensitive zone, was set at 0.1. The capacity coefficient C was set to 10, which determines the trade-off between the **model** complexity and the degree to which deviations larger than H are tolerated in optimization formulation. As a kernel we used the radial **basis** functions with parameter J equal 0.2. This functions is by far the most popular choice of kernel types, because of their localized and finite responses across the entire range of the real x-axis.

Show more
In another study, a QR **model** was also used to forecast **electricity** **demand** [41]. The data used in their study was collected from 3639 households in Ireland at both aggregated and disaggregated levels. The proposed QR **model** was compared with three other benchmark methods. Other authors also developed additive quantile regression models for **forecasting** both probabilistic load and **electricity** prices as part of the global energy **forecasting** competition of 2014 (GEFCom2014) [13]. A summary of the methods used in GEFCom2014 are given in [20]. The proposed new methodology of [13] ranked first in both tracks of the competition. The work done by [13] is extended by Fasiolo et al. [11] who developed fast calibrated additive quantile regression models. To implement the developed models, [11] developed a new r statistical package “qgam”. The same covariates used in [13] were also used in [11]. In both papers variable selection techniques are not discussed. In another study [18] used kernel support vector quantile regression and copula theory for **short**-**term** load probability density **forecasting**. Two criteria for evaluating the accuracy of the prediction intervals are proposed, the prediction interval normalized average width (PINAW) and the prediction interval coverage probability (PICP). Results from this study show that the Gaussian kernel gives the most accurate forecasts compared to the linear and polynomial kernels respectively. In a more recent study, [48] developed a Gaussian process quantile regression **model** for **short**-**term** load probability density **forecasting**. The authors argue that this modelling framework provides accurate point forecasts as well as giving probabilistic descriptions of the prediction intervals. The present study discusses an application of partially linear additive quantile regression (PLAQR) models in **forecasting** hourly **electricity** **demand** during the peak period (i.e from 18:00 to 20:00) in South Africa (SA). PLAQR models are a combination of generalized additive models (GAMs) developed by [17] and quantile regression (QR) models [29,30] where the conditional quantile function comprises a linear parametric component and a nonparametric additive component [23]. Among the first to introduce partially linear models include among others Engle et al. [8] who analyzed the relationship between **electricity** usage and temperature. A two-step approach for estimating a PLAQR **model** is discussed in Hoshino [23] and applied to a real data set. In a another study [28] discussed an application of double-penalized quantile regression partially linear additive models. A simulation study and an application to a real data set were used to evaluate the developed models. In a related study, Bayesian partially linear additive quantile regression models are used in simulation studies by [24]. To summarize, the state-of-the-art in modelling is to use hybrid models.

Show more
41 Read more

In order to see if the differences observed between the two models are statisti- cally significant (that is, if the errors come from different distributions), we did a Wil- coxon signed-rank test for paired samples for the samples of daily MAPEs of the uni- variate **model** and the best multivariate **model**. We chose this test because a previously done Shapiro-Wilk test pointed no evidence that the samples are normally distributed. For a two-tailed test where the null hypothesis H 0 is that the performance of the

It is a fact that every system is pervasively imprecise, uncertain and hard to be modelled precisely. A flexible approach called Soft Computing technique has emerged to deal such models effectively and most efficiently on research scenario. It has been very widely in use over the last few decades. Soft computing is an emerging approach which parallels the remarkable ability of the human mind to reason and learn in an environment of uncertainty and imprecision. It is fast emerging as a tool to help computer-based intelligent systems mimic the ability of the human mind to employ modes of reasoning that are approximate rather than exact. The basic theme of soft computing is that precision and certainty carry a cost and that intelligent systems should exploit, wherever possible, the tolerance for imprecision and uncertainty. Soft computing constitutes a collection of disciplines which include fuzzy logic (FL), neural networks (NNs), evolutionary algorithms (EAs) like genetic algorithms (GAs) etc. Natural intelligence is the product of millions of years of biological evolution. Simulating complex biological evolutionary processes may lead us to discover, how evolution propels living systems toward higher- level of intelligence. One of the newer and relatively simple optimization approaches is the GA which is based on the evolutionary principle of natural selection. Perhaps one of the most attractive qualities of GA is that it is a derivative free optimization tool. The **demand**/ load **forecasting** techniques are also developed based on the following soft computing/ intelligent techniques. The Knowledge-based expert systems have been utilized for this purpose also.

Show more
11 Read more

A few problems arise in using purely theoretical models to judge the decisions of the Federal Reserve. First, there is a problem with the dependent variables of these models. In practice, the federal funds rate is a discrete variable that is very limited in sudden movement (recall the 1 percent threshold discussed earlier). Both the Taylor Rule and the Expectations **Model** assume the Federal Reserve will behave in a manner that allows them to make extreme changes to the interest rate until alignment with equilibrium is accomplished. However we know this is not the case because history has taught us that large changes are made over time in a smooth manner. As an example, consider the recession of 2008. The federal funds target rate was just below 5 percent at the beginning of the recession (December 2007); it was almost an entire year later (November 2008) that the federal funds target rate was down to 1 percent, later followed by the zero bound.

Show more
44 Read more

It is quite obvious that the gas load **forecasting** can play an important role in the planning and operation of power systems. Since gas resources are usually located in the remote area from end-users, the time taken for gas transport to distributed zones and populated cities can sometimes exceed 48 hours. However, any possible variations in the weather parameters such as sudden temperature changes, etc. can affect the gas consumption rate, which is required to be known. Gas **demand** load **forecasting** is not easy because the change in the weather parameters and gas load is not linear, so it is hard to use the statistical methods for such a purpose. Analytical techniques could not deal with the kind of information that the experts had to deal with, which were

Show more
The important contribution of the seasonal component to the dynamics of **electricity** **demand** suggests that a proper description of (1) may improve the forecast performance with respect of the existing methods. In this paper we attempt to do so. Our method exploits principal components and regression to solve the dimensionality problem implied by N → ∞ and (1). We propose to extract from the data latent factors that drive most of the dynamics, and we use them in a diffusion index (DI) forecast (see Stock and Watson, 2002). The DI **model** has been relatively successful to forecast macroeconomic time series, and in this paper we demonstrate that such a method also produces quite good **short**-run forecasts of high-frequency **electricity** **demand**.

Show more
22 Read more

The data seem to have a linear positive trend; see Figure 1. This is corroborated by the traditional Phillips- Perron unit-root test [20], where the null hypothesis of a stochastic trend (unit-root) is strongly rejected for all the 24 individual series. Furthermore, the positive trend in the load is correlated with economic and demographic factors. Hence, it is expected that the trend has a high positive correlation with the potential Gross Domestic Product (GDP), which in the case of Brazil is known to be almost linear; see [21] for a discussion. All that said, we **model** the trend as a deterministic linear function of time. Most papers in the load **forecasting** literature take first-order differences of the load series without previously testing for unit-roots; see [22] for example. This has a major drawback. When the trend is deterministic, taking first-differences introduces a non- invertible moving average component in the data gener- ating process, which causes serious estimation problems. Furthermore, there is no linear **autoregressive** **model** that is able to correctly describe the dynamics of the data; see the discussion in Chapter 4 of [23]. Finally, it seems that there is a break in the trend after 1999. As this break belongs to the out-of-sample period, we ignore it during the specification and estimation of the proposed **model**. This is important in order to test the robustness of the proposed **model**.

Show more
15 Read more

This paper proposes Artificial Neural Network based hourly a day ahead **demand** **forecasting** **model** for PJM **electricity** markets. The **demand** **forecasting** is done based on classic **demand** **forecasting** **model** and correlation **demand** **forecasting** **model**. The day to be forecasted is divided into 7 groups of hours depending upon correlation coefficient. Similarly depending upon classic **demand** **forecasting** **model** the day to be forecasted is divided into peak load hours and off peak load hours. The performances of the ANNs are evaluated by calculating the MAPE in both the cases. The input data used as training data for ANN is the previous day data and same day of previous week data. The results explore that correlation based **demand** **forecasting** is best and suitable for PJM **electricity** markets than the classic **demand** **forecasting** **model**. The results conclude that the good quality of training data used for the ANNs improve the **forecasting** **electricity** **demand**. The MAPE evaluated through the proposed models are very low as compared to other conventional models. The proposed approach represents a useful method to forecast next-day **electricity** demands in the bidding process of energy market operators.

Show more
By combining an ARMA‑type **model** with a machine learning technique, namely support vector regression (SVR), flexibility of this hybrid **model** should increase compared a single‑method **model**. In addition, one can utilize advantages from each of the two worlds: attaining precise forecasts by capturing nonlinear behavior while maintaining some interpretability. This research aims to compare two specifications of hybrid models based on ARIMA combined with SVR since the SVR models have shown to yield high performance in this combination (Weron, 2014). Furthermore, by comparing multiple areas of the Nord Pool market, the effect of transmission congestion is assessed since insufficient network capacity induces differences in prices between areas (Kristiansen, 2014; Loland et al., 2012). The effect of transmission congestion on **forecasting** precision is a part of balancing costs, i.e., enters the final **electricity** price.

Show more
10 Read more

We present a refined parametric **model** for **forecasting** **electricity** **demand**, that performed particularly well in the recent Global Energy **Forecasting** Competition (GEFCom 2012). We begin by motivating and presenting a simple parametric **model**, treating **electricity** **demand** as a function of tem- perature and day of the year. We then set out a series of refinements to the **model**, explaining the rationale for each, and using competition scores to demonstrate that each successive refinement step increases the accuracy of the model’s predictions. These refinements include combining models from multiple weather stations, removing outliers from the historical data and special treatment of public holidays.

Show more
13 Read more

Model specifications that include fare variables indicate a reasonable range of fare elasticities (percent change in ridership due to a one percent change in fare). • Metrorail fare el[r]

18 Read more

a energy publicize, where the records required for the neural framework database have been obtained by using dealing with ideal **electricity** circulation (OPF) conditions for the system load assortments for every hour of the day in a month [19]. ANN based totally multi day-ahead really worth desire is accomplished in the Turkish **electricity** function and the results are differentiated and ARIMA version and MAPE is used for surveying the framework execution concerning one of a kind topologies, enter units, and getting prepared figurings [20]. Direct ANNs are associated with gauge the **electricity** fees of warm and bloodless days with low, average and zenith hundreds [21]. giant amounts of this present truth input records checks to get equipped faux neural frameworks, commonly makes sickness over ANNs for the duration of the getting to know technique and ultimately, debases their farsighted factor of confinement. consequently, the statistics dimensionality is dwindled the use of head segment evaluation (PCA) before applying to records envelopment examination (DEA) in locating the efficiencies of electrical go with the flow associations in a deregulated situation [22]. in addition, discrete cosine change (DCT) is associated as a pre-managing mechanical meeting for neural frameworks in the preference for energy charges. The time course of motion expenses are modified into repeat space using DCT and those are then skilled NN for the request [23]. consequently, a light has hurled in diminishing the information dimensionality that is to be used as making plans enlightening file for the FFNN in deciding the **electricity** charges. given that following day really worth envisioning is an vital prerequisite for creators, customers and essentialness agency associations, this paper proposes a beneficial but notably precise after day worth finding out gadget.

Show more
11 Read more

In this work, we investigated the **short** **term** **electricity** load **forecasting** problem, while considering feature engineering and classifier parameters adjustment. We proposed a two- stage **electricity** load **forecasting** framework, which is based on feature processing and enhanced CNN classifiers to solve **forecasting** accuracy problems. Specifically, to select the critical features, a new combined two-stage **model** is employed to process the n-dimensional time sequence data as input. Additionally, to enhance CNN classifier efficiency in terms of accuracy and speed, we apply t-SNE for feature extraction with less redundancy. Moreover, the GSA automatically and efficiently obtains the appropriate super parameters for ECNN to boost classification performance. The numerical results confirm that the proposed framework shows better results in terms of accuracy when compared to the standard CNN. Furthermore, the work suggests GSA offers a flexible and powerful tool for certain types of optimzation problem. In a different context, for example, GSA has a strong potential to be used for research into robot control systems for nuclear decommissioning and mobile robot path planning, which the present authors are also investigating. Acknowledgements The authors acknowledge funding support from COMSATS University Islamabad and Lancaster University UK to support this project. The work was in part supported by the UK EPSRC grant EP/R02572X/1.

Show more
In this paper, we discuss a new hybrid **model** obtained by fusing a SARIMA **model** and a generalized single neuron **model**. The proposed **model** has several advantages: ﬁ rst, it is able to the capture nonlinear behavior in the data; second, the SARIMA approach provides the modeler with a well-known and accepted methodology for **model** speciﬁ cation; and third, it is not necessary to use heuristics and expert knowledge for selecting the conﬁ guration of the artiﬁ cial neural network because the GSMN **model** uses the same inputs as the SARIMA **model** and it is not necessary to specify processing layers as in other neural network architectures. To assess the effectiveness of our **model**, we forecast the monthly **demand** of **electricity** in the Colombian energy market using several competitive models and we compare the accuracy of forecasts. The results obtained show that our approach performs better than the SARIMA and GSMN models in isolation. However, further research is needed to gain more conﬁ dence and to better understand the proposed **model**.

Show more
rule based demand analysis based on 10 years of historic data and takes into account the daily, weekly, seasonal, trend and local variations. It is an on-line system [r]

309 Read more