IJSER Home >> Journal >> IJSER
International Journal of Scientific and Engineering Research
ISSN Online 2229-5518
ISSN Print: 2229-5518 5    
Website: http://www.ijser.org
scirp IJSER >> Volume 2, Issue 5, May 2011 Edition
Load Forecasting Using New Error Measures In Neural Networks
Full Text(PDF, 3000)  PP.  
Author(s)
Devesh Pratap Singh, Mohd. Shiblee, Deepak Kumar Singh
KEYWORDS
Load forecasting, Neural networks, New neuron Models, New error metrics for neural networks, England ISO
ABSTRACT
Load forecasting plays a key role in helping an electric utility to make important decisions on power, load switching, voltage control, network reconfiguration, and infrastructure development. It enhances the energy-efficient and reliable operation of a power system. This paper presents a study of short-term load forecasting using new error metrices for Artificial Neural Networks (ANNs) and applied it on the England ISO.
References
[1] S. Haykin. : Neural Networks: A Comprehensive Foundation, Macmillan College Publishing Company, New York, 1994

[2] Engle,R.F.,Mustafa,C.,andRice,J. 1992.”Modelling peak electricity Demand”.Journal of forecasting.11:241-251. & Eugene, A.F and Dora,G.2004.Load Forecasting.269-285.State University of New York :New York

[3] D.E. Rumelhart, ""Parallel Distributed Processing"", Plenary Lecture presented at Proc. IEEE International Conference on Neural Networks, San Diego, California, 1988.

[4] Donald F. Specht, “Probabilistic neural networks”, in Neural Networks, 3, pp. 109-118, 1990

[5] Donald F. Specht, “A General Regression Neural Network”, IEEE Trans on Neural Networks,Vol 2, 1991.

[6] S. Chen, C.F.N. Cowan, P.M. Grant, “Orthogonal least squares learningalgorithmfor radial basis function networks,” IEEE Transactions on Neural Networks, vol. 2, pp.302-309, 1991.

[7] Satish Kumar. : Neural Networks: A Classroom Approach, Tata McGraw Hill Publishing Company, New Delhi, 3rd Ed. 2007.

[8] W. Schiffmann, M. Joost, and R. Werner, , “Optimization of the backpropagation algorithm for multilayer perceptrons”, Univ. Koblenz, Inst. Physics, Rheinau 3-4, Germany.

[9] R. Battiti, “First and second-order methods for learning: between steepest descent and Newton’s method”, Neural Computation, Vol. 2, 141-166, 1992.

[10] J. Yu, J. Amores, N. Sebe, Q. Tian, “Toward an improve Error metric”, International Conference on Image Processing (ICIP), 2004.

[11] J. Yu, J. Amores, N. Sebe, Q. Tian, “Toward Robust Distance Metric Analysis for Similarity Estimation”, IEEE Computer Sosciety Conference on Computer Vision and Pattern Recognition, 2006.

[12] W. J. J. Rey, “Introduction to Robust and Quasi-Robust Statistical Methods”, Springer-Verlag, Berlin, 110–116, 1983.

[13] K. Hornik, M. Stinchombe and H. White, “Multilayer feedforward networks are universal approximators”, Neural networks, vol. 2, pp. 359-366, 1989.

[14] J. M. Zurada, “Introduction to Artificial Neural Systems”, Jaicob publishing house, India, 2002

[15] G. Cybenko, “Approximation by superpositions of a sigmoidal function”, Mathematics of Control, Signals, and Systemsvol. 2 pp.303-314, 1989.

Untitled Page