Author Topic: Weather Forecasting using ANN and PSO  (Read 8562 times)

0 Members and 1 Guest are viewing this topic.

IJSER Content Writer

  • Sr. Member
  • ****
  • Posts: 327
  • Karma: +0/-1
    • View Profile
Weather Forecasting using ANN and PSO
« on: August 21, 2011, 05:39:35 am »
Author : Asis Kumar Tripathy, Suvendu Mohapatra, Shradhananda Beura, Gunanidhi Pradhan
International Journal of Scientific & Engineering Research Volume 2, Issue 6, June-2011
ISSN 2229-5518
Download Full Paper : PDF

Abstract— Weather prediction is basically based upon the historical time series data. The basic parameters of weather prediction are maximum temperature, minimum temperature, rainfall, humidity etc. In this paper we are trying to predict the future weather condition based upon above parameters by Artificial Neural Network and Particle Swarm Optimization Technique. The basic Data mining operations are employed to get a useful pattern from a huge volume of data set. Different testing and training scenarios are performed to obtain the accurate result. Experimental results indicate that the proposed approach is useful for weather forecasting.

Index Terms— Weather forecasting, Artificial Neural Network, Particle Swarm Optimization Technique, Data mining, Numerical Weather Prediction, Network Forecasting

Weather forecasting is the application of science and  technology to predict the state of the atmosphere for a future time and a given location. Human beings have attempted to predict the weather informally for millennia, and formally since at least the nineteenth century. Weather forecasts are made by collecting quantitative data about the current state of the atmosphere and using scientific understanding of atmospheric processes to predict how the atmosphere will evolve. Weather forecasting is an ever-challenging area of investigation for scientists. In this research a weather forecasting model using Neural Network is going to propose. The weather parameters like maximum temperature, minimum temperature, relative humidity and rainfall is going to predicted using the features extracted over different periods as well as from the weather parameter time-series itself.

The approach applied here uses feed forward artificial neural networks (ANNs) with back propagation for su-pervised learning using the data recorded at a particular station. The trained ANN was used to predict the future weather conditions. The model can be suitably adapted for making forecasts over larger geographical areas. Weather changes affect also the demand for electricity. Typically, weather variables are used in the prediction of demand for electricity. The predictions are based on neural network and statistical modeling. The results indicate that, in some cases, the effect of weather is indirectly taken into account by other variables, and explicit use of weather variables may not be necessary. However, the decision to include or exclude weather variables should be analyzed for each individual situation.

1.1 Weather Forecasting System
1.1.1 Weather Data collection
Observations of atmospheric pressure, temperature, wind speed, wind direction, humidity, and precipitation are made near the earth's surface by trained observers, automatic weather stations. The World Meteorological Organization acts to standardize the instrumentation, observing practices and timing of these observations worldwide.
1.1.2 Weather Data Assimilation   
During the data assimilation process, information gained from the observations is used in conjunction with a numerical model's most recent forecast for the time that observations were made to produce the meteorological analysis. This is the best estimate of the current state of the atmosphere. It is a three dimensional representation of the distribution of temperature, moisture and wind.
1.1.3 Numerical Weather Prediction
Numerical Weather Prediction (NWP) uses the power of computers to make a forecast. Complex computer pro-grams, also known as forecast models, run on supercomputers and provide predictions on many atmospheric variables such as temperature, pressure, wind, and rainfall. A forecaster examines how the features predicted by the computer will interact to produce the day's weather.

Back Propagation was introduced by Rumelhart and Hinton in 1986. It is the most widely used network today. The basic idea in back propagation network is that the neurons of lower layer send up their outputs to the next layer. It is a fully connected, feed forward, hierarchical multi layer network, with hidden layers and no intra-connections.
. PSO was developed by Dr. Russell Eberhart and Dr. James Kennedy in 1995. PSO is based upon the popula-tion for numerical optimization without explicit know-ledge of the problem. This stochastic optimization tech-nique was inspired by the social behavior and intelligence of swarms.
Kennedy and Eberhart (1997) described a very simple alteration of the canonical algorithm that operates on bit-strings rather than real numbers. In their version, the velocity is used as a probability threshold to determine whether xid the dth component of xi should be evaluated as a zero or a one.
Kennedy and Spears (1998) compared this binary particle swarm to several kinds of GAs, using Spears’ multimodal random problem generator. This paradigm allows the creation of random binary problems with some specified characteristics, e.g., number of local optima, dimension, etc.
Mohan and Al-Kazemi (2001) suggested several ways that the particle swarm could be implemented on binary spaces. One version, which he calls the “regulated discrete particle swarm,” performed very well on a suite of test problems.
Agrafiotis and Cedeño (2002) used the locations of the particles as probabilities to select features in a pattern-matching task. Each feature was assigned a slice of a roulette wheel based on its floating-point value, which was then discretized to {0, 1}, indicating whether the feature was selected or not.
In Pamparä et al. (2005), instead of directly encoding bit strings in the particles, each particle stored the small number of coefficients of a trigonometric model (angle modulation) which was then run to generate bit strings. Extending PSO to more complex combinatorial search spaces is also of great interest.
Parrot and Li (2006) adjust the size and number of swarms dynamically by ordering the particles into ‘spe-cies’ in a technique called clearing. Blackwell and Branke (2006), borrowing from the atomic analogy referred to above, invoke an exclusion principle to prevent swarms from competing on the same peak and an anti-convergence measure to maintain diversity of the multi-swarm as a whole.
Recently, a self-adapting multi-swarm has been derived (Blackwell 2007)[3]. The multi-swarm with exclusion has been favorably compared, on the moving peaks problem, to the hierarchical swarm, PSO re-initialization and a state of the-art dynamic-optimization evolutionary algorithm known as self-organizing scouts.

Read More: Click here...