International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 1
ISSN 2229-5518
One Half Global Best Position Particle
Swarm Optimization Algorithm
Narinder Singh, S.B. Singh
Best Position, Global optimization, Velocity update equation.
—————————— ——————————
Particle swarm optimization (PSO) [1] is a
is represent the how much confidence in itself and second acceleration coefficient is referred the how
stochastic, population-based search method, modeled
after the behavior of bird flocks. A PSO algorithm
maintains a swarm of individuals (called particles),
much confidence in its neighborhood),
r r
1 j 2 j
U (0,1) ,
where each individual (particle) represents a
yij
is the personal best position of particle i and
candidate solution. Particles follow a very simple
dimension j , and
yˆ j
is the neighborhood best
behavior: emulate the success of neighboring particles, and own successes achieved. The position of a particle is therefore influenced by the best particle in a neighborhood, as well as the best solution found by the particle. Particle position xi are adjusted using
position of particle i and dimension j . The t is
represent the rate of change in time.
Eberhart and Shi [15] suggested a more generalized PSO, where a constriction coefficient is applied to both terms of the velocity formula. Clerc and
x (t 1)
x (t )
v (t 1) t
...(1)
Kennedy [14] showed that the constriction PSO can
i i i
Update Position
Previous Position
New Update Velocity
converge without using Vmax:
v (t 1) (v (t) c r
( y
x ) c r
( yˆ
x ))
where the velocity component, vi (t ) represents the
step size. For the basic PSO,
ij ij
1 1 j ij ij
2 2 j j ij
( y x ) ( yˆ
x )
where the constriction factor was set 0.7289. By using the constriction coefficient, the amplitude of the
c r ij ij
c r
j ij
v (t 1)
wv (t)
1 1 2 2
...(2)
ij ij j t
j t
particle‘s oscillation decreases, resulting in its
Update Velocity Current Motion Conitive Component Social Compoent
convergence over time. Kennedy [10] carried out
where w is the inertia weight [12], 1 and
2 are the
some experiments using a PSO variant, which drops the velocity term from the PSO equation.
acceleration coefficients (first acceleration coefficient
Dr S.B. Singh, is Professor and Head in Department of Mathematics, Punjabi University, Patiala, INDIA, Punjab, Pin No- 147002, Email ID:sbsingh69@yahoo.com. The research interests of author are Mathematical Modeling
and Optimization Techniques. He has published 50
If pi and pg were kept constant, a canonical PSO samples the search space following a bell shaped distribution centered exactly between the pi and pg.
This bare bones PSO produces normally distributed
research papers in Journals/Conference Proceedings. He was awarded Khosla Gold Medal by I.I.T. Roorkee for his
random numbers around the mean
pid pgd
2
(for
outstanding research contributions.
each dimension d), with the standard deviation of the
Mr. Narinder Singh is Ph.D. student in Department of
Mathematics, Punjabi University, Patiala, Punjab, INDIA,
Gaussian distribution being
pid pgd .
Pin No- 147002, Email ID: narindersinghgoria
@ymail.com.
Mendes and Kennedy [4] found that von Neumann
topology (north, south, east and west, of each particle
IJSER © 2011 http://www.ijser.org
International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 2
ISSN 2229-5518
placed on a 2 dimensional lattice) seems to be an overall winner among many different communication
topologies.
of particle i and dimension j . The t
the rate of change in time.
is represent
Kennedy [10] also proposed an alternative version of the barebones PSO, where
In the velocity update equation of this new PSO the
first term represents the current velocity of the
particle and can be thought of as a momentum term. The second term is proportional to the
(2 y
yˆ )
v (t 1)
y if U (0,1) 0.5
ij
y yˆ
...(3)
vector
c r
11 j
( ij j
2
t
x )
ij
, is responsible for the
ij N ( ij ij , )
otherwise
attractor of particle‘s current position and positive
2
direction of its own best position (pbest). The third
(2 y
yˆ )
Based on equation (3), there is a 50% chance that the jth dimension of the particle dimension changes to the
term is proportional to the vector
c r (
2 2 j
ij j
2
t
x )
ij ,
corresponding personal best position. This version of
the barebones PSO biases towards exploiting personal best positions.
PSO variants are continually being devised in an attempt to overcome this deficiency, see e.g. [16] [17] [18] [19] [20] [21] [22] [23] [24] for a few recent additions. These PSO variants greatly increase the complexity of the original method and Pedersen and co workers [25, 26] have demonstrated that satisfactory performance can be achieved with the basic PSO if only its parameters are properly tuned.
The motivation behind introducing OHGBPPSO is that in the velocity update equation instead of
is responsible for the attractor of particle‘s current
position.
- Randomly initialize particle position and
Velocities.
- While do not terminate
Evaluate fitness objective functional value at current position x .
i
If objective functional value is better than
modifying the Personal Best and Global Best Position.
Personal Best Position ( y
ij
) then update y .
ij
We introduce a new velocity update equation as
follows:
If objective function value is better than Global
(( y
yˆ j
) x ) (( y
yˆ j
) x )
Best Position ( yˆ j ) then update yˆ j .
v (t 1) wv (t ) c r
ij 2
ij c r
ij 2 ij
ij ij
1 1 j
t 2 2 j t
OR
- For each particle;
Update velocity v
(t 1) and position
(2 y
yˆ ) (2 y
yˆ ) ij
( ij j
x ) (
ij j
x )
v (t 1) wv (t ) c r
2 ij c r
2 ij
...(4)
x (t 1)
ij ij
1 1 j t
2 2 j t
ij
ij ij
c r j
yˆ j
y x
c r j
yˆ j
y x
where w is the inertia weight,
c1 and c2
are the
ij 2 ij
t
ij 2 ij
t
acceleration coefficients (first acceleration coefficient
x (t 1) x
(t ) v
(t 1)
represent the how much confidence in itself and
second acceleration coefficient referred the how much
ij ij ij
confidence in its neighborhood),
r r
1 j 2 j
U (0,1) ,
yij
is the personal best position of i particle and j
dimension, and
yˆ j is the neighborhood best position
IJSER © 2011 http://www.ijser.org
International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 3
ISSN 2229-5518
OHGBPPSO
SPSO
Pbes t+0.5 Gbes t
0.5 Gbes t
Pbes t
Current Pos ition of Partic le
-0.5 Gbes t
SPSO
OHGBPPSO
Pbes t+0.5 Gbes t
0.5 Gbes t
Pbes t
Current Pos ition of Partic le
-0.5 Gbes t
The relative performance of SPSO and OHGBPPSO is evaluated on two kinds of problem sets. Problem Set
1 consists of 15 scalable problems and Problem Set-II
consists of 13 non-scalable Problems.
Min f ( x) 20 exp(0.02
n1
i 1
xi )
between 30 x
i
30 and
exp(n1
n
i 1
cos( xi )) 20 e
Min Objective Function Value is
0.
IJSER © 2011 http://www.ijser.org
International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 4
ISSN 2229-5518
2. Cosine
Mixture
n n
Min f ( x) 0.1cos(5 xi ) xi
In which search space lies
i 1
i 1
between 1 x
i
1 and Min
3. Exponential
Objective Function Value is 0.1 (n) .
n In which search space lies
Min f ( x) (0.5
x2 )
between 1 x
1 and Min
4. Griewank
1 n
i 1 i
2 n x
i
Objective Function Value is -1.
In which search space lies
Min f ( x) 1
x
4000 i 1 i
i 1
cos( i )
i
between 600 x
i
600 and
5. Rastrigin
Min Objective Function Value is
0.
n In which search space lies
Min f ( x) 10n
[ x2 10 cos(2 x )]
between 5.12 x
5.12 and
i 1 i i
i
Min Objective Function Value is
0.
6. Function ‘6’
n 1 2 2 2
In which search space lies
Min f (x)
[100( x
1 x )
( x
1) ]
between 30 x
30 and
i 1
i i i
i
Min Objective Function Value is
0.
7. Zakharov’s
Min f (x)
n x2 [ n
( i ) x ]2 [ n
( i ) x ]4
In which search space lies
i 1 i
i 1 2 i
i 1 2 i
between 5.12 x
i
5.12 and
8. Sphere
Min Objective Function Value is
0.
n In which search space lies
Min f ( x) x2
i 1 i
between 5.12 x
i
5.12 and
Min Objective Function Value is
0.
9. Axis parallel
n In which search space lies
hyper ellipsoid
Min f ( x)
i 1
ix2
i
between 5.12 x
i
5.12 and
10. Schwefel ‘3’
Min Objective Function Value is
0.
n n In which search space lies
Min f ( x)
x
i 1 i
x
i 1 i
between 10 x
i
10 and
11. Dejong n
Min Objective Function Value is
0.
In which search space lies
Min f ( x)
i 1
( x4 rand (0,1))
i
between 10 x
i
10 and
12. Schwefel ‘4’
Min f (x) Max{ x
,1 i n}
Min Objective Function Value is
0.
In which search space lies
i between 100 x
i
100 and
IJSER © 2011 http://www.ijser.org
Min Objective Function Value is
0.
International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 5
ISSN 2229-5518
13. Cigar
n In which search space lies
Min f ( x) x2 100000 x2
between 10 x
10 and Min
i i 1 i
i
Objective Function Value is 0.
14. Brown ‗3‘
n 1
Min f (x) ( x )( x
1) ( x2
1)( x2 1)]
In which search space lies
[
i 1
i i 1
i 1 i
between 1 x
i
4 and Min
15. Function ‘15’
Objective Function Value is 0.
n In which search space lies
Min f ( x)
i 1
ix2
i
between 10 x
i
10 and
Min Objective Function Value is
0.
Problem
No.
Problems Name Problems Range
1. Becker and Lago
Min f (x) ( x
5)2 ( x
5)2
In which search space lies
1 2 between 10 x
i
10 and Min
2. Bohachevsky ‘1’
Min f ( x) x2 2x2 0.3 cos(3 x )
Objective Function Value is 0. In which search space lies
1 2 1
between 50 x
50 and Min
x i
3. Bohachevsky ‘2’
0.4 cos(4
Min f ( x) x2 2x2
2 ) 0.7
Objective Function Value is 0. In which search space lies
1 2
x x
between 50 x
i
50 and Min
0.3 cos(3
1) cos(4
2 ) 0.3
Objective Function Value is 0.
4. Branin
Min f ( x) a( x
bx2 cx
d )2
In which search space lies
2 1 1
between 5
1 100 ,
g (1
h) cos( x ) g
1
5 x
2
15 and Min Objective
a 1, b
5.1 , c 5 , d 6,
Function Value is 0.398.
4 2
g 10, h 1
5. Eggcrate
8
Min f (x) x2 x2 25(sin2 x
sin2 x )
In which search space lies
1 2 1 2
between 2 x
i
2 and Min
6. Miele and
Min f ( x) (exp( x ) x
)4 100( x
x )6
Objective Function Value is 0. In which search space lies
Cantrell
1 4 2 3
between 1 x
1 and Min
(tan( x
x ))4 x8 i
7. Modified
Min f (x) 100(x
3 4 1
x2 )2 [6.4( x
0.5)
Objective Function Value is 0.
In which search space lies
Rosenbrock
2 1 2
between 5 x , x
1 2
5 and
Min Objective Function Value is 0
x x
In which search space lies
8. Easom
Min f ( x) cos( 1) cos( 2 )
* exp(( x
)2 ( x
)2 )
between 10 x
i
10 and Min
1 2 Objective Function Value is -1
IJSER © 2011 http://www.ijser.org
International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 6
ISSN 2229-5518
9. Periodic
Min f ( x) 1 sin2 x
sin2 x
In which search space lies
1 2 between 10 x
i
10 and Min
0.1exp( x2 x2 )
1 2
Objective Function Value is 0.9
10. Powell’s
Min f ( x) ( x
10x
)2 5( x
x )2
In which search space lies
1 2 3 4
between 10 x
10 and Min
( x
2x
)4 10( x x )4 i
11. Camel back-3
2 3 1 4
Min f ( x) 2x2 1.05x4 1 x6
Objective Function Value is 0
In which search space lies
1 1 6 1
between 5 x , x
1 2
5 and
x x
x2
Min Objective Function Value is 0
12. Camel back-6
1 2 2
Min f ( x) 4x2 2.1x4 1 x6
In which search space lies
1 1 3 1
between 5 x , x
1 2
5 and
x x
4x2 4x4
Min Objective Function Value is -
13. Aluffi-Pentini’s
1 2 2 2
Min f ( x) 0.25x4 0.5x4 0.5x2
1.0316
In which search space lies
1 1 1
between 10 x
i
10 and Min
0.1x
0.5x2
Objective Function Value is 0.352
1 2
5.1 Parameter Setting: The maximum number of function evaluations is fixed to be 30,000.The swarm size is fixed to 20 and dim is 30. The inertia weight is 0.7 and the acceleration coefficients for SPSO and OHGBPPSO are set to
c c .
robustness. In observing Table 4, it can be seen that OHGBPPSO gives a better quality of solutions as compared to SPSO. Thus, for the non- scalable problems OHGBPPSO outperforms SPSO with respect to efficiency, reliability, cost and
robustness, Table 3.
be 1 2
1.5
It is observed that SPSO could not solve two
seen that OHGBPPSO gives a better quality of solutions as compared to SPSO. Thus, for the scalable problems OHGBPPSO outperforms SPSO with respect to efficiency, reliability, cost and
problems with 100% success, whereas OHGBPPSO solved all the problems with 100% success.
Problem No. | Minimum Function Value | Mean Function Value | Standard Deviation | Rate of Success | ||||||
SPSO | OHGBPPSO | SPSO | OHGBP PSO | SPSO | OHGBP PSO | SPSO | OHGBPPS O | |||
1 | 0.674207 | 0.524363 | 14699.6000 | 3835.60000 | 0.323300 | 0.089916 | 68.00% | 100% | ||
2 | 0.683359 | 0.415349 | 902.400000 | 812.400000 | 0.051324 | 0.109088 | 100% | 100% | ||
3 | 0.000000 | 0.000000 | 40.000000 | 40.000000 | 0.000569 | 0.000483 | 100% | 100% | ||
4 | 0.770386 | 0.680129 | 7038.80000 | 4505.20000 | 0.024076 | 0.053167 | 100% | 100% |
IJSER © 2011 http://www.ijser.org
International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 7
ISSN 2229-5518
5 | 20.89413 | 0.222967 | 13000.0000 | 5724.00000 | 16.219312 | 0.393235 | 0.00% | 100% | ||
6 | 0.007444 | 0.002482 | 140.400000 | 130.000000 | 0.263601 | 0.280204 | 100% | 100% | ||
7 | 0.001967 | 0.000010 | 52.800000 | 64.400000 | 0.251745 | 0.218062 | 100% | 100% | ||
8 | 0.000000 | 0.000000 | 40.000000 | 40.000000 | 0.029778 | 0.025274 | 100% | 100% | ||
9 | 0.000005 | 0.000004 | 40.400000 | 44.400000 | 0.135186 | 0.253899 | 100% | 100% | ||
10 | 0.001648 | 0.001330 | 40.400000 | 42.000000 | 0.141846 | 0.225886 | 100% | 100% | ||
11 | 0.635139 | 0.172111 | 5653.20000 | 4446.40000 | 0.065082 | 0.189287 | 100% | 100% | ||
12 | 0.012020 | 0.011124 | 65.200000 | 74.000000 | 0.256712 | 0.235492 | 100% | 100% | ||
13 | 0.057514 | 0.047514 | 1196.00000 | 1587.00000 | 0.216409 | 0.246410 | 100% | 100% | ||
14 | 0.002002 | 0.002000 | 40.000000 | 40.000000 | 0.129936 | 0.147956 | 100% | 100% | ||
15 | 0.000000 | 0.000000 | 40.000000 | 40.000000 | 0.011661 | 0.013672 | 100% | 100% |
Problem No. | Minimum Function Value | Mean Function Value | Standard Deviation | Success of Rate | ||||||
SPSO | OHGBP PSO | SPSO | OHGBP PSO | SPSO | OHGBP PSO | SPSO | OHGBPPSO | |||
1 | 0.500000 | 0.500104 | 41.600000 | 49.600000 | 0.088031 | 0.089541 | 100% | 100% | ||
2 | 0.004665 | 0.015410 | 48.800000 | 50.000000 | 0.279521 | 0.251598 | 100% | 100% | ||
3 | 0.002060 | 0.009141 | 54.400000 | 61.200000 | 0.240645 | 0.239715 | 100% | 100% | ||
4 | 0.003335 | 0.006881 | 81.600000 | 86.400000 | 0.257595 | 0.258021 | 100% | 100% | ||
5 | 0.002562 | 0.034907 | 60.400000 | 59.600000 | 0.249756 | 0.251556 | 100% | 100% | ||
6 | 73046.5964 | 73046.59648 | 30000.000 | 30000.000 | 0.000000 | 0.000000 | 0.00% | 0.00% | ||
7 | 14.541432 | 26.900800 | 30000.000 | 30000.000 | 0.000000 | 0.000000 | 0.00% | 0.00% | ||
8 | 0.009239 | 0.091784 | 59.200000 | 96.800000 | 0.286622 | 0.230102 | 100% | 100% | ||
9 | 0.480964 | 0.480470 | 40.000000 | 40.000000 | 0.037841 | 0.033635 | 100% | 100% | ||
10 | 0.075842 | 0.034330 | 578.80000 | 546.80000 | 0.233322 | 0.245201 | 100% | 100% | ||
11 | 0.006245 | 0.006541 | 46.800000 | 49.600000 | 0.206116 | 0.236984 | 100% | 100% | ||
12 | 0.017029 | 0.003487 | 51.200000 | 56.800000 | 0.262820 | 0.229284 | 100% | 100% | ||
13 | 0.010104 | 0.012869 | 45.600000 | 54.400000 | 0.217098 | 0.210105 | 100% | 100% |
IJSER © 2011 http://www.ijser.org
International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 8
ISSN 2229-5518
Function Values
Best Position Particle Swarm Optimization is
IJSER © 2011 http://www.ijser.org
International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 9
ISSN 2229-5518
presented. The algorithm is tested on scalable problems (increasing or decreasing particle in the swarm) and non-scalable problems (swarm size is fixed). The results show that when the particle size is increasing and decreasing in the swarm, the proposed algorithm outperforms the Standard Particle Swarm Optimization. But in the case when the particle size is fixed and no particle enters/leaves the swarm the Standard Particle Swarm Algorithm is better than the proposed one.
[1] R.C. Eberhart and J. Kennedy A New Optimizer using Particle Swarm Theory. In Proceedings of the Sixth International Symposium on Micromachine and Human Science, 1995, pp 39–43.
[2] J. Kennedy and R.C. Eberhart. Particle Swarm Optimization. In Proceedings of the IEEE International Joint Conference on Neural Networks, 1995, pp 1942–1948. IEEE Press.
[3] J. Kennedy. Small Worlds and Mega-Minds: Effects of Neighborhood Topology on Particle Swarm Performance. In Proceedings of the IEEE Congress on Evolutionary Computation, volume 3, July 1999, pages 1931–1938.
[4] J. Kennedy and R. Mendes Population Structure and Particle Performance. In Proceedings of the IEEE Congress on Evolutionary Computation, 2002. pages 1671–
1676. IEEE Press.
[5] E.S. Peer, F. van den Bergh, and A.P.
Engelbrecht. Using Neighborhoods with the Guaranteed Convergence PSO. In Proceedings of the IEEE Swarm Intelligence Symposium, 2003, pp 235–242. IEEE Press.
[9] F. van den Bergh and A.P. Engelbrecht. A Study of Particle Swarm Optimization Particle Trajectories. Information Sciences,
176(8) , 2006, pp 937–971.
[10] J. Kennedy. Bare Bones Particle Swarms. In Proceedings of the IEEE Swarm Intelligence Symposium, April 2003, pp 80–87.
[11] Y. Shi and R.C. Eberhart. A Modified Particle Swarm Optimizer. In Proceedings of the IEEE Congress on Evolutionary Computation, May
1998, pp 69–73.
[12] Angline, P.J. ‗Evolutionary optimization versus particle swarm optimization philosophy and performance differences‘, Lecture Notes in Computer Science, Vol.1447,
1998a pp.601-610, Springer, Berlin.
[13] Angline, P.J ‗Using selection to improve particle swarm optimization‘, Proceedings of the IEEE Conference on Evolutionary Computations, 1998b pp.84-89.
[14] Clerc M., Kennedy J., ― The Particle Swarm : Explosion, Stability, and Convergence in a Multi-dimensional Complex Space‖, IEEE Transactions on Evolutionary Computation,Vol.6, 2002, pp 58-73.
[15] Eberhart, R. C. and Shi, Y. Comparing inertia weigthts and constriction factors in particle swarm optimization. Proceedings of IEEE Congress on Evolutionary Computation, 2000 pp.
84-88. San Diego, CA.
[16] Z-H. Zhan, J. Zhang, Y. Li, and H.S-H.
Chung. Adaptive particle swarm
optimization. IEEE Transactions on Systems, Man, and Cybernetics, 2009, pp 1362-1381.
[6] | A.P. Engelbrecht. | Fundamentals | of | [17] | Z. Xinchao. A perturbed particle swarm |
Computational Swarm | Intelligence. Wiley | & | algorithm for numerical optimiza-tion. | ||
Sons, 2005. | Applied Soft Computing, 2010, pp 119-124. |
[7] J. Kennedy, R.C. Eberhart, and Y. Shi. Swarm
Intelligence. Morgan Kaufmann, 2001.
[8] F. van den Bergh An Analysis of Particle Swarm Optimizers. PhD thesis, Department of Computer Science, University of Pretoria, Pretoria, South Africa, 2002.
[18] T. Niknam and B. Amiri. An efficient hybrid approach based on PSO,ACO and k-means for cluster analysis. Applied Soft Computing,
2010, pp 183- 197.
[19] M. El-Abda, H. Hassan, M. Anisa, M.S.
Kamela, and M. Elmasry. Discrete
cooperative particle swarm optimization for
FPGA placement. Applied Soft Computing,
2010 pp 284-295.
IJSER © 2011 http://www.ijser.org
International Journal of Scientific & Engineering Research Volume 2, Issue 8, August-2011 10
ISSN 2229-5518
[20] M-R. Chena, X. Lia, X. Zhanga, and Y-Z. Lu.
A novel particle swarm optimizer hybridized with extremal optimization. Applied Soft Computing, 2010, pp 367-373.
[21] P.W.M. Tsang, T.Y.F. Yuena, and W.C. Situ.
Enhanced a_ne invariant matching of broken boundaries based on particle swarm optimization and the dynamic migrant principle. Applied Soft Computing, 2010, pp
432-438.
[22] C-C. Hsua, W-Y. Shiehb, and C-H. Gao.
Digital redesign of uncertain interval systems
based on extremal gain/phase margins via a hybrid particle swarm optimizer. Applied Soft Computing, 2010, pp 606-612.
[23] H. Liua, Z. Caia, and Y. Wang. Hybridizing particle swarm Optimi-
-zation with differential evolution for constrained numerical and engineering optimization. Applied Soft Computing, 2010, pp 629-640.
[24] K. Mahadevana and P.S. Kannan.
Comprehensive learning particle swarm
optimization for reactive power dispatch. Applied Soft Computing, 2010, pp 641-652.
[25] M.E.H. Pedersen. Tuning & Simplifying Heuristical Optimization. Ph.D. thesis, School of Engineering Sciences, University of Southampton, England, 2010.
[26] M.E.H. Pedersen and A.J. Chipper_eld.
Simplifying particle swarm optimization. Applied Soft Computing, 2010, pp 618-628.
IJSER © 2011 http://www.ijser.org