12.07.2015 Views

A Simulated Annealing Based Multi-objective Optimization Algorithm ...

A Simulated Annealing Based Multi-objective Optimization Algorithm ...

A Simulated Annealing Based Multi-objective Optimization Algorithm ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

3archive to check if it dominates any member of the archive.If yes, the child is accepted as the new parent and all thedominated solutions are eliminated from the archive. If thechild does not dominate any member of the archive, both theparent and the child are checked for their nearness with thesolutions of the archive. If the child resides in a less crowdedregion in the parameter space, it is accepted as a parent and acopy is added to the archive. Generally this crowding conceptis implemented by dividing the whole solution space into d Msubspaces where d is the depth parameter and M is the numberof <strong>objective</strong> functions. The subspaces are updated dynamically.The other popular algorithm for multi-<strong>objective</strong> optimizationis NSGA-II proposed by Deb et al. [19]. Here, initiallya random parent population P 0 of size N is created. Thenthe population is sorted based on the non-domination relation.Each solution of the population is assigned a fitness whichis equal to its non-domination level. A child population Q 0is created from the parent population P 0 by using binarytournament selection, recombination, and mutation operators.Generally according to this algorithm, initially a combinedpopulation R t = P t + Q t is formed of size R t , which is 2N.Now all the solutions of R t are sorted based on their nondominationstatus. If the total number of solutions belongingto the best non-dominated set F 1 is smaller than N, F 1is completely included into P (t+1) . The remaining membersof the population P (t+1) are chosen from subsequent nondominatedfronts in the order of their ranking. To chooseexactly N solutions, the solutions of the last included frontare sorted using the crowded comparison operator and the bestamong them (i.e., those with larger values of the crowdingdistance) are selected to fill in the available slots in P (t+1) .The new population P (t+1) is now used for selection, crossoverand mutation to create a new population Q (t+1) of size N,and the process continues. The crowding distance operator isalso used in the parent selection phase in order to break a tiein the binary tournament selection. This operator is basicallyresponsible for maintaining diversity in the Pareto front.B. Recent MOSA algorithm [17]One of the recently developed MOSA algorithm is by Smithet al. [17]. Here a dominance based energy function is used. Ifthe true Pareto front is available then the energy of a particularsolution x is calculated as the total number of solutions thatdominates x. However as the true Pareto front is not availableall the time a proposal has been made to estimate the energybased on the current estimate of the Pareto front, F ′ , whichis the set of mutually non-dominating solutions found thusfar in the process. Then the energy of the current solution xis the total number of solutions in the estimated front whichdominates x. If ‖F ′ ′ ‖ is the energy of the new solution x′xand ‖F x‖ ′ is the energy of the current solution x, then energydifference between the current and the proposed solution iscalculated as δE(x ′ , x) = (‖F ′ ‖ − ‖F ′ x ′ x ‖)/‖F ′ ‖. Division by‖F ′ ‖ ensures that δE is always less than unity and providessome robustness against fluctuations in the number of solutionsin F ′ . If the size of F ′is less than some threshold, thenattainment surface sampling method is adopted to increase thenumber of solutions in the final Pareto front. Authors haveperturbed a decision variable with a random number generatedfrom the laplacian distribution. Two different sets of scalingfactors, traversal scaling which generates moves to a nondominatedproposal within a front, and location scaling whichlocates a front closer to the original front, are kept. Thesescaling factors are updated with the iterations.III. ARCHIVED MULTI-OBJECTIVE SIMULATEDANNEALING (AMOSA)As mentioned earlier, the AMOSA algorithm is based onthe principle of SA [3]. In this article at a given temperatureT , a new state, s, is selected with a probabilityp qs =1−(E(q,T )−E(s,T ))1 + e T, (1)where q is the current state and E(s, T ) and E(q, T ) are thecorresponding energy values of s and q, respectively. Note thatthe above equation automatically ensures that the probabilityvalue lies in between 0 and 1. AMOSA incorporates theconcept of an Archive where the non-dominated solutionsseen so far are stored. In [28] the use of unconstrainedArchive size to reduce the loss of diversity is discussed indetail. In our approach we have kept the archive size limitedsince finally only a limited number of well distributed Paretooptimalsolutions are needed. Two limits are kept on the sizeof the Archive: a hard or strict limit denoted by HL, anda soft limit denoted by SL. During the process, the nondominatedsolutions are stored in the Archive as and whenthey are generated until the size of the Archive increases toSL. Thereafter if more non-dominated solutions are generated,these are added to the Archive, the size of which is thereafterreduced to HL by applying clustering. The structure of theproposed simulated annealing based AMOSA is shown inFigure 1. The parameters that need to be set a priori arementioned below.• HL: The maximum size of the Archive on termination.This set is equal to the maximum number of nondominatedsolutions required by the user.• SL: The maximum size to which the Archive may be filledbefore clustering is used to reduce its size to HL.• Tmax: Maximum (initial) temperature.• Tmin: Minimal (final) temperature.• iter: Number of iterations at each temperature.• α: The cooling rate in SAThe different steps of the algorithm are now explained indetail.A. Archive InitializationThe algorithm begins with the initialization of a numberγ× SL (γ > 1) of solutions. Each of these solutions isrefined by using a simple hill-climbing technique, acceptinga new solution only if it dominates the previous one. Thisis continued for a number of iterations. Thereafter the nondominatedsolutions (ND) that are obtained are stored in theArchive, up to a maximum of HL. In case the number ofnondominated solutions exceeds HL, clustering is applied to

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!