12TABLE VConvergence AND Purity MEASURES ON THE 3 OBJECTIVE TEST FUNCTIONS WHILE Archive IS BOUNDED TO 100Test Convergence P urity MinimalSpacingProblem AMOSA MOSA NSGA-II AMOSA MOSA NSGA-II AMOSA MOSA NSGA-IIDT LZ1 0.01235 0.159 13.695 0.857 0.56 0.378 0.0107 0.1529 0.2119(85.7/100) (28.35/75) (55.7/100)DT LZ2 0.014 0.01215 0.165 0.937 0.9637 0.23 0.0969 0.1069 0.1236(93.37/100) (96.37/100) (23.25/100)DT LZ3 0.0167 0.71 20.19 0.98 0.84 0.232 0.1015 0.152 0.14084(93/95) (84.99/100) (23.2/70.6)DT LZ4 0.28 0.21 0.45 0.833 0.97 0.7 0.20 0.242 0.318(60/72) (97/100) (70/100)DT LZ5 0.00044 0.0044 0.1036 1 0.638 0.05 0.0159 0.0579 0.128(97/97) (53.37/83.6) (5/100)DT LZ6 0.043 0.3722 0.329 0.9212 0.7175 0.505 0.1148 0.127 0.1266(92.12/100) (71.75/100) (50.5/100)TABLE VIConvergence, Purity AND Minimal Spacing MEASURES ON THE 3 OBJECTIVES TEST FUNCTIONS BY AMOSA AND MOSA WHILE Archive IS UNBOUNDEDTest Convergence P urity MinimalSpacingProblem AMOSA MOSA AMOSA MOSA AMOSA MOSADT LZ1 0.010 0.1275 0.99(1253.87/1262.62) 0.189(54.87/289.62) 0.064 0.083.84DT LZ2 0.0073 0.0089 0.96(1074.8/1116.3) 0.94(225/239.2) 0.07598 0.09595DT LZ3 0.013 0.025 0.858(1212/1412.8) 0.81(1719/2003.9) 0.0399 0.05DT LZ4 0.032 0.024 0.8845(88.5/100) 0.4050(38.5/94.9) 0.1536 0.089DT LZ5 0.0025 0.0047 0.92(298/323.66) 0.684(58.5/85.5) 0.018 0.05826DT LZ6 0.0403 0.208 0.9979(738.25/739.75) 0.287(55.75/194.25) 0.0465 0.0111TABLE VIIConvergence, Purity AND Minimal Spacing MEASURES ON THE DTLZ2 4, DTLZ1 5, DTLZ1 10 AND DTLZ1 15 TEST FUNCTIONS BY AMOSA ANDNSGA-IITest Convergence P urity MinimalSpacingProblem AMOSA NSGA-II AMOSA NSGA-II AMOSA NSGA-IIDT LZ2 4 0.2982 0.4563 0.9875(98.75/100) 0.903(90.3/100) 0.1876 0.22DT LZ1 5 0.0234 306.917 1 0 0.1078 0.165DT LZ1 10 0.0779 355.957 1 0 0.1056 0.2616DT LZ1 15 0.193 357.77 1 0 0.1 0.271The Purity measure also indicates this. The results on many<strong>objective</strong>optimization problems show that AMOSA peformsmuch better than NSGA-II. These results support the fact thatPareto ranking-based MOEAs such as NSGA-II do not workwell on many-<strong>objective</strong> optimization problems as pointed outin some recent studies [26], [27].D. Discussion on <strong>Annealing</strong> ScheduleThe annealing schedule of an SA algorithm consists of (i)initial value of temperature (T max), (ii) cooling schedule, (iii)number of iterations to be performed at each temperature and(iv) stopping criterion to terminate the algorithm. Initial valueof the temperature should be so chosen that it allows the SAto perform a random walk over the landscape. Some methodsto select the initial temperature are given in detail in [18]. Inthis article, as in [17], we have set the initial temperature toachieve an initial acceptance rate of approximately 50% onderogatory proposals. This is described in Section VI.C.Cooling schedule determines the functional form of thechange in temperature required in SA. The most frequentlyused decrement rule, also used in this article, is the geometricschedule given by: T k+1 = α × T k , where α (0 < α
13poorer. However, an exhaustive sensitivity study needs to beperformed for AMOSA.The third component of an annealing schedule is the numberof iterations performed at each temperature. It should be sochosen that the system is sufficiently close to the stationarydistribution at that temperature. As suggested in [18], the valueof the number of iterations should be chosen depending onthe nature of the problem. Several criteria for termination ofan SA process have been developed. In some of them, thetotal number of iterations that the SA procedure must executeis given, where as in some other, the minimum value of thetemperature is specified. Detailed discussion on this issue canbe found in [18].V. DISCUSSION AND CONCLUSIONSIn this article a simulated annealing based multi-<strong>objective</strong>optimization algorithm has been proposed. The concept ofamount of domination is used in solving the multi-<strong>objective</strong>optimization problems. In contrast to most other MOO algorithms,AMOSA selects dominated solutions with a probabilitythat is dependent on the amount of domination measuredin terms of the hypervolume between the two solutions inthe <strong>objective</strong> space. The results of binary-coded AMOSAare compared with those of two existing well-known multi<strong>objective</strong>optimization algorithms - NSGA-II (binary-coded)[19] and PAES [20] for a suite of seven 2-<strong>objective</strong> testproblems having different complexity levels. In a part of theinvestigation, comparison of the real-coded version of theproposed algorithm is conducted with a very recent multi<strong>objective</strong>simulated annealing algorithm MOSA [17] and realcodedNSGA-II for six 3-<strong>objective</strong> test problems. Real-codedAMOSA is also compared with real-coded NSGA-II for some4, 5, 10 and 15 <strong>objective</strong> test problems. Several different comparisonmeasures like Convergence, Purity, MinimalSpacing,and Spacing, and the time taken are used for the purposeof comparison. In this regard, a measure called displacementhas also been used that is able to reflect whether a front isclose to the PO front as well as its extent of coverage. Acomplexity analysis of AMOSA is performed. It is found thatits complexity is more than that of PAES but smaller than thatof NSGA-II.It is seen from the given results that the performance of theproposed AMOSA is better than that of MOSA and NSGA-IIin a majority of the cases, while PAES performs poorly ingeneral. AMOSA is found to provide more distinct solutionsthan NSGA-II in each run for all the problems; this is a desirablefeature in MOO. AMOSA is less time consuming thanNSGA-II for complex problems like ZDT1, ZDT2 and ZDT6.Moreover, for problems with many <strong>objective</strong>s, the performanceof AMOSA is found to be much better than that of NSGA-II. This is an interesting and appealing feature of AMOSAsince Pareto ranking-based MOEAs, such as NSGA-II [19]do not work well on many-<strong>objective</strong> optimization problems aspointed out in some recent studies [26], [27]. An interestingfeature of AMOSA, as in other versions of multi-<strong>objective</strong> SAalgorithms, is that it has a non-zero probability of allowinga dominated solution to be chosen as the current solutionin favour of a dominating solution. This makes the problemless greedy in nature; thereby leading to better performancefor complex and/or deceptive problems. Note that it may bepossible to incorporate this feature as well as the concept ofamount of domination in other MOO algorithms in order toimprove the performance.There are several ways in which the proposed AMOSAalgorithm may be extended in future. The main time consumingprocedure in AMOSA is the clustering part. Someother more efficient clustering techniques or even the PAESlike grid based strategy, can be incorporated for improving itsperformance. Implementation of AMOSA with unconstrainedarchive is another interesting area to pursue in future. Analgorithm, unless analyzed theoretically, is good for only theexperiments conducted. Thus a theoretical analysis of AMOSAneeds to be performed in the future in order to study itsconvergence properties. Authors are currently trying to developa proof for the convergence of AMOSA in the lines of theproof for single <strong>objective</strong> SA given by Geman and Geman [4].As has been mentioned in [18], there are no firm guidelinesfor choosing the parameters in an SA-based algorithm. Thus,an extensive sensitivity study of AMOSA with respect to itsdifferent parameters, notably the annealing schedule, needs tobe performed. Finally, application of AMOSA to several reallifedomains e.g., VLSI system design [34], remote sensingimagery [35], data mining and Bioinformatics [36], needs tobe demonstrated. The authors are currently working in thisdirection.REFERENCES[1] K. Deb, <strong>Multi</strong>-<strong>objective</strong> <strong>Optimization</strong> Using Evolutionary <strong>Algorithm</strong>s.England: John Wiley and Sons, Ltd, 2001.[2] C. Coello, D. V. Veldhuizen, and G. Lamont, Evolutionary <strong>Algorithm</strong>sfor Solving <strong>Multi</strong>-Objective Problems. Boston: Kluwer AcademicPublishers, 2002.[3] S. Kirkpatrick, C. Gelatt, and M. Vecchi, “<strong>Optimization</strong> by simulatedannealing,” Science, vol. 220, pp. 671–680, 1983.[4] S. Geman and D. Geman, “Stochastic relaxation, Gibbs distributionsand the Bayesian restoration of images,” IEEE-Transactions on PatternAnalysis and Machine Intelligence, vol. 6, no. 6, pp. 721–741, 1984.[5] S. Bandyopadhyay, U. Maulik, and M. K. Pakhira, “Clustering usingsimulated annealing with probabilistic redistribution,” InternationalJournal of Pattern Recognition and Artificial Intelligence, vol. 15, no. 2,pp. 269–285, 2001.[6] R. Caves, S. Quegan, and R. White, “Quantitative comparison of theperformance of SAR segmentation algorithms,” IEEE Transactions onImage Proc., vol. 7, no. 11, pp. 1534–1546, 1998.[7] U. Maulik, S. Bandyopadhyay, and J. Trinder, “SAFE: An efficientfeature extraction technique,” Journal of Knowledge and InformationSystems, vol. 3, pp. 374–387, 2001.[8] P. Czyzak and A. Jaszkiewicz, “Pareto simulated annealing - a metaheuristictechnique for multiple-<strong>objective</strong> combinatorial optimization,”Journal of <strong>Multi</strong>-Criteria Decision Analysis, vol. 7, pp. 34–47, 1998.[9] M. Hapke, A. Jaszkiewicz, and R. Slowinski, “Pareto simulated annealingfor fuzzy multi-<strong>objective</strong> combinatorial optimization,” Journal ofHeuristics, vol. 6, no. 3, pp. 329–345, 2000.[10] A. Suppapitnarm, K. A. Seffen, G. T. Parks, and P. Clarkson, “A simulatedannealing algorithm for multi<strong>objective</strong> optimization,” Engineering<strong>Optimization</strong>, vol. 33, pp. 59–85, 2000.[11] D. K. Nam and C. Park, “<strong>Multi</strong><strong>objective</strong> simulated annealing: a comparativestudy to evolutionary algorithms,” Int. J. Fuzzy Systems, vol. 2,no. 2, pp. 87–97, 2000.[12] E. L. Ulungu, J. Teghaem, P. Fortemps, and D. Tuyttens, “MOSAmethod: a tool for solving multi<strong>objective</strong> combinatorial decision problems,”J. multi-criteria decision analysis, vol. 8, pp. 221–236, 1999.