Tackling TSP with Self-Organized Maps and Simulated Annealing
Certainly the TSP is one of the most known combinatorial optimization problems. TSP alone was first studied apparently in 1930, yet appears in many applications from logistics, navigation, electronic circuit design, to DNA sequencing. Out of many methods to tackle it, I compared here Kohonen’s SOM and Simulated Annealing.
Kohonen’s SOM is a neural network with neurons positioned on some 2D map. The number of neurons is twice as number of nodes/cities (2N) yet the main idea is for them to form a connected ring initially in the center of mass of cities. This can be done by averaging x and y coordinates of all cities. The ring represents kind of rubber band which will be stretched by neurons as they move through 2D space. The network works like this: First, randomly a city is chosen with position or vector w_city=(x_city, y_city) and closest neuron wcl=w_n(i)=(x_n(i), y_n(i)) is found. Next, we evaluate neighborhood function NF for each neuron. This function tells us how far is each neuron from our closest neuron wcl. This value is not equal to the distance. The NF equals 1 for the closest neuron to the city and < 1 for all other neurons. This parameter is multiplied with learning rate (Pheta) which is typically 0.1 – 0.5. We take delta learning rule and move each neuron in the way of the randomly chosen city. Moving neurons depends on the value of function NF. The rule for each neuron (i=0,…2N) with position (vector) w_n(i)=(x,y) is: w_n(i)’ = w_n(i) + NF(wcl,w_n(i))*Pheta*(w_city – w_n(i)) where NF(wcl,w_n(i))=exp(-distance(wcl,w_n(i))^2 / 2*Theta). Procedure is repeated few times until we obtain static picture of the ring that is not moving. The ring ends up stretched forming so a good approximation of TSP solution.
Simulated Annealing is a well know optimization method based on stochastic search. Kohonen’s SOM gives better result even though we don’t have perfect overlapping of cities and neurons. This is because the Kohonen’s SOM tries to cluster cities which means that some neurons represent two or more cities and sometime is one on one relation what is perfect result. To obtain the same result with the Simulated Annealing, we need to repeat the test many more times compared to Kohonen’s SOM. Genetic Algorithms and Simulated Annealing produce precise solutions while SOM only approximates and is thus a bit faster. An implementation and installation of an app with 52 cities (adoptable in quantity) can be found here and presentation on topic here.