D1e2eabbfd08c6d4fed5b4a8b15aa97b7c98 [627936]
CREATIVE MATH. & INF.
16(2007), 42 – 53
Dedicated to Professor Ioan A. RUS on the occasion of his 70thanniversary
Solving the generalized minimum spanning tree
problem with simulated annealing
PETRIC ˘APOP, COSMIN SABO , CORINA POPSITAR and M ARIAN V. C R˘ACIUN
ABSTRACT .We consider a generalization of the minimum spanning tree probl em, called the gen-
eralized minimum spanning tree problem, denoted by GMST. It is kn own that the GMST problem is
NP-hard. We present an effective algorithm for this problem. Th e method combines a simulated an-
nealing algorithm (SA) with a local greedy algorithm. The heuri stic that we proposed found solutions
that were optimal for graphs with nodes up to 280 and were within at most 24% of optimality for larger
problems, while the existing algorithms from the literature bec ome computationally intractable.
1.INTRODUCTION
Many combinatorial optimization problems are NP-hard, and the theory of NP-
completeness has reduced hopes that NP-hard problems can be solved within
polynomially bounded computation times. Nevertheless, su b-optimal solutions
are sometimes easy to find. Consequently, there is much inter est in approximation
and heuristic algorithms that can find near optimal solution s within reasonable
running time. Heuristic algorithms are typically among the b est strategies in terms
of efficiency and solution quality for problems of realistic size and complexity.
In contrast to individual heuristic algorithms that are desi gned to solve a spe-
cific problem, metaheuristics are strategic problem solvin g frameworks that can be
adapted to solve a wide variety of problems. Metaheuristic a lgorithms are widely
recognized as one of the most practical approaches for combi natorial optimization
problems. Among representative metaheuristics are geneti c algorithms, simulated
annealing, tabu search and so on. Useful references regardi ng metaheuristic meth-
ods can be found in [7], [12] and [14].
The GMST problem was introduced in [11], and is defined on an un directed
graph G= (V,E)whose nodes are partitioned into a number of subsets (cluste rs)
and whose edges have a nonnegative cost. The GMST problem ask s for finding a
minimum-cost tree in Gthat includes exactly one node from each cluster. Note that
the well-known minimum spanning tree problem is a special ca se of the GMST
problem where each cluster consists of only one node. The GMS T problem has
several applications to location and telecommunication pr oblems.
The GMST problem was solved to optimality for nodes up to 200 b y [3] using a
branch-and-cut algorithm and by [13] using a rooting proced ure based on a new
mixed integer formulation of the GMST problem, but the probl em becomes com-
putationally intractable for larger instances.
Received: 20.09.2006. In revised form: 10.01.2007
2000 Mathematics Subject Classification. 90B10, 90C10, 90C27, 90C39.
Key words and phrases. Generalized minimum spanning tree problem, dynamic programming, s imulated
annealing.
42
Solving the generalized minimum spanning tree problem with simulat ed annealing 43
More information on the problem and its applications can be f ound in [4], [11,
13].
The aim of this paper is to provide:
(i) an exact exponential time algorithm for the GMST problem , which has
polynomial complexity when the number of clusters is fixed;
(ii) an effective heuristic based on simulated annealing.
The paper is organized as follows. In Section 2 we give the form al definition of
the GMST problem and discuss its complexity. The exact algor ithm is presented
in Section 3. Two integer programming formulations of the GM ST problem are
presented in Section 4. In Section 5, we present an overview of simulated anneal-
ing. In Section 6, we solve the GMST problem with simulated ann ealing combined
with a local greedy algorithm. In Section 7, we present the det ails of the heuris-
tic implementation and the computational results obtained on various instances of
the problem and compare them with previous results. Finally , concluding remarks
are presented in Section 8.
2. D EFINITION AND COMPLEXITY OF THE GMST PROBLEM
LetG= (V,E)be an n-node undirected graph. Let V1,… ,V mbe a partition of
Vintomsubsets called clusters (i.e.,V=V1∪V2∪…∪VmandVl∩Vk=∅for all
l,k∈ {1,… ,m }withl/\e}atio\slash=k). We assume that edges are defined between all nodes
which belong to different clusters. We denote the cost of an e dgee= (i,j)∈Eby
cijor by c(i,j)and the costs of the edges are chosen integers.
The generalized minimum spanning tree (GMST) problem asks for finding a
minimum-cost tree Tspanning a subset of nodes which includes exactly one node
from each cluster Vi,i∈ {1,… ,m }. We will call such a tree a generalized spanning
tree.
In [11] it is proved that the GMST problem is NP-hard. [13] proved a stronger
result:
Theorem 2.1. The Generalized Minimum Spanning Tree problem on trees is NP-hard.
The proof of this result is based on a polynomial reduction of the set covering
problem, which is known to be NP-hard (see for example [5]), to the GMST prob-
lem defined on trees.
3. A NEXACT ALGORITHM FOR THE GMST P ROBLEM
In this section, we present an algorithm that finds an exact sol ution to the GMST
problem based on dynamic programming (see also [13]).
LetG′be the graph obtained from Gafter replacing all nodes of a cluster Viwith
a supernode representing Vi. For convenience, we identify Viwith the supernode
representing it. Edges of the graph G′are defined between each pair of the graph
vertices {V1,…,V m}.
Given a spanning tree of G′, which we shall refer to as a global spanning tree ,
we use dynamic programming in order to find the corresponding best (w.r.t. cost
minimization) generalized spanning tree.
44 Petric ˘a Pop, Cosmin Sabo, Corina Pop Sitar and Marian V . Cr ˘aciun
Fix an arbitrary cluster Vrootas the root of the global spanning tree and orient
all the edges away from vertices of Vrootaccording to the global spanning tree. A
directed edge /a\}bracketle{tVk,Vl/a\}bracketri}htofG′, resulting from the orientation of edges of the global
spanning tree defines naturally an orientation /a\}bracketle{ti,j/a\}bracketri}htof an edge (i,j)∈Ewhere
i∈Vkandj∈Vl. Letvbe a vertex of cluster Vkfor some 1≤k≤m. All such
nodes vare potential candidates to be incident to an edge of the glob al spanning
tree. On the graph G, we denote by T(v)denote the subtree rooted at such a vertex
vfromG;T(v)includes all vertices reachable from vunder the above orientation
of the edges of Gbased on the orientation of the edges of the global spanning t ree.
Thechildren ofv∈Vk, denoted by C(v), are those vertices u∈Vlwhich are heads of
the directed edges /a\}bracketle{tv,u/a\}bracketri}htin the orientation. The leaves of the tree are those vertices
that have no children.
LetW(T(v))denote the minimum weight of a generalized subtree rooted at v.
We want to compute
min
r∈VrootW(T(r)).
We are now ready for giving the dynamic programming recursio n to solve the
subproblem W(T(v)). The initialization is:
W(T(v)) = 0 ,ifv∈VkandVkis a leaf of the global spanning tree .
To compute W(T(v))for an interior to a cluster vertex v∈V, i.e., to find the
optimal solution of the subproblem W(T(v)), we have to look at all vertices from
the clusters Vlsuch that C(v)∩Vl/\e}atio\slash=∅. Ifudenotes a child of the interior vertex v,
then the recursion for vis as follows:
W(T(v)) =/summationdisplay
l,C(v)∩Vl/negationslash=∅min
u∈Vl[c(v,u) +W(T(u))].
Hence, for fixed vwe have to check at most nvertices. Consequently, for the
given global spanning tree, the overall complexity of this d ynamic programming
algorithm is O(n2). Since by Cayley’s formula (see [1]), the number of all disti nct
global spanning trees is mm−2, we have established the following.
Theorem 3.2. There exists a dynamic programming algorithm which provide s an exact
solution to the GMST problem in O(mm−2n2)time, where nis the number of nodes and
mis the number of clusters in the input graph.
Clearly, the above is an exponential time algorithm unless t he number of clus-
tersmis fixed.
4. I NTEGER PROGRAMMING FORMULATIONS
The GMST problem can be formulated as an integer program in ma ny different
ways, cf. [3], [4], [11] and [13].
For example, introducing the variables xe∈ {0,1},e∈Eandzi∈ {0,1},i∈V,
to indicate whether an edge erespectively a node iis contained in the spanning
tree, a feasible solution to the GMST problem can be seen as a c ycle free subgraph
Solving the generalized minimum spanning tree problem with simulat ed annealing 45
withm−1edges, one node selected from every cluster and connecting a ll the clus-
ters. Therefore the GMST problem can be formulated as the fol lowing 0-1 integer
programming problem:
min/summationdisplay
e∈Ecexe
s.t. z (Vk) = 1, ∀k∈K={1,…,m } (4.1)
x(E(S))≤z(S−i),∀i∈S⊂V,2≤ |S| ≤n−1 (4.2)
x(E) =m−1 (4.3)
xe∈ {0,1}, ∀e∈E (4.4)
zi∈ {0,1}, ∀i∈V. (4.5)
Here E(S) ={e= (i,j) :i,j∈S},S⊆Vwe use the standard shorthand
notations:
x(F) =/summationdisplay
e∈Fxe, F⊆Eandz(S) =/summationdisplay
i∈Szi, S⊆V
For simplicity we used the notation S−iinstead of S\ {i}. In the above formu-
lation, constraints (1) guarantee that from every cluster w e select exactly one node,
constraints (2) eliminate all the subtours and finally const raint (3) guarantees that
the selected subgraph has m−1edges.
This formulation, introduced by [11], is called the generalized subtour elimination
formulation since constraints (2) eliminate all the cycles.
We may replace the subtour elimination constraints (2) by co nnectivity con-
straints, resulting in the so-called generalized cutset formulation introduced by [11]:
min/summationdisplay
e∈Ecexe
s.t. (1),(3),(4),(5)and
x(δ(S))≥zi+zj−1,∀i∈S⊂V,j /∈S (4.6)
where for S⊆V, the cutset , denoted by δ(S), is defined as usually:
δ(S) ={e= (i,j)∈E|i∈S, j /∈S}
We observe that the two formulations described have an expon ential number of
constraints since we have to choose subsets of V(constraints (2) and (6)).
5. S IMULATED ANNEALING
Simulated annealing (SA) is a global optimization method th at distinguishes
between different local optima and is inspired from Monte Ca rlo methods in sta-
tistical mechanics. The ideas that form the basis of simulat ed annealing were first
published by [10] as an algorithm to simulate the cooling of m aterial in a heat bath,
a process known as annealing.
46 Petric ˘a Pop, Cosmin Sabo, Corina Pop Sitar and Marian V . Cr ˘aciun
[9] and [2] suggested that this type of simulation could be us ed to search the
feasible solutions of an optimization problem. Their appro ach can be regarded as
a variant of the local search technique.
A central problem for local search is the occurrence of local optima , i.e. nodes
in the search space where no neighbor strictly improves over the current node
in terms of the cost function, but which are not global optima . In order to over-
come the local optima, in the simulated annealing heuristic non-improving lo-
cal moves are allowed, but their frequency is governed by a pr obability function
which changes as the algorithm progresses. It has been shown ( see [6]) that with a
”large enough” initial temperature and a proper cooling sch edule, SA guarantees
a globally optimal solution. However, this theoretical resu lt is particularly helpful,
since the annealing time to ensure a significant probability of success will usually
exceed the time required for a complete search of the solutio n space.
The annealing algorithm for a minimization problem with sol ution space S, ob-
jective function cand neighborhood structure Ncan be stated as follows:
Select a starting solution s0∈S, an initial temperature T0>0, a temper-
ature reduction function αand the number of iterations per temperature
L;
Repeat
Repeat
Randomly select s∈N(s0)and compute δ=c(s)−c(s0);
Ifδ <0
thens0=s
else generate random puniformly distributed in the range (0,1);
ifp <exp(−δ/T)thens0=s
Until iterationcount =L;
SetT=α(T);
Until stopping condition is true.
s0is the approximation of the optimal solution.
HereLrepresents the number of repetitions at each temperature, i .e. the num-
ber of randomly chosen candidate solutions in the neighborh ood of the current
solution that are evaluated. Therefore at each stage, Lrandomly chosen candidate
solutions in the neighborhood of the current solution are ev aluated. If a candidate
solution improves on the current solution, it is accepted. O therwise, it is accepted
with a probability p(T,δ) = exp( −δ/T), which depends on the control parameter
T(the temperature in the physical equivalent) and the amount δby which a move
worsens the current solution. This relation ensures that th e probability of accept-
ing a move to a very poor solution is very small. At the complet ion of each stage,
the temperature is reduced. The way that the temperature is r educed is known
ascooling schedule . Given a relatively high temperature Tat the beginning of the
process, the probability of accepting non-improving moves is fairly high. As the
process continues, the temperature decreases and non-impr oving moves become
less likely.
Solving the generalized minimum spanning tree problem with simulat ed annealing 47
The search is continued until there is some evidence to sugge st that there is a
very low probability of improving on the best solution found so far. At this stage
the system is said to be frozen.
6. S OLVING THE GMST PROBLEM WITH SIMULATED ANNEALING
In order to implement the simulated annealing algorithm for t he GMST prob-
lem a number of decisions must be made. These can be divided in to two categories:
generic decisions , which are concerned with parameters of the annealing algor ithm
itself; including factors such as the initial temperature, the cooling schedule (gov-
erned by the parameter L, i.e. number of repetitions and the choice of the tempera-
ture reduction function α) and the stopping condition and problem specific decisions ,
which involves the choice of the solution space, the form of t he cost function, an
initial solution (initialization) and the neighborhood st ructure employed.
In our implementation of the simulated annealing algorithm f or the GMST
problem we chose the following cooling schedule:
1. Initial value of the control parameter, T0.
Following [8], we determine T0by calculating the average increase in cost
δ+, for a number of random transitions:
T0=δ+
ln(χ−1
0)
where χ0= 0.8is a given acceptance rate.
2. Final value of the control parameter .
The stopping criteria, determining the final value of the con trol parameter,
that we used was by fixing the number of values Tk, for which the algo-
rithm is to be executed, ending for a suitable low temperatur e.
3. Number of repetitions at each temperature .
The number of iterations at each temperature which is relate d to the size
of the neighborhoods may vary from temperature to temperatu re. It is
important to spend a long time at lower temperatures to ensur e that a local
optimum has been fully explored. Therefore we increased the value of L
arithmetically (by adding a constant factor).
4. Decrement of the control parameter .
We used the following decrement rule:
α(T) =rT
where ris a constant smaller than but close to 1, called the cooling r ate.
We used, as [8], r= 0.95. This corresponds to a fairly slow cooling.
In what follows we present the problem specific decisions.
6.1. Solution space. We define the solution space as the set of all the feasible solu –
tions. Suppose that the graph G= (V,E)consists of nnodes which are partitioned
intomclusters V1,…,V mhaving the same number of nodes, i.e.
|V1|=…=|Vm|=n
m
48 Petric ˘a Pop, Cosmin Sabo, Corina Pop Sitar and Marian V . Cr ˘aciun
By Cayley’s formula we know that the total number of global sp anning trees (i.e.
trees connecting the clusters) is equal to mm−2. Given a global spanning tree there
are(n
m)mpossible generalized spanning trees (feasible solutions o f the GMST
problem).
Therefore the solution space of the GMST problem contains
mm−2·(n
m)m=nm·m−2
elements. The number of elements in the solution space is exp onentially large, so
that they cannot be explored exhaustively.
6.2. Objective function. We will consider the objective function described in the
two integer programming formulations of the GMST problem, s ee Section 4.
6.3. Initialization. Simulated Annealing is an improvement heuristic which re-
quires an initial solution. It is desirable to start with a rel atively good solution (as
compared, for example with a randomly generated one), other wise a great deal of
effort will be spent to simply reach the first local minimum.
In our case, an initial feasible solution for the GMST problem was found using
the following greedy algorithm:
•Input: A connected graph G= (V,E)with the nodes partitioned into m
clusters and edges defined between nodes from different clus ters with pos-
itive cost.
•Output: A generalized spanning tree T= (W,F)ofG.
F:=∅(F is the edge set of the current tree T)
W:={vi},vi∈Vk,1≤k≤m (W is the node set of T)
while |S|< m do
Among all the edges with exactly one end in Wfind an edge (vi,vj)
with minimum cost and whenever we choose a node from a cluster delete
all the other nodes from that cluster and all the edges adjacen t to the
deleted nodes;
F:=F∪(vi,vj)
W:=W∪({vi,vj} \W)
end; (while).
A similar algorithm as the greedy algorithm can be applied in order to find a
feasible solution for the GMST problem (i.e. a generalized s panning tree of G) if
we start with a generalized tree T= (W,F)ofG(|W|< m ), instead of selecting
randomly a node in a random cluster. We will call this algorit hm the local greedy
algorithm .
6.4. Neighborhood structure. The neighborhood is one of the most important fac-
tors in designing a simulated annealing algorithm. It should be designed such that
the structures common to good solutions are preserved after the neighborhood
operation is applied.
Solving the generalized minimum spanning tree problem with simulat ed annealing 49
FIGURE 1. A current solution and a modified solution of the
GMST problem
In the case of the GMST problem, we defined a neighborhood struc ture as fol-
lows: The neighborhood of any given generalized spanning tree Tis defined as the
set of all the generalized spanning trees obtained using the local greedy algorithm
for any subtree of T. The size of the neighborhood of any given generalized span-
ning tree Tis given by the number of subtrees. The maximum number of subt rees
of a tree with mnodes is obtained for trees with the nodes distributed on exa ctly
one level except the root which has degree m−1. Therefore in this case, the maxi-
mum size of the neighborhood is 2m−1+m−1, see [15]. Given a current solution
of the GMST problem, i.e. a generalized spanning tree T, for any subtree UofT,
using the local greedy algorithm we obtain a candidate solut ion for the simulated
annealing in the neighborhood of the current solution. A cur rent solution and a
modified solution are shown in the next figure.
In Figure 1, we consider a graph Gwhose nodes are partitioned into 6 clusters
and a current solution Tof the problem. Selecting randomly a subtree UofT(for
example discarding the edges connecting nodes from the clus tersV4withV5andV2
withV5) we apply the local greedy algorithm in order to obtain the be st modified
solution in the neighborhood of Tgiven the subtree U.
7. I MPLEMENTATION DETAILS AND COMPUTATIONAL RESULTS
Our computational experiments were performed on a Dual Intel Pentium III
Processor – 1000 Mhz computer with 512 MB RAM. The combined Si mulated An-
nealing and local greedy scheme procedures were written in J ava.
After some preliminary tests, we have chosen the following c ooling schedule:
we start with a temperature T0= 100 , the cooling rate r= 0.95will be used to
reduce the temperature in each stage and L= 30 moves are evaluated at each
stage.
We considered an undirected graph G= (V,E)having nnodes which are par-
titioned into mclusters such that each cluster contains the same number of n odes.
Edges are defined between all the nodes from different cluste rs and the cost of the
edges were generated randomly in the interval [1,100]. For e ach type of instance
(n,m)we generated randomly five different problems, i.e. with dif ferent costs of
the edges. The instances that we used are similar to those con sidered by Myung
[11] and Feremans [3].
50 Petric ˘a Pop, Cosmin Sabo, Corina Pop Sitar and Marian V . Cr ˘aciun
In the next table we compare the computational results obtain ed for solving
the GMST problem using the combined simulated annealing wit h a local greedy
algorithm with the computational results given by [3] and [1 3].
Table 1: Computational results for solving the GMST problem
Pb. size SA algorithm Rooting procedure Branch and cut
m n UB/OPT CPU LB/OPT CPU LB/UB CPU
8 24 100 2.6 100 0.0 100 0.0
32 100 3.7 100 0.0 100 0.2
48 100 5.6 100 0.3 100 1.4
80 100 10.2 100 2.2 100 4.2
10 30 100 4.8 100 0.2 100 1.0
40 100 6.7 100 0.3 100 1.0
60 100 10.8 100 0.7 100 3.2
100 100 20.6 100 1.9 100 8.8
12 36 100 8.3 100 0.7 100 1.8
48 100 11.6 100 0.9 99.2 3.2
72 100 19.1 100 3.7 100 6.8
120 100 39.0 100 9.3 – –
15 45 100 16.6 100 1.3 100 3.6
60 100 21.6 100 1.7 97.1 6.2
90 100 39.7 100 9.6 100 21.4
150 100 80.2 100 34.1 98.8 42.4
18 54 100 27.2 100 2.6 99.5 7.6
108 100 68.1 100 13.8 – –
180 100 143.8 100 150.4 – –
20 60 100 37.5 100 7.2 – –
120 100 181.9 100 33.2 96.3 39.8
200 100 219.8 100 203.8 94.6 191.4
25 75 100 80.8 100 17.5 – –
150 100 221.7 100 266.4 88.3 178.8
200 100 331.1 100 284.5 97.8 140.6
30 90 100 150.4 100 60.2 – –
180 100 412.0 100 721.8 96.6 114.6
240 100 627.4 100 1948.9 – –
40 120 100 382.1 100 142.6 100 92.6
160 100 561.0 100 572.3 94.2 288.6
280 100 691.2 100 25672.3 – –
The first two columns in the table give the size of the problem: the number of
clusters (m)and the number of nodes (n). The next two columns describe the sim-
ulated annealing procedure and contain: the upper bounds ob tained as a percent-
age of the optimal value of the GMST problem (UB/OPT) and the c omputational
Solving the generalized minimum spanning tree problem with simulat ed annealing 51
times (CPU) in seconds for solving the GMST problem. The last columns con-
tain the lower bounds as a percentage of the optimal value of t he GMST problem
(UB/OPT) and the computational times (CPU) in seconds for so lving the GMST
problem with the rooting procedure (see [13]) and the lower b ounds as a percent-
age of the upper bound of the GMST problem (LB/UB) and the comp utational
times (CPU) in seconds for solving the GMST problem with a bra nch and cut al-
gorithm (see [3]). In the last two columns ‘-’ means that for th ose instances, [3] did
not provide computational results.
It is important to notice that for all the instances considere d we were able to
obtain the optimal solution of the GMST problem using our sim ulated annealing
combined with a local greedy algorithm.
From Table 1, we observe that the computational times in the c ase of simulated
annealing were bigger compared with the computational time s corresponding to
the other methods for instances up to 140 nodes, but for large r instances, while the
computational times for the rooting procedure increase exp onentially, the compu-
tational times in the case of simulated annealing are reason able.
For graphs which contains more than 50 clusters and 200 nodes , we obtained
good solutions which are within at most 24 %of optimality. The computational
results obtained in this case are presented in Table 2. We hav e considered again for
each type of instance five trials.
Table 2: Computational results for solving large instances of the GMST problem
(average of five trials per type of instance)
Pb. size Simulated Annealing algorithm
m n LB UB Average gap(%) initial sol. CPU
50 200 49 61 24 71 765.7
300 49 55 12 67 1620.5
75 300 74 97 10 104 3545.0
450 74 75 1 87 7440.3
100 300 99 114 15 128 6734.5
400 99 103 4 118 10940.3
600 99 99 0 107 20154.7
The first two columns in the table give the size of the problem: the number of
clusters (m)and the number of nodes (n). The next four columns describe the
simulated annealing procedure and contain: the lower bound s (the optimum for
a graph whose nodes are partitioned into mclusters is at least m−1), the upper
bounds obtained (UB), the average gap as a percentage, initi al solutions and the
52 Petric ˘a Pop, Cosmin Sabo, Corina Pop Sitar and Marian V . Cr ˘aciun
computational times (CPU) in seconds for calculating the up per bound of GMST
problem.
As Table 2 shows, for large instances of the GMST problem our s imulated algo-
rithm provided good sub-optimal solutions within reasonab le running time, while
the other algorithms become computationally intractable.
The explanation for the improvement of the quality solution as the number of
nodes increases for a fixed number of clusters is the followin g: when the number of
nodes per cluster is increased, then the number of edges is in creased and therefore
the number of edges with low cost.
Analyzing the results presented in Tables 1 and 2, we observe that the number
of clusters is a more important factor than the number of node s for the behavior of
the algorithms.
The quality of the solutions obtained by using the combined s imulated anneal-
ing with local greedy algorithm depended on the initial solu tions. In about 40 %
of the instances considered we were able to get the optimal so lution using the first
initial solution. We observed also that good initial soluti ons were not necessarily
improved to good solutions by simulated annealing.
8. C ONCLUSIONS
In this paper we have provided an exact exponential time algor ithm for the gen-
eralized minimum spanning tree problem and shown an effecti ve way of solving
the GMST problem using simulated annealing combined with a l ocal greedy algo-
rithm. Our method has been tested and compared with previous methods from
literature on various small to large instances of the proble m that are commonly
used for comparative studies. For instances up to 280 nodes o ur algorithm pro-
vides the optimal solution of the GMST problem and for large i nstances provides
good sub-optimal solutions within reasonable running time .
REFERENCES
[1] Bondy A., Thomass ´e S. and Thomasen C., Spanning trees of multipartite graphs, Preprint, 2001
[2] Cerny V ., A thermodynamical approach to the traveling salesman problem: an efficient simulated annealing
algorithm, Journal of Optimization Theory and Applications 45 (1985) 41-5 5
[3] Feremans C., Generalized Spanning Trees and Extensions , PhD Thesis, Universite Libr ´e de Bruxelles,
Belgium, 2001
[4] Feremans C., Labb ´e M. and Laporte G., A Comparative Analysis of Several Formulations for the Gener-
alized Minimum Spanning Tree Problem, Technical Report IS-MG 2000/22, Universite Libr ´e de Brux-
elles, Belgium; to appear in Networks
[5] Garey M.R. and Johnson D.S., Computers and Intractability: A guide to the theory of NP-Comple teness,
Freeman, San Francisco, California, 1979
[6] Geman S. and Geman D., Stochastic Relaxation, Gibbs Distribution and the Bayesian Res toration in
Images, IEEE Trans. Patt. Anal. Mac. Int. 6, 721-741
[7] Glover F.W. and Kochenberger G.A., Handbook of Metaheuris tics, Kluwer Academic Publishers,
2002
[8] Johnson D., Aragon C., McGeoch L. and Schevon C., Optimization by Simulated Annealing: An Ex-
perimental Evaluation, Part I, Graph Partitioning, Operations Research 37 (1989), 865-892
[9] Kirkpatrick S., Gelatt C.D. and Vecchi M.D. Optimization by Simulated Annealing, Science, Vol. 220,
No. 4598 (1983), 671-680
Solving the generalized minimum spanning tree problem with simulat ed annealing 53
[10] Metropolis N., Rosenbluth A.W., Rosenbluth M.N., Teller A. H. and Teller E., Equation of State Cal-
culations by Fast Computing Machines, Journal of Chem. Phys., Vol. 21, No. 6 (1953), 1087-1092
[11] Myung Y.S., Lee C.H. and D.w. Tcha, On the Generalized Minimum Spanning Tree Problem, Networks
26, pp. 231-241, 1995
[12] Osman I.H. and Laporte G., Metaheuristics, A Bibliograph y, Annals of Operations Research, 63,
pp. 231-241, 1996
[13] Pop P .C., The Generalized Minimum Spanning Tree Problem, PhD thesis, University of Twente, The
Netherlands, 2002
[14] Reeves C.R., Modern Metaheuristic Techniques for Combinatorial Problems, Blackwell, Oxford, 1993
[15] Szekely L.A. and Wang H., On subtrees of trees, Research report, University of Carolina, 2004
PETRIC ˘APOP
NORTH UNIVERSITY OF BAIA MARE
DEPARTMENT OF MATHEMATIS AND COMPUTER SCIENCE
VICTORIEI 76, 430122 B AIA MARE, ROMANIA
E-mail address :poppetrica@yahoo.com
CORINA POPSITAR
NORTH UNIVERSITY OF BAIA MARE
DEPARTMENT OF ECONOMICS
VICTORIEI 76, 430122 B AIA MARE, ROMANIA
E-mail address :sitarcorina@yahoo.com
COSMIN SABO
NORTH UNIVERSITY OF BAIA MARE
DEPARTMENT OF MATHEMATISC AND COMPUTER SCIENCE
VICTORIEI 76, 430122 B AIA MARE, ROMANIA
E-mail address :cosmin sabo@scream.ro
MARIAN V. C R˘ACIUN
UNIVERSITY ”DUN˘AREA DE JOS” G ALAT ¸I
DEPARTMENT OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE
DOMNEASC ˘A47, 800008 G ALAT ¸I, ROMANIA
E-mail address :mcraciun@ugal.ro
Copyright Notice
© Licențiada.org respectă drepturile de proprietate intelectuală și așteaptă ca toți utilizatorii să facă același lucru. Dacă consideri că un conținut de pe site încalcă drepturile tale de autor, te rugăm să trimiți o notificare DMCA.
Acest articol: D1e2eabbfd08c6d4fed5b4a8b15aa97b7c98 [627936] (ID: 627936)
Dacă considerați că acest conținut vă încalcă drepturile de autor, vă rugăm să depuneți o cerere pe pagina noastră Copyright Takedown.
