Evolutionary Tabu Search for Geometric Primitive
Extraction
Jinxiang Chai
National Laboratory of Pattern Recognition, Institute of Automation Beijing 100080, P.R. China
Email: chaij@prlsun3.ia.ac.cn TianziJiang
y
,
Member, IEEE
School of Mathematics, The University of New South Wales Sydney 2052, Australia
and
National Laboratory of Pattern Recognition, Institute of Automation Chinese Academy of Sciences, Beijing 100080, P.R. China Email: jiangtz@maths.unsw.EDU.AU and jiangtz@prlsun6.ia.ac.cn
Song De Ma
National Laboratory of Pattern Recognition, Institute of Automation Beijing 100080, P.R. China
Email: masd@prlsun2.ia.ac.cn May 6,1998
Abstract
Many problems in computer vision can be formulated as an optimization problem. De-velopping the ecient global optimizational technique adaptive to the vision proplem becomes more and more important. In this paper, we present a geometric primitive ex-traction method, which plays a crucial role in content-based image retrieval and other vision problems. We formulate the problem as a cost function minimization problem and we present a new optimization technique called Evolutionary Tabu Search (ETS). Ge-netic algorithm and Tabu Search Algorithm are combined in our method. Specicly, we incorporates "the survival of strongest" idea of evolution algorithm into tabu search. In experiments, we use our method for shape extraction in images and compare our method with other three global optimization methods including genetic algorithm, simulated An-nealing and tabu search. The results show that the new algorithm is a practical and eective global optimization method, which can yield good near-optimal soultions and has better convergence speed.
Keywords
: Geometric primitive, Global Optimization, Evolutionary Tabu Search, Tabu Search, Genetic Algorithm, Simulated Annealing.This work was partially supported by the Chinese National Science Fundation and the Chinese National High
Technology Program(863).
1 Introduction
Extracting predened geometric primitives from geometric data is an important problem in the eld of model-based vision because it is a prerequisite to solving other problems in model-based vision, such as pose determination, model building, object recognition, and so on. The most commonly used method of primitive extraction is the Hough transforn (HT). The HT divides parameter space of the geometric primitive into cells (usually rectangular) by quantizing each dimension into a xed number of intervals. Each datum point adds a vote to every cell whose parameters are such that the primitive associated with that cell passed through the point. After all the points have voted, the cell which have a number of votes greater than a threshold are marked. For each such cell the associated geometric primitive is taken as a description of the points that voted for the cell, and this primitive is said to be extracted from the data[1]. The HT
has been show to be equivalent to template matching, where the templates are dened by each of the cells in parameter space[2]. Performing template matching in this way is time ecient,
but space inecient. The space requirements are proportional to the number of the cells, and this number is an exponential function of the dimension of the parameter space. This means that unless the quantization of the parameter space is coarse, the HT can be pratically used for primitives with at most two degrees of the fredom. In fact, the majority of applications of the HT are for line extraction[3].
Recently, Roth et al[4] proved that extracting the best geometric primitive from a given
set of geometric data is equivalent to nding the optimum value of a cost function. Once it is understood that primitive extraction is such an optimization problem, the use of any technique for tackling optimization problem suggests itself. The objective function of the global optimization problem for geometric primitive extraction has potentially many local minima. Conventional local search minimization techniques are time consuming and tend to converge to whichever local minimum they rst encounter. These methods are unable to continue the search after a local minimum is reached. The key requirement before any global optimization method is that it must be able to avoid entrapment in local minimal and continues the search to give a near-optimal nal solution whatever the initial conditions. It is well known that Simulated Annealing[5;6](SA) and Genetic Algorithm[7;8] (GA) meet this requirement. Some researchers
suggested to solve this problem using the Genetic Algorithms[9;10].
In this paper we develop an ecient algorithm based on the combination of tabu search and evolution theory. This new shape detection method has the ability to nd the global optimum, which not only keeps the advantages of Tabu Search and Genetic Algorithms, but also overcomes some of their shortages. Specically, by comparing our algorithm with the existing other global optimization methods (such as Genetic Algorithm, Simulated Annealing and Tabu Search), we nd that the ETS is more practical and eective, which also yields good near-optimal solutions and has better convergence speed. The rest of this paper is organized as follows. Section 2 devotes to the background of our problem. Section 3 gives a general description of our new algorithm for geometric primitive extraction. The statement of ETS is presented in Section 4. Section 5 devotes to the experimental comparison results. We give our conclusion in Section 6.
2 Primitive Extraction and Minimal Subsets
In this section, we brie y review facts of geometric primitive extraction and the denition of minimal subsets. We refer the readers to Roth et al[4] for the details.
data points in two or three dimensional Cartesian space, which are labeled
p
1;p
2;
;p
N, alongwith the equation dening the type of geometric primitive to be extracted. We assume that this dening equation is an implicit form
f
(p
;
a
) = 0. Herep
is the datum points, anda
denes the parameter vector for this particular primitive. This assumption is not restrictive because it has been shown that the parametric curves and surfaces used to dene the parts in CAD databases can be converted to implicit for[11]. The output consists of the parameter vectora
of the bestprimitive, along with the subset of the geometric data that belongs to this primitive. We dene the residual
r
i as the closest distance ofi
th point of the geometric data to the curve or surface.Given that residuals
r
1;r
2;
;r
N have been calculated, then extracting a single instance of ageometric primitive is equivalent to nding the parameter vector
a
which minimizes the value of a cost functionh
(r
1;r
2;
;r
N). Denotef
(a
) =h
(r
1;r
2;
;r
N). Therefore, extracting thebest geometric primitive from a given set of geometric data is equivalent to solving the following global optimization problem:
min
a2A
f
(a
):
(1)where
A
is a set of feasible solutions. A minimal subset is the smallest number of points necessary to dene a unique instance of a geometric primitive. It is possible to convert from a minimal subset of points to the parameter vectora
for a wide variety of geometric primitives[4].Sinceonly values of the parameter vector dened by these minimal subsets are potential solutions to the extraction problem, we could search the best solution in a smaller space.
3 Evolutionary Tabu Search
In this section, we will state the thought of the Evolutionary Tabu Search. For the sake of completeness, we brie y review other two global optimization algorithms, including Genetic Algorithm and Simulated Annealing. All of them can be presented as methods for nding a global optimization in the presence of local optimum.
3.1 Simulated Annealing
Simulated Annealing is a stochastic optimization algorithm based on the physical analogy of annealing a system of molecules to its ground state. Originally developed by Kirkpatrick et al[5;22], the method has been successfully applied to a variety of hard optimization problems
in dierent elds. Starting from initial conguration
X
0 which can be generated randomly,the simulated annealing procedure generates a stationary Markov chain of conguration
X
k, Aobjective function f(
X
k), which is to be minimized, is used to evaluate the congurationX
k.During the simulated annealing process, for a given present conguration
X
k, the algorithmgenerates a candidate conguration
Y
k. The decision of selecting either congurationX
k orconguration
Y
k as the next stateX
k+1 for the following iteration is not made deterministicallyvia a straight-forward comparison of the objective function values of the conguration
X
k andY
k, but stochastically, based on a sampling of the Bolzamann distribution via Mentropolisfunction. If conguration
Y
k has a objective function value smaller than that of congurationX
K, congurationY
k is selected as congurationX
k+1. If congurationY
k has a objectivefunction value larger than that of conguration
X
k, congurationY
k is selected asX
k+1 withprobability p determined by the Metropolis function
p
=exp
(?max
(F
(Y
k;X
k);
0)=T
k), whereF
(Y
k;X
k) =F
(Y
k)?F
(X
k).The set
T
kis called an annealing schedule, and is a sequence of strictly monotonicallydecreas-ing nonnegative numners such that
T
1> T
2>
> T
n and limkconvergence(i.e. in the limit
k
! 1) is guaranteed for a logarithmic annealing schedule of theform
T
k=T
1=
(1 +lnk
) wherek
0.3.2 Genetic Algorithm
Genetic Algorithms(GA), rst developed by Holland[12], are stochastic optimization techniques
that mimic the principles of natural evolution. According to the genetic evolution theory, stronger and tter individuals have better chances of survival than weaker ones. Osprings are produced by parents using interesting genetic information from both parents, or strings of chromosomes are inherited from both parents. Possible solutions of the optimization problems resemble chromosomes in the natural process, and producing new solution resembles producing osprings with certain chromosomes. The objective function in the optimization problem is evaluated in accordance with "survival of the strongest" principle in the natural evolution pro-cess. The chromosomes are manipulated by genetic operators to try to simulated the eects of natural evolution. The basic four operators are Reproduction, Crossover, Mutation, Inversion, the rst three resemble the natural operators, while the last is done for merely improving the simulation results. These genetic operators are discussed in detail in standard references on genetic algorithms[7;8].
3.3 Tabu Search
Here we brie y review some notations of Tabu Search[13;14;15] and outline the basic steps of the
Tabu Search procedure for solving optimization problems.
Tabu Search is a metaheuristic that guides a local heuristic search procedure to explore the solution space beyond local optimum. It is dierent from the well-known hill-climbing local search techniques because Tabu Search allows moves out of a current solution that makes the objective function worse in the hope that it eventually will achieve a better solution. It is also dierent from the Simulated Annealing and Genetic Algorithm because the Tabu Search includes a memory mechanism. According to Glover's idea, in order to solve a optimization problem using Tabu Search, the following components must be dened.
Conguration:
Congurationis a solution or an assignment of values of variables.Move:
A move characterizes the process of generating a feasible solution to the problem that is related to the current solution (i.e. a move is a procedure by which a new solution is generated from the current one).Neighborhood:
A neighborhood of the solution is the collection set of all possible moves out of a current conguration. Note that the actual denitions of the neighborhood depend on the particular implementation and the nature of problem.Tabu Conditions:
In order to avoid a blind search, tabu search technique uses a prescribed problem-specic set of constraints, known as tabu conditions. They are certain conditions imposed on moves which make some of them forbidden. These forbidden moves are known as tabu . It is done by forming a list of certain size that records these forbidden moves. This is known as tabu list.With the above basic components, the Tabu Search algorithm can be described as follows.
(i)
Start with a certain (current) conguration and evaluate the criterion function for that conguration.(ii)
Follow a neighbor of the current conguration, that is, a set of candidate moves. If the best of these moves is not tabu or if the best is tabu, but satises the aspiration criterion, then pick that move and consider it to be the new current conguration; otherwise pick the best move that is not tabu and consider it to be the new current conguration.(iii)
Repeat (i) and (ii) until some termination criteria are satised.The best solution in the nal loop is the solution obtained by the algorithm. Note that the move picked at a certain iteration is put in the tabu list so that it is not allowed to be reversed in the next iterations. The tabu list has a certain size, and when the length of the tabu reaches that size and a new move enters that list, then the rst move on the tabu list is freed from being tabu and the process continues (i.e. the tabu list is circular).
3.4 Evolutionary Tabu Search
The Evolutionary Tabu Search takes advantages of the GA and Tabu Search, and it has two level selections, which are respectively called the rst-level selection and the second-level selection. We propose the use of Tabu search in the rst level, and evolution ideas[8;16]in the second level.
The ETS technique starts with a guess of
N
likely candidates, actually chosen at random in the search space. These candidates are the so-called parents. Initially each parent can generate a number of children, sayNTS
, which consists a family. The children in the same family(i.e., generated from the same parent) constitute the rst level selection, then we use Tabu Search to select the child as parent for the next generation. This selection creates the parents for the next generation.The second-level selection is the competition between the families. The number of children that should be generated in the next generation depends on the results of the second-level selection. This second-level selection actually provides a measure of the tness of each family. Instead of using the objective values of a single point that might be considerable biased, we dene a tness value based on the objective values of all the children in the same family for that measure. The number of children allocated to each family for the next generation is proportional to their tness values, but the total number of the next generation's children is still constant. In such way, tter individuals have better survival chance. It has been show that eventually only one family survives, which is usually the best one. The procedures of the rst-level and second-level selection continue until a certain number of iterations have been reached or an acceptable solution has been found. It is the eect of the second-level competition that gives measure of the regional information. In fact, The tness value provides the information of how good the region is. If the region is found to contain a higher tness value, we allocated more attention to search in that region.
4 New Algorithm for Geometric Primitive Extraction
particular primitive we want to extract. Let
I
= (I
1;
;I
m) denote the minimal subset of thesize
m
, whereI
i corresponds to the indices of its member points. For the sake of convenience, weassume that
I
i< I
j fori < j;i;j
= 1;
;m
. For any minimal subsetI
, the objective functionf
(I
) is dened as follows:f
(I
) =Xmi=1
s
(r
2Ii) (2)
where
s
is step function;s
= 0 ifr
j is greater than or equal to the template width, ands
= 1otherwise. This objective function counts the number of points within a xed distance of the geometric primitive. Moreover, it eectively matches a small template around this primitive to the geometric data. Therefore, our task is to nd the maximum of objective function. Therefore, our task becomes to nd the maximum of the objective function.
Let
I
c;I
tandI
b denote the current, trial and best congurations (minimal subsets) andf
c;f
tand
f
b denote the corresponding current, trial and best objective function values, respectively.As described in the previous section, we operate with a conguration which is known as the current solution
I
c and then through moves which were explained in what follows, we generatetrial solutions
I
t. As the algorithm proceeds, we also save the best solution found so far whichis denoted by
I
b. Corresponding to these congurations, we also operate with the objectivefunction values
f
c;f
t andf
b, respectively.Before stating the two level selection algorithm, we dene the basic components in the geometric primitive extraction.
Conguration
: Conguration is denoted byI
= (I
1;
;I
m), whereI
i corresponds to theindices of its member points.
Neighborhood
: We rst give a denition about point. Two points is called neighbor-point or is within the Move Distance(MD) if their Euclidean distances is less than the given value of MD. For Congurations dened byI
= (I
1;
;I
m), One conguration moved toanother neighbor conguration only if each point in it moves to its neighbor-point.
Tabu element:
We assume that the number of the parameter of a specic geometric prim-itive is m. For example, m is equal to three if geometric primprim-itive is circle, which in-cludes radius(R) and the coordinate of the center(X
c;Y
c). Therefore, we can use a vectorA(
a
1;
;a
m) to denote specic geometric primitive, wherea
i(i
= 1;
;m
) is parametricvariable. Now, our tabu element can be dened as a vector B(
b
1;
;b
m;"
),where"
is anerror variable,
b
i(i
= 1;
;m
) isi
?th
parametric variable of geometric primitive.Tabu condition:
Assume that the parametric vector of current conguration isA
t(a
1;
;a
m) and tabu element B(b
1;
;b
m;"
),we can dene tabu condition as follows:For
i
= 1;
;m
,ifka
i?b
ik,then current conguration is tabu.Aspiration Condition:
If current conguration satises the aspiration condition, then it is no longer tabu even if it satises the tabu condition. we dene aspiration condition as follows:f
t> f
b, wheref
tdenotes the corresponding trial objective function value andf
bdenotes the best objective function values according to equation 2. The main steps of our ETS algorithms are as follows:
Step 1
Randomly selectN
0parents(i.e.initial families). For each family, selectNTS
=NTS
0(initialnumber of children),
"
="
0(initial error value) andMD
=MD
0(initial move distance).Step 2
For each family, use Tabu Search(details are given in section 4.1) to nd the parent of the next generation(the rst level selection).Step 3
According to the second level selection, nd the number of children that will be gener-ated by the parents of the next generation. Details about it are given in section 4.2.Step 4
According to some specic strategy, change the Move Distance(MD
)and error value("
).Step 5
Repeat Step 2 to Step 4 until an acceptable solution has been found or until a certain number of iterations(IMAX) has been reached.4.1 Tabu Search in the rst-level selection
For the geometric primitive extraction problem, our rst level selection algorithm can be de-scribed as follows.
Step 1
LetI
c be parent of one family, andf
c be the corresponding objective function valuecomputed using equation (2). For the rst generation, Let
I
b =I
c andf
b =f
c.Step 2
Generating Neighborhood: UsingI
c andMD
to generateNTS
childrenI
1t
;I
2t
;
;I
tNTS (see Remark 1) and evaluate their corresponding objective function valuesf
1 t;f
2 t;
;f
tNTS and go to Step 3.Step 3
Arrangef
1 t;f
2t
;
;f
tNTS in a descending order, and denote them byf
[1] t;f
[2] t;
;f
[NTS] t . Iff
[1]t is not tabu, or if it is tabu but
f
[1]t
> f
b, then letI
c=I
[1]t and
f
c=f
[1]t ,
and go to Step 4; otherwise, let
I
c =I
[L]t and
f
c =f
[L]t , where
f
[L]t is the best objective
function of
f
[2]t
;
;f
[NTS]t that is not tabu and go to Step 4. If all
f
[1]t
;f
[2]t
;
;f
[NTS]t are
tabu, we change
MD
according to the specic strategy. Go to step 2.Step 4
Insert new tabu element at the bottom of the tabu list and letTLL
=TLL
+ 1 (ifTLL
=MTLS
+ 1, delete the rst element in the tabu list and letTLL
=TLL
?1). Iff
b< f
c, letI
b =I
c andf
b =f
c.Remark 1
Given a current solutionI
c, one can generate a trial solution using several strategies.We use the following strategy: Given
I
c and a probability thresholdP
, fori
= 1;
2;
;m
, drawa random number
R
iu
(0;
1), whereu
(0;
1) is the uniform distribution on interval [0;
1]. IfR
i< P
, thenI
t(i
) =I
c(i
); otherwise draw randomly point ^l
from the set including all theI
c(i
)neighbor-points and let
I
t(i
) = ^l
.4.2 The Second Level Selection
The second-level selection algorithm is presented as follows:
Step 1
Repeat Step 2 for each family; goto Step 3.Step 2
According to the denition of the tness function(see Remark 2), compute the tness of the family.Step 3
Sum up the tness of each the family.NTS
=N
0NTS
0F
S
(3)where NTS is the number of children that will be generated for that family
NTS
0 is the initial number of children being generated for each familyN
0 is the initial number of familyF is the tness of that family
S is the sum of the tness values of each the family
Remark 2
Increase tness value by 1 whenever the objective value of a child is larger than that of its parent, or the following condition is satised:e
?(f t ?f b ) T> P
0 (4) wheref
t is objective value of childf
b is the best objective value up to the current generationT
is a constant coecientP
0 is a random number uniformly distributed between 0 and 1.4.3 Simulated Annealing and Genetic Algorithm approaches to geometric
primitive extraction
Now, for the sake of completeness, we describe brie y the SA and GA approaches to geometric primitive extraction.
4.3.1 Simulated Annealing-based approach to geometric primitive extraction
The main steps of our Simulated Annealing algorithms are as follows:
Step 1
Initialization: LetI
c be an arbitrary solution, andf
c be the corresponding objectivefunction value computed using equation (2). Let
I
b =I
c andf
b =f
c. Select values forthe following parameters:
T
i (initial temperature), TM (temperature multiplier),T
f (naltemperature), P(probability threshhold), NI (number of iterations after which temperature is reduced if
f
b has not improved) and ITmax(maximum number of iterations allowd). LetT=
T
i, L=0, k=1.step 2
: Obtain a neighborI
t of congurationI
c, and computef
t using Eq.(2).step 3
: Iff
t< f
c goto Step 4; Otherwise, letI
c=I
t,f
c=f
t (accept the trial solution as thenew current one). If
f
t< f
b, let L=L+1, and go to Step 5. Otherwise, letI
b =I
t, andf
b =f
t, L=0, and go to Step 5.Step 4
: Draw a radom numberp
u
(0;
1). Ifp > exp
((f
t?f
c)=T
) go to Step 5 (reject thetrial solution); Otherwise, let
I
c =I
t andf
c =f
t (accept the trial solution as the newcurrent solution although it is of lower objective function value).
Step 5
: If L=NI andT > T
f, letT
=T
TM
(see Remark 3). IfK < ITmax
, letk
=k
+ 1Remark 3
The logarithmic annealing schedule makes simulated annealing convergent. How-ever, in practice, the logarithmic annealing schedule is far too slow and hence we have used a geometric annealing schedule of the formT
k=(1-)k where is a positive real number close tozero. We have empirically found the values of
=0.03 andT
12 [2000, 4000] to be well suited
for our purpose.
4.3.2 Genetic Algorithm approaches to geometric primitive extraction
Recently, G.Roth et al.[9]and E.Lutton et al.[10]have used genetic algorithm to extract geometric
primitive. A basic Genetic Algorithm could be described as follows[7;8]:
Step 1
: An initial population of chromsomes(i.e. conguration) is created randomly, and each individual is evaluated by the objective function.Step 2
: Two mates are selected for reproduction with probabilities in proportion to their tness values using roulette wheel selection.Step 3
: Crossover, Mutation, Inversion operatorare applied to the selected mates, and o-springs are generated.Step 4
: Each individual ospring is evaluated by the objective function.Step 5
: Steps 2-4 are repeated until an entirely new population of chromsomes is generated.Step 6
: The previous population is replaced by the new population.Step 7
: If the stopping criterion is not reached, Goto Step 2, otherwise return the best chro-mosome(i.e. conguration) inthe new population as the solution to the problem.In addition to the basic genetic algorithm operators, we have incorporated Elitism strategy and Intelligent mutation in our genetic algorithm-based geometric primitive extraction.
5 Experimental Results
In this section, we present our experiment results on extracting various kinds of geometric prim-itives. We will also make some comparison of our method with other three global optimization methods: Genetic Algorithm(GA), Simulated Annealing(SA) and Tabu Search(TS).
As mentioned in the previous section, ETS has four important parameters
MTLS
,N
0,NTS
0and
P
. The tabu list enables the algorithm to have short term memory. A large tabu list size allows more diversication while a small list size makes the algorithm more forgetful, i.e., allows intensication to happen. Determining the list size is a non-trivial problem. Glover[17] suggestsusing a tabu list size in the range [1
3
n;
3n
], wheren
is the size of problem. Recently, someresearchers[18;19] propose variable tabu list size(tabu tenure). The next important parameter is
the size of the initial family(
N
0). If it is too small, it will resemble Tabu Search. Especially,when
N
0 equals to 1, then ETS becomes Tabu Search. It is too large, then convergence speedwill become worse. Thus selecting this value is a tradeo between the nal extraction result and convergence speed. The third parameter is the initial number of children(
NTS
0), whichalso aect the convergence speed. In our experiments,
NTS
02[2, 8]. The last parameter is
P
(a) (b)
(c) (d)
Figure 1: Extracting multiple ellipses: (a)real image. (b)edge pixel. (c)extracted ellipses. (d)extracted ellipses superimposed on edge pixel.
consequently the closer the neighbor (the solution obtained after move) to the current solution, and vice verse. In our case, we have found that
p
2[0:
3;
0:
5] to be well suited our purpose.To extract primitives from real images, the rst step is to obtain the geometric data, which could be obtained by active sensors or edge detection. In our experiment, geometric data are produced by zero-crossing edge detector[20].
To extract multiple primitives, we just repeat the algorithm presented in the above section. Following the rst application of the algorithm, the data points belonging to the rst geometric primitive are removed, and the next application of the algorithm has as input the remaining geometric data points. A more complex situation is the extraction of not only multiple primitives, but also dierent types of primitives. In this case, we apply the algorithm for circles and ellipses simultaneously on the same geometric data. The best primitive is extracted and the algorithm repeated.
Extracting circles and ellipses using HT is still an active area of research. When the number of parameters of these primitives is higher than two ( 3 for circle and 5 for ellipse), it is space inecient and therefore dicult for HT to perform the extraction of circle, especially the ellipse. Our approach has no such diculty, as can be seen from Fig.2 and Fig.3
Fig.2 shows the extraction of four ellipses (here circle is considered as a special kind of ellipse). This example is signicant in that: (1) the background is complex and the ellipses are occluded; (2) the small ellipse in the right-down corner is successfully extracted, which is very dicult for HT to extract. The last step of HT is peak detection and it is dicult in detecting a very small real peak. Case becomes more serious when the original images are contaminated by noise.
(a) (b)
(c) (d)
0 50 100 150 200 250 300 350 400 50 100 150 200 250 300 350 Number of Iterations
Best Cost Function Score
Figure 3: Cost function value of best primitive in Figure 2 versus number of iteration.
close together makes this a particularly dicult example for HT method[21] (because of the
coarse quantization necessary for HT to deal with higher dimensional spaces), while our global optimization algorithms can fulll the extracting task quickly and correctly, as can be seen from this gure. Note: In Figure 3, sold line is ETS, dashed line is Tabu Search, dotted line is Genetic algorithm, dashdot line is Simulated Annealing.
Now, we compare ETS with other three global optimization algorithms, including GA, SA and TS. Fig.3 gives the comparison results of four algorithms in the extraction of the optimal ellipse in Fig.2. Table 1 gives the nal best objective function value and the number of iteration to reach it(IMAX=400). For comparison, each iteration consists of 40 times of objective function evaluation in all of the four algorithms in our experiment. From the experimental results, we can see that ETS has better convergence speed than the other three algorithms. At the same time, the nal best objective function value indicates that ETS and GA yields better result than SA and TS. Furthermore, it is obvious that ETS algorithm can be implemented in parallel.
6 Conclusion
Tabu Search to other problems in computer vision.
References
[1] M. D. Levine, Vision in Man and Machine, McGraw-Hill, New York, 1985.
[2] G. Stockman and A> Agrawala, Equuivalence of Hough transform to template matching, Comm. ACM
20
,1977, 820-822.[3] T. Risse, Hough transform for line recognition: complexity of evidence accumulation and cluster detection, Comput. Vision Graphics Image Process.
46
, No.3, 1989,327-345.[4] G. Roth and M.D. Levine, Extracting Geometric Primitives, CVGIP: Image Understanding,
58
(1993), 1-22.[5] S. Kirkpatrick, C.D. Gelatt Jr., and M.P. Vecchi, Optimization by Simulated Annealing, Science,
220
:671(1983), 621-680.[6] E. Aarts and J. Korst, Simulated Annealing and Boltzmann Machine. New York: Wi-ley,1989.
[7] D.E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, 1989.
[8] Z.Michalewicz, Genetic Algorithms+Data Structures=Evolution Programs, Springer-Verlag, 1992.
[9] G. Roth and M.D. Levine, Geometric Primitive Extraction Using a Genetic Algorithm, IEEE Trans. on Pattern Analysis and Machine Intelligence,
16
:9(1994), 901-905.[10] E.Lutton and P. Martinez, A Genetic Algorithm for the Detection of 2D Geometric Primi-tive in Images, Proc.ICPR'94, Vol.1, 526-528.
[11] T.W. Sederberg and D.C. Anderson, Implicit Representation of parametric Curves and Surfaces, Computer Vision, Graphics, and Image Processing,
28
(1984), 72-84.[12] J. Holland, Adaptation in Natural and Articial Systems, University of Michigan Press, Ann Arbor, 1975.
[13] F. Glover, Tabu Search, in Modern Heuristic Techniques for Combinatorial Problems, C.R. Reeves ed., John Wiley & Sons, Inc., 1993.
[14] K.S.Al-Sultan, A Tabu Search Approach to the Clustering Problem, Pattern Recognition,
28
:9(1995),1443-1451.[15] D.Cvijovic and J.Klinowski, Taboo Search:An Approach to tha Multiple Minima problem, Science, Vol.267, 664-666(1995).
[16] P.Yip and Y.H.Pao, Combinational Optimization with Use of Evolutionaryary Simulated Anneaing, IEEE Trans. on Neural Network,
6
:2(1995), 290-295[18] R. Battiti and G. Tecchiolli, The reactive tabu search, ORSA Journal on computing,
6
:126-140(1994).[19] J. Xu, Steve Chiu and F.Glover, Fine-tuning a tabu search algorithm with statistical tests, Technical Report, University of Colorado at Boulder, 1996.
[20] S.D. MA and B.Li, Multiscale derivative computation, to appear in Image Vision Comput., 1996.