• No results found

Evolutionary Tabu Search for Geometric Primitive Extraction Jinxiang Chai National Laboratory of Pattern Recognition, Institute of Automation Beijing

N/A
N/A
Protected

Academic year: 2021

Share "Evolutionary Tabu Search for Geometric Primitive Extraction Jinxiang Chai National Laboratory of Pattern Recognition, Institute of Automation Beijing"

Copied!
14
0
0

Loading.... (view fulltext now)

Full text

(1)

Evolutionary Tabu Search for Geometric Primitive

Extraction



Jinxiang Chai

National Laboratory of Pattern Recognition, Institute of Automation Beijing 100080, P.R. China

Email: chaij@prlsun3.ia.ac.cn TianziJiang

y

,

Member, IEEE

School of Mathematics, The University of New South Wales Sydney 2052, Australia

and

National Laboratory of Pattern Recognition, Institute of Automation Chinese Academy of Sciences, Beijing 100080, P.R. China Email: jiangtz@maths.unsw.EDU.AU and jiangtz@prlsun6.ia.ac.cn

Song De Ma

National Laboratory of Pattern Recognition, Institute of Automation Beijing 100080, P.R. China

Email: masd@prlsun2.ia.ac.cn May 6,1998

Abstract

Many problems in computer vision can be formulated as an optimization problem. De-velopping the ecient global optimizational technique adaptive to the vision proplem becomes more and more important. In this paper, we present a geometric primitive ex-traction method, which plays a crucial role in content-based image retrieval and other vision problems. We formulate the problem as a cost function minimization problem and we present a new optimization technique called Evolutionary Tabu Search (ETS). Ge-netic algorithm and Tabu Search Algorithm are combined in our method. Speci cly, we incorporates "the survival of strongest" idea of evolution algorithm into tabu search. In experiments, we use our method for shape extraction in images and compare our method with other three global optimization methods including genetic algorithm, simulated An-nealing and tabu search. The results show that the new algorithm is a practical and e ective global optimization method, which can yield good near-optimal soultions and has better convergence speed.

Keywords

: Geometric primitive, Global Optimization, Evolutionary Tabu Search, Tabu Search, Genetic Algorithm, Simulated Annealing.

This work was partially supported by the Chinese National Science Fundation and the Chinese National High

Technology Program(863).

(2)

1 Introduction

Extracting prede ned geometric primitives from geometric data is an important problem in the eld of model-based vision because it is a prerequisite to solving other problems in model-based vision, such as pose determination, model building, object recognition, and so on. The most commonly used method of primitive extraction is the Hough transforn (HT). The HT divides parameter space of the geometric primitive into cells (usually rectangular) by quantizing each dimension into a xed number of intervals. Each datum point adds a vote to every cell whose parameters are such that the primitive associated with that cell passed through the point. After all the points have voted, the cell which have a number of votes greater than a threshold are marked. For each such cell the associated geometric primitive is taken as a description of the points that voted for the cell, and this primitive is said to be extracted from the data[1]. The HT

has been show to be equivalent to template matching, where the templates are de ned by each of the cells in parameter space[2]. Performing template matching in this way is time e cient,

but space inecient. The space requirements are proportional to the number of the cells, and this number is an exponential function of the dimension of the parameter space. This means that unless the quantization of the parameter space is coarse, the HT can be pratically used for primitives with at most two degrees of the fredom. In fact, the majority of applications of the HT are for line extraction[3].

Recently, Roth et al[4] proved that extracting the best geometric primitive from a given

set of geometric data is equivalent to nding the optimum value of a cost function. Once it is understood that primitive extraction is such an optimization problem, the use of any technique for tackling optimization problem suggests itself. The objective function of the global optimization problem for geometric primitive extraction has potentially many local minima. Conventional local search minimization techniques are time consuming and tend to converge to whichever local minimum they rst encounter. These methods are unable to continue the search after a local minimum is reached. The key requirement before any global optimization method is that it must be able to avoid entrapment in local minimal and continues the search to give a near-optimal nal solution whatever the initial conditions. It is well known that Simulated Annealing[5;6](SA) and Genetic Algorithm[7;8] (GA) meet this requirement. Some researchers

suggested to solve this problem using the Genetic Algorithms[9;10].

In this paper we develop an ecient algorithm based on the combination of tabu search and evolution theory. This new shape detection method has the ability to nd the global optimum, which not only keeps the advantages of Tabu Search and Genetic Algorithms, but also overcomes some of their shortages. Speci cally, by comparing our algorithm with the existing other global optimization methods (such as Genetic Algorithm, Simulated Annealing and Tabu Search), we nd that the ETS is more practical and e ective, which also yields good near-optimal solutions and has better convergence speed. The rest of this paper is organized as follows. Section 2 devotes to the background of our problem. Section 3 gives a general description of our new algorithm for geometric primitive extraction. The statement of ETS is presented in Section 4. Section 5 devotes to the experimental comparison results. We give our conclusion in Section 6.

2 Primitive Extraction and Minimal Subsets

In this section, we brie y review facts of geometric primitive extraction and the de nition of minimal subsets. We refer the readers to Roth et al[4] for the details.

(3)

data points in two or three dimensional Cartesian space, which are labeled

p

1

;p

2

;



;p

N, along

with the equation de ning the type of geometric primitive to be extracted. We assume that this de ning equation is an implicit form

f

(

p

;

a

) = 0. Here

p

is the datum points, and

a

de nes the parameter vector for this particular primitive. This assumption is not restrictive because it has been shown that the parametric curves and surfaces used to de ne the parts in CAD databases can be converted to implicit for[11]. The output consists of the parameter vector

a

of the best

primitive, along with the subset of the geometric data that belongs to this primitive. We de ne the residual

r

i as the closest distance of

i

th point of the geometric data to the curve or surface.

Given that residuals

r

1

;r

2

;



;r

N have been calculated, then extracting a single instance of a

geometric primitive is equivalent to nding the parameter vector

a

which minimizes the value of a cost function

h

(

r

1

;r

2

;



;r

N). Denote

f

(

a

) =

h

(

r

1

;r

2

;



;r

N). Therefore, extracting the

best geometric primitive from a given set of geometric data is equivalent to solving the following global optimization problem:

min

a2A

f

(

a

)

:

(1)

where

A

is a set of feasible solutions. A minimal subset is the smallest number of points necessary to de ne a unique instance of a geometric primitive. It is possible to convert from a minimal subset of points to the parameter vector

a

for a wide variety of geometric primitives[4].Since

only values of the parameter vector de ned by these minimal subsets are potential solutions to the extraction problem, we could search the best solution in a smaller space.

3 Evolutionary Tabu Search

In this section, we will state the thought of the Evolutionary Tabu Search. For the sake of completeness, we brie y review other two global optimization algorithms, including Genetic Algorithm and Simulated Annealing. All of them can be presented as methods for nding a global optimization in the presence of local optimum.

3.1 Simulated Annealing

Simulated Annealing is a stochastic optimization algorithm based on the physical analogy of annealing a system of molecules to its ground state. Originally developed by Kirkpatrick et al[5;22], the method has been successfully applied to a variety of hard optimization problems

in di erent elds. Starting from initial con guration

X

0 which can be generated randomly,

the simulated annealing procedure generates a stationary Markov chain of con guration

X

k, A

objective function f(

X

k), which is to be minimized, is used to evaluate the con guration

X

k.

During the simulated annealing process, for a given present con guration

X

k, the algorithm

generates a candidate con guration

Y

k. The decision of selecting either con guration

X

k or

con guration

Y

k as the next state

X

k+1 for the following iteration is not made deterministically

via a straight-forward comparison of the objective function values of the con guration

X

k and

Y

k, but stochastically, based on a sampling of the Bolzamann distribution via Mentropolis

function. If con guration

Y

k has a objective function value smaller than that of con guration

X

K, con guration

Y

k is selected as con guration

X

k+1. If con guration

Y

k has a objective

function value larger than that of con guration

X

k, con guration

Y

k is selected as

X

k+1 with

probability p determined by the Metropolis function

p

=

exp

(?

max

(

F

(

Y

k

;X

k)

;

0)

=T

k), where



F

(

Y

k

;X

k) =

F

(

Y

k)?

F

(

X

k).

The set

T

kis called an annealing schedule, and is a sequence of strictly monotonically

decreas-ing nonnegative numners such that

T

1

> T

2

>



> T

n and limk

(4)

convergence(i.e. in the limit

k

! 1) is guaranteed for a logarithmic annealing schedule of the

form

T

k=

T

1

=

(1 +

lnk

) where

k

0.

3.2 Genetic Algorithm

Genetic Algorithms(GA), rst developed by Holland[12], are stochastic optimization techniques

that mimic the principles of natural evolution. According to the genetic evolution theory, stronger and tter individuals have better chances of survival than weaker ones. O springs are produced by parents using interesting genetic information from both parents, or strings of chromosomes are inherited from both parents. Possible solutions of the optimization problems resemble chromosomes in the natural process, and producing new solution resembles producing o springs with certain chromosomes. The objective function in the optimization problem is evaluated in accordance with "survival of the strongest" principle in the natural evolution pro-cess. The chromosomes are manipulated by genetic operators to try to simulated the e ects of natural evolution. The basic four operators are Reproduction, Crossover, Mutation, Inversion, the rst three resemble the natural operators, while the last is done for merely improving the simulation results. These genetic operators are discussed in detail in standard references on genetic algorithms[7;8].

3.3 Tabu Search

Here we brie y review some notations of Tabu Search[13;14;15] and outline the basic steps of the

Tabu Search procedure for solving optimization problems.

Tabu Search is a metaheuristic that guides a local heuristic search procedure to explore the solution space beyond local optimum. It is di erent from the well-known hill-climbing local search techniques because Tabu Search allows moves out of a current solution that makes the objective function worse in the hope that it eventually will achieve a better solution. It is also di erent from the Simulated Annealing and Genetic Algorithm because the Tabu Search includes a memory mechanism. According to Glover's idea, in order to solve a optimization problem using Tabu Search, the following components must be de ned.

Con guration:

Con gurationis a solution or an assignment of values of variables.

Move:

A move characterizes the process of generating a feasible solution to the problem that is related to the current solution (i.e. a move is a procedure by which a new solution is generated from the current one).

Neighborhood:

A neighborhood of the solution is the collection set of all possible moves out of a current con guration. Note that the actual de nitions of the neighborhood depend on the particular implementation and the nature of problem.

Tabu Conditions:

In order to avoid a blind search, tabu search technique uses a prescribed problem-speci c set of constraints, known as tabu conditions. They are certain conditions imposed on moves which make some of them forbidden. These forbidden moves are known as tabu . It is done by forming a list of certain size that records these forbidden moves. This is known as tabu list.

(5)

With the above basic components, the Tabu Search algorithm can be described as follows.

(i)

Start with a certain (current) con guration and evaluate the criterion function for that con guration.

(ii)

Follow a neighbor of the current con guration, that is, a set of candidate moves. If the best of these moves is not tabu or if the best is tabu, but satis es the aspiration criterion, then pick that move and consider it to be the new current con guration; otherwise pick the best move that is not tabu and consider it to be the new current con guration.

(iii)

Repeat (i) and (ii) until some termination criteria are satis ed.

The best solution in the nal loop is the solution obtained by the algorithm. Note that the move picked at a certain iteration is put in the tabu list so that it is not allowed to be reversed in the next iterations. The tabu list has a certain size, and when the length of the tabu reaches that size and a new move enters that list, then the rst move on the tabu list is freed from being tabu and the process continues (i.e. the tabu list is circular).

3.4 Evolutionary Tabu Search

The Evolutionary Tabu Search takes advantages of the GA and Tabu Search, and it has two level selections, which are respectively called the rst-level selection and the second-level selection. We propose the use of Tabu search in the rst level, and evolution ideas[8;16]in the second level.

The ETS technique starts with a guess of

N

likely candidates, actually chosen at random in the search space. These candidates are the so-called parents. Initially each parent can generate a number of children, say

NTS

, which consists a family. The children in the same family(i.e., generated from the same parent) constitute the rst level selection, then we use Tabu Search to select the child as parent for the next generation. This selection creates the parents for the next generation.

The second-level selection is the competition between the families. The number of children that should be generated in the next generation depends on the results of the second-level selection. This second-level selection actually provides a measure of the tness of each family. Instead of using the objective values of a single point that might be considerable biased, we de ne a tness value based on the objective values of all the children in the same family for that measure. The number of children allocated to each family for the next generation is proportional to their tness values, but the total number of the next generation's children is still constant. In such way, tter individuals have better survival chance. It has been show that eventually only one family survives, which is usually the best one. The procedures of the rst-level and second-level selection continue until a certain number of iterations have been reached or an acceptable solution has been found. It is the e ect of the second-level competition that gives measure of the regional information. In fact, The tness value provides the information of how good the region is. If the region is found to contain a higher tness value, we allocated more attention to search in that region.

4 New Algorithm for Geometric Primitive Extraction

(6)

particular primitive we want to extract. Let

I

= (

I

1

;



;I

m) denote the minimal subset of the

size

m

, where

I

i corresponds to the indices of its member points. For the sake of convenience, we

assume that

I

i

< I

j for

i < j;i;j

= 1

;



;m

. For any minimal subset

I

, the objective function

f

(

I

) is de ned as follows:

f

(

I

) =Xm

i=1

s

(

r

2

Ii) (2)

where

s

is step function;

s

= 0 if

r

j is greater than or equal to the template width, and

s

= 1

otherwise. This objective function counts the number of points within a xed distance of the geometric primitive. Moreover, it e ectively matches a small template around this primitive to the geometric data. Therefore, our task is to nd the maximum of objective function. Therefore, our task becomes to nd the maximum of the objective function.

Let

I

c

;I

tand

I

b denote the current, trial and best con gurations (minimal subsets) and

f

c

;f

t

and

f

b denote the corresponding current, trial and best objective function values, respectively.

As described in the previous section, we operate with a con guration which is known as the current solution

I

c and then through moves which were explained in what follows, we generate

trial solutions

I

t. As the algorithm proceeds, we also save the best solution found so far which

is denoted by

I

b. Corresponding to these con gurations, we also operate with the objective

function values

f

c

;f

t and

f

b, respectively.

Before stating the two level selection algorithm, we de ne the basic components in the geometric primitive extraction.

Con guration

: Con guration is denoted by

I

= (

I

1

;



;I

m), where

I

i corresponds to the

indices of its member points.

Neighborhood

: We rst give a de nition about point. Two points is called neighbor-point or is within the Move Distance(MD) if their Euclidean distances is less than the given value of MD. For Con gurations de ned by

I

= (

I

1

;



;I

m), One con guration moved to

another neighbor con guration only if each point in it moves to its neighbor-point.

Tabu element:

We assume that the number of the parameter of a speci c geometric prim-itive is m. For example, m is equal to three if geometric primprim-itive is circle, which in-cludes radius(R) and the coordinate of the center(

X

c

;Y

c). Therefore, we can use a vector

A(

a

1

;



;a

m) to denote speci c geometric primitive, where

a

i(

i

= 1

;



;m

) is parametric

variable. Now, our tabu element can be de ned as a vector B(

b

1

;



;b

m

;"

),where

"

is an

error variable,

b

i(

i

= 1

;



;m

) is

i

?

th

parametric variable of geometric primitive.

Tabu condition:

Assume that the parametric vector of current con guration is

A

t(

a

1

;



;a

m) and tabu element B(

b

1

;



;b

m

;"

),we can de ne tabu condition as follows:

For

i

= 1

;



;m

,ifk

a

i?

b

ik



,then current con guration is tabu.

Aspiration Condition:

If current con guration satis es the aspiration condition, then it is no longer tabu even if it satis es the tabu condition. we de ne aspiration condition as follows:

f

t

> f

b, where

f

tdenotes the corresponding trial objective function value and

f

b

denotes the best objective function values according to equation 2. The main steps of our ETS algorithms are as follows:

Step 1

Randomly select

N

0parents(i.e.initial families). For each family, select

NTS

=

NTS

0(initial

number of children),

"

=

"

0(initial error value) and

MD

=

MD

0(initial move distance).

(7)

Step 2

For each family, use Tabu Search(details are given in section 4.1) to nd the parent of the next generation(the rst level selection).

Step 3

According to the second level selection, nd the number of children that will be gener-ated by the parents of the next generation. Details about it are given in section 4.2.

Step 4

According to some speci c strategy, change the Move Distance(

MD

)and error value(

"

).

Step 5

Repeat Step 2 to Step 4 until an acceptable solution has been found or until a certain number of iterations(IMAX) has been reached.

4.1 Tabu Search in the rst-level selection

For the geometric primitive extraction problem, our rst level selection algorithm can be de-scribed as follows.

Step 1

Let

I

c be parent of one family, and

f

c be the corresponding objective function value

computed using equation (2). For the rst generation, Let

I

b =

I

c and

f

b =

f

c.

Step 2

Generating Neighborhood: Using

I

c and

MD

to generate

NTS

children

I

1

t

;I

2

t

;



;I

tNTS (see Remark 1) and evaluate their corresponding objective function values

f

1 t

;f

2 t

;



;f

tNTS and go to Step 3.

Step 3

Arrange

f

1 t

;f

2

t

;



;f

tNTS in a descending order, and denote them by

f

[1] t

;f

[2] t

;



;f

[NTS] t . If

f

[1]

t is not tabu, or if it is tabu but

f

[1]

t

> f

b, then let

I

c=

I

[1]

t and

f

c=

f

[1]

t ,

and go to Step 4; otherwise, let

I

c =

I

[L]

t and

f

c =

f

[L]

t , where

f

[L]

t is the best objective

function of

f

[2]

t

;



;f

[NTS]

t that is not tabu and go to Step 4. If all

f

[1]

t

;f

[2]

t

;



;f

[NTS]

t are

tabu, we change

MD

according to the speci c strategy. Go to step 2.

Step 4

Insert new tabu element at the bottom of the tabu list and let

TLL

=

TLL

+ 1 (if

TLL

=

MTLS

+ 1, delete the rst element in the tabu list and let

TLL

=

TLL

?1). If

f

b

< f

c, let

I

b =

I

c and

f

b =

f

c.

Remark 1

Given a current solution

I

c, one can generate a trial solution using several strategies.

We use the following strategy: Given

I

c and a probability threshold

P

, for

i

= 1

;

2

;



;m

, draw

a random number

R

i 

u

(0

;

1), where

u

(0

;

1) is the uniform distribution on interval [0

;

1]. If

R

i

< P

, then

I

t(

i

) =

I

c(

i

); otherwise draw randomly point ^

l

from the set including all the

I

c(

i

)

neighbor-points and let

I

t(

i

) = ^

l

.

4.2 The Second Level Selection

The second-level selection algorithm is presented as follows:

Step 1

Repeat Step 2 for each family; goto Step 3.

Step 2

According to the de nition of the tness function(see Remark 2), compute the tness of the family.

Step 3

Sum up the tness of each the family.

(8)

NTS

=

N

0



NTS

0



F

S

(3)

where NTS is the number of children that will be generated for that family

NTS

0 is the initial number of children being generated for each family

N

0 is the initial number of family

F is the tness of that family

S is the sum of the tness values of each the family

Remark 2

Increase tness value by 1 whenever the objective value of a child is larger than that of its parent, or the following condition is satis ed:

e

?(f t ?f b ) T

> P

0 (4) where

f

t is objective value of child

f

b is the best objective value up to the current generation

T

is a constant coecient

P

0 is a random number uniformly distributed between 0 and 1.

4.3 Simulated Annealing and Genetic Algorithm approaches to geometric

primitive extraction

Now, for the sake of completeness, we describe brie y the SA and GA approaches to geometric primitive extraction.

4.3.1 Simulated Annealing-based approach to geometric primitive extraction

The main steps of our Simulated Annealing algorithms are as follows:

Step 1

Initialization: Let

I

c be an arbitrary solution, and

f

c be the corresponding objective

function value computed using equation (2). Let

I

b =

I

c and

f

b =

f

c. Select values for

the following parameters:

T

i (initial temperature), TM (temperature multiplier),

T

f ( nal

temperature), P(probability threshhold), NI (number of iterations after which temperature is reduced if

f

b has not improved) and ITmax(maximum number of iterations allowd). Let

T=

T

i, L=0, k=1.

step 2

: Obtain a neighbor

I

t of con guration

I

c, and compute

f

t using Eq.(2).

step 3

: If

f

t

< f

c goto Step 4; Otherwise, let

I

c=

I

t,

f

c=

f

t (accept the trial solution as the

new current one). If

f

t

< f

b, let L=L+1, and go to Step 5. Otherwise, let

I

b =

I

t, and

f

b =

f

t, L=0, and go to Step 5.

Step 4

: Draw a radom number

p



u

(0

;

1). If

p > exp

((

f

t?

f

c)

=T

) go to Step 5 (reject the

trial solution); Otherwise, let

I

c =

I

t and

f

c =

f

t (accept the trial solution as the new

current solution although it is of lower objective function value).

Step 5

: If L=NI and

T > T

f, let

T

=

T



TM

(see Remark 3). If

K < ITmax

, let

k

=

k

+ 1

(9)

Remark 3

The logarithmic annealing schedule makes simulated annealing convergent. How-ever, in practice, the logarithmic annealing schedule is far too slow and hence we have used a geometric annealing schedule of the form

T

k=(1-

)k where

is a positive real number close to

zero. We have empirically found the values of

=0.03 and

T

1

2 [2000, 4000] to be well suited

for our purpose.

4.3.2 Genetic Algorithm approaches to geometric primitive extraction

Recently, G.Roth et al.[9]and E.Lutton et al.[10]have used genetic algorithm to extract geometric

primitive. A basic Genetic Algorithm could be described as follows[7;8]:

Step 1

: An initial population of chromsomes(i.e. con guration) is created randomly, and each individual is evaluated by the objective function.

Step 2

: Two mates are selected for reproduction with probabilities in proportion to their tness values using roulette wheel selection.

Step 3

: Crossover, Mutation, Inversion operatorare applied to the selected mates, and o -springs are generated.

Step 4

: Each individual o spring is evaluated by the objective function.

Step 5

: Steps 2-4 are repeated until an entirely new population of chromsomes is generated.

Step 6

: The previous population is replaced by the new population.

Step 7

: If the stopping criterion is not reached, Goto Step 2, otherwise return the best chro-mosome(i.e. con guration) inthe new population as the solution to the problem.

In addition to the basic genetic algorithm operators, we have incorporated Elitism strategy and Intelligent mutation in our genetic algorithm-based geometric primitive extraction.

5 Experimental Results

In this section, we present our experiment results on extracting various kinds of geometric prim-itives. We will also make some comparison of our method with other three global optimization methods: Genetic Algorithm(GA), Simulated Annealing(SA) and Tabu Search(TS).

As mentioned in the previous section, ETS has four important parameters

MTLS

,

N

0,

NTS

0

and

P

. The tabu list enables the algorithm to have short term memory. A large tabu list size allows more diversi cation while a small list size makes the algorithm more forgetful, i.e., allows intensi cation to happen. Determining the list size is a non-trivial problem. Glover[17] suggests

using a tabu list size in the range [1

3

n;

3

n

], where

n

is the size of problem. Recently, some

researchers[18;19] propose variable tabu list size(tabu tenure). The next important parameter is

the size of the initial family(

N

0). If it is too small, it will resemble Tabu Search. Especially,

when

N

0 equals to 1, then ETS becomes Tabu Search. It is too large, then convergence speed

will become worse. Thus selecting this value is a tradeo between the nal extraction result and convergence speed. The third parameter is the initial number of children(

NTS

0), which

also a ect the convergence speed. In our experiments,

NTS

0

2[2, 8]. The last parameter is

P

(10)

(a) (b)

(c) (d)

Figure 1: Extracting multiple ellipses: (a)real image. (b)edge pixel. (c)extracted ellipses. (d)extracted ellipses superimposed on edge pixel.

consequently the closer the neighbor (the solution obtained after move) to the current solution, and vice verse. In our case, we have found that

p

2[0

:

3

;

0

:

5] to be well suited our purpose.

To extract primitives from real images, the rst step is to obtain the geometric data, which could be obtained by active sensors or edge detection. In our experiment, geometric data are produced by zero-crossing edge detector[20].

To extract multiple primitives, we just repeat the algorithm presented in the above section. Following the rst application of the algorithm, the data points belonging to the rst geometric primitive are removed, and the next application of the algorithm has as input the remaining geometric data points. A more complex situation is the extraction of not only multiple primitives, but also di erent types of primitives. In this case, we apply the algorithm for circles and ellipses simultaneously on the same geometric data. The best primitive is extracted and the algorithm repeated.

Extracting circles and ellipses using HT is still an active area of research. When the number of parameters of these primitives is higher than two ( 3 for circle and 5 for ellipse), it is space inecient and therefore dicult for HT to perform the extraction of circle, especially the ellipse. Our approach has no such diculty, as can be seen from Fig.2 and Fig.3

Fig.2 shows the extraction of four ellipses (here circle is considered as a special kind of ellipse). This example is signi cant in that: (1) the background is complex and the ellipses are occluded; (2) the small ellipse in the right-down corner is successfully extracted, which is very dicult for HT to extract. The last step of HT is peak detection and it is dicult in detecting a very small real peak. Case becomes more serious when the original images are contaminated by noise.

(11)

(a) (b)

(c) (d)

(12)

0 50 100 150 200 250 300 350 400 50 100 150 200 250 300 350 Number of Iterations

Best Cost Function Score

Figure 3: Cost function value of best primitive in Figure 2 versus number of iteration.

close together makes this a particularly dicult example for HT method[21] (because of the

coarse quantization necessary for HT to deal with higher dimensional spaces), while our global optimization algorithms can ful ll the extracting task quickly and correctly, as can be seen from this gure. Note: In Figure 3, sold line is ETS, dashed line is Tabu Search, dotted line is Genetic algorithm, dashdot line is Simulated Annealing.

Now, we compare ETS with other three global optimization algorithms, including GA, SA and TS. Fig.3 gives the comparison results of four algorithms in the extraction of the optimal ellipse in Fig.2. Table 1 gives the nal best objective function value and the number of iteration to reach it(IMAX=400). For comparison, each iteration consists of 40 times of objective function evaluation in all of the four algorithms in our experiment. From the experimental results, we can see that ETS has better convergence speed than the other three algorithms. At the same time, the nal best objective function value indicates that ETS and GA yields better result than SA and TS. Furthermore, it is obvious that ETS algorithm can be implemented in parallel.

6 Conclusion

(13)

Tabu Search to other problems in computer vision.

References

[1] M. D. Levine, Vision in Man and Machine, McGraw-Hill, New York, 1985.

[2] G. Stockman and A> Agrawala, Equuivalence of Hough transform to template matching, Comm. ACM

20

,1977, 820-822.

[3] T. Risse, Hough transform for line recognition: complexity of evidence accumulation and cluster detection, Comput. Vision Graphics Image Process.

46

, No.3, 1989,327-345.

[4] G. Roth and M.D. Levine, Extracting Geometric Primitives, CVGIP: Image Understanding,

58

(1993), 1-22.

[5] S. Kirkpatrick, C.D. Gelatt Jr., and M.P. Vecchi, Optimization by Simulated Annealing, Science,

220

:671(1983), 621-680.

[6] E. Aarts and J. Korst, Simulated Annealing and Boltzmann Machine. New York: Wi-ley,1989.

[7] D.E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, 1989.

[8] Z.Michalewicz, Genetic Algorithms+Data Structures=Evolution Programs, Springer-Verlag, 1992.

[9] G. Roth and M.D. Levine, Geometric Primitive Extraction Using a Genetic Algorithm, IEEE Trans. on Pattern Analysis and Machine Intelligence,

16

:9(1994), 901-905.

[10] E.Lutton and P. Martinez, A Genetic Algorithm for the Detection of 2D Geometric Primi-tive in Images, Proc.ICPR'94, Vol.1, 526-528.

[11] T.W. Sederberg and D.C. Anderson, Implicit Representation of parametric Curves and Surfaces, Computer Vision, Graphics, and Image Processing,

28

(1984), 72-84.

[12] J. Holland, Adaptation in Natural and Arti cial Systems, University of Michigan Press, Ann Arbor, 1975.

[13] F. Glover, Tabu Search, in Modern Heuristic Techniques for Combinatorial Problems, C.R. Reeves ed., John Wiley & Sons, Inc., 1993.

[14] K.S.Al-Sultan, A Tabu Search Approach to the Clustering Problem, Pattern Recognition,

28

:9(1995),1443-1451.

[15] D.Cvijovic and J.Klinowski, Taboo Search:An Approach to tha Multiple Minima problem, Science, Vol.267, 664-666(1995).

[16] P.Yip and Y.H.Pao, Combinational Optimization with Use of Evolutionaryary Simulated Anneaing, IEEE Trans. on Neural Network,

6

:2(1995), 290-295

(14)

[18] R. Battiti and G. Tecchiolli, The reactive tabu search, ORSA Journal on computing,

6

:126-140(1994).

[19] J. Xu, Steve Chiu and F.Glover, Fine-tuning a tabu search algorithm with statistical tests, Technical Report, University of Colorado at Boulder, 1996.

[20] S.D. MA and B.Li, Multiscale derivative computation, to appear in Image Vision Comput., 1996.

References

Related documents

In partic- ular, MESSy has been connected to the ECHAM5 general circulation model of the atmosphere (Roeckner et al., 2006), expanding it into the global chemistry climate model

This study summarizes the results of a program evaluation of the Distance Education Mentoring Program (DEMP), an ongoing initiative at Purdue University Calumet, Indiana (USA)

processing time when number of nodes increases for this reason we use asymmetric cryptography techniques(public key algorithms).In this paper we use two public

We describe the pelagic distribution of the most abundant forage fish species including walleye pollock ( Theragra chalcogramma ), capelin ( Mallotus villosus ), Pacific sandlance

In the present investigation, different acoustical properties such as ultrasonic velocity (Us), adiabatic compressibility (s), intermolecular free length (L f ),

&#34;The German privatization debate.&#34; In Privatization or public enterprise reform?: International case studies with implications for public management, edited by Ali

The data within each cohort examining the relationship between a high- or low-STEM dose and the proportions earning a particular STEM outcome (i.e., STEM PhDs, STEM publications,

Poster presented at the 2011 San Diego State University Student Research Symposium, San Diego, CA, and at the Industrial/Organizational &amp; Organizational Behavior