• No results found

4. REVIEW OF TOPOLOGY OPTIMIZATION

4.2. Heuristic Optimization Methods

Optimization techniques can be broadly classified into two categories: continuous and heuristic optimization algorithms. Continuous optimization methods include zeroth-order, first-order, and second-order approaches for finding optimum solutions. Some of the optimization algorithms that fall into this category include Powell’s method of conjugate directions, steepest descent, Newton’s method, the Broyden Fletcher Goldfarb Shanno (BFGS) approach, Augmented Lagrange Method (ALM). These algorithms all follow a common progression in which the algorithm determines the direction in which the objective function will see the largest amount of decrease, or a prescribed direction for zeroth-order methods, and the design variables are moved in that direction until a relative minimum is found. The

indicating that an optimal point has been found. Region elimination techniques, including interval halving and the Golden Section method, are frequently employed alongside other continuous algorithms to reduce their computation times. These techniques sequentially reduce the range of values considered for the design variables by dividing these ranges into intervals and testing the objective function results to judge which interval will contain the local minimum. All of these continuous algorithms also have the capability to deal with problems having multiple constraints.

The other category of optimization techniques, heuristic algorithms, are better suited to problems in which the nature of the objective function is not well known, frequently referred to as “black-box” problems. Objective functions in these problems are often not continuous, making it difficult to distinguish clear patterns between different design variable combinations. One drawback of heuristic algorithms is that the user will most likely not have any feedback to determine whether or not the optimum reached is the value of the true optimum. This requires the parameters of the algorithm to be “tuned”, particularly with respect to the chosen convergence criterion, until the best possible optimum is reached. It is also common for different trials of the same heuristic algorithm to produce vastly different optima, even when using the same starting design population. The typical criterion used to determine convergence is when the improvement in the objective function fails to rise above a certain value for a particular number of optimization iterations have elapsed. For certain problems, the optimization may be halted if all possible solutions have been tested, though this is rare (52). Some commonly used heuristic optimization algorithms include Genetic Algorithms, Particle Swarm Optimization, and Simulated Annealing.

Simulated annealing is an optimization algorithm that mimics the crystallization process in heating and annealing of metals. This comes from the idea that, in annealing, the metal particles will move essentially at random while the material is in a high energy state, but, as the temperature is lowered, the particles will begin to seek out positions that will minimize the overall potential energy state of the material. Similarly, Simulated Annealing allows for design points to move to positions that do not improve the objective function. The user prescribes a “cooling schedule”, which determines the rate at which the probability of choosing to keep an inferior design point will decrease. Occasionally, users will implement a simpler acceptance rule for Simulated Annealing known as Threshold Accepting, in which all random movements are accepted, but a limit is set on how much worse a design can get within a single iteration (52).

Genetic Algorithms are heuristic search algorithms that adapt the concepts of evolution and natural selection to finding optimal solutions. Genetic Algorithms utilize a randomly generated initial population of designs, and then select the fittest designs to be used as “parents” in the process of creating a new, “child” population. The goal of this process is to guide the random search process of creating new designs by adapting the most beneficial elements of the previous population iteration. Genetic Algorithms can employ a variety of different approaches for the selection of the fittest parent designs for breeding, the method of crossover, which is the process of creating child designs from parent designs, and mutation, through which typically random changes are made to children outside of crossover in order to increase the variability of the population between iterations. Genetic Algorithms are extremely flexible

because of their ability to be customized by the user and are widely used in many different fields of engineering for finding optimum solutions (53) (54).

Particle Swarm Optimization is a population-based optimization algorithm that draws its inspiration from the behavior of the flocking of birds and the schooling of fish. It was first introduced by Kennedy and Eberhart in 1995 (54) (55). The algorithm keeps track of a population of design solutions, referred to as particles, and how they move relative to each other in the design space. Each particle keeps a record of the design variable coordinates of the best solution it has encountered in the design space as well as the coordinates of the best solution that the entire population of particles has encountered. These two coordinates are used to generate a velocity for each particle individually. An optimum solution is found when all particles converge on a single coordinate point in the design space.

Related documents