optimization-based framework

Top PDF optimization-based framework:

Improving an optimization-based framework for sensitivity analysis in multi-criteria decision-making

Improving an optimization-based framework for sensitivity analysis in multi-criteria decision-making

The framework is largely based on mathematical programming and involves solv- ing a potentially large number of mathematical programmes of various types, some of which are nonlinear and nonconvex. Consequently the computational load may inhibit use of the framework particularly when decision analyses are performed in the context of decision conferences, where it is desirable for sensitivity analyses to be conducted in near real time, preferably on a PC. In principle variations in the ob- jective data could also be handled within the framework but our principal concern is with judgemental data, which is inherently more uncertain and whose examination aids problem understanding.
Show more

16 Read more

An optimization based framework for modeling counterterrorism strategies

An optimization based framework for modeling counterterrorism strategies

The following paragraphs present a somewhat generic model that chooses emergency exits among a number of possible locations (determined by an engineering feasibility study), and then routes people from their present location to an exit or any other safe area. In order to implement the results, it would be most beneficial if the system were automated in the following sense. Sensors count the number of people walking into rooms, so that a system knows at any point in time how many people are present in different locations of the building. Whenever there is a change (i.e., somebody walks from one room into another), the system recomputes the optimal evacuation routes in real time. The results of that optimization process are then displayed outside each room, where arrows, not unlike the emergency floor lighting in airplanes, direct individuals from their present location to a safe place.
Show more

32 Read more

An Optimization-Based Framework for Automated Market-Making

An Optimization-Based Framework for Automated Market-Making

We take an axiomatic approach. Given a relatively small space of securities with bounded payoff, we define a set of intuitive conditions that a reasonable market maker should satisfy. We prove that a market maker satisfying these con- ditions must price securities via a convex potential function (the cost function), and that the space of reachable security prices must be precisely the convex hull of the payoff vectors for each outcome (that is, the set of vectors, one per out- come, denoting the payoff for each security if that outcome occurs). We then incorporate ideas from online convex opti- mization [22, 31] to define a convex cost function in terms of an optimization over this convex hull; the vector of prices is chosen as the optimizer of this convex objective. With this framework, instead of dealing with the exponentially large or infinite outcome space, we only need to deal with the lower- dimensional convex hull. The problem of automated market making is reduced to the problem of convex optimization, for which we have many efficient techniques to leverage.
Show more

10 Read more

A Genetic Algorithm-based Framework for Soft Handoff Optimization in Wireless Networks

A Genetic Algorithm-based Framework for Soft Handoff Optimization in Wireless Networks

Modern wireless network design is challenging due to the dynamic environment under which users’ services are provided thereby making parameter optimization a complex task. This dynamic, and often unknown, operating conditions increases wireless networking standards reliance on machine learning and artificial intelligence algorithms (Mehboob et al., 2014). Genetic algorithms (GAs) provide a well-established framework for implementing artificial intelligence tasks such as classification, learning, and optimization. Their versatility makes them remarkably useful in a wide range of application domains including wireless networks. As a meta-heuristic computational method (Hillier & Lieberman, 2001), GA has been successfully applied in aircraft and telecommunication industry, chip design, computer animation, software creation, and financial markets (Mitchell, 1998). According to Ridley (2004), these algorithms are inspired from biological evolution by imitating and adapting robust procedures used by various biological organisms. In the early days of evolutionary computing, developments took place rather independently from each other (Fogel, 1998), and this led to the emergence of different subareas such as evolutionary programming, evolution strategies, genetic algorithms, and genetic programming (Eiben & Smith, 2003). The common underlying idea, behind all these techniques is the same such that with a given population of individuals, natural selection (survival of the fittest) is invoked by environmental pressure to grow the fitness of the population. This is a clear optimization process.
Show more

14 Read more

Software Defect Prediction Framework Based On Hybrid Metaheuristic Optimization Methods

Software Defect Prediction Framework Based On Hybrid Metaheuristic Optimization Methods

k-NN k-Nearest Neighbor. k-Nearest Neighbor algorithm represents that classification method, in which a new object is labeled based on its closest (k) neighboring objects. In principle, given a training dataset (left) and a new object to be classified (right), the distance (referring to some kind of similarity) between the new object and the training objects is first computed, and the nearest (most similar) k objects are then chosen (Gorunescu 2011).

24 Read more

Efficient Data Dependence Profiling.

Efficient Data Dependence Profiling.

Figure 7.17 shows the benefits from using the set-based profiler information, instead of tradi- tional profilers. Here, we show the normalized IPC, and percentage improvement in cycles and instructions when using set-based profiler information, instead of traditional profiler informa- tion. We see that the benefits in equake and bzip2 are realized due to the improved accuracy afforded by the set-based profiler. This figure clearly shows the impact of accuracy on optimiza- tion potential. The decentralization of storage into various sets, allows the set profiler to isolate potentially inaccurate regions of code into a finite number of sets. As we mentioned before, the Root Sets and membership checks allow the set profiler to create highly local sets. An advantage of these localized sets, besides better cache locality, is the isolation from the rest of the storage. In other words, even if a few sets sets are completely full, and hence lose accuracy, the other sets are not impacted by this degradation. Our set-allocation schemes improve the accuracy of the traditional profiler by using the storage to record set IDs of the previous instructions, rather than simply setting a bit (Streamlined Design), or storing a tag (Improved Original Design). However, this improvement cannot get around the fundamental roadblock in traditional profiler design, that of centralized storage.
Show more

134 Read more

Engineering Design Optimization by Dynamic Cluster based Framework using Mixture Surrogates

Engineering Design Optimization by Dynamic Cluster based Framework using Mixture Surrogates

The remaining parts of this paper are provided in the succeeding sections: Section 2 details the algorithm utilized for dynamic partitioning based constrained optimization using mixture surrogates. Section 3 illustrates the algorithm with an example benchmark problem along with the concept of dynamic partitions of DS. Sections 4 provide a clear explanation of numerical experiments done for validating the developed algorithm along with its computer implementation. In Section 5, the introduced techniques are applied to a benchmark test problem and engineering problems include frontal crash simulation on the car model problem and design of MTS.In Section 6, conclusion and also scope of future studies are provided.
Show more

10 Read more

PE2   A Service Oriented Meta Task Scheduling Framework in Cloud Environment

PE2 A Service Oriented Meta Task Scheduling Framework in Cloud Environment

Scheduling a task in an efficient way is a noteworthy study issue in the part of cloud computing. For different types of user, it is very vital element for organizing and sharing resources of cloud. Some of the task scheduling algorithms is explained in the literature. Every algorithm has a few confinements, for example, highest time taken for scheduling, overloaded, time and computation complexity. Also, nearly all of the research works utilize a less number of tasks and single objective function. These works cannot provide better results. To conquer the difficulties present in recent scheduling algorithms, there is an huge requirement for creating novel approaches. This work presents a framework for task scheduling in cloud environment that reduce time, cost and maximize the resource utilization.
Show more

7 Read more

Intelligent Optimization Framework based on Benchmarking

Intelligent Optimization Framework based on Benchmarking

According to the principle of optimization and the general characteristic, the existing intelligent optimization algorithms (IOAs) can be divided into four categories. 1) Evolutionary Computation: Although the specific search techniques are different, the same optimization idea is to imitate the evolutionary process of biology, and generate candidate solutions by selecting, crossover and mutation. Genetic Algorithm is the typical representative. 2) Swarm Intelligence: They are mainly inspired by the behavior of social insects or animals. The most well-known are Particle Swarm Optimization and Ant Colony Optimization. 3) Phenomenon Algorithms: They are mainly inspired by the internal rules behind the various physical phenomena. The physical phenomena are varied. It can be predicted that this kind of IOAs will usher in a wave of a high tide of prosperity. 4) Other Meta heuristic Approach: In addition to the above three categories, there is another one. They should not be categorized as the optimization algorithm. In essence, they should be considered as unique search strategies and search ideas, like Simulated Annealing Algorithm, Taboo Search, Variable Neighborhood Algorithm, and Predatory Search.
Show more

7 Read more

Multi-Objective Stochastic Optimization Programs for a non-Life Insurance Company under Solvency Constraints

Multi-Objective Stochastic Optimization Programs for a non-Life Insurance Company under Solvency Constraints

We preliminary evaluate the effectiveness of the proposed multi-objective optimization procedure in the case of two criteria by solving Problem (14). The quality of the dotted representation of the efficient frontier obtained by using the proposed version of the NNCmethod with 20 solutions is compared to that provided by the approximation with 20 optimal portfolios detected by the -constraint method in terms of both performance metrics and qualitative charts. The last set of solutions has been obtained by minimizing the invested capital c for different targets of the expected return on capital ROC. The chosen values for ROC correspond to 20 equally-spaced points on the interval with the minimum and the maximum admissible value for ROC as end-points. We analyze the optimality of these solution sets according to the following two criteria.
Show more

30 Read more

Failure analysis of a re-design knuckle using topology optimization

Failure analysis of a re-design knuckle using topology optimization

This study has proposed a systematic framework for the de- sign and analysis of a lightweight knuckle for a commercial electric vehicle (EV). In the proposed framework, the design space is identified from an inspection of the steering and sus- pension systems of the target EV and a hardpoints sensitiv- ity analysis. A finite element model of the knuckle is con- structed in ABAQUS and two-stage topology optimization is then performed to minimize the weight of the knuckle. Fi- nally, finite element simulations are conducted to analyze the strength of the knuckle and evaluate its fatigue life under four ISO 8608 road classes (A–D). The simulation results support the following main conclusions:
Show more

9 Read more

A General Framework for Constrained Bayesian Optimization using Information-based Search

A General Framework for Constrained Bayesian Optimization using Information-based Search

To implement Algorithm 1, we have proposed a new information-based approach called Predictive Entropy Search with Constraints (PESC). At each iteration, PESC collects data at the location that is expected to provide the highest amount of information about the solution to the optimization problem. By introducing a factorization assumption, we obtain an acquisition function that is additive over the subset of functions to be evaluated. That is, the amount of information that we approximately gain by jointly evaluating a set of functions is equal to the sum of the gains of information that we approximately obtain by the individual evaluation of each of the functions. This property means that the acquisition function of PESC is separable. Therefore, PESC can be used to solve general constrained BO problems with decoupled evaluation, something that has not been previously addressed. We evaluated the performance of PESC in coupled problems, where all the functions (objective and constraints) are always jointly evaluated at the same input location. This is the standard setting considered by most prior approaches to constrained BO. The results of our experiments show that PESC achieves state-of-the-art results in this scenario. We also evaluated the performance of PESC in the decoupled setting, where the different tasks can be evaluated independently at arbitrary input locations. We considered scenarios with competition (CD) and with non-competition (NCD) and compared the performances of two versions of PESC: one with decoupling (decoupled PESC) and another one that always performs coupled evaluations (coupled PESC). Decoupled PESC is significantly better than coupled PESC when there is competition, that is, in the CD setting. The reason for this is that some functions can be more informative than others and decoupled PESC exploits this to make optimal decisions when deciding which function to evaluate next with limited resources. In particular, decoupled PESC avoids wasting time in function evaluations that are unlikely to improve the current estimate of the solution to the optimization problem. However, when there is no competition, that is, in the NCD setting, coupled and decoupled PESC perform similarly. Therefore, in our experiments, the main advantages of considering decoupling seem to come from choosing an unequal distribution of tasks to evaluate, rather than from the additional freedom of evaluating the different tasks at potentially arbitrary locations. In our experiments we have assumed that the evaluation of all the functions takes the same amount of time. However, NCD is expected to perform better than the coupled approach in other settings in which some functions are much faster to evaluate than others. Evaluating the performance of NCD in these settings is left as future work.
Show more

53 Read more

A FRAMEWORK FOR ONTOLOGY- BASED DIABETES  DIAGNOSIS USING BAYELSIAN OPTIMIZATION TECHNIQUE

A FRAMEWORK FOR ONTOLOGY- BASED DIABETES DIAGNOSIS USING BAYELSIAN OPTIMIZATION TECHNIQUE

Mass Index) which we assume as a hidden variable in the network. Regarding the fact that skin fold thickness is not a good evi- dence of diabetes, BMI is considered as obe- sity value. Both GTT and insulin measure- ments are used for testing diabetes and cause diabetes. Whether blood pressure is a reason for diabetes or not is a question. Following the experiments, it has been found that blood pressure is not a cause of diabetes. Pregnancy, age and obesity are all reasons for blood pressure. According to the presented analysis, Figure 4 indicates the diagnosis of 15 patients and their diabetes status based on the features stated.
Show more

13 Read more

A competency based framework for social work education in india

A competency based framework for social work education in india

The problem and research gap identified above indicates the need for an alternative model of social work education. According to Siddiqui (1997), the demand for a new model of social work education has been in existence since the 1940s. Also, as early as 1987, Siddiqui had suggested that social work education in India move towards a competency based approach. The current article, which is based on the doctoral study of the author, envisages an educational framework which adopts a Competency Based Education (CBE) approach and draws directly from social work practice in India. This competency-based educational framework purports to address the shortcomings of the present model by being more relevant to Indian social realities, creating a strong link between theory and practice, and fostering greater commitment to the profession. It also hopes to be useful, to both social work schools and social welfare institutions, in setting standards of training for social work students and practitioners.
Show more

7 Read more

STUDIES ON IMPROVING TEXTURE SEGMENTATION PERFORMANCE USING GENERALIZED GAUSSIAN 
MIXTURE MODEL INTEGRATING DCT AND LBP

STUDIES ON IMPROVING TEXTURE SEGMENTATION PERFORMANCE USING GENERALIZED GAUSSIAN MIXTURE MODEL INTEGRATING DCT AND LBP

In this paper we propose a new technique for image segmentation called MRF-ABC algorithm. It is based on Artificial Bee Colony optimization of label configuration of images to perform segmentation process in a Markovian framework. This model is developed to overcome the fact that ICM may converge to a local minimum during segmentation process. The evaluation of this algorithm on both real world images and X-ray Non Destructive Testing images shows that results are satisfactory in terms of energy optimization and visual quality assessment. This fact confirms that the proposed image segmentation method is robust than ICM and consequently equivalent to other state of the art techniques such as swarm particle and ant colony optimization in Markovian framework. In the perspective, future work will focus on overcoming high time processing limitation to allow large image segmentation by eliminating inefficient loops in the software and eventually implementing it in a programmable circuit. In addition, the application of this method will be tested on high-resolution specific images such as images of remote sensing and medical images.
Show more

10 Read more

Evolutionary Algorithms-based Parallel Simulation-Optimization Framework for Solving Inverse Problems

Evolutionary Algorithms-based Parallel Simulation-Optimization Framework for Solving Inverse Problems

D. High-level Parallel Hybrid (HPH): This procedure involves several self-contained algorithms performing a search in parallel, and cooperating to find the optimum solution. Instinctively, HPH will perform at least as well as one algorithm alone, and often performs better. An example of HPH is the GAs based island model, where the population is divided into small subpopulations by geographic isolation. A GA evolves each subpopulation and individuals can migrate between subpopulations. The island model is controlled by several parameters such as the topology that defines the connections between subpopulations, the migration rate that controls the number of migrant individuals, the replacement strategy used, and migration interval that affects how often migration occurs. A number of studies managed to implement this procedure with different degrees of success using different global optimization methods, for example in Bachelet et al. (1996) a parallel GA is proposed while in Badeau et al. (1997) a parallel Tabu Search is utilized.
Show more

200 Read more

Productive Public Expenditure and Debt Dynamics: a Theoretical Framework based on Intertemporal Optimization

Productive Public Expenditure and Debt Dynamics: a Theoretical Framework based on Intertemporal Optimization

Bose, Haque, Osborne (2003) examined the growth effects of government expenditure for a panel of 30 developing economies with a focus on sectoral expenditures during the 1970s and 80s. Their main empirical result is that the ratio of government capital expenditure to GDP is positively and significantly correlated with economic growth, while the growth effect of current expenditure is not significant for a large group of countries. Gupta, Clements, Baldacci, Granados (2005) test the effects of fiscal consolidation and expenditure composition on economic growth in a sample of 39 low-income countries during the 1990s. The results show a strong link between public expenditure and growth , as fiscal consolidations achieved through current expenditures cuts are, in general, more conducive to growth. Higher current expenditures and domestic financing of deficit are associated with less favourable economic performance. Empirical literature with similar results includes Landau (1983) and Summers, Kravis, Heston (1984). Hence for the empirical analysis in this paper, based on the findings in the above literature, the hypothesis of productive public expenditure being capital expenditure is tested for Indian data. In fact, the cointegration exercise presented in section 4 on empirical results, reconfirms the hypothesis, for India, where capital expenditures emerge to be of productive type.
Show more

39 Read more

Vibration Suppression of a Cantilever Beam Using Multi-Mode Dynamic Vibration Absorbers

Vibration Suppression of a Cantilever Beam Using Multi-Mode Dynamic Vibration Absorbers

In the designing of DVAs, some optimization methods have been implemented. [9] presented a general procedure based in the genetic algorithms (GA) method for determining simultaneously the localization and the optimal parameters of a neutralizer system manufactured using viscoelastic materials. In [1] was used the direct updated method in an optimization procedure and showed that the optimum values of the absorber parameters depend upon various factors as the position of the applied force, the location where the absorbers are attached, the position at which the beam response should be minimized, and the beam characteristics. [10] developed a hybrid optimization methodology, which combines the global optimization method based on the GA and the local optimization method based on Sequential Quadratic Programming to find the optimum values of the design parameters, namely, the spring stiffness, damping factor and the position of the attached tuned mass damper in order to suppress the vibration amplitude either at a particular mode or at several modes simultaneously.
Show more

11 Read more

Cultural Algorithm based on Decomposition to solve Optimization Problems

Cultural Algorithm based on Decomposition to solve Optimization Problems

It mainly focuses on maintaining the population diversity. Generally, the diversity of the population can be handled by increasing diversity using mutation of selected old solutions or some random generation of new solutions upon detection of environmental change or deploying multi-population methods [13], [16]. Good diversity helps obtain promising search regions. In [53], Deb presented an extended version of the Nondominated Sorting Genetic Algorithm (NSGA-II) [54] by introducing diversity at each environmental change detection. There were two approaches discussed in the paper; the first version introduced the diversity by replacing the population with new randomly created solutions. In the second version, diversity is promised by replacing the population with mutated solutions. In 2015 Azzouz [2], proposed a different version of the above algorithm to deal with dynamic constraints by replacing the constraint-handling mechanism with a much elaborated and self-adaptive penalty function. The major drawback of diversity-based methods is the difficulty to determine the useful amount of diversity needed. Because when the diversity high, it will resemble restarting the optimization process, whereas less diversity leads to slow convergence.
Show more

116 Read more

Cognitive Ant colony optimization: A new framework in swarm intelligence

Cognitive Ant colony optimization: A new framework in swarm intelligence

There are a considerable number of researchers, mainly biologists, who study the behaviour of ants in detail. Biologists have shown experimentally that it is possible for certain ant species to find the shortest paths in foraging for food from a nest by exploiting communication based only on pheromones, an odorous chemical substance that ants can deposit and smell. This behavioural pattern has inspired computer scientists to develop algorithms for the solution of optimization problems. The first attempts in this direction appeared in the early 1990s, indicating the general validity of the approach. Ant Colony Optimization (ACO) algorithms are the most successful and widely recognized algorithmic techniques based on ant behaviours. These algorithms have been applied to numerous problems; moreover, for many problems ACO algorithms are among the current high performing algorithms.
Show more

157 Read more

Show all 10000 documents...