Over the past 30-40 years, many different approaches have been used to solve nurse rostering problems of varying forms and complexity. Methods used include mathematical programming (Bard and Purnomo 2007, Jaumard et al. 1998, Mason and Smith 1998, Miller et al. 1976, Thornton and Sattar 1997, Warner 1976), constraint programming (Bourdais et al. 2003, Darmoni et al. 1995, Meisels et al. 1995, Meyer auf'm Hofe 2000), goal programming and multi-objective approaches (Arthur and Ravindran 1981, Azaiez and Al Sharif 2005, Berrada et al. 1996, Jaszkiewicz 1997). Recent, novel approaches include case-based reasoning (Beddoe and Petrovic 2006, Beddoe and Petrovic 2007) and Bayesian optimisation algorithms (Aickelin et al. 2007, Aickelin and Li 2007). A great variety of **local** **search** and metaheuristic approaches have also been applied to the problem. A few recent examples are represented by (Bellanti et al. 2004, Burke et al. 2001, Dias et al. 2003, Meisels and Schaerf 2003, Valouxis and Housos 2000). Many more can be found in the literature reviews of Burke et al. (Burke et al. 2004b) and Ernst et al. (Ernst et al. 2004). There are, however, very few large-scale neighbourhood searches applied to nurse rostering problems.

Show more
16 Read more

The Open Periodic Vehicle Routing Problem with Time Windows (OPVRPTW) is a practical transportation routing and scheduling problem arising from real-world scenarios. It shares some common features with some classic VRP variants. The problem has a tightly constrained large-scale solution space and requires well- balanced diversification and intensification in **search**. In **Variable** **Depth** Neighbourhood **Search**, large neighbourhood **depth** prevents the **search** from trapping into **local** optima prematurely, while small **depth** provides thorough exploitation in **local** areas. Considering the multi-dimensional solution structure and tight constraints in OPVRPTW, a **Variable**-**Depth** Adaptive Large Neighbourhood **Search** (VD-ALNS) algorithm is proposed in this paper. Contributions of four tailored destroy operators and three repair operators at **variable** depths are investigated. Comparing to existing methods, VD-ALNS makes a good trade-off between exploration and exploitation, and produces promising results on both small and large size benchmark instances. Keywords: adaptive large neighbourhood **search**, **variable** **depth** neighbourhood **search**, open periodic vehicle routing problem with time windows, metaheuristic

Show more
10 Read more

Many optimization problems of practical interest are computationally intractable. Therefore, a practical approach for solving such problems is to employ heuristic (approximation) algorithms that can find nearly optimal solutions within a reasonable amount of computation time. An improvement algorithm is a heuristic algorithm that generally starts with a feasible solution and iteratively tries to obtain a better solution. Neighborhood **search** algorithms (alternatively called **local** **search** algorithms) are a wide class of improvement algorithms where at each iteration an improving solution is found by searching the “neighborhood” of the current solution. A critical issue in the design of a neighborhood **search** algorithm is the choice of the neighborhood structure, that is, the manner in which the neighborhood is defined. As a rule of thumb, the larger the neighborhood, the better is the quality of the locally optimal solutions, and the greater is the accuracy of the final solution that is obtained. At the same time, the larger the neighborhood, the longer it takes to **search** the neighborhood at each iteration. For this reason, a larger neighborhood does not necessarily produce a more effective heuristic unless one can **search** the larger neighborhood in a very efficient manner. This paper concentrates on neighborhood **search** algorithms where the size of the neighborhood is “very large” with respect to the size of the input data and in which the neighborhood is searched in an efficient manner. We survey three broad classes of very large-scale neighborhood **search** (VLSN) algorithms: (1) **variable**-**depth** methods in which large neighborhoods are searched heuristically, (2) large neighborhoods in which the neighborhoods are searched using network flow techniques or dynamic programming, and (3) large neighborhoods induced by restrictions of the original problem that are solvable in polynomial time.

Show more
36 Read more

The above example demonstrates how to find a goal state with optimal path by using DFS algorithm. The **search** starts from the initial node „A‟ and it visits deepest path of that vertex such as, B, E, K,L and it adds visited node on the stack, then it backtracks to previous level and examines the next nearest vertices in the graph and so on. This way the DFS **search** reaches the goal node. Table-II shows the open and close list for DFS algorithm.

In contrast, although the ACO required the repair function for the more complex functions, adding information via distance heuristic, and reducing redundancy were both detrimental to the ACO-based variants. We hypoth- esize that both of these may have the effect of focussing **search**, causing premature convergence. To test this, we examined ACO runtime logs, which revealed that the lowest cost solutions are typically discovered an order of magnitude sooner than the equivalent EA-based experiments. This suggests that basins of attraction of fewer **local** optima are sampled during **search**. Given the relationship between the quality of **local** optima and their distance from the global optima (and hence from other low cost solutions) shown in Fig. 4 this explains the performance curves seen in Fig. 2. For some mid-scale problems (SC5-7) the negative quality-distance correlation means that algo- rithms which move from **local** optima to **local** optima will actually move away from the global optima, and because of the redundancy of the permutation encoding will actually be moving into more sparsely populated regions, where they are more likely to to become trapped. There remains the possibility that our results would dramatically change if we could find some “magic” set of settings that reduced this problem of ACO premature convergence, since we did not exhaustively tune the parameter values. However, our preliminary investigations did include all combinations of several different values for each parameter, in addition to the recommended settings.

Show more
31 Read more

product/company/information he straight away goes to the **search** engine and types his/her query, for this purpose now every company started recruiting SEO Analysts. SEO has become very important and plays key role in generating leads and business to all the organizations.

DE was chosen as the second mainstream heuristic to apply to this problem because, being designed for continuous domains, it represents a stark contrast to the previous ACO approach [8]. As the goal of minimizing resonant frequency is strongly related to antenna length, it was decided that solutions should represent circuit-free meander lines (as in the prior ACO), which precludes the use of solvers such as Binary DE (BDE) [28] and Binary Particle Swarm Optimization (BPSO) [29]. Related work using EO [7] confirmed this decision, since the expanded **search** space of a segment-based representation comes at the cost of **search** effi- ciency. Unless a binary solution representation is suitable, adapting continuous solvers to dis- crete problem domains is generally a non-trivial task (see Onwubolu and Davendra [30] for several examples). In a Cartesian grid of nodes there are two primary ways to describe a self- avoiding walk (SAW) from a given starting point. The first is in terms of the absolute direction of travel, often described using the cardinal directions (N)orth, (E)ast, (S)outh and (W)est. The second is in terms of the relative change in direction of travel given a common initial direction, which can be labeled (L)eft, (F)orward and (R)ight. A path of k steps with a fixed starting node may then be described by k symbols from either the NESW or LFR alphabets. Inserting an additional solution component to select the start node 1–n on one edge of the grid, a solution for a bounded grid of n × n nodes consists of n 2 components: n 2 − 1 directional instructions plus the starting node. Mapping these solution encodings to a continuous space can be done by dividing each dimension (of arbitrary size) by the number of alternative direc- tions in the encoding. The DE algorithm presented here defines each dimension to be in the range [0, 3].

Show more
22 Read more

Cuckoo **Search** algorithm works as follows. At the start, the algorithm generates randomly initial nests. Each egg in a nest represents a solution. In each generation of the algorithm, there are two operations are performed. First, generate a new nest by performing a Levy flight from the current nest, and then the new nest is evaluated. The new nest will be chosen as a current nest if the new nest is better than current nest. The second part of the Cuckoo **Search** algorithm discovers and removes the worse nests with probability pa. For simplification purpose, Cuckoo **Search** relies upon three idealized rules:

Show more
The effect of the Earth’ rotation on the dynamics of nonlinear waves in the oceans was extensively studied in the last decades (see, for example, References [1–8] and references therein). As well-know, wave propagation in big lakes can also be affected by the Earth’ rotation [9–13]. In particular, the dynamics of solitary waves has been investigated within the framework of the Ostrovsky equation and it was established that they cannot propagate steadily because of the permanent radiation of small-amplitude long waves [14,15] (however, they can steadily propagate being supported by a long background wave [16,17]). As a result, an initial solitary wave experiences a terminal decay which leads formally to it vanishing in a finite time [3,18]. However, the process of solitary wave decay is more complicated in reality and leads eventually to the formation of envelope solitons described by the nonlinear Shrödinger (NLS) equation or its modifications [19–24]. In an inhomogeneous medium, the dynamics of solitary waves is determined by the synergetic effects of inhomogeneity and fluid rotation. In particular, at a certain relationship between these two factors, a Korteweg–de Vries (KdV) soliton propagating towards a coast with a gradually decreasing **depth** can preserve its shape and amplitude, whereas its width and velocity adiabatically changes [8].

Show more
12 Read more

(d) the outer limits of: (i) construction for a new road to be built by a **local** authority; (ii) an approved alteration or improvement to an existing road involving construction of a subway, underpass, flyover, footbridge, elevated road or dual carriageway; or (iii) construction of a roundabout (other than a mini- roundabout) or widening by the construction of one or more additional traffic lanes;

15 Read more

(2) As from 1 April 2002 the installation of a replacement window, roof light or roof window or specified type of glazed door must either have building regulation approval or be carried out and certified by a person who is registered under the Fenestration Self -Assessment Scheme by the Glass and Glazing Federation. (3) Question ‘h’. Competent Persons Scheme. These records are not routinely held by the **Local** Authority. Information is available from the appropriate Scheme Managers direct. This includes - heat producing gas appliances; oil-fired combustion devices, oil storage tanks and heating and hot water services systems connected to them; certain solid fuel burning appliances and heating and hot water service systems connected to them; air conditioning or ventilation systems; lighting or electric heating systems; certain electrical installations; sanitary ware or washing facilities and cavity wall insulation. The client is advised to apply to the vendor for details of any works or completions issued under Competent Persons Schemes.

Show more
20 Read more

Abstract. Experimental investigations of pool boiling heat transfer on microchannels of **variable** **depth** were conducted. The experiments were carried out for water and ethanol at atmospheric pressure. Microchannels of **variable** **depth** from 0.2 to 2.8 mm and width 0.5 mm were uniformly spaced on base surface with pitch of 1 mm. The comparison of heat transfer coefficients for surfaces with **variable** and constant **depth** of microchannels was made. At the low and medium heat fluxes structures with constant microchannel **depth** showed the best boiling heat transfer performance. EX-FH20 (Casio) camera was used to record the images of the entire surface of the specimen. The bubble growth mechanism on the enhanced surface was different from that of plain surface. Visualization investigations were aimed at identifying nucleation sites and determining the bubble growth cycle. Vapor bubbles generate in microchannel spaces, from where they move towards the fin tips, then grow and depart.

Show more
• E.g., sort moves by the remembered move values found last time. • E.g., expand captures first, then threats, then forward moves, etc. • E.g., run Iterative Deepening **search**, sort by value last iteration. • Alpha/beta best case is O(b (d/2) ) rather than O(b d )

77 Read more

Next, we suggest two refinements to DAC that reduce running time. We propose a divide-and-conquer **local** **search** heuristic with quadrant restrictions (DACQ) and a divide-and-conquer **local** **search** heuristic with neighbor restrictions (DACN). In these **local** **search** algorithms, there are neighborhood restrictions on the lattice points to which points can be assigned. Both algorithms follow the same steps used in DAC. However, the points in M are not assigned to all of the lattice points. The neighborhood is restricted as follows.

Show more
The language L GRAPHS(5) 0 representing the sub- set approximation of the **search** space for all non- crossing graphs with nesting **depth** 5 can be ap- plied beyond the Yli-Jyr¨a and G´omez-Rodr´ıguez (2017) framework to guide and restrict syntac- tic parsing and generation. This is expected to lead to linear time syntactic and semantic de- pendency parsing with noncrossing output struc- tures. If combined with multiplanar/book em- bedding methods (Yli-Jyr¨a, 2003; Kuhlmann and Johnsson, 2015), the techniques might extend to non-projective parsing and crossing graphs. Acknowledgements

Show more
11 Read more

The purpose of this study was to investigate the role of vocabulary knowledge and reading comprehension among Iranian EFL learners. The participants in this study were 50 EFL learners who were selected randomly from among intermediate learners in Adib Language Institute in Ardabil, Iran. To collect data, learners were given two tests. One of them measured the **depth** of vocabulary knowledge (WKT), and the other one was a reading comprehension test which required them to read different passages and answer multiple choice questions. The results of this study showed that there was a strong positive relationship between **depth** of vocabulary knowledge and reading comprehension skill. The implications of this study can be that teachers and learners should take into account the role of vocabulary knowledge **depth** in their teaching, and learning, respectively.

Show more
The independent paths ideally should be generated from the pseudocode. This step was intended to ascertain whether all pseudocodes have been applied to the code. If a given test case did not produce the result as expected, it means there was a missing/fault in implementation(code). For that reason, the pseudocode should be used as the source of CFG/DD-Graph. Since CFG/DD-Graph were using graph representation, then a theory of the graph was proper to be used as a solver for independent paths generation problems. To traverse from one node to another specific node can be done using some algorithms such as Dijkstra, Deep First **Search** (DFS)[24]. These algorithms widely used in network problems such as routing and traveling salesman problems. DFS has been used to determine test cases within system testing graph which was a combination between activity diagram and sequence diagram[25].

Show more
In this section, a taxonomy is given for classifying **local** **search** metaheuris- tics based on the characteristics of their move acceptance methods. **Local** **search** metaheuristics are composed of two fundamental components; a predefined so- lution neighbourhood, and a move acceptance method. Here, an emphasis is put on predefined, since the strategy for applying move operator(s) is deter- mined in advance and there is no discrimination between multiple operators if there is any. This contrasts with methods such as hyper-heuristics where (machine) learning techniques could be used for selecting the most appropriate move operator given the current **search** state. Similarly, Tabu **Search** is not considered in this study as a **local** **search** metaheuristic embedding a move ac- ceptance strategy; Glover and Laguna (2013) have previously contrasted Tabu **Search** to single-point based **search** methods utilising a (move) acceptance cri- terion. A classification for **local** **search** metaheuristics is shown in Figure 1 and distinguishes them based on two features of their move acceptance methods as indicated by the dashed arrows. Firstly (left), the nature of the accept/reject decision, as indicated by the acceptRejectDecision() procedure in Line 6 of Al- gorithm 1, is considered which takes into account what the objective value of the candidate solution is compared to, and the resulting probability of accep- tance that is returned. The second part of the classification (right) considers the nature of how the algorithmic parameters are set within the move acceptance method. The mechanism(s), if any, to update the settings of the parameters are usually employed during the procedures process 1 () and/or process 2 () as shown

Show more
48 Read more

moves to be organized into a decision tree, which can be tra- versed through with various types of algorithms. **Depth**-first **search** (DFS) and breadth-first **search** (BFS) are two rudi- mentary approaches to tree traversal that are straightforward to implement and can solve the game if possible. However, there is still room left for improving their performance us- ing auxiliary algorithms. As part of the challenge proposed by Neller, teams were asked to explore potential heuristics to guide the performance of these graph algorithms. This compelled our team to delve into more intelligent solutions such as heuristic-based traversal algorithms. There are sev- eral features that can be extracted from any state of a BoF game to be used directly as a heuristic to guide the traver- sal. This abundance of applicable features facilitated a ma- chine learning approach: suitable features can be used as an input, and the output can be applied as a heuristic - which allows the traversal to be directed by multiple features rather than just one. The team compared the provided **depth**- first **search** to heuristic algorithms such as Monte Carlo tree **search** (MCTS), as well as a novel heuristic **search** algorithm guided by machine learning.

Show more
Hyper-heuristics are hence categorized as selec- tion constructive, selection perturbative, generation constructive and generation perturbative. Selection hyper-heuristics select the heuristic to apply next when constructing or improving a solution. Gener- ation hyper-heuristics create new low-level heuristics by combining low-level heuristics or components of these heuristics. Genetic programming [3, 4] has been primarily used by generative hyper-heuristics to cre- ate new heuristics. Methods employed by selection constructive hyper-heuristics include tabu-**search**, vari- able neigbourhood **search**, simulated annealing and evolutionary algorithms. Selection perturbative hyper- heuristics perform single point or multi-point **search**. Single point **search** hyper-heuristics are comprised of a heuristic selection and move acceptance component while multi-point hyper-heuristics employ a population based method such as evolutionary algorithms or par- ticle swarm optimization to select low-level heuristics. This study investigates the use of a multi-point **search** selection perturbative hyper-heuristic to identify the appropriate **search** or combination of searches to use to solve the problem at hand. An evolutionary algorithm is used to explore the heuristic space. The low-level perturbative heuristics are the searches, i.e.,

Show more
13 Read more