Top PDF Criteria for optimal web design (designing for usability)

Criteria for optimal web design (designing for usability)

Criteria for optimal web design (designing for usability)

The fourth difficulty may be the structure itself. That is, it is generally found that people make fewer mistakes if the hierarchical structure of the site is broader rather than deeper. In fact, research has generally found that ideally all information should be placed within three hierarchical levels from the initial homepage of the site. Specifically, the more levels users have to take in order to get the information they want, the less chance they will find this information. For instance, in placing hyperlinks on a web page, Larson and Czerwinski (1998) point out that a moderate level of breadth is optimal if it is preceded by a well-organized layout. In their study, they reported that a two-level site beginning with 16 sequential links on the first level, then 32 links on the other produced reliably faster searches for information and produced less confusion than a three-level site with eight sequential links in all three levels. The reasoning here is that the deeper the levels, the more a user has to rely on short-term memory. Deeper level sites also have more general (and consequently more vague) link descriptions at the top level, which makes it more difficult for users to figure out and remember the correct paths to a target (For a good discussion of the breadth versus depth issue see Larson & Czerwinski,1998).
Show more

48 Read more

Optimal tuned mass-damper-inerter (TMDI) design for seismically excited MDOF structures with model uncertainties based on reliability criteria

Optimal tuned mass-damper-inerter (TMDI) design for seismically excited MDOF structures with model uncertainties based on reliability criteria

the passive vibration control of earthquake excited multi-storey building structures aiming to achieve enhanced performance compared to a same-mass TMD by exploiting the mass amplification effect of the (ideal) inerter. The latter is a linear two-terminal mechanical device of negligible mass/weight developing a resisting force proportional to the relative acceleration of its terminals [20]. The constant of proportionality is termed “inertance” and is measured in mass units (kg). Notably, several different inerter prototypes were devised and experimentally tested over the past decade achieving inertance values orders of magnitude larger than the devices’ physical mass, while observing a linear behavior within relatively wide frequency bands of practical interest [21- 24]. These devices are relatively small-scale tailored for and used in vehicle suspension systems. Nevertheless, inerter-like devices and mechanical arrangements, termed inertial or rotational dampers, have been also considered for the seismic protection of building structures either in place of viscous dampers [25], or in conjunction with viscous dampers to enhance their energy dissipation capabilities [26-29]. However, in the TMDI, the inerter is utilized as a mass-amplifier to increase the inertial property of a TMD without increasing its weight [18, 19, 30]. This is achieved by connecting the TMD mass via the inerter to a different floor from the one that the TMD is attached to in a multi-storey primary building structure as depicted in Figure 1(a). Specifically, in [18] a particular TMDI topological configuration was examined in which the TMD was located at the top floor and the inerter linked the attached mass to the penultimate floor. For this TMDI topology, it was shown in [19] that for a given primary structure, attached mass, and inertance coefficient an optimally designed TMDI (for spring and dashpot coefficients) can provide significant performance improvement compared to a same-mass optimal TMD in terms of primary structure response displacement variance under stochastic ground excitation.
Show more

29 Read more

The Open Research Web:  A Preview of the Optimal and the Inevitable

The Open Research Web: A Preview of the Optimal and the Inevitable

It is important to stress that at this point all of this would not only be an unvalidated regression equation, to be used only experimentally, but that even after being validated against an external criterion or criteria, it would still need to be used in conjunction with human evaluation and judgment, and the regression weights would no doubt have to be set differently for different purposes, and always open for tweaking and updating. But it will begin ushering in the era of online, interactive scientometrics based on an Open Access corpus and in the hands of all users.

15 Read more

eMedOffice: A web-based collaborative serious game for teaching optimal design of a medical practice

eMedOffice: A web-based collaborative serious game for teaching optimal design of a medical practice

The game eMedOffice is an interactive, web-based, 2D, bird's eye view simulation of a medical practice proto- type. Its purpose is to teach optimization of interior de- sign, including the furnishing as well as equipment components of a medical practice of a general practi- tioner. It is playable in every standard web browser and hence available anytime at any location where an inter- net connection is accessible. Constructed as a rich inter- net application that uses a live-connect technique to communicate with the web server, it offers continuous game play that is not interrupted by interfering web page loads while one is using standard web techniques. First, all participants have to register with a free-to-choose public player’s name. After registration the active game starts. Three successive game phases characterize the ac- tive game. During the Introduction Phase the learning objectives, principles and procedures of the serious game are presented. The introduction utilizes many in-game pictures to describe possible situations as an example. Additionally, this phase is supported by an educational adviser who explains the learning objectives orally and answers general questions. The game is played in the fol- lowing Execution Phase. In this phase players start to as- sign appropriate room functionalities, placing interior furniture and equipment onto the ground plan. A player can virtually open his/her medical practice at any time. Opening his/her medical practice starts a simulation.
Show more

15 Read more

Experimental design for vector output systems

Experimental design for vector output systems

The similarity in results, however, does not continue through the selected time point distribu- tions. Under constraint C2 (Figure 5), the D-optimal distribution is loosely clustered about the center of the time interval, the E-optimal time points are clustered about t = 250 seconds, and SE-optimal chooses a small cluster of time points around the initial bump and allows a few samples after the function reaches a steady state. Using constraint C3 (Figure 6), all three optimal design criterion chose a majority of their sampling times before t = 600 and allow only a few sampling times after the system reaches a steady state. The optimization of time point distributions yields improved asymptotic standard errors from those of the uniform distribution - sometimes by an order of magnitude or more. For all three time point constraints, D-optimal yields the smallest ASE’s for the most number of parameters, and SE-optimal yields the smallest ASE’s for the second most number of parameters. Both optimal design criteria perform better with the C3 optimal times than the C2 optimal times. Therefore, in this simple case, using either the D- or SE-optimal design criterion with time point distribution constraint C3 would yield the best results.
Show more

39 Read more

Structural Analysis and Topology Optimization of Continuous Linear Elastic Orthotropic Structures Using Optimality Criterion Approach In ANSYS

Structural Analysis and Topology Optimization of Continuous Linear Elastic Orthotropic Structures Using Optimality Criterion Approach In ANSYS

In structural design, topology optimization can be regarded as an extension of methods for size optimization and shape optimization. Size optimization considers a structure which can be decomposed into a finite number of members. Size optimization then seeks to find the optimal values of the parameters defining the members. Shape optimization is an extension of size optimization in that it allows extra freedoms in the configuration of the structure such as the location of connections between members. The designs allowed are restricted to a fixed topology and thus can be written using a limited number of optimization variables. The topology optimization is performed using optimality criteria method through ANSYS software.
Show more

7 Read more

Optimal Selection and Composition of Web Services   A Survey

Optimal Selection and Composition of Web Services A Survey

Web services by tags are the best way to find the appropriate web service. In [14] Uddam Chukmol et al. illustrated a collaborative tagging-based environment for web service discovery, allowing users to tag or annotate a service using keyword or free text. Algorithm is developed for both types of tagging, and yet to be implemented. In [6], the responses of two different approaches were evaluated by an experimental setup. The design approach to implement the database consists of tables which contains web services that are already tagged by the web service providers. Based on the tags inputted by the user, the search engine implements the algorithms to find the semantic correlation between the service tags. By applying SEBT (Search Expansion Based on Tags) algorithm, web services which are semantic correlative are obtained. However, it can be observed that some of these web services semantic relationships are poor. By applying
Show more

5 Read more

Risk-informed optimization of the tuned mass-damper-inerter (TMDI) for the seismic protection of multi-storey building structures

Risk-informed optimization of the tuned mass-damper-inerter (TMDI) for the seismic protection of multi-storey building structures

. This corresponds to TMDI cost taken as function of both the secondary mass and the inerter force demand. Since costing information for the latter is not available, a parametric investigation is undertaken. For all design problems, stochastic simulation is used to estimate the necessary risk measures, whereas a Kriging metamodel is developed to support an efficient optimization process. The main novel contributions of the present work is a) TMDI optimal design and performance evaluation using life-cycle cost criteria for a real-life case study building exposed to site-specific seismic hazard represented by a non-stationary stochastic ground motion model, and b) the incorporation of the inerter force within the adopted performance evaluation framework which has been employed in the past solely for TMD applications [12]. Emphasis is placed on discussing numerical results presented in the form of Pareto fronts while most aspects of the considered probabilistic design and performance evaluation framework are only briefly reviewed with the exception of those details related to novel elements introduced to accommodate intricacies of the TMDI design problem such as the inerter force. Readers interested in further details for the adopted performance evaluation framework are directed to references [12, 35].
Show more

30 Read more

Design considerations for early-phase clinical trials of immune-oncology agents

Design considerations for early-phase clinical trials of immune-oncology agents

A traditional approach to this combination dose-find- ing is to pre-select drug combinations with a known tox- icity order and apply a single-agent design by escalating and de-escalating doses along a chosen path [33]. This could be done by, a priori, pre-specifying a subset of combinations for which the toxicity ordering is known. This approach transforms the two-dimensional dose-finding space into a one-dimensional space, and it has been used in much of the early work in dose combi- nations [34, 35]. The disadvantage of this approach is that it limits the number of dose combinations that can be considered and it can potentially miss promising dose combinations that exist outside of the path. More re- cent methods have moved away from reducing the two-dimensional dose-finding space to a single dimen- sion, a thorough review of which has been written by Harrington et al. [36]. A number of designs have been proposed for finding the MTD of cytotoxic agents [37–39]. These methods determine combinations to which patients are allocated based solely on toxicity considerations, with- out accounting for efficacy. As in the single-agent setting, these model-based methods have superior performance to rule-based methods in terms of accuracy of MTD identifi- cation, and safety in allocating patients [32]. A web applica- tion for the Bayesian Optimal Interval (BOIN) method [39] for combinations is available at www.trialdesign.org, and R packages exist for the partial order continual reassess- ment method (package pocrm) [37] and the product of independent beta probabilities escalation (PIPE) design (package pipe.design) [38]. The POCRM was successfully implemented in a recently completed, but yet to be pub- lished, Phase I trial designed to determine the MTD of a combination of a toll-like receptor (TLR) agonists with or without a form of incomplete Freund’s adjuvant (IFA) for the treatment of melanoma (NCT01585350). To our knowledge, the PIPE design has been implemented in two dose-finding studies (NCT02760797, NCT02308072). There are a few existing early-phase designs for drug com- bination trials that account for both toxicity and efficacy. For example, the method of Wages and Conaway [40] has been adapted and implemented in recently completed and ongoing early-phase studies of combination immune-on- cology agents (NCT02126579, NCT02425306) [41, 42] using immunologic response as a binary activity endpoint for driving the design. The R code used to successfully implement these designs are available at http://faculty. virginia.edu/model-based_dose-finding/.
Show more

10 Read more

REQUEST FOR PROPOSAL FOR EMPANNELMENT OF VENDORS FOR DESIGNING, DEVELOPMENT, MAINTENANCE & HOSTING OF WEB-SITE/ WEB-PORTAL AS PER BANK S NEED

REQUEST FOR PROPOSAL FOR EMPANNELMENT OF VENDORS FOR DESIGNING, DEVELOPMENT, MAINTENANCE & HOSTING OF WEB-SITE/ WEB-PORTAL AS PER BANK S NEED

The vendors will be empanelled based on pre-qualification criteria (enclosed in Annexure-A) and technical evaluation criteria (Clause No. 15) as specified in the RFP. The Bank will invite Commercial Quotations from empanelled vendors for developing, maintaining and upgrading web-sites/web-portals, logo design and flash presentation design as per the Bank’s need, The terms and conditions for the same are mentioned below. It may be noted that actual terms and condition will be mentioned specifically at the time of inviting commercial proposals for the respective job.
Show more

16 Read more

Major MCDM Techniques and their application-A Review

Major MCDM Techniques and their application-A Review

The problem which has been used as a reference in this paper, to describe various techniques- the alternatives are „the cars‟ from same or different companies; the criteria include both qualitative as well as quantitative criteria. Qualitative criteria include reliability and style where as quantitative criteria include fuel economy and cost. These are the criteria against which the alternatives have to be compared. The alternative which suits in all the ways is chosen as the best resulting solution. The parameters for optimal choice of the car are as under:- A. Reliability- Reliability is the probability of failure-free operation of a product for a specified time in a specified environment i.e. for how long the product will work effectively without any failure while the product is under use. So, greater will be the reliability, more will be the probability of failure free life of the product. B.Style- Style includes the basic appearance, the design and the comfort level of the product. Style directly doesn‟t affect the quality of the product and of course is a voluntary option. But as the priorities of the masses are being diverted from economical perspective to qualitative perspective, so style is being given greater importance.
Show more

11 Read more

A Monte Carlo based analysis of optimal design criteria

A Monte Carlo based analysis of optimal design criteria

In recent years modeling in the life sciences involves increasingly sophisticated models with a plethora of parameters to be estimated. This is due in part to increasingly stringent requirements concerning accuracy and the desire for adaptation of models to individual subjects. In contrast to this complex situation one usually is confronted with serious restrictions on data which often need to be obtained by noninvasive or modestly invasive procedures. The validation of such models is, as a result, a very challenging problem. It is therefore necessary to design the data collection procedures very carefully in order to optimize the information content carried by the data obtained. Methods of optimal experimental design are of central importance in this context and have justifiably received significant attention in the research literature (see for example [11, 14, 15, 22]). Since one has a number of different design criteria from which to choose, it is important to have procedures to compare typical results obtained by using different criteria. A natural criterion for comparison of different approaches are the standard errors of the parameters estimated on the basis of the data provided by the optimally designed experiments. One possibility which has been frequently employed is the asymptotic approximation of standard errors [23] which are based on only a single realization and the limiting approximations. However, such standard errors obtained in specific cases may not be sufficiently precise, because the underlying assumptions for these asymptotic approximations may not be satisfied. Therefore it is of interest to use a different approach which is not based on a single realization of the observation process, but uses a large number of realizations, with, of course, increased computational costs. To illustrate an alternative approach, we present in this paper a detailed Monte Carlo analysis of the performance of several optimal design methods for the logistic and harmonic oscillator examples which were previously [8] investigated using asymptotic standard errors as a measure of comparison.
Show more

39 Read more

Statistical design of pools using optimal coverage and minimal collision

Statistical design of pools using optimal coverage and minimal collision

The drug discovery process starts with screening chemical libraries (collections of chemical compounds) to identify new lead compounds. Compounds are tested for potency with re- spect to one or several biological targets in an automated process known as high throughput screening (HTS). This process involves screening thousands to tens of thousands of chem- ical compounds per week. However, today’s chemical libraries are extremely large, on the order of hundreds of thousands or millions of compounds, so that even the high throughput screening process can take months to test the entire library. This, and the fact that only a small fraction of the compounds in a library are actually active, leads to the conclusion that this approach is not very cost or time efficient. New methods for improving screening efficiency are needed. Xie, Tatsuoka, Sacks, and Young (2001) state that, in situations when the rate of active individuals is very small but the number of individuals to be tested is very large, testing individuals in groups is an effective alternative to testing them one by one. This scenario describes exactly the situation we are facing when screening large collec- tions of chemical compounds. Therefore, one alternative approach to assaying compounds individually is to test them in pools, for example put 10 compounds together for testing. Creating the pools can be done either in a random fashion, or by applying certain design criteria.
Show more

44 Read more

Stable Gene Silencing in Zebrafish with Spatiotemporally Targetable RNA Interference

Stable Gene Silencing in Zebrafish with Spatiotemporally Targetable RNA Interference

We designed shRNAs employing the primary miR-30 back- bones (hence called miR-shRNAs) (Figure 1A). The fluores- cent reporter TdTomato conveniently marked the cells that express the miR-shRNAs. Three genes were used as the test set: (1) The atypical protein kinase C lambda (aPKCl), for which a distinct loss-of-function phenotype has been docu- mented in zebrafish. We chose this gene also because its disruption produces a highly specific cellular phenotype, that is, an alteration of spindle rotation in dividing radial glia progenitor cells (Horne-Badovinac et al. 2001). (2) The no tail a (ntla) and (3) one-eyed pinhead (oep) genes, for which distinct loss-of-function phenotypes can be readily assessed (Schulte-Merker et al. 1994; Zhang et al. 1998). Using the Web-based shRNA design tool (www.genescript. com) together with the filtering criteria based on thermody- namic properties (Ui-Tei et al. 2008), six shRNAs targeting the apkcl gene (Supporting Information, Figure S1), four shRNAs each that targeted the ntla (Figure S2), and oep (Figure S3) genes were selected.
Show more

22 Read more

Optimal design techniques for distributed parameter systems

Optimal design techniques for distributed parameter systems

Parameter estimation problems consist in approximating parameter values of a given mathematical model based on measured data. They are usually formulated as optimization problems and the accuracy of their solutions depends not only on the chosen optimization scheme but also on the given data. The problem of collecting data in the “best way” in order to assure a statistically efficient estimate of the parameter is known as Optimal Design. In this work we consider the problem of finding optimal locations for source identification in the 3D unit sphere from data on its boundary. We apply three different optimal design criteria to this 3D problem: the Incremental Generalized Sensitivity Function (IGSF), the classical D -optimal criterion and the SE-criterion recently introduced in [3]. The estimation of the parameters is then obtained by means of the Ordinary Least Square procedure. In order to analyze the performance of each strategy, the data are numerically simulated and the estimated values are compared with the values used for simulation.
Show more

20 Read more

Selection of catchment descriptors for the physical similarity approach  Part I: Theory

Selection of catchment descriptors for the physical similarity approach Part I: Theory

The second disadvantage relates to the initial phase of the algorithm, when a decision about the hierarchy of CD categories must be made. It is a highly subjective decision and is also affected by the availability or unavailability of particular CD categories. If a training set is available with all CDs and all model parameters already known, it may principally be possible to identify the most relevant CDs by a sort of principal direction analy- sis. Until this is done, the subjectivity may affect the selection of optimal CDs and thus also the final estimates of model parameters in ungauged catchments. A suitable categorisation of CDs for large and very heterogeneous catchment sets could be based e.g. on the effect of human activities. The superior categories of CDs could be those that are least influenced by human activities (e.g. climatic descriptors and geological descriptors), while the categories of CDs more influenced by human activities (e.g. land-cover descriptors) could possibly be regarded as inferior.
Show more

8 Read more

Optimization of the aerodynamic configuration of a tubular projectile based on blind Kriging

Optimization of the aerodynamic configuration of a tubular projectile based on blind Kriging

tunnel tests [5,6]. With the rapid improvement of computer capacity and the development of Computa- tional Fluid Dynamics (CFD) in recent years, many studies have focused on numerical simulation of the ow eld around the tubular projectile [7,8], and the ow structure and aerodynamic characteristics are studied. Our research group has also done a lot of work from numerical simulation to numerical optimization of the tubular projectile [1,2]. Li and Chen [2] dedi- catedly studied the aerodynamic characteristics of the tubular projectile under real conditions with the use of FLUENT. Based on numerical simulations of the two- dimensional ow elds around dierent congurations at Mach number of 3.0, Huang et al. [1] optimized the aerodynamic conguration of a simplied tubular projectile using the exhaustive method and obtained the optimal conguration with minimum drag coe- cient. However, it appears that the exhaustive method is appropriate based on the extremely limited number of design variables.
Show more

12 Read more

1.
													Shortest-path:  dynamic and extensible indicator for geographical search on road networks

1. Shortest-path: dynamic and extensible indicator for geographical search on road networks

The criteria of calculating an optimal path between two locations is a familiar problem in transportation system. Optimal path algorithms are prone to extensive research, in resulting a multiple approach for different conditions and restrictions [5, 3, 2]. The optimal path problem; In transportation system finding the route between source and destination with minimum distance, time, or cost is one of the main key problem. In a more complex problem setting, it arises in a wide range of engineering and scientific problem settings, both as stand- alone models and sub problems [4]. In this paper, Fuzzy Logic guided Genetic Algorithm can be used to find the optimal path problem. Usually, a transportation network is represented by a graph or two locations in a map with more possible connections and in assuming that the path distances are common at all times to as such a transportation network is replaced with a graph each node that is representing a location and each edge represents a path between two locations. Usually, drivers select the shortest path way to reach their destination since they assume that it should take less time to travel the optimal route in a real-world scenario. However, if some kind of events, such as accidents, traffic congestions happens in the optimal route; the overall travel time spent on this route can be much greater than that of time spending on the longest route, so finally users cannot find out which is the best shortest path amongst the other possible paths.
Show more

7 Read more

Online Full Text

Online Full Text

The cognitive walkthrough was developed as an additional tool in usability engineering, to give design teams a chance to evaluate early mockups of designs quickly (Rieman, Franzke & Redmiles, 1995). It does not require a fully functioning prototype, or the involvement of users. Instead, it helps designers to take on a potential user perspective, and therefore to identify some of the problems that might arise in interactions with the system.

6 Read more

Optimal management of adults with pharyngitis – a multi-criteria decision analysis

Optimal management of adults with pharyngitis – a multi-criteria decision analysis

Methods: We defined optimal patient management using four criteria: 1) reduce symptom duration; 2) prevent infectious complications, local and systemic; 3) minimize antibiotic side effects, minor and anaphylaxis; and 4) achieve prudent use of antibiotics, avoiding both over-use and under- use. In our baseline analysis we assumed that all criteria and sub-criteria were equally important except minimizing anaphylactic side effects, which was judged very strongly more important than minimizing minor side effects. Management strategies included: a) No test, No treatment; b) Perform a rapid strep test and treat if positive; c) Perform a throat culture and treat if positive; d) Perform a rapid strep test and treat if positive; if negative obtain a throat culture and treat if positive; and e) treat without further tests. We defined four scenarios based on the likelihood of group A streptococcal infection using the Centor score, a well-validated clinical index. Published data were used to estimate the likelihoods of clinical outcomes and the test operating characteristics of the rapid strep test and throat culture for identifying group A streptococcal infections.
Show more

13 Read more

Show all 10000 documents...

Related subjects