109 this methodology. While the functional decompositions in the comparison did vary in terminology and structure, the translation to use parameters was consistent overall. This supports the conclusion that this object-orientedapproach to defining use parameters helps remove practitioner variability. The exercise of comparing functional decomposition iterations to determine the effect on the use parameter definition was important because overall, the use and scaling parameters and use scenario inputs are the main inputs and driving factors to the Cumulative Damage Function. The methodology developed in this study to systematically define all use parameters integrates seamlessly in the object-oriented framework. This is important for the ease of use by practitioners. If a methodology was developed, that was overly complex or outside the realm of knowledge for a typical LCA practitioner or design for the environment teams, the method would not be practically feasible for widespread acceptance. While there will always be some subjectivity to LCA goal and scope definition due to varying knowledge, limited data availability and human error, the methodology developed in this study significantly improves upon the lack of guidance that existed previously. The case study helps to demonstrate that the proposed method for defining use and scaling parameters is effective and exhaustive in capturing the complete usephase of the product system. The only concern is that this approach could potentially overestimate impacts by double counting if use parameters overlap or are depended on one another. This concern is mitigated if all practitioners follow the same methodology steps and therefore results are comparable, even if slightly over estimated. Once again, there is a clear advantage of having detailed methodology steps for goal and scope phasedefinition.
LifeCycleAssessment is defined by ISO 14040 (ISO. 2006) as “LCA studies the environmental aspects and potential impacts throughout a product’s lifecycle (i.e. cradle to grave) from raw material acquisition through production, use and disposal. The general categories of environmental impact needing consideration include resource use, human health, and ecological consequences.” One of the important features of LCA is the capability to study the product throughout its entire lifecycle. The ‘cradle to grave’ approach ensures that all the stages of a product’s lifecycle are considered for assessing its environmental impacts. A product’s lifecycle is composed of different unit processes namely raw material extraction, production of intermediate products, production of end product, usage of product by a consumer and its disposal or/and recycling. Even the transportation across these phases is taken into account in an LCA study (Walter Klöpffer 2014).
As can be seen in Figure 11 the CDF of both of the functions ‘Puncture Can’ and ‘Rotate Can’ is 2, so the CDF of ‘Open Can’ is also 2. This information can then be fed into SimaPro to actually compute the life- cycle impacts to be integrated into the object-orientedlifecycleassessment framework (Gadre 2016). Consider the functional decomposition of the can opener shown in figure 10. It can be observed that a FMECA on the lower level functions of ‘Grip Can Edge’ and ‘Penetrate Lid’ could have been conducted and that the CDF for these associated sub-functions could have been calculated. These CDFs would have been integrated into the functions that the sub-functions integrate into in a similar manner described above. This then could have been used to assess the lifecycle impacts. However, instead the FMECA was performed on the higher level function of ‘Puncture Can’. This was done to illustrate that it was not necessary to have all of the product detailed design decisions completed in order to execute this methodology. It is a nice illustration of the benefits of the object-orientedlifecyclelifeassessment framework. By computing the CDF at higher level function some accuracy is lost but it does help guide design decisions and does allow for the LCA model to evolve as the product design evolves.
Also the relationship between BI and UB is also reliable while the relationship between FC and UB is found to be fairly less than desired. Furthermore, all the five mean effects are statistically positive at α =0.01 to attest to their statistical significance. The fail safe test further asserts the significance of the relationships. We discovered that 47-167 null effect size is needed to be hidden in file drawer for the mean correlation between the trio of PE,EE,SI and BI to be non significant, this seems unlikely. However the mean effect size of BI-UB has a weak fail safe value, suggesting that six (6) reports with null effects can make the effect non-significant. While the mean effect size of FC-UB failed the fail safe test, suggesting that addition of just one report with null effect can make the effect non-significant. In conclusion, we discovered that majority of researchers cited UTAUT in their articles in order to support an argument rather than using it. Others that reported the use of UTAUT actually used it partially while only a few have reported the use of the actually theory. This paper contributes to the area of IS/IT adoption and diffusion research by showing the inadequacy and inconsistency in the use and output of a theory.
Only for some chemicals an additional characteristic is required, for instance, its isotope (for radioactive releases), its stereo-isomer (for a chemical like cyclohexane) or its valence (for an ion such as chromium). For nanoparticles that are released during any life-cycle stage, additional parameters will be of importance in the impact assessment (either for fate, exposure or effect modeling). Parameters that most likely influence toxicity of nanomaterials include the chemical composition, particle size, shape, aspect ratio, crystal structure, surface area, surface chemistry and charge, solubility, as well as adhesive properties. As nanoparticles may also be coated, it is important to find out whether to report the pure material or the composite. In this context, it is also important to know whether nanoparticles change their form (shape, coating, etc.) during their lifecycle, for instance, due to aging and other influences such as weather, mechanical stress/pressure, electromechanical fields or catalysis. As a result, the elementary flows characterizing nanomaterials in the inventory may require that these additional characteristics be described.
Our approach is to design a hybrid method that combines the known packer detector and removal with a heuristic virus scanning engine to accelerate virus scanning in computers. As mentioned earlier, most obfuscated techniques used by malware authors are from known packer. Dynamic heuristic scanner is capable in unpacking obfuscated executables in memory by executing the instance code on the virtual memory. The approach of known packer removal can accelerate the scanning process by detecting and removing any known packer starting from the common entry point and reveal the real intention of the malicious code instead of consuming computer time and performance to emulate and decrypt garbage instructions. In cases where no known packer is detected, the emulator component will be executed in virtual memory. This approach is based on the belief that no matter how complex the obfuscation algorithm is, the binary will eventually be decrypted in memory. Static heuristic scanner is devised based on an analysis which compares file format and an instance code fragment to a virus “pattern.” The word “pattern” refers to the hexadecimal string in a virus signature. Our malicious behavior database is designed by using a sequence of one or more segments which are separated by gaps. Each time the scanning engine scans a malware instance file, the overall program’s structure, computer instructions, programming logic and some other attributes will be scrutinized.
An ObjectOriented Analysis and Design (OOAD) techniques has been prominence in 1990. The OOAD techniques utilized the same basic principles as OOP languages such as Java, C++, and other programming languages. An object is the concept of all OO approaches. An object is “An abstraction of the real world based on objects and their interactions with other objects”. In OO approach represents, in a stand-alone package, the process and data related with the real-world object. An object can communicate with other objects via messages.  There are three processes which are:
Different automated approaches have been proposed to restructure object systems. We cite three: the clustering algorithms, algorithms based on meta -heuristics and those based on the FCA. The first aim to restructure system by the distribution of some elements (eg classes, methods , attributes) in groups such that the elements of a group are more similar to each other with elements of other groups   . Approaches to restructuring based on meta- heuristic algorithms   are generally iterative stochastic algorithms, progressing towards a global optimum of a function by evaluating a certain objective function (eg characteristics or quality metrics). Finally, the approaches based on FCA   provide an algebraic derivation of hierarchies of abstractions from all entities of a system. Reference  presents a general approach for the application of the FCA in the field of object-oriented software reengineering. In previous work, we added the dimension of exploration using the FCA   .
“It was a great step in science when men became convinced that, in order to understand the nature of things, they must begin by asking, not whether a thing is good or bad, noxious or beneficial, but of what kind it is? And how much is there of it? Quality and Quantity were then first recognized as the primary features to be observed in scientific inquiry” . This quote of the Scottish physicist and mathematician James Clerk Maxwell (1831 - 1879) highlights the importance of identifying the nature of the entity to take into consideration when it comes to quality and also the major role of measurement in any scientific field. As software engineering differentiates itself from other hard sciences such as mathematics and physics, especially for its subjectivity aspects, several studies and experiments have shown that software metrics, when applied earlier in the software lifecycle (i.e. design phase), can help considerably the improvement and control of software quality over specific software properties such as efficiency, complexity, understandability and reuse . Many software quality indicators have been identified and successfully verified in helping reduce risks, detect faultiness and thus managing both time and cost estimation control . Some of the most relevant
development (optimisation) purposes. Diagnostic indicators need to be relatively simple to carry out and to report, but should still give a precise assess- ment of the relation between production practices and the (potential) environmental impact. Indicators such as N and P surplus per hectare, pesticide bTreatment Frequency IndexQ or bkg active ingredients per haQ and energy use in bMJ per kg productQ are well established in a number of countries. The LCA type indicators for GWP, eutrophication and acidification per kilogram product are well-established, but meth- odological differences are still too large to allow detailed comparisons between results from different researchers. Land use is part of LCA tools and was interpreted to reflect three competing objectives of: I. Biodiversity protection through sustaining and in- creasing uncultivated areas, e.g. rainforests and bogs. II. The need for land to secure global food sufficiency (Runge et al., 2003). III. Bio energy pro- duction (EEA, 2003). The objective of minimising land use per produced unit does not always go hand in hand with the objective of maintaining semi-natural grasslands by grazing, which is a goal in many Euro- pean countries (EEA, 2004a).
This research will use a logical approach to formalize vertical semantic consistency rules of class diagram refinement. In mathematics, the study of logic deals with statements or propositions. A statement is a sentence that is either true or false, but not both. For example, it rained yesterday. Logical investigation conveys clearly the required relationships between facts about the real world and show where possibly unwarranted assumptions enter into them. Mathematically, logic is referred to as a tool for working with complex compound statements which involves using formal language for expressing them, concise notation for writing them and a methodology for objectively reasoning about their truth or falsity. Also, logic is the basis for stating formal proofs in all branches of mathematics (Shoenfield, 1967).
In this paper, we discuss the use of various convex reformulations to find a solution for days-off scheduling problem with day task constraints. This problem has been presented as 0-1 quadratic programming subject to linear constraints. To solve this problem, we have used the quadratic convex reformulation based on two techniques, the first one is based on the smallest eigenvalue method and the second uses the semidefinite programming. Some numerical examples which assess the effectiveness of the theoretical results as well as the advantages of this model are shown in this paper. Several directions can be investigated to try to improve this Instances
A colony of honey bees can extend itself over long distances and in multiple directions simultaneously to exploit a large number of foodsources .A colony prospers by deploying its foragers to good fields. In standard, flower patches with plentiful amounts of nectar or pollen that can be collected with less effort should be visited by more bees, whereas patches with less nectar or pollen should receive fewer bees. In the proposed method, Bees algorithm is used to identify the amount of voltage and angle to be injected in UPFC for the location identified using neural network in the above section. Normally, Bees algorithm consists of three stages, namely
In this paper the NN model of PEM fuel cell using Matlab/Simulink is presented. The model is used to predict the stack voltage to the vehicle speed. The speed of vehicle in this study uses Driving Cycle ECE15. The PEM fuel cell model based on an artificial neural network shows a good agreement with the referenced model. The resulting configuration of the NN model is 2 units of input layer, 1 unit of the hidden layer with 24 neurons and 1 unit of output layer.
Time While receiving the CMESS, two hop neighbors computes their LET with adjacent node as per section-3.2.2 and sends reply to the initiator node. The reply message includes node id and its LET value. Node that initiates the connectivity maintenance phase, updates its routing table from the upcoming reply messages. From the replies of CMESS, the node gets additional path to the destination and updates them in the routing table. Among the available paths, it chooses the path with long link expiration time and transmits real time data steam in that path.
the rights of speech. In the case of groupware, the right of speech is exchanged between the participants by dint of a «token» policy. Here it is a question of different protocols which allow participants to communicate control information by designating a «token» software. By and large, there is recourse to a single «token». This method assures exclusive access to shared information. Yet, in certain applications, several «tokens» can exist side by side, and this is often the case in virtual games where m users among n can modify the shared context. The strategies of passing the “token” are done in two ways. They are respectively based on: • Explicit indication. An actor is charged to
The work environment is potentially as important as the external environment in the assessment of impacts on health and well-being  and the health of the work- force is a relevant indicator of sustainability . Efforts to eliminate hazardous materials from the environment through emissions controls can lead to increased expo- sures to the workers inside the facility generating the emissions [4-6]. For example, regulatory limits on par- ticulate matter, metals, and polyaromatic hydrocarbons emitted from waste-to-energy facilities (also known as incinerators) reduce the overall risk of community health impacts and environmental health impacts . However, the maintenance required to ensure proper operation of the controls increases the likelihood that workers contact these materials during routine mainten- ance tasks [7,8]. Several workplace exposure assessments have indicated how control technologies designed to re- duce environmental emissions worsen the health and safety of workers [5,9]. In addition, efforts to eliminate or reduce hazardous chemicals through substitution of an alternative chemical may introduce new, unknown
The lifecycle model used to evaluate infrastructure sustainability indicators consists of two integrated elements: 1) a lifecycle inventory analysis/impact assessment model of material production, construction, use, repair, and demolition stages; and 2) a lifecycle cost model of agency and social costs. This integration is shown (Fig. 2) along with other model components that characterize the infrastructure system, vehicle emissions, and traffic flows. Environmental impact categories evaluated include energy and material resource consumption, air and water pollutant emissions, and solid waste generation. Agency costs consisted of material, construction, and end-of-life costs, while social costs were comprised of pollution damage costs from agency activities, and vehicle congestion, user delay, vehicle crash, and vehicle operating costs. These indicators are evaluated for the total 60-year service life of a bridge with a traffic flow rate of 35,000 cars per day in each direction.