Validation and Verification

Top PDF Validation and Verification:

A model for validation and verification of disk imaging in computer forensic investigation

A model for validation and verification of disk imaging in computer forensic investigation

Forensic computing has typically developed out of a demand for services from the law enforcement community (Noblett, Pollitt and Presley, 2000) and typically developed in an ad hoc manner (Etter, 2001) rather than a scientific one. According to Jason and Jill (2007) many forensic computing practitioners are working in a high workload and low resources environment are finding difficulty in meeting demand of validation and verification of their tools and still meet demands of the accreditation framework.

20 Read more

Verification and Validation of a Fingerprint Image Registration Software

Verification and Validation of a Fingerprint Image Registration Software

Based on the investigation of the specification, literature, and code inspection, we concluded that the image registration module is designed consistently with respect to the claimed references. The transformations it o ff ers are linear and they preserve the essential image features for accurate compari- son. We realized that the software package provides imple- mentation of the standard ML algorithm as well as the opti- mized ML ∗ algorithm. The ML ∗ implementation conformed to the algorithm described in [4]. The construction of the B- spline model as well as the pyramidal approach have a so- phisticated theoretical basis presented in [10–13]. Through code inspections we did not find any faults in the imple- mentation. While the absence of software faults may surprise some readers, one needs to have in mind that our team served as an independent verification agent. Our activities were in- tended to go beyond the verification and validation activities performed earlier by the software development organization.

9 Read more

Rapid, automated test, verification and validation for CubeSats

Rapid, automated test, verification and validation for CubeSats

From the experiences learned in the ZA-CUBE-1 mission, bringing up of a small-scale mission assurance facility is reported. The first of a series of actions envisaged to accomplish the facility is achieving functional test and verification, and the same under modest temperature cycling. The former is initiated by setting up a test and measurement system comprising of the legacy equipment from the previous space project, while a thermal chamber is procured as a first-pass of environmental validation. The test system is driven autonomously using a highly pliant software controller, which is the executable tool for the conceived methodology of systems engineering life cycle and mission assurance. Besides automated electrical and temperature measurements, the software has been crafted to accommodate for Phase B/C deliverables by way of simulations, virtual prototypes, emulation of operation scenarios with hardware tools and the mission software—all unified in a single platform. The system is exploited in validating a S-band transmitter while economizing time, and in obtaining valuable insight in transmission performance over thermal loads, which may result in revising the mission requirements and impacting satellite system parameters. The goal of the work is to shorten the iterative mission engineering in top-down and bottom-up cycles, through automation and, ultimately ensure substantial consistency and traceability in the design flow.

10 Read more

Expanding the focus of ID check for age verification : factors influencing attendants behavior in ID validation in compliance with sales legislation of age restricted products

Expanding the focus of ID check for age verification : factors influencing attendants behavior in ID validation in compliance with sales legislation of age restricted products

How much time an attendant has to check an ID has an impact on the depth and breadth of the validation process. The result of the self-reported behavior confirms that time-constraints to some extend affects attendants’ behavior to validate IDs or not. This problem can be realistically seen because customers do not expect to spend too much time in transactions with attendants to obtain the products they want and attendants are under pressure, especially during busy hours to attend to customers as quickly as possible. ID check might not just extend the pressure of time, but might be affected by the pressure of limited time as well. However, their ability to dispense the expected behavior highly reflects on their ability to validate efficiently within the restricted time. The results, however, show that while time constraint is a negative predictor for self-reported behavior, it actually positively predicts ability of vendors to validate IDs. What features vendors consider to validate IDs within a short time compared to when they have more time could not be determined in this study and this could be a limitation for the work. However, results show that attendants might still be able to validate IDs regardless of the time-frame. Most respondents reported to spend at most 15 seconds on ID check. How much of this is dedicated to ID validation could not be determined in this study. It is possible that such distinctions might not be measured in the sequence of time allotted to age verification against time given to ID validation. A possible explanation could be that time itself is not considered a problem in the process or perhaps it is when they are confronted with time constraints that they develop the consciousness to quickly validate. It could also be an indication of their perception on personal efficiency to manage time while validating IDs during ID checks

59 Read more

Pedestrian Flow Simulation Validation and Verification Techniques

Pedestrian Flow Simulation Validation and Verification Techniques

For the verification and validation of microscopic simulation models of pedestrian flow, we have performed experiments for different kind of facilities and sites where most conflicts and conges- tion happens e.g. corridors, narrow passages, and crosswalks. The validity of the model should compare the experimental conditions and simulation results with video recording carried out in the same condition like in real life e.g. pedestrian flux and density distributions. The strategy in this technique is to achieve a certain amount of accuracy required in the simulation model. This method is good at detecting the critical points in the pedestrians walking areas. For the calibration of suitable models we use the results obtained from analysing the video recordings in Hajj 2009 and these results can be used to check the design sections of pedestrian facilities and exits. As prac- tical examples, we present the simulation of pilgrim streams on the Jamarat bridge (see Figure 5). The objectives of this study are twofold: first, to show through verification and validation that simulation tools can be used to reproduce realistic scenarios, and second, gather data for accurate predictions for designers and decision makers.

16 Read more

Allowing and Storing of Authorized and Unauthorized Database User According to the Policy Verification and Validation of Distributed Firewall Under the Specialized Database

Allowing and Storing of Authorized and Unauthorized Database User According to the Policy Verification and Validation of Distributed Firewall Under the Specialized Database

The main objective of this research is to implement a authorized and unauthorized database user according to the policy verification and validation of distributed firewall under the specialized database(SDB).In distributed firewall environment in order to keep track of some certain actions in the first stage (Create, Read, Update, Delete) that are performed on the policy rule set. Then distributed firewall concept is explained and the comparison of two firewall designs is presented in terms of their performance in network security. The next stage is to give the details of distributed firewall environment for which the proposed the maintain specialized database is designed. Such an application will be very helpful in network security management in protecting the consistency among the overall security policy. The data provided by the application can be used to implement more advanced tools like distributed firewall policy advisor tools(DFPA).

5 Read more

Diagnosing verification and validation problems in public civil engineering projects : How "building the right system right" can go wrong

Diagnosing verification and validation problems in public civil engineering projects : How "building the right system right" can go wrong

One of the important characteristics of the civil engineering industry is the contractual relation between the client and the contractor (Dubois & Gadde, 2002). This is considered one of the most difficult aspects of civil engineering projects (Farnham & Aslaksen, 2009). The introduction of the integrated contracts changed the contractual relationships between the client and contractor. In these contracts the client is responsible for developing the requirements and making a reference design as preparation of the tendering process. The contractor is not merely responsible for realization and execution, but also for further development of the requirements and for completing the design. Also maintenance could be included as the contractor’s responsibility. The design for every project is unique and is part of the project itself (Chang & Ive, 2007). Public clients are obliged to procure the realization activity of civil projects in a tendering process according to European and national procurement laws. In the tendering process, the contractor is selected and contracted. Because of the increased responsibility of the contractor, the client has to rely more on verification and validation to gain trust in the quality of the contractors work.

16 Read more

Guide for Verification and Validation in Computational Solid Mechanics

Guide for Verification and Validation in Computational Solid Mechanics

The committee maintains a roster of slightly less than the maximum permitted 30 members, with a few alternate and corresponding members. The membership is diverse with three major groups being industry, Government, and academia. The industry members include representatives from auto and aerospace industries and the Government members are primarily from the Departments of Defense and Energy. Particularly well represented are members from the three national laboratories under the National Nuclear Security Administration. This latter membership group is key to the committee as much of the recent progress in verification & validation has come from these laboratories and their efforts under the Advanced Simulation and Computing (ASC) Program, started in 1995.

10 Read more

The Complexity of Verification and Validation Testing in Component Based Software Engineering

The Complexity of Verification and Validation Testing in Component Based Software Engineering

Correctness of the complete system is verified and the system is validated with respect to the requirements. Validation is process of tracing requirements with final complete system i.e. after coupling of components together. The designer implements various methods and tools which are able support for this activity. The verification and validation in component software development differ with traditional software development in terms proofing, certifying, functionality and quality of software. The various techniques such as manual, tools based inspections, tests and live tests can make verification and validation more easy to develop the software product as robust[10].

7 Read more

ANALYTICAL METHOD VALIDATION AND CLEANING  VERIFICATION OF FELODIPINE BY HPLC METHOD

ANALYTICAL METHOD VALIDATION AND CLEANING VERIFICATION OF FELODIPINE BY HPLC METHOD

Cross contamination is a major problem in the manufacture of pharmaceutical formulations in multi drug formulation plant. To avoid cross contamination major importance has to be given to the cleaning activity of equipments used for the manufacturing purpose. The present study is undertaken to validate analytical procedure for felodipine, to perform cleaning verification studies by using worst case approach and to find the efficiency of cleaning process. The method for analysis of felopdine is chosen high performance liquid chromatography (HPLC) and has been validated according to ICH guidelines. In cleaning verification among tablets, felodipine, is identified as the worst case drug. The swab samples are taken from all equipments and analyzed. The results of cleaning verification is found to be well within the acceptable limits (based on 10 ppm criteria and maximum allowable carry over (MACO) approach. Thus the present study is found suitable for validation of analytical method and cleaning verification of felodipine tablet formulation.

9 Read more

Formal Verification of Receipt Validation in Chaum’s Scheme

Formal Verification of Receipt Validation in Chaum’s Scheme

With this in mind, we can apply the processes described in the Verified Software Toolchain to portions of Chaum’s scheme to provide voters with a reassurance that despite the changes their votes are processed just the same as before. This is the thrust of this paper, to provide a formal verification of the vote validation process within Chaum’s Scheme. Chapter two will provide a detailed background on both Chaum’s Scheme and the Verified Software Toolchain, including a demonstrative proof of a simpler program, with chapter three providing the formally stated thesis description along with a description of the C language source code of our program. Chapter four will provide a description of the difficulties encountered during the process of proving our goal, and then chapter five explains some design decisions along with a group of future work that would help further the value of the proofs within this thesis.

60 Read more

Independent verification and validation of an industrial simulation model

Independent verification and validation of an industrial simulation model

Discussions on independent verification and validation (IV&V) centre on the assessment of the large scale models that are typically found in the military and public policy domains. There is, however, a significant use of simulation in industry. In this context there does not appear to be any reference to the idea of IV&V. A key issue is that models in the industrial context are generally smaller than their counterparts for the military and public policy domains. But this does not necessarily mean that the decisions being taken with these models are of an insufficient scale to warrant an independent review of the confidence that should be placed in the results. Many industrial simulation models involve decisions that run into the millions or even billions of dollars.

42 Read more

Validation and Verification of Software Design using Finite State Process

Validation and Verification of Software Design using Finite State Process

Lakos and Malhotra (2002) have argued that it is possible to refine textual descriptions into programs through a sequence of developments led by a process of specification validation. The motivation for performing this action during the development process is primarily directed at ensuring that the eventual outcome of the implementation is satisfactory to the end users needs and expectations. Thus, validation becomes an early focus of the software effort (Lakos and Malhotra, 2002: p57). They emphasise that object-oriented methodologies help to bridge the gap between specification and implementation. Object-oriented design and programming methods assist the bridging process due to a clean mapping between problem space and solution space.

111 Read more

The engineering of generic requirements for failure management

The engineering of generic requirements for failure management

Abstract. We consider the failure detection and management function for en- gine control systems as an application domain where product line engineering is indicated. The need to develop a generic requirement set - for subsequent system instantiation - is complicated by the addition of the high levels of verification demanded by this safety-critical domain, subject to avionics industry standards. We present our case study experience in this area as a candidate methodology for the engineering, validation and verification of generic requirements using domain engineering and Formal Methods techniques and tools. For a defined class of sys- tems, the case study produces a generic requirement set in UML and an example instantiation in tabular form. Domain analysis and engineering produce a model which is integrated with the formal specification/ verification method B by the use of our UML-B profile. The formal verification both of the generic require- ment set, and of a simple system instance, is demonstrated using our U2B and ProB tools.

16 Read more

Validation and verification of the OPI 2.0 System

Validation and verification of the OPI 2.0 System

In addition to the eight images created to bracket the range of values of the three parameters (designed HHL for high dispersion, high density, and low brightness, etc), a middle image was created at the mean parameter values to create a total of nine images. To measure the effectiveness of the software, an image was output with the areas of detected simulated tear film breakup shown in red. For the purposes of this verification procedure, the artificially constructed images created to mimic the visual properties of images captured during an actual clinical session using fluorescein staining videography will be referred to as the “artificial” images. The software analysis output of the image with the areas of detected simulated tear film breakup will be referred to as the “detected” images. There are two types of incorrect detections of breakup area with regard to discrepancies seen between the number of pixels detected in real images and the detected images: false negatives and false positives. A false negative detection occurs when breakup in the real image is not observed by the software analysis in the detected image. A false positive detection occurs when the software analysis detects breakup in the detected image that is not considered breakup in the real image.

10 Read more

Seismic   Analysis of the Armenian Nuclear Power Plant (K242)

Seismic Analysis of the Armenian Nuclear Power Plant (K242)

The Armenian Nuclear Regulatory Authority (ANRA) plays a role in the ANPP upgrade program by independently evaluating and verifying key aspects of the seismic upgrade effort. These include the verification of the procedures and evaluation of floor response spectra (FRS) at the superstructures of the NPP, the independent validation of the geo-technical data at the NPP site and their influence on the interaction of structures and soil during an earthquake, the acceptance of procedures and criteria for the probabilistic seismic hazard analysis for the site, and finally, the overseeing of the plant seismic walk-downs. Along these lines the Nuclear and Radiation Safety Center of ANRA has performed a host of independent studies. Specifically, by incorporating the latest data on (a) soil condition and layering beneath the plant, (b) particularities of the reactor foundation design and its interface with the surrounding soil, and (d) structural details of the reactor and generator buildings, a comprehensive effort has been undertaken to independently establish floor response spectra that, in turn, better reflect the seismic reality at the plant. As mentioned previously, the seismic capacity, and need for upgrade, of the structures and the systems that ensure the safe shutdown of the plant during a seismic event (Review Level Earthquake) will be re-assessed based on the levels of acceleration defined by the generated FRS.

9 Read more

An Ingenious Model To Implement Cmmi Rskm With Small Software Industries: A Survey-Based Analysis

An Ingenious Model To Implement Cmmi Rskm With Small Software Industries: A Survey-Based Analysis

at their inception. In support of the proposed model, a detailed study is conducted to investigate the status of process improvement model (if any) being used by small software industries. For this purpose, a detailed questionnaire is prepared based on the pattern suggested by the experts [12] of CMMI. Also some special questions related to adoption of software testing strategies are designed. After deliberate analysis of the applicability and relevancy, the designed questionnaire is distributed to 6 small scale software industries (start-ups). The response options are qualitative in nature. For analysis purpose, the questionnaire is sub-parted into six categories. In order to convert qualitative responses into quantitative values, a fuzzy logic based mapping is applied. This conversion facilitated authors to perform comparative studies statistically and to validate the proposed model. The study primarily focused on the evaluation of testing strategies used by concerned industries and development of an efficacious model to assist them during software testing phase. Based on the study, authors propose a novel model RAVV (Risk Analysis with Validation and Verification) to be implemented as an intermediary between ML-1 and ML-3 as illustrated in Fig. 1, to guide these industries in achieving targets with less expense and efforts.

5 Read more

Research Areas in Cloud Computing

Research Areas in Cloud Computing

Undoubtedly cloud computing possess many benefits, but still there are certain problems which needs to be faced and handled properly. Especially the issues of security and economics of enterprises must be researched before utilizing cloud computing. Certain issues such as The Green Cloud, Denial of Service, cloud Verification Validation and Testing, Cloud Security, Data migration, Harvesting Unused Resources and Ad hoc Cloud, Scalability in the Cloud, caching and Session State Management in the Cloud must take into account. The following sections of this paper will focus on such issues of cloud computing which must explored and handled.

6 Read more

Performance Analysis of a Southern Mediterranean Seaport via Discrete-Event Simulation

Performance Analysis of a Southern Mediterranean Seaport via Discrete-Event Simulation

Modeling & Simulation (M&S) has proved to be a day-to-day highly indispensable tool for complex systems design, management and monitoring. Therefore, the proposed research study aims to develop a simulation model to recreate the complexity of a medium-sized Mediterranean seaport and analyse the performance evolution of such system with particular reference to the ship turnaround time. After the input data analysis, simulation model development, verification and validation, a design of experiments (according to a 24 factorial experimental design) was carried out in order to evaluate how some critical factors (i.e. inter-arrival times, loading/unloading times, number of cars and trucks to be unloaded/loaded) may affect the seaport’s performance. To this end an analysis of variance is performed and an analytical input-output meta-model was created to evaluate the system’s performance.

9 Read more

Project data management

Project data management

Suatu Projek berkemungkinan berjangka panjang atau pendek yang melibatkan faktor-faktor lain saperti buruh, kos dan procedure. Sesuatu projek pempunyai pusingan jangka hanyat dan ia terbahagi kepada beberapa peringkat yang yang mempunyai ciri- ciri yang tersendiri. Berdasarkan cirri tersebut, suatu projek memerlukan pengurusan yang bersistematik. Sistem “Project Data Management “ atau PDM adalah berdasarkan satu penyatuan atau prinsip “Project Management” dan “Software Engineering Practice” yang membolehkan pengabungan diantara pengawasan maklumat di antara projek- projek untuk mengurangkan kerja-kerja secara manual dengan masa yang diambil. PDM adalah aplikasi yang membantu Pengurus Projek di dalam mengurus projek dan ia mempunyai enam aplikasi Pengurusan Projek ia itu Time Management, Risk Management, Issue, Change Request, Verification and Validation dan QMS.

28 Read more

Show all 10000 documents...