According to domestic studies on the stability of levee, construction methods are be- ing developed through patents and most studies focus on the evaluation of stability of piping and levee slope due to permeation. Kang et al. analyzed and evaluated the stabil- ity through the sensor installed on the levee after carrying out the penetration simula- tion of levee using SEEP/W which was two-dimensional abnormal-unsaturated pene- tration model . Lee et al. assumed that the collapse mechanism of river levee and the hydraulic phenomenon are same in all collapses, established physical levee collapse numerical model and suggested the levee collapse discharge formula . Park et al. car- ried out 2-dimensional hydraulic analysis by reflecting accurate topography before and after the project execution and suggested the hazard evaluation method of levee based on the result in order to confirm the effects of river dredging project , and Lee et al. reviewed domestic and overseas safety evaluation techniques for levee and presented the improvement measures for domestic design standards . Kim et al. carried out the real-scale verification experiment of SPF (Scouring Protection Form) method which was the river bed protection method, calculated the critical velocity of the method and present the design standards according to the placing method .
11 Read more
constructed by creating fake nodes. The number of false nodes dynamically adjusts according to the number of users in the surrounding environment. Although the ap- proach based on false nodes has various advantages in terms of implementation and the computational cost is low, when users submit continuous requests, they suffer from temporal and spatial correlation problems. To solve this problem, Nosouhi proposed a practical hybrid location privacy protection scheme . The proposed method filters out relevant false location data before submission. Therefore, the attacker cannot identify the user’s real location. Since privacy protection in the era of big data is more difficult than traditional information protection, Zhang Sun et al. proposed an improved model that combines k-anonymity with L-diversity. The K -mem- ber clustering algorithm can be used to transform the ano- nymity problem into a clustering problem to achieve an improved anonymity model. Improved anonymous models can reduce algorithm execution time and information loss, which is especially important for big data . Most exist- ing anonymous methods directly delete trajectories or loca- tions that violate specific constraints, resulting in a large amount of information being lost. In response to this prob- lem, Chen et al. proposed a trajectory privacy protection method based on 3D mesh partitioning . This method first divides the trajectory area into a number of spatiotem- poral units (represented as 3D units) and then performs location swapping or suppression in each spatiotemporal unit. Compared with other methods, this algorithm effect- ively preserves the trajectory data privacy and improves the availability of the data.
13 Read more
2006) and the Two Fixed Reference Points method (Chang et al., 2007). Most recently, Lin et al. (2010) proposed a density-based microaggregation method that forms records by the descending order of their densities, and then fine-tunes these groups in reverse order. All the works stated above proposed different microaggregation methods to form the groups, where within groups the records are homogeneous but between groups the records are heterogeneous and sum of squares of errors (SSE) are used the measure the information loss. As median is used as a measure of location to represent each group, in this paper we proposed sum of absolute deviations from median (ADM) to measure the information loss that is always less than the SSE. That means by using ADM as a measure of information loss always produce less information loss than the SSE. Thus the proposed median based microaggregation method has the following features:
20 Read more
security and consistency.In this paper oﬀers integration of watermarking and encryption of independentlyfor grey scale images. Hence authentication can be verified even in the presence of decryption. In integration of encryption and watermarking technique Arnold transformbased encryption and watermarking technique based on discrete cosine transform is used. The experimental result shows that designed method do not decrease the quality of thegrey scale images which measured by means of peak signal to noise ratio. while integrating these two technologiesand also any type of watermarking technique and encryption can be used.This work also comparing various algoritms of proposed method.
The sketches of various slope surface protection composite structures identified in Japan are given in Figs.8-10. The shape the edge of composite structures used for earth slope protection is shown in Fig.8. Both the upper and lower edges have vertical portions (flanges) of 30 cm or more depending on the field conditions. Due to these two vertical portions of flanges, the edge of composite structures would be fixed on the slope and prevent water entry just underneath the layer. According to the need, however, one can make the structure without any flanges or the flanges can be made with variable sizes. The cross-sectional view of the composite structures showing the mortar, mesh and the anchored pin is given in Fig.9. The length of the anchored pin ranges from 200 to 400 mm with diameter of 13 to 16 mm depending on the slope conditions and position of the pin. The anchored pin is used to hold the mesh on the surface of the earth slope during construction and to facilitate the composite structures to be remained on the slope during on-service. The average thickness of the composite structures used for slope protection is usually 6 to 7 mm. However, this may be little bit thicker due to the unevenness of the surface of the slope. A part of the completed composite structures for slope surface protection is shown in Fig.10. This figure also shows a portion of the mesh and the application technique of the mortar on the mesh.
Abstract— Microdata protection in statistical databases has recently become a major societal concern and has been intensively studied in recent years. Statistical Disclosure Control (SDC) is often applied to statistical databases before they are released for public use. Microaggregation for SDC is a family of methods to protect microdata from individual identification. SDC seeks to protect microdata in such a way that can be published and mined without providing any private information that can be linked to specific individuals. Microaggregation works by partitioning the microdata into groups of at least k records and then replacing the records in each group with the centroid of the group. This paper presents a clustering-based microaggregation method to minimize the information loss. The proposed technique adopts to group similar records together in a systematic way and then anonymized with the centroid of each group individually. The structure of systematic clustering problem is defined and investigated and an algorithm of the proposed problem is developed. Experimental results show that our method attains a reasonable dominance with respect to both information loss and execution time than the most popular heuristic algorithm called Maximum Distance to Average Vector (MDAV).
The proposed method is particularly favorable in comparison with existing RPV based AIP methods if the used DG interconnection standard characterizes generally wide OUF thresholds. In addition, the islanding can be distinguished by smaller injection of reactive power in correlation with most existing RPV based AIP schemes. Furthermore, the performance of the proposed method does not degrade when multiple inverter based DG units are furnished with a similar method.
In this paper we develop an improved method for protection of three phase induction motor using arduino for industries. Our main target is to protect three phase induction motor from various fault with using one circuit in minimum time and at low cost by considering different factors such as ease of operation, time of operation, cost of equipment. This is very cheap in cost but mainly used for protection of three phase induction motor. So therefore we design a circuit which can protect motor from various faults. This protection method protects motor from various faults such as over voltage, under voltage, overcurrent, single phasing, thermal protection. This protection system requires minimum time for operation as it can work very instantly. It will detect the fault and stop the motor. As the induction motor is main part of industry it is important to provide a better protection system. There is a need of better protection system for three phase induction motor as the conventional system requires more time to operate. The system designed by us is better than conventional system as it can operate in less time .As these system can detect the fault in less time and operate quickly due to which the fault in motor does not extend. As the fault occurs in induction motor the system detect the fault and stops the motor and display the fault on the screen. Due to this feature it is easy to rectify the fault as this is less time consumption process. Due to the fast operation the fault can’t damage the motor.
Abstract:- In today’s world password has been the most important method to secure ourselves from fraud and other scams. Even though some of the people does not use password to secure the device or files. Some of them use default passwords like “abc$1234” or “admin”, etc which find it easy to remember. The use of strong password with minimum 16 characters is advised. We will see how password were used, how they are used currently, what are the threats of weak password usage, how to avoid those risks and to implement new methods of password policy.
) from real values of GDP. Selecting the best parameters and assuming a given value of , time series of physical capital and PR protection expenditure are stored using LAD and LS methods. Then, another stochastic value is selected for and respect to m times for different ’s. Now, among m calculated time series with different initial values, time series with the least errors in LS and LAD are selected. Since there are 5 parameters for each of which, n stochastic value is selected with m random initial values, estimated time series among 5
18 Read more
In present scenario a good amount of information is being stored and distributed in digital form. Growth in internet services and availability of different materials like videos, images, e-books, e-journals and other contents increases the need of copyright protection of all digital data. Data stored in digital form is always being in risk in a number of ways like; digital data can be copied and republished by some other name. Due to advancement in computer technology, digital data can be easily edited and manipulated.
Research methodology: discrete choice experiment and decision of attributes and levels In order to test above hypotheses, we need to know consumer ’ s preference and WTP for each attribute about Fair Trade information (organic farming, and poverty and child labor) and brand. So, we used Discrete Choice Experiment (DCE) method which involves asking individuals to state their preference over hypothetical alternative scenarios, goods or services (Mangham et al. 2009). DCE is a type of conjoint analysis which was originally developed for marketing research (Louviere and Woodworth 1983; Louviere et al. 2010). DCE is the preferred method for evaluating added-value products with multiple attributes simultaneously, such as the ones we consider in this study (Aizaki et al. 2015).
14 Read more
Our results and experimental observations present convincing evidence about the positive impact of post- conditioning on renal complications after ischemic rhabdomyolysis and make a reasonable promise for a possible future application in different clinical situations. Newer studies drew attention to certain concerns about the actual effectiveness of the method under circum- stances with comorbidities of diabetes, hypertension, hypercholesterolemia (conditions that often affects pa- tients with arterial occlusive disease), which fact could mean some limitations of the current model [33,34]. However, our findings about the effectiveness of post- conditioning in terms of renal protection that seemed (at least partially) independent of the effect (or ineffi- ciency) on the muscle IR injuries suggest the possibility of a successful translation into clinical practice. The easy and practicle attainability of the method and the positive impact on systemic hemodynamics during the opera- tions will hopefully lead us to take measures to preven- tion of postoperative kidney disfunction.
12 Read more
In the year 2000, Agrawal R. et al.,  present an additive data perturbation method for building decision tree classifiers. Every data element is randomized by adding some noise. These random noise chosen independently by a known distribution like Gaussian distribution. The data miner rebuilds the distribution of the original data from its distorted version. They consider the concrete case of building a Decision-tree classifier from training data in which the values of individual records have been perturbed. These perturbed data records look very different from the original records and the distribution of data values is also very different from the original distribution. Agrawal R. et al. present a reconstruction method to exactly approximation the distribution of data values, present classifiers accuracy is comparable to the accuracy of classifiers built with the original data values by using these reconstructed distribution.
According to Yang and Lin (2011), with the faster Internet method were used widely, digital image watermarking becomes a significant topic of intellectual property in the digital age. They propose a method that can process image watermarking based on a robust method, which combines the Singular Value Decomposition (SVD) and Distributed Discrete Wavelet Transformation (DDWT) on cloud environments.
33 Read more
In the year 2013, Sara Hajian et al., they handle discrimination protection in data mining and present new techniques applicable for direct or indirect discrimination protection individually or both at the same time.The system clean training data sets and outsourced data sets in such a way that direct and indirect discriminatory decision rules are converted to legitimate (nondiscriminatory) classification rules. Also present new metrics to evaluate the utility of the present approaches and compare these approaches. The experimental evaluations demonstrate that the present techniques are effective at removing direct and indirect discrimination biases in the origina l data set while preserving data quality.
Cloud computing proposed a new method for securing cloud data in real environment. 128 bit AES encryption is used for providing confidentiality, authenticity and access control. Then performance of proposed approach was analyzed based on delay. From this analysis we observed that there is drastic increase in delay with increase in file size. In the proposed system, during unauthorized access to a particular file an SMS alert system is used to inform actual owner. Each file uploaded in cloud has a unique file id. Authorized users can use this ID for downloading and editing their uploaded data. If somebody tries to access another person’s file, an alert SMS will send to the actual owner’s mobile number which he/she provided during the time of registration.
This paper presents a new digital protection system to solve the protection challenges in future smart grids, i.e., fast protection and fault isolation in a loop-structured system with limited magnitude of fault current. The new system combines two protection algorithms, i.e., a differential protection as the primary algorithm and an overcurrent protection as the backup one. The new system uses real-time Ethernet and digital data acquisition techniques to overcome the restriction on data transmission over large grids. The current measurements at different locations are time-synchronized by GPS clocks, and then transmitted to a central computer via the Ethernet. As opposed to digital relays which often contain PMU functionality nowadays, this approach uses time stamps on the instantaneous current values. We build a prototype of the new system on a test-bed. The results from simulations and experiments have demonstrated that the protection system achieves fast and accurate protection.
Digital watermarking is a new technology of information hiding and it has effective effect for Copyright Protection. Digital watermarking technology is to use the digital embedding method to hide the watermarking information into the digital products of image, visible and video. Seen form the field of signal process, the watermarking signal being embeded into carrier is as a feeble signal to add into a strong background. As long as the intensity of watermarking is lower than the contrast restriction of human visible system (HVS) or the apperceiverestriction of human audio system(HAS), the watermarking signal won’t be felt by HVS or HAS.
Sunscreen in the UVA range does not have good photostability whereas sunscreen in the range of UVB offers good photostability. Results of our product show that it lies in the UVB range thus confirming UV protective factor. The ingredients used in the gel are easily available and the evaluation parameters performed showed better results. The present study reveals that UV spectrometry is a acceptable, economic, reproducible and rapid method for the evaluation of herbal sunscreen.