ABSTRACT: In order to solve the problem of product and service configuration in product service system, this paper proposes a method of product service configuration based on TRIZ(Theory of Inventive Problem Solving). The first product service system problem analysis and construction analysis of root causes, using 40 inventive problems in TRIZ theory to solve the problem and find out the theory, configuration of products and services, to solve problems with the help of 39 engineering parameters, so that the configuration principle of product module and service module.
Abstract: In order to increase the metabolic activity of human hepatocytes and liver cancer cell lines, many approaches have been reported in recent years. The metabolic activity could be increased mainly by cultivating the cells in 3D systems or co-cultures (with other cell lines). However, if the system becomes more complex, it gets more difficult to quantify the number of cells (e.g., on a 3D matrix). Until now, it has been impossible to quantify different cell types individually in 3D co-culture systems. Therefore, we developed a PCR-based method that allows the quantification of HepG2 cells and 3T3-J2 cells separately in a 3D scaffold culture. Moreover, our results show that this method allows better comparability between 2D and 3D cultures in comparison to the often-used approaches based on metabolic activity measurements, such as the conversion of resazurin.
Abstract— In order to assess software reliability, many software reliability growth models (SRGMs) have been proposed in the past 40 years. In principle, two widely used methods for the parameter estimation of SRGMs are the maximum likelihood estimation (MLE) and the least squares estimation (LSE). However, the approach of these two estimations may impose some restrictions on SRGMs, such as the existence of derivatives from formulated models or the needs for complex calculation. In this paper, we propose a modified genetic algorithm (MGA) to assess the reliability of software considering the Time domain software failure data and SPC using Inflection S-shaped model which is NonHomogenous Poisson Process (NHPP) based. Experiments based on real software failure data are performed, and the results show that the proposed genetic algorithm is more effective and faster than traditional algorithms.
Abstract: In order to alter and adjust the shape of the membrane, cells harness various mechanisms of curvature generation. Many of these curvature generation mechanisms rely on the interactions between peripheral membrane proteins, integral membrane proteins, and lipids in the bilayer membrane. One of the challenges in modeling these processes is identifying the suitable constitutive relationships that describe the membrane free energy that includes protein distribution and curvature generation capability. Here, we review some of the commonly used continuum elastic membrane models that have been developed for this purpose and discuss their applications. Finally, we address some fundamental challenges that future theoretical methods need to overcome in order to push the boundaries of current model applications.
Abstract :- In order to improve the picture quality, the region of interest within video sequence should be handled differently to regions which are of less interest. Generally the region of interest is coded with smaller quantization step-size to give higher picture quality, compared with the remainder of the image. However the abrupt change picture quality variation on the boundary of regions of interest can cause the some degradation of the overall subjective picture quality. This paper presents a method where the local picture quality is varied smoothly between the region of interest and the remainder of the image. We first classify the type of subjectively region of interest which can be determined by using motion vectors estimated in the coding process. Then the region of interest is coded by decreasing the quantization step-size according to a gradual linear change from the other regions within a video sequence.
ABSTRACT : In order to eliminate the high density salt and pepper noise effectively in the image, this paper proposes a new algorithm that can eliminate the noise .Other similar algorithms need to adjust the filtering window in the image which is polluted by different concentration of noise constantly. The proposed algorithm use the fixed small scale of filtering window only, at the same time of filter, it can reserve the detail of the image features well. The proposed algorithm extracted the noise points from the contaminated image firstly, according to the relationship between the gray value of signal points and noise points, then determine which is the real noise. The experimental results show us that the proposed algorithm achieved satisfactory result in filter out noise, especially in the treatment of the images that have high levels of noise pollution, and it is better than other algorithm.
Abstract : In order to analyze the temperature distribution of the low-temperature radiant floor heating system that uses the condensing wall-hung boiler as the heat source, the heating system is designed according to a typical house facing south in Shanghai. The experiments are carried out to study the effects of the supply water temperature on the thermal comfort of the system. Eventually, the supply water temperature that makes people in the room feel more comfortable is obtained. The result shows that in the condition of that the outside temperature is 8~15℃ and the relative humidity is 30~70%RH, the temperature distribution in the room is from high to low when the height is from bottom to top. The floor surface temperature is highest, but its uniformity is very poor. When the heating system reaches the steady state, the air temperature of the room is uniform. When the supply water temperature is 63℃ The room is relatively comfortable at the above experimental condition.
ABSTRACT: In order to improve the cutting temperature in the cutting process, adverse effects of the cutting tool and the workpiece, it is better to join the cooling factors in the cutting to achieve a better cutting effect. In this paper several common cooling methods (cold air, high pressure water jet, cold and warm air atomizing jet, cutting fluid) are used. A series of comparative analysis is done by simulation, such as analysis of the chip shape, chip formation, cutting force, temperature and surface stress at different cooling cutting conditions. Through the comparison, it is found that the cooling effect of cold and warm air is the best.
The participants worked in group sessions in a separate room in the ZBW buildings. The duration of the complete test sessions varied between 30 minutes and 90 minutes. As an incentive, each participant received a €20 voucher for a popular online shop. After a short welcome, the instructor informed the participants that their data would be treated anonymously and strictly confidentially. In order to avoid politeness effects, the instructor explicitly stated that the participants should not give polite answers. Instead, the instructor explained that honest and open answers were needed as a basis for improvements. The participants were instructed (orally and in written form) to work on the questions in the given order. During the whole study, the instructor was present. The questionnaire was handed out and answered in a written paper-pencil based format. During the test sessions, some sweets were offered. After the completion of the questionnaire, the instructor took the filled-out materials and conducted a short post-interview with each participant. Subsequently, they received the voucher as a reward for participation and had again the opportunity to ask questions or make comments. The instructors requested that they not talk about the study for the next few weeks.
collaborative process. Any violations of capacity limits imply that the collaborative plan breaches the agreed capacity limits set within the supplier network. The automated process seeks to redress the capacity violation. The virtual order bank identifies plants that contribute to the violation and analyses the situation within the context of the violation. Excess capacity may be re-directed to a plant with spare capacity or a ‘capacity agreement add on’ initiated, where additional capacity may be added [or lowered] within a pre-agreed restricted scope and timescale. This process ensures collaborators within the networked enterprise remain viable and negates contractual arguments over time, order levels and cost, usual when ‘rush jobs’ are encountered. This process operates autonomously and hence more quickly than is currently possible, as the current approach would require individuals at OEMs contacting many suppliers and negotiating separate contract amendments. This process integration has been shown to be achievable using current IT systems, linked through innovative supporting systems (Fischer et al., 2008).
Remarks 3.1: The conclusion of the theorem 3.1 also remains true if we replace the compatibility of X with respect to the order relation and the norm . by a weaker condition of the compatibility of every compact chain C in X with respect to the order relation and the norm . . The later condition holds in particular if every partially compact subset of X possesses the compatibility property.
In order to improve basketball player’s field-goal percentage, combining with mathematical, mechanical relative aspects principles, by establishing basketball motion trajectory mathematical model when player shoots, utilize variation method, projection method and so on; from minimum angle with angular deviation, optimal incident angle, and minimum angle with speed variation these three aspects, respectively research player basketball shooting angle size influences on field-goal percentage, and get respectively field-goal percentage highest interval is
What will it take to break the impasse? In order to begin addressing this question, we must first acknowledge that the current state of affairs does not adequately support science, let alone open science. As such, an effective resolution would need to support the advance of science in more efficient and effective ways, while also satisfying the different concerns and priorities of individual stakeholders in an open ecosystem. This challenge seems so great that many have turned to key organizing bodies in the community, such as publishers, academic institutions, funding agencies and professional associations, to provide guidance or to set or enforce standards.
The cell counts per unit time obtained in each treatment with the yeast Candida utilis are presented in Figures 2, 3 and 4. The positive effect of greater quantities of inoculum used in the media for the treatments 14, 21, 26, 8 and 30(in decreasing order of growth) is shown in the growth curves (Figure 2). The cultivation periods of 4 to 8 hours were significant because the numbers of cells increased and reached about 6 x 10 7 cells. mL -1 in treatments 21 and 30. After this rapid growth, the media were stable throughout most of the period between 8 and 20 hours of cultivation, with the exception of media 14 and 21, which had small variations in the rate of increase in cell numbers. This interval is related to the phase in which yeasts produce enzymes essential to cell metabolism and the adaptation of the microorganism to the culture medium. Therefore, productivity at this stage was not significant (Hiss, 2001). During the period of 20 to 36 hours of cultivation, the growth curves of the media 14 and 26 were exponential, and the curve for medium 8 had the same behavior as those of médium 20 after 32 hours of growth. The exponential phase for the media 30 and 21 was less pronounced. Therefore, the more pronounced growth during the first hours and the longer exponential phase were crucial for the increase in the number of yeast cells. At this stage, the growth velocity was maximal, the production of the enzymes required for the oxidative metabolism was greater so that there was a greater consumption of carbohydrates and a greater energy input to the system (Trivedi et al., 1986; Lima et al., 2001). The highest average number of cells was observed with treatment 14. This treatment differed at 5% significance by the Tukey test (Tables 4 and 5). This result is explained by the short steady phase and longer exponential phase. During the latter phase, the growth rate was proportional to the cell concentration, and the specific rate of cell proliferation was constant (Stroppa et al., 2009).Medium 26 contained higher concentrations of peptone and yeast extract and a lower concentration ammonium nitrate than medium 14. The fact that that rate of cell growth observed in medium 26 was lower than that in medium 14 indicates that the ready availability of nitrogen is more important than the source of carbon or a source of nitrogen the requires metabolization to release the nitrogen (peptone and yeast extract).
Modelling and predicting online sales based on the publicly available data from the online user community are important. In order to improve the accuracy of existing predicting model, we propose a weighted related networks approach to account for the contributions from related products in the same brand (supporting) and category (competing), plus other factors including the claimed past purchases by the user members, the number of reviewers and the relative prices. Three scenarios of such an approach are implemented, compared and tested with a baseline model that is developed based on a traditional approach in literature. The results show that categorical relationship impacts the online sales more significantly in general.
In recent times, remote sensed data is one of the most important sources of images used in producing land use maps. Land use maps can be obtained by classifying of remote sensed images through different methods. The aim of this paper is to use supervised classification method in order to produce land use map and evaluate the classified image against ground truth. Remote sensed images and ArcGIS were used to complete the goal of the research. Firstly, Landsat images were collected for the study area. Secondly, preprocessing operations were done on the collected data such as composite bands of Landsat images, Pan-sharpening, Mosaicking and Clipping the study area from resulted mosaic. All preprocessing operations were done using ArcGIS software. Thirdly, supervised classification was performed to produce land use map for the study area. The image was classified into five classes: Agriculture lands, Built-up areas, Mountainous areas, Sandy lands and Wet lands. Finally, classified image was evaluated using random points distributed on the study area. The overall accuracy of the image classification was 86 % and the Kappa coefficient was 0.82 which indicates that there was 82 % agreement for the classified image.