The ECAR faculty technology study is conducted in the same manner as the annual ECAR student technology study. These studies rely on respondents recruited from institutions that volunteer to partner with ECAR to conduct technology research in the academic community. ECAR works with an institutional stakeholder (the survey administrator) to secure local approval to participate in the research. Once the Internal Review Board process is successfully navigated and a sampling plan is submitted, ECAR provides each survey administrator the survey link for the current year’s research project. The survey administrator then uses the survey link to invite participants from that institution to respond to the survey. Data were collected between January 31 and March 14, 2014, and 17,451 faculty from 151 institutional sites responded to the survey (see demographic breakdown in table A). ECAR issued $100 or $200 Amazon.com gift cards to 19 randomly selected faculty respondents who opted into a drawing; the opportunity drawing was offered as an incentive to participate in the survey. In exchange for distributing the ECAR-deployed survey to their faculty, participating colleges and universities received files containing anony- mous, unitary-level (raw) data of their faculty responses, along with summary tables that compared their faculty’s aggregate responses with those of faculty at similar types of institutions. Participating in this survey is free, and any higher education institu- tion can sign up to contribute data to this project by e-mailing email@example.com.
The term ‘ co-author’ is used to denote the appearance of multiple writers simultaneously in one paper, and also reflects the collaboration of different institutes, regions, or countries [33,34]. The higher the strength of these co- authorships, the closer the relationship among them. Col- laboration between countries was determined by the author description, where the term ‘independent’ was assigned if no collaboration was present. ‘Co-words’ refers to the phenomenon that two or more keywords occur simultan- eously in one article or one field, where the number of times being cited is called the frequency or strength of co- words . ‘Cluster analysis’ is a collective term covering a wide variety of techniques for delineating natural groups or clusters in data sets . It aims to group a set of objects in such a way that objects in the same group (called a cluster) are more similar (in some sense or another) to each other than to those in other groups (other clusters). During the process, many algorithms and software were used. This study is based on the relationship between countries, insti- tutions, authors, and keywords through a certain algorithm to find the core groups among them by VOSviewer software. It has recently been used in many fields, including machine learning, pattern recognition, image analysis,
For both isotopes of the water molecule, abundances of stable isotopes of water are reported in the standard delta notation: the relative difference in the molar ratios of the heavy isotope to light isotope of a sample and Vienna standard mean ocean water (VSMOW) expressed in permil (‰). The contributions from evaporated surface water bodies (swb) and groundwater (gw), which are considered to be the only contributing sources to streamflow, are assessed using end-member mixing analysis of the proportional fraction of either end-member ( f swb or f gw )
The septic study was primarily based at the Squam Lakes where we have already conducted an extensive water and nutrient budget study and have compiled a complete GIS analysis system that includes septic system locations and specifications. The Squam Lakes (Squam and Little Squam Lake) benefit from the absence of point sources of pollution like wastewater treatment facilities, industrial facilities or large agricultural operations in close proximity to the shore; however, the role of non-point sources of pollution continues to be an issue. In 2001, Schloss et al. conducted a preliminary survey of septic systems around the Squam Lakes. They found that certain basins were at elevated risk of pollution by septic systems where age of system/design, soil characteristics and slopes were unfavorable to septic waste treatment. This study collected environmental samples from five of the basins found to be at high risk for pollution, one basin at moderate risk, and one basin found to be at low risk of pollution by shore side property septic systems. Shallow water samples were collected using a Van Dorn sampler at 0.5 meters or less. Samples were also collected bracketing 2 sewage treatment plants and at their direct outflow pipe. Samples were analyzed for specific anions/cations, boron, boron isotopes, and, using a modified HPLC/MS procedure: acetaminophen (a common analgesic), caffeine, carbamazepine (an antiepileptic, mood stabilizer) and trimethoprim (an antibiotic) at the NH DES Water Quality laboratory. As part of a companion study funded through the NH WRRC samples were also analyzed for Total Phosphorous in the UNH Center for Freshwater Biology Analytical Laboratory. The emerging contaminants caffeine and triclosan were analyzed through ELISA procedures using very high sensitivity test kits from Abraxis and will be reported upon elsewhere.
Lake and stream monitoring through the LLMP generally involved a minimum of monthly sampling from spring runoff through lake stratification, and weekly to bi-weekly sampling from stratification until fall overturn. Water clarity, chlorophyll a, acid neutralizing capacity, color, dissolved oxygen and nutrients (total N, total P and nitrate) was the default suite of parameters measured for lakes while nutrients, turbidity, color and flow were the parameters of choice for the lake tributary work. On occasion, student field teams traveled to join the volunteer monitors to perform quality assurance checks and do more in-depth analysis and lake profiling.
We facilitate water resources research through technical assistance and sample analysis through The Water Quality Analysis Laboratory (WQAL). The WQAL was established by the Department of Natural Resources in 1996 to meet the needs of various research and teaching projects both on and off the UNH campus. It is currently administered by the NH WRRC and housed in James Hall. The mission of the Water Quality Analysis Laboratory is to provide high−quality, reasonably priced analyses in support of research projects conducted by scientists and students from throughout the University, state, and nation. Past clients have included numerous research groups on the UNH campus, federal agencies, scientists from other universities, and private firms. Many thousands of analyses are conducted each year. To further encourage and support water resources research near the University campus, we have put forth a concentrated effort to establish an appropriate infrastructure and background dataset for the Lamprey River Basin. The entire basin is referred to as The Lamprey River Hydrologic Observatory. Its goal is to serve as a platform to study the biogeochemistry of a developing suburban basin. As part of a cooperative project between the Department of Natural
According to the suggestions laid out by the Inter- national Stem Cell Banking Initiative, there are specific criteria that should be met before banking an iPSC line . Most bio-banks have common characterization methods for establishing iPSC lines which include: (1) embryonic-like morphology observation; (2) transgene silencing after reprogramming; (3) pluripotency assess- ment including alkaline phosphatase assay or detection of pluripotent and renewal markers such as TRA-1-60, TRA-1-81, Nanog, Oct4; (4) differentiation potential both in vitro (embryoid body formation) and in vivo (teratoma formation); (5) karyotype analysis to indicate chromosomal abnormalities; (6) identity confirmation by DNA fingerprinting and short tandem repeat-PCR; and (7) microbiological assay to ensure the culture is free of any possible biological contaminants (Table 1). It is im- portant for cell banks to provide useful characterization data and information for either research-grade or clinical-grade iPSCs.
The present study reports the analysis of data gathered through the questionnaire designed for library professionals of research libraries of Dr. BARC and MARC. Research libraries surveyed ranged widely in terms of nature of library users served, funding agency, budget, collections, services, infrastructure facility, database, automation software, Internet, security, training to library professionals and users etc. The abbreviated form of names of research institutes is used in this study to represent their respective research libraries. It is observed from the questionnaire that the oldest or first established research library is Dr. BARC (1991) and MARC (2007).
Abstract: Nashe earth fill dam was constructed for the aim multi-purpose use (hydroelectric power and irrigation). While the construction of this dam many peoples around Dam were immigrated from their residential to another places including to the town. This research focused on the dam fail analysis by considering overtopping and piping failure mode. The input data were collected from Min. of Water and Energy, Federal Democratic Republic of Ethiopia (FDRE). The dam breach parameters were determined by applying the principle with Von Thun and Gillette. In dam break analysis the first step is model setup by using three dimensions (x, y and Z) of the downstream. By applying overtopping model, the peak discharge 8761.23m 3 /sec was obtained, which was more than 7.33 times the probable maximum flood and by applying piping mode, the peak discharge was obtained is 8620.85 m 3 /sec which was more than 7.21 times the probable maximum food at the nearby location of the dam. This indicates that, the peak outflow development during raining season was greater than inflow discharge flood (IDF) used as upper boundary condition for breach parameters. So it was summarized that high peak outflow and risk were developed to downstream by overtopping mode as compare by piping mode during occurrence of dam breach. As from the sensitivity analysis it was concluded, the effect of breach time on discharge is more sounded than the water level increase. This Dam Break modeling results obtained by studies could be used as flood mappings to assist the societies/communities for future planning developments in the flood prone areas/zones in advance.
The paper is organized as follows. After preliminaries and notation in Section 2, in Section 3 we develop an analysis of Renegar’s condition number in terms of purely geometric quantities that bound C( A ) from above and below. In Section 4 we present a lower bound on the width of the feasible solution set of (1) in terms of C( A ). In Section 5 we introduce the concept of conic curvature, and we present upper bounds on the width of the feasible solution set using C( A ) and the conic curvature K . Section 6 contains comments about three themes underlying conic linear systems (1): conditioning, geometry, and complexity, and relations between these themes.
The goal of this research is to enable performance improvements in IT portfolio management. Through investigation of software practices at a Fortune 500 company, we were able to demonstrate how simulation analysis can improve the valuation of software applications and overcome difficulties in reducing software maintenance costs. Our analysis used the System Dynamics Modeling (SDM) methodology to develop and simulate quantitative models that linked software dynamics, key actors, and management systems to estimate the costs and benefits of various management approaches. We also utilized complementary research techniques ‐ including onsite interviews with multiple constituencies ‐ to develop a holistic view of the software maintenance/development cycle.
I read the transcripts of all focus groups several times in their entirety as well as the field notes taken during and after each focus group meeting. Based on the transcripts and notes, I used purposeful sampling to select four of the six Traditional Group focus group meetings to analyze in detail, because these groups yielded information rich discussions important to answering the research questions (Patton, 2002). The participants represented diverse demographics such as sex, race, level of physical functioning, financial means, and countries of origin. While such heterogeneity within a small group of participants could yield cases that are quite different and extreme, I selected the groups that appeared to have detailed descriptions which addressed the interview questions and revealed shared patterns (Patton, 2002). In addition, if one participant appeared to dominate the focus group discussion to the exclusion of the opinions of the other participants or if the participants provided only yes/no responses or answers lacking in specifics, purposeful sampling supported the exclusion of those focus groups from the analysis. Because of the small number of caregivers able to participate in this study, I chose not to sample the groups and used all caregiver participants instead. Therefore, the analysis reflected 14 Traditional Group members and 10 caregivers of Memory Loss group members.
In the MCTFR, ancestry was based initially on the eth- nicity specified on a birth certificate, adoption records, or by self-report. Of the 8,405 GWAS samples, 7,599 (90.4%) self-reported as having primarily European ancestry (i.e., ‘White’), 382 (4.5%) as Asian, 83 (1%) as African American (i.e., ‘Black’), and 127 (1.5%) reported mixed ancestry. All other ethnicities had a self-reported frequency of <1% and there were approximately 1% with a missing self-report. Be- cause the genetics of complex phenotypes can vary across different ancestral groups (Bamshad, 2005), the primary sample used in our GWAS analysis will be comprised of individuals of European ancestry. However, to improve on the accuracy of the self-report data and to deal with miss- ing self-report data, we ran EIGENSTRAT, http://genepath. med.harvard.edu/ ∼ reich/Software.htm (Price et al., 2006) analysis, extracting the first 10 principal components (PCs), to aid in the identification of a cluster of individuals with European ancestry.
Anthropometric and Corporate Measures: Anthropometric measures should be performed by trained and trained persons, with the purpose of minimizing intra and interracial errors. In the project in question, the anthropometric data were collected in an own form, subdivided and carried out in two parts. First, we can cite the data obtained by the bioimpedance balance, such as: body weight (kg); percentage of body fat (%); basal metabolism (Kcal); visceral fat level (%); percentage of skeletal muscles (%); and body age (year). Bioimpedance is based on the fact that the human body is made up of water and electrically conducting ions, in which adipose tissue causes resistance to electrical conduction and the muscle mass, rich in water, is a good conductor of electricity. In the bioimpedance test a low intensity alternating electric current is conducted through the body, so it is calculated from two vectors: the resistance and the reactance (ABESO 2016). In the four-way models, the results are obtained from predictive equations, using sex, age, race, weight and height, estimating fat mass, fat-free mass, total body water intracellular and extra (ABESO 2016). It is a practical method that is independent of the examiner's ability but can be influenced by the ambient temperature, physical activity, consumption of food and drink, menopause and menstrual cycle. When performing the bioimpedance should be considered: fasting for at least 4 h, not practicing physical activities for 12 hours, presenting with alcohol withdrawal for 24 hours, preferably not to use diuretics for 7 days and women should perform the analysis between the 7th and 21st day of the menstrual cycle (ABESO 2016).
Description: This Frost & Sullivan research service titled Green Data Centers--Emerging Trends and Developments provides the key technologies and industry trends that are shaping the move toward green and energy- efficient data centers. This research service provides an analysis on the impact of the economic downturn on data center energy initiatives, an overview of global data center energy consumption trends, and an analysis of metrics and standards utilized to measure data center energy efficiency.
"Supported by the National Center for Advancing Translational Sciences of the National Institutes of Health under Award Number UL1TR000454. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health." KL2 Scholars should also list KL2 TR000455 and TL1 Trainees should also list TL1 TR000456.
Dr. Munnell was co-founder and first President of the National Academy of Social Insurance and is currently a member of the American Academy of Arts and Sciences, the Institute of Medicine, and the Pension Research Council at Wharton. She is a member of the Board of The Century Foundation, the National Bureau of Economic Research, and the Pension Rights Center. She is also a TIAA-CREF Institute Fellow. And she was awarded the 2007 International INA Prize for Insurance Sciences by the Italian Accademia Nazionale dei Lincei in Rome.
The measure of variation used is the sum of the variances of variables, perhaps after scaling so that they each have variance one. An analysis that works with the unscaled variables, and hence with the variance-covariance matrix, gives a greater weight to variables that have a large variance. The common alternative –scaling variables so that they each have variance equal to one – is equivalent to working with the correlation matrix. With biological measurement data, it is usually desirable to begin by taking logarithms. The standard deviations then measure the logarithm of relative change. Because all variables measure much the same quantity (i.e. relative variability), and because the standard deviations are typically fairly comparable, scaling to give equal variances is unnecessary.