Glaciers other than the ice sheets cover approximately 0.5% of the terrestrial land surface (Pfeffer et al., 2014), but their extent is shrinking due to ongoing climate change (Vaughan et al., 2013). Melting of land ice has the potential to signiﬁcantly contribute to sea level rise and affect the availability of fresh water for drinking, irri- gation, and hydropower (e.g., Huss & Hock, 2018). Assessing the consequences of glacier shrinking from local to global scales is thus an important task. Global-scale predictions of glacier evolution combine glacier mass balance models with assessments of present-day glacier geometry (e.g., Huss & Hock, 2015; Marzeion et al., 2012). The surfaces of mountain glaciers are frequently covered by supraglacialdebris that can amplify or dampen ice melt rates, depending on its thickness (Østrem, 1959). The extent and thickness of supraglacialdebriscover is likely a complex function of debris supply rates from ice-surrounding rock walls, its transport by the ice, and ice ablation that exposes englacial debris at the glacier surface (e.g., Kirkbride & Deline, 2013). Because all of these processes can vary with time, the extent and thickness of supraglacialdebriscover are not constant but changes with time, too. Historical observations from the Mont Blanc massif, for example, indicate signiﬁcant expansion of supraglacialdebriscover over the last 150 years (Deline, 2005). However, similar observations are rare for most ice-covered regions on Earth (e.g., Herreid et al., 2015) and we currently lack observations to develop and test models of how debris-coverextents and thicknesses change with time. Here we propose an approach to automatically map supraglacialdebriscover from optical satellite images at a global scale. Our approach makes use of the cloud-computing platform Google Earth Engine (GEE; https:// earthengine.google.com/) and exploits the large number of optical satellite images that are currently avail- able. In this contribution, we present mapping results from Landsat 8 and Sentinel-2 images, with 30 and 10 m spatial resolution, respectively. In principle, our approach allows rapidly mapping changes in the distri- bution and extent of debriscover at any time period, for which suitable satellite imagery is available in GEE, such as the Landsat data sets (Gorelick et al., 2017). The goal of this contribution is twofold. First, we present our new automatic mapping approach and evaluate it by comparison with a recently published data set of RESEARCH LETTER
supraglacialdebris is initially entrained into lateral and medial moraines in the upper reaches of the glacier. As moraines coalesce with increasing distance from their source the debris layer becomes more spatially extensive (Anderson, 2000; Kirkbride and Deline, 2013). The thickness of the supraglacialdebris layer increases down-glacier and reaches its maximum near the glacier terminus (Anderson, 2000). In areas where supraglacialdebriscover extends across the entire glacier surface spatially variable debris distribution results in differential melting and forms an undulating glacier surface topography (Hambrey et al., 2008; Kirkbride and Deline, 2013). Supraglacialdebris thickness varies in space and time as a result of differing spatial extents and temporal rates of debris input, transport and exhumation (Rowan et al., 2015). Ablation rates of debris-covered glaciers are therefore also spatially and temporally variable (Benn et al., 2012; Rounce and McKinney, 2014). Studies that consider the response of debris-covered glaciers to climatic change currently do not account for this variability (e.g. Bolch et al., 2012; Scherler et al., 2011; Shea et al., 2015), which increases the uncertainty in estimations of glacier ablation rates, and thus the subsequent predictions of the response of debris-covered glaciers to climatic change.
To calibrate a method that automatically maps ice cliff area, a sufficiently accurate “truth” dataset is needed. For this study, ice cliff outlines generated from the high-resolution visible and thermal data described in Sect. 2.2.1 were considered to be true. Elevation data described in Sect. 2.2.1 were not ex- plicitly used to digitize from but were used in a 3-D viewer with draped visible and thermal layers to assess generated ice cliff outline quality. Area that was clearly ice cliff in visible and thermal data but not apparent in elevation data (possibly due to errors in the DEM) was still mapped as ice cliff area. Given the ambiguities described in Sect. 1 regarding “thin” debriscover, ice cliffs were liberally outlined, including, for example, cliffs that were nearly 100 % covered by debris, yet had a unique thermal signature relative to the surround- ing debriscover indicating thinner debris. No minimum size was considered, thus ice cliffs below the resolution of the method input data are penalized in quality assessment met- rics, if missed. While we made an effort to manually map ice cliffs based on consistent criteria, there is subjective inter- pretation within this “truth” dataset. We did not quantify the uncertainty associated with this subjectivity.
The QA contractor, Maximus, raised numerous concerns about the Oracle software and about the Oracle consulting services. They noted that Oregon was the first state to use the framework for both eligibility automation and HIX and raised concerns about how integrated the various components of the Oracle solution were. In November 2011, Maximus conducted an initial risk assessment of the project. The report dated November 3, 2011 stated the following about the use of the Oracle framework: “The Oracle framework is not currently used in other states on similar projects. Oregon is the first State to use the framework for both EA and HIX. The commercial framework presented from Oracle is a number of products that Oracle has purchased over the years. It is unclear as to how integrated these products are currently.” First Data heard comments consistent with this in interviews, noting that the core system environment took longer to stand up than expected and noting that the Oracle team sometimes seemed to operate in silos.
mate history because of differences in geometry and hence response times (Jóhannesson et al., 1989). The retreat of Zmuttgletscher is relatively modest compared to that of other large Swiss glaciers (Fig. 4) and it has shown little terminus fluctuation, even during the climatically favourable period in the 1970s and 1980s. Other glaciers with similarly sub- dued fluctuations are Unteraargletscher and Glacier de Zinal, which are also debris covered in their lower reaches. Unlike smaller and debris-free glaciers, neither Unteraargletscher nor Glacier de Zinal advanced in the 1980s and 1990s (Fig. 4). Aletschgletscher (> 80 km 2 ) is too large and thick to react to short periods of positive mass balance and shows a smooth and accelerating long-term retreat trend since the end of the LIA (i.e. − 24 m yr −1 ; GLAMOS, 2018). Con- versely, Findelgletscher, a nearby debris-free glacier of sim- ilar size, thickness, and elevation range as Zmuttgletscher, showed much stronger fluctuations in length and mass bal- ance. Findelgletscher experienced periods of balanced and even positive mass balances, accompanied by a sharp ter- minus advance in the 1980s ( + 41 m yr −1 ), followed by a strong retreat since 1985 ( − 44 m yr −1 ). During these recent periods of strongly negative mass balance, the retreat rate at Zmuttgletscher ( − 7.2 ± 0.01 m yr −1 ) was lower than that of all other glaciers in Fig. 4.
High-resolution land cover maps are in high demand for many environmental applications. Yet, the information they provide is uncertain unless the accuracy of these maps is known. Therefore, accuracy assessment should be an integral part of land cover map production as a way of ensuring reliable products. The traditional accuracy metrics like Overall Accuracy and Producer’s and User’s accuracies - based on the confusion matrix - are useful to understand global accuracy of the map, but they do not provide insight into the possible nature or source of the errors. The idea behind this work is to complement traditional accuracy metrics with the analysis of error spatial patterns. The aim is to discover errors underlying features which can be later employed to improve the traditional accuracy assessment. The designed procedure is applied to the accuracy assessment of the GlobeLand30 global land cover map for the Lombardy Region (Northern Italy) by means of comparison with the DUSAF regional land cover map. Traditional accuracy assessment quantified the classification accuracies of the map. Indeed, critical errors were pointed out and further analyses on their spatial patterns were performed by means of the Moran’s I indicator. Additionally, visual exploration of the spatial patterns was performed. This allowed describing possible sources of errors. Both software and analysis strategies were described in detail to facilitate future improvement and replication of the procedure. The results of the exploratory experiments are critically discussed in relation to the benefits that they potentially introduce into the traditional accuracy assessment procedure.
Abstract. A detailed comparison between the performances of two different approaches to debris flow modelling was car- ried out. In particular, the results of a mono-phase Bingham model (FLO-2D) and that of a two-phase model (TRENT- 2D) obtained from a blind test were compared. As a bench- mark test the catastrophic event of 1 October 2009 which struck Sicily causing several fatalities and damage was cho- sen. The predicted temporal evolution of several parameters of the debris flow (such as flow depth and propagation veloc- ity) was analysed in order to investigate the advantages and disadvantages of the two models in reproducing the global dynamics of the event. An analysis between the models’ re- sults with survey data have been carried out, not only for the determination of statistical indicators of prediction accuracy, but also for the application of the Receiver Operator Char- acteristic (ROC) approach. Provided that the proper rheolog- ical parameters and boundary conditions are assigned, both models seem capable of reproducing the inundation areas in a reasonably accurate way. However, the main differences in the application rely on the choice of such rheological param- eters. Indeed, within the more user-friendly FLO-2D model the tuning of the parameters must be done empirically, with no evidence of the physics of the phenomena. On the other hand, for the TRENT-2D the parameters are physically based and can be estimated from the properties of the solid mate- rial, thus reproducing more reliable results. A second impor- tant difference between the two models is that in the first method the debris flow is treated as a homogeneous flow, in which the total mass is kept constant from its initiation in the upper part of the basin to the deposition in a debris fan. In contrast, the second approach is suited to reproduce the erosion and deposition processes and the displaced mass can be directly related to the rainfall event. Application of
In 1982, R. Wille  proposed a new model to represent the formal concepts associated to a context (G,M, I), which have found several real-world applications in data analysis (so-called formal concept analysis),such as Object-oriented databases,inheritance lattices ,mining for association rules  ,generating frequent sets etc, one of the important challenges in data handing is generating or navigating the concept lattice of binary relation. The theory of rough sets, proposed by Z.Pawlak[6,7], is an extension of set theory for the study of intelligent systems characterized by insufficient and incomplete information. The concepts of lower and upper approximations in rough set theory is an effective way of studying imprecision, vagueness, and uncertainty. Many authors researched these two fields( see [8-14]). In this paper, we discuss the rough properties of extents set of formal context , define respectively the upper approximation and lower approximate of the extents of formal context, and discuss their properties, study the classifications and properties of concept extents by means of the lower (upper) approximation, and investigate the dependent relationship of the extents of formal concepts.
2 Department of Geography, University of Zu¨rich, Zu¨rich, Switzerland 3 Institute for Cartography, Technische Universita¨t Dresden, Dresden, Germany
ABSTRACT. We investigated area changes in glaciers covering an area of ! ! 200 km 2 in the Tista basin, Sikkim, Eastern Indian Himalaya, between !1990 and 2010 using Landsat Thematic Mapper (TM) and Indian Remote-sensing Satellite (IRS) images and related the changes to debriscover, supraglacial lakes and moraine-dam lakes. The glaciers lost an area of 3.3 " 0.8% between 1989/90 and 2010. More detailed analysis revealed an area loss of 2.00 " 0.82, 2.56 " 0.61 and 2.28 " 2.01 km 2 for the periods 1989–97, 1997–2004/05 and 2004–2009/10, respectively. This indicates an accelerated retreat of glaciers after 1997. On further analysis, we observed (1) the formation and expansion of supraglacial lakes on many debris-covered glaciers and (2) the merging of these lakes over time, leading to the development of large moraine-dam lakes. We also observed that debris-covered glaciers with lakes lose a greater area than debris-covered glaciers without lakes and debris-free glaciers. The climatic data for 24 years (1987–2011), measured at the Gangtok meteorological station (1812 m a.s.l.), showed that the region experienced a 1.0 8C rise in the summer minimum temperature and a 2.08C rise in the winter minimum temperature, indicating hotter summers and warmer winters. There was no significant trend in the total annual precipitation. We find that glacier retreat is caused mainly by a temperature increase and that debris-covered glaciers can retreat at a faster rate than debris-free glaciers, if associated with lakes.
However, this usually requires a long time and many efforts, both methodological and conceptual. Mainly this task has been executed by several academic institutions and scientific teams in the last decade. The accessible and handy land cover is, conversely, difficult to represent at the global level. Describing (modelling) reality in all its facets and complexity is mostly impossible. Furthermore, large gaps still exist, for example, in our knowledge of the current geographic distribution and spatial pattern of crop performance (Liangzhi et alii, 2009). Besides, many regional and national high resolution databases were produced and many of them are available to the scientific community. They are prepared with different methodological approaches; thus, the data are hard to directly compare. On the other hand, the global layers produced suffer either an excessive simplification, or they do not assure the same level of accuracy all over and for the all classes of the land cover types. An assessment on cropland extension using several global databases, for example, reports a difference of more that 20% in the total extension (Fritz et alii, 2014).
GLC2000 legend Global Aggregation
1. Tree Cover, broadleaved, evergreen (LCCS >15% tree cover, tree height >3m ) 2. Tree Cover, broadleaved, deciduous, closed
3. Tree Cover, broadleaved, deciduous, open 18. Mosaic: Cropland / Tree 4. Tree Cover, needle-leaved, evergreen 19. Mosaic: Cropland / Other
Fifteen arc second global land cover, GLCNMO2008 (or GLCNMO version 2) has been produced with the overall accuracy of 77.9 % by 904 validation points sampled by stratified random method. The product is available from ISCGM website [w4]. MODIS 500 m data of 2008 used for classification, globally distributed 2080 training polygons with shape format, and reference maps are also published from the data sharing system developed by the author, CEReS Gaia [w9]. All published data produced by this study are listed in Appendix C. Through the experience of this mapping project, the following recommendations were obtained for the future global land cover mapping.
Support for this effort was provided by the following National Aero- nautics and Space Administration (NASA) programs: Making Earth Sci- ence Data Records for Use in Research Environment (NNH06ZDA001N- MEaSUREs), Land Cover and Land Use Change (NNH07ZDA001N- LCLUC), NASA ACCESS (NH11ZDA001N-ACCESS) and NASA Indicators (NNH12ZDA001N-INCA). We thank Linda Jonescheit Owen of LPDAAC U.S. Geological Survey (USGS) for supporting our large Landsat data re- quests, and thank our colleagues Katie Collins, Dr. Fu-Jiang Liu, and Guang-Xiao Zhang for their efforts on interpreting the points. We would also like to thank the four anonymous reviewers whose constructive com- ments led to a better presentation of our research methods and results. Appendix A
When the GCAM projections indicate that the area of a given FLT is increasing, the additional area can be downscaled on grid cells where the FLT already exists, which is referred to as intensification, or on grid cells where it does not yet exist, referred to as expansion. In the real world, the ratio of in- tensification vs. expansion varies in space and time. In North America, for example, land giveaways, infrastructure devel- opment, and a number of other factors led to a large-scale westward expansion of agricultural activities from 1800 to 1950, then to their intensification until today, with most of the Corn Belt now featuring more than 80 % crop cover (Ra- mankutty and Foley, 1999b). In the default configuration pre- sented here, the intensification ratio (intens_ratio) is set to 0.8 globally, and is part of the sensitivity analysis (Sect. 3). The code can be modified to define specific ratios for differ- ent regions or time periods, however. Note that the ratio is a target, which sometimes cannot be met. In the extreme case where croplands exist in all grid cells of a region/AEZ, for example, expansion is impossible. The code then applies the desired expansion target as intensification instead.
DOI: 10.4236/blr.2019.103025 433 Beijing Law Review Such circumstances might, in the perspective of similar benefits globally, Consequently, the augmentation in activities almost encourage the happening of coincidences in outer space, the theory of inevitability might attain significance in the forthcoming and perform a part for the instituting legal rules for SDR. Apprehension of terrible collision or hitting with functional object in outer space may indicate exclusively unlike time frame as in the tanker catastrophe on the high seas. States may also employ in self-support where another State is in dis- obeying of key duties. Under the state responsibility rules, such a global errone- ous may give rise to lesser responsibilities of a State of Registry and unwraps the avenues for encounter for safety and protection (Hafner, 2002). In analogy, the right to remove space debris in good faith may be acknowledged as a sensible compromise and break with long-standing dogmata in order to effectively ad- dress space debris as an issue of global concern. And if something wrong hap- pened while removing the object bona-fide then who will compensate the dam- ages? Key components may be a unequivocal and rational selection matrix on that base objects are accurate candidate for removal as well as objective criteria for determining whether objects are non-functional, whether objects are, for in- stance, without “justifiable authentication of capability to presuppose or resum- ing the intentional roles or any other roles on that behalf they either may be ap- proved” (Mey, 2012). A negative list may ensure that the legal rights of the State of Registry are not prejudiced, banning inter alia removal procedures that could divulge sensitive information instead of letting objects disintegrate upon atmos- pheric re-entry.
ABSTRACT: Entanglement in anthropogenic debris poses a threat to marine wildlife. Although this is recognised as a cause of marine turtle mortality, there remain quantitative knowledge gaps on entanglement rates and population implications. We provide a global summary of this issue in this taxon using a mixed methods approach, including a literature review and expert opinions from conservation scientists and practitioners worldwide. The literature review yielded 23 reports of marine turtle entanglement in anthropogenic debris, which included records for 6 species, in all ocean basins. Our experts reported the occurrence of marine turtles found entangled across all species, life stages and ocean basins, with suggestions of particular vulnerability in pelagic juve- nile life stages. Numbers of stranded turtles encountered by our 106 respondents were in the thou- sands per year, with 5.5% of turtles encountered entangled; 90.6% of these dead. Of our experts questioned, 84% consider that this issue could be causing population level effects in some areas. Lost or discarded fishing materials, known as ‘ghost gear’, contributed to the majority of reported entanglements with debris from land-based sources in the distinct minority. Surveyed experts rated entanglement a greater threat to marine turtles than oil pollution, climate change and direct exploitation but less of a threat than plastic ingestion and fisheries bycatch. The challenges, research needs and priority actions facing marine turtle entanglement are discussed as pathways to begin to resolve and further understand the issue. Collaboration among stakeholder groups such as strandings networks, the fisheries sector and the scientific community will facilitate the development of mitigating actions.
Among the elements of the scheme of through type debris flow against construction, developed by us:(1)metal pipe, (2) tyres, (3) metal axis, (4) angled bar for connecting metal pipe and metal axis, (5) ropes for fixing on the river bed slope cylindrical elements component of the debris flow against construction, (6) ropes clamps, (7) anchors for attach ropes clips on the river bed slopes, (8) reinforce for attach anchors on the river bed slopes, (9) metal angled bar, for connection cylinder elements of the construction to each other, (10) inert mass placed in the tube, (11) slopes of river bed, (12) the building of reinforced concrete foundation.
Almost all respondents believe that Ghana’s beaches are not clean and yet surprisingly they all admit to the fact that this gives them cause for concern. This development can be attributed to the fact that people have be- come desensitized to the litter campaigns that have been in the media for many years and may believe littering is not their problem but rather belies a belief that regulators need to control and respond to littering . Again, there is strong evidence that people are more likely to litter in places where litter is already present -. People litter more when in an unclean environment as their social norms indicate that as the environment around them is unclean it is acceptable to litter  . This supports my results as relatively high amounts of debris were collected during the beach survey.
ing of the turbulent fluxes, in particular the sensible heat flux (QS) over debris-free glacier surfaces at high elevations. The stability corrections are based on the bulk Richardson num- ber (specifically, those provided in Braithwaite, 1995) and have been used previously in glacier CMB modeling (e.g., Mölg et al., 2008, 2009; Reid et al., 2012). In the most sta- ble conditions, the turbulent fluxes are fully damped, which resulted in decoupling of the surface and the atmosphere and excessive radiative cooling. Even in less stable conditions, the damping of modeled turbulent fluxes has been found to be excessive compared with eddy covariance measurements over glaciers (Conway and Cullen , 2013). Congruent with previous modeling studies of glacier surface-energy fluxes, we therefore limit the maximum amount of damping in sta- ble conditions to 30 % (Martin and Lejeune, 1998; Giesen et al., 2009). In addition, we adopt a minimum wind speed of 1 m s −1 to be consistent with neighboring non-glacierized grid cells simulated by the Noah-MP LSM (land surface model; Niu et al., 2011). However, test simulations in early April indicate that the second correction has a minimal im- pact on wind speeds and turbulent fluxes in glacier grid cells and, thus, may be unnecessary.
It is stated by several authors that debris covered glaciers respond differently to climate change than bare ice glaciers (Bolch et al., 2008; Sorg et al., 2012; Scherler et al., 2011). During years of negative mass balance the position of the ter- minus region remains stable while the debris covered parts react by surface lowering. This downwasting behaviour is also reported for Koxkar Glacier (Pieczonka et al., 2013). The significant difference in the terminus evolution is related to the facts that debriscover is present and the decreasing ice flow velocity due to reduction of ice thickness and surface gradient (Benn et al., 2012). The ablation model allows us to compare melt rates of the debris covered Koxkar Glacier with the ablation if no debris were present on the glacier sur- face. Figure 9 shows the direct comparison of melt rates in- cluding a zoomed section of the glacier terminus. It becomes quite clear why debris covered glaciers respond differently in a particular climate setting. While the melt amount on the bare ice glacier reaches values of approximately 8 m in one ablation season, the ablation on the debris covered glacier al- most comes to a standstill. The ice cliffs are the exception and can easily be spotted as melt hotspots with values up to 8.5 m melt on the debris covered tongue. For the supraglacial lake grounds a slightly inferior ablation can be observed when compared with the surrounding subdebris melt. Another im- portant point is the melt gradient: the modelled ablation on the bare ice tongue exhibits the temperature and elevation dependence of the melt rates. In the case of the debris cov- ered glacier this effect is not present. Due to the fact that the curve in Fig. 2 levels out beyond 0.1 m, no significant changes in ablation are observable along the lower tongue profile (Fig. 10). At lower elevations, the increasing debris thickness compensates for the higher temperature.