In the past, Cloud Computing Technology (CCT) is used to integrate the segregated segments of a particular industry using minimum resources. It has given excellent results and has wide range of applications in various industries like banking, manufacturing, IT etc. It makes the information visible to all segments of an industry by deploying its service delivery models like Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as s Service (IaaS). Keeping these attributes in mind, CCT is deployed here to minimize carbon footprint of entire beef supply chain. The retailer, being a key stakeholder is going to maintain a private cloud, which will map the entire beef supply chain. The information related to carbon footprint associated with every stakeholder will be available on the cloud. This information will be accessible to all of them by using basic computing and Internet equipment.
19 Read more
Cloud computing can carry out large-scale data calculation and can highly integrate the logistics industry resources, which plays a strong role in promoting the level of informationalization and is conducive to the transformation and development of the logistics industry. Due to these advantages, cloud computing technology is widely applied in the informationalization process of cold chain logistics. But from a practical point of view, the penetration of the cloud computing technology is not widespread enough, so that it has not yet been fully penetrated and applied. There will be a long- term period of practice before the penetration and application is fully realized. Therefore, in order to promote the development of logistics industry (including cold chain logistics) and further improve the level of industry informatization, it is also needed to enhance the penetration and application as well as the practical study of cloud computing technology in the process of logistics industry informatization.
Side channel analysis has long been known to be a serious threat to information confidentiality. With a sensor grid system both the communication between the sensors and the grid and that between the grid and clients are subjected to this threat. Our preliminary research shows that the design of web or cloud application, which is built upon the interactions between its client component and server component, makes them vulnerable to traffic analysis. Techniques can be developed to infer the internal states of these applications and the data associated with them by looking at the attributes such as packet size, directions and numbers of the traffic between its two components. In our work we developed a model to analyze side-channel weakness in Web applications. For experimental purposes seven high profile, top-of-the-line Web applications were tested and founded side-channel vulnerabilities in all of them. We evaluated the effectiveness and the overhead of applying common mitigation techniques. Our research shows effective mitigations of side-channel leakage threats have to be application-specific. We proposed the first technique for automatic detection of side-channel leaks in Web applications, and offered novel solutions to the challenges associated with the extensive use of AJAX and GUI widgets. We developed a novel technique for quantifying the side-channel leaks in Web applications. The new technique can measure not only the information disclosed from a single tainted source but also that aggregated from multiple sources. We designed and implemented a preliminary detection and mitigation framework for analyzing Web applications and a platform for Web application developers to specify privacy policies, upon which Web browsers and servers collaborate to enforce such policies.
70 Read more
The emergence of cloud computing has brought new revolution for higher vocational education. Network virtualization based on cloud technology leads to tremendous innovation of the traditional teaching model and diversified web virtual teaching can be achieved by the use of virtualization technology. Moreover, private cloud in school could effectively implement open teaching and research work and make up for the shortage of facilities as well as reduce the working strength of school operation. Existing learning system has been perfected by virtualization technology. As an application of cloud computing, private cloud in school needs to deal with huge amount of data, and meet the high demand for resource scheduling. In this paper, the cloud computing technology based on the study of resource scheduling algorithm is researched; the feasibility of ant colony algorithm applied to cloud task scheduling is analyzed; the basic strategy for resource scheduling algorithm in cloud environment is proposed; the training of cloud system with high availability and load balancing is worked to be implemented. This work may provide some useful ideas for further research.
Big Data Analytics (Technology) is used to store large amount of data (in terabytes) and handled by Relational Database Management System (RDBMS). Big Data Technology is more suitable to maintain data having high volume, high velocity and high variety that is need of recent trends . The data is not the “stock” in a data warehouse but a continuous flow . “Big data is defined as large amount of data which requires new technologies and architectures so that it becomes possible to extract value from it by capturing and analysis process” . Big data technology helps to collect large data on cloud and cloud computing technology helps to provide this collected large data; Virtualization technology by creating virtual machines on hosts, is used to maximize the utilization of computing resources and manages memory.
With the development of Internet of things (IoTs) in recent years, IoT technology and applications are widely used in more and more industries including manufacturing, medical, energy and utilities, transportation and distribution, agricultural technology, and smart cities. With the rapid development of a global Internet and e-commerce, IoT technology plays a crucial role in the whole industry chain from product production to product delivery. At the same time, consumers put forth higher requirements for product timeliness, integrity, and security, which is especially true for the special goods and construction industry. Military supplies, special prescription drugs, fresh food, and chemical products are common goods that are in demand through IoT technology. Effective transportation is the most important factor to deliver these products. Goods necessitate recognition technology such as, vehicle tracking and positioning technology, information transmission technology, and network technology combined with cloud computing, big data, and artificial intelligence (AI) application in conjunction with the current Internet technology to meet these requirements and achieve the true meaning of intelligent transportation.
Hardware– The provision and running of the relevant computing, storage, and network equipment necessary to operate one or more business intelligence components. In the web based context of Cloud Computing, handing out this layer corresponds to an infrastructure as a Service approach. Here, virtualization brings flexibility regarding both the physical location and assigned resources like CPU power or storage – highly relevant arguments when considering the volatility of resource consumption in BI. Hardware abstraction is especially interesting for facilitating scalability and portability and it might give middle sized enterprises access to hardware power that was otherwise be out of reach for them (e.g. because they cannot afford “DWH appliances” (McKnight 2005)). High-end requirements on the DWH side (latencies, data volume) are often at odds with an Internet based rider model. It can therefore be doubted that virtualization relieves of the cumbersome installation, alteration, and operation tasks for truly demanding ODS/DWH installations.
10 Read more
Span of the inelegant holdfast conformable to provided by Cloud-like Computing is the centralized text. The benefits of centralized right are Scrooge-like statistics getaway and better monitoring . For peanuts materials leakage is the Unexcelled talked and pretentiously consistent with from the sunless providers for enterprises. Foremost of the enterprises maintain their facts on tapes and laptops but they are never fixed. It is less destined to convey materials in the presence of short- lived caches or handled chattels than transferring scan laptops. in addition yowl encompassing solid enterprises are purpose encryption techniques. Accordingly, the text keester be beholden round secured with the history of Insensitive Computing technology. It is pule counting easier to control and corroboration details thumb central storage. Extent, on the other collaborator it is except for audacious to have around information at two job as if pilferage happens, about data is rapt but Balding prefers the centralized data. It is better to storm mature on greatest the Mainstay for three centralized meeting passably than forethought out the way to come into possession of all the places where companies reside their data (Balding, 2008).
(4) In this paper, cloud computing, data mining services can be provided from the four levels: the basic steps of the underlying composition data mining algorithms; the second layer as a separate data mining services, such as classification, clustering, etc.; the third layer for distributed data mining models, such as parallel classification, aggregation, and machine learning; elements of the fourth floor before the three-complete data mining application. On the basis of this design, they designed a cloud computing-based Data Mining open service framework, and developed a series of data mining services, such as Weka4WS Knowledge Grid, Mobile Data Mining Services, Mining @ home, users can take advantage of the graphical interface define their own data mining workflow, and then executed on the platform.
From literature, there are an increasing number of research work relating to HPC and the Cloud, but, many of these are skewed towards certain directions and do not cover the concepts in their entirety. In writing an article or embarking on research in general, a researcher must consider a technical area of interest. This involves a lot of studies in an attempt to understand the topic. This usually entails searching several conference proceedings, journals and even books. In addition, determining an area of interest may also require a lot of search on digital libraries, attending workshops, seminars and conferences. Through the process of conducting research, as well as the long hours that are spent reviewing other people’s research, researchers can often stumble onto new and often unanticipated research ideas. Also, many researchers become interested in particular research in a specific observed phenomenon serving as impetus for a great amount of research in all fields of study. From the foregoing, it is obvious that the process of determining a research topic is sometimes usually cumbersome. This necessitates a summary and overview of research in this area. A systematic mapping study allows the categorization of reports using a unique structure and scheme to provide insight into these research work (Peterssen, Feldt, Mujtaba, & Mattsson, 2008). Such insights relating to frequency of publications are then visually reported using a bubble map. Three facets were employed in this study namely; the topic, contribution and research facets. The topic facet is used to extract core issues that relates to HPC and Cloud computing. The research facet focuses on the type of research carried out, while the contribution facet is concerned with the method, model or metric used. The purpose of the paper therefore, is to conduct a systematic mapping study of HPC and the Cloud. The rest of this paper is organized as follows: Section 2 examines related work. Section 3 materials and methods. Section 4 presents the obtained results and discussion. Finally, the paper is concluded and further studies suggested in section 5.
15 Read more
The cloud computing is naturally the complex infrastructure which makes the user organizations to avail the resources in convenient way. In this paper, general characteristics of cloud computing, data mining techniques and audit logs are discussed. Google cloud platform based audit logs are primarily consider for reviewing the data mining techniques applicable to mine the audit logs. The final outcome of this review states that audit logs are essential to monitor the administrative and user activities. Further, the data mining technique are used to prevent losses and helps to create the security, access policy.
Sessions are used in RBAC models to define ways to use multiple roles. The use of sessions in cloud computing is almost the same as that in traditional application environments. The process can be briefly shown in Figure 4. In the figure, attribute ActiveRoleList provides links to current active roles for a user. Attribute RoleList provides links to all roles where the current user is a member. ActiveRoleList is a subset of RoleList. We can check whether a user is a member of a role by invoking checkUserRole(). To activate or deactivate roles where a user is a member, we can invoke roleActivation() and roleDeactivation() routines. This mechanism can be used to enforce role exclusion or inclusion at execution time. A session can be established using SessionEstablishment(). The session can be disabled using SessionRevocation(). By using sessions, one can manage dynamic role inclusion/exclusion owned by each user. Permissions can be disabled and enabled through the use of sessions. It should be noted that according to the definition by Ferraiolo et al. , “each session is a mapping between a user and an activated subset of roles that are assigned to the user”. A session is not necessarily limited to one user or one role. In cluster-computing environments, parallel assignments of multiple sessions to one role can be easily achieved. Parallel implementations can also be done for user-to-role, object-to-role, permission-to- role, permission-to-operation and permission-to-object assignments.
Another “green” benefit of the cloud is that it can be delivered as an on-demand in a form of “utility computing” where data centers deliver computational resources as needed. For this reason, these hosting sites don’t have to be up-and-running 24/7 as data can be accessed when required which will significantly lower energy consumption. For this reason, utility computing can directly support grid computing architectures and web sites which require very large computational resources or those that have sudden peaks in demand (see Figure 1 below).
This layer is the outermost layer of private cloud service, that is the specific software service. In this paper, the development of "Interdisciplinary Practical Teaching Platform under Cloud Environment" deploys the education cloud resources, makes it cooperate with financial students' practical teaching. This layer is mainly to deploy the 37 sets online teaching system of the five training centers. According to the departments and professional distribution, it is divided into five clouds: "business management cloud", "humanity travel clouds", "information technology cloud", "international trade cloud", "financial accounting cloud", for the whole learning services. They are shown in figure 6.
• Clouds are a major industry thrust with a growing fraction of IT expenditure that IDC estimates will grow to $44.2 billion direct investment in 2013 while 15% of IT investment in 2011 will be related to cloud systems with a 30% growth in public sector. • Gartner also rates cloud computing high on list of critical
25 Read more
Evolution of Internet in 1990s has significantly changed the computing technologies. In 1999 grid computing come in to existence which uses the facility of Internet and cluster computing. Grid computing combines computers from multiple administrative domains to reach a common goal, to solve a single task and may then disappear. It is analogous to the power grid. Grid computing involves computation in a distributed fashion, which may also involve the aggregation of large-scale cluster computing based systems. The size of a grid may vary from small a network of computer workstations within a corporation to large collaborations across many companies and networks.
Cloud computing is promising technology which is specially concentrate to provide these special services like dynamic storage, sharing of information, storage of huge amount of data without any special requirement at user end. According to a forester he define cloud computing as “A pool of abstracted, highly scalable, and managed compute infrastructure capable of hosting end customers application and billed by consumption.” The main advantage of cloud
technology enables a vendor’s cloud software to automatically move data from a piece of hardware that goes bad or is pulled offline to a section of the system or hardware that is functioning or operational. Therefore, the client gets seam- less access to the data. Separate backup systems, with cloud disaster recov- ery strategies, provide another layer of dependability and reliability. Finally, cloud computing also promotes a green alternative to paper-intensive office functions. It is because it needs less computing hardware on premise, and all computing-related tasks take place remotely with minimal computing hard- ware requirement with the help of technological innovations such as virtual- ization and multitenancy. Another viewpoint on the green aspect is that cloud computing can reduce the environmental impact of building, shipping, hous- ing, and ultimately destroying (or recycling) computer equipment as no one is going to own many such systems in their premises and managing the offices with fewer computers that consume less energy comparatively. A consolidated set of points briefing the benefits of cloud computing can be as follows: 1. Achieve economies of scale: We can increase the volume output or pro-
396 Read more
Cloud computing is an evolution in the field of computer science and technology. In the twenty-first century, computer users access Internet services via lightweight portable devices because powerful desktop machines are going through a phase of drought. Cloud computing is emerged as a solution to this problem. Cloud is a distributed computing paradigm. It is a collection of interconnected and virtualized computers, which are provisioned and presented dynamically as unified computing resources offered on a pay-per-use basis . Cloud computing is defined as applications that are delivered as Internet services: the hardware and system software in the data centers are used to provide these services. Cloud computing is an advanced technology that focuses on the way of designing computing systems, developing applications, and leveraging existing services for building software . It is based on dynamic provisioning . In cloud computing, resources are offered in an on-demand and pay-per-use basis from the cloud computing vendors .
Rapid Elasticity: Users can rapidly increase and decrease their computing resources as needed, this is often achieved automatically, which gives the consumer impression that resources are infinite and that the application can always cope when in demand. When resources are no longer needed they are relinquished back into the resource pool. Pay-Per-Use – Any resources that are used are