This article incorporating a theoretical and practical approach to cost-benefit analysis to help the managerial level of a company that is in front of a decision if or not to adopt CloudComputing to the bussiness. Cost-benefit analysis is divided into six distinct sections to determine what costs and benefits are economically significant for the case study analyzed. Based on the above allocation sections TEI (Total Economic Impact) is applied, implemented by Forrester as a methodology. This methodology was chosen because it gives a general view of the impact and benefits in information technology to cloudcomputing adoption not only through the analysis of costs and benefits but also giving appropriate weight to information technology resources that are possessed. The article deals with analysis, forecasts and implementation of CloudComputing. Showing financial benefits acquired from "green IT" as a consequence from this technology. Through a detailed analysis of the cloudcomputing phenomenon and its connection with green IT, this article conducts a cost-benefit analysis as a case study taking a internet and comunications services company, such as Primo Communications sh.p.k. The aim of the project is to show the potential benefits of the transition to cloudcomputing of this company, benefits expressed in increased efficiency of the use of technological resources or even increased environmental benefits. Together "cloudcomputing" and "green IT" treatment in this article represent two of the most challenging opportunities in business development nowadays.
Cloudcomputing is the development of parallel computing, distributed computing, grid computing and virtualization technologies which defines the shape of a new era. Cloudcomputing is an emerging model of business computing. It means storing and accessing data over the internet instead of your computer’s hard drive. That means all the data what you are going to access is on air i.e., on internet. In this paper we have enlighten different computing models from which we have finally concluded that CloudComputing is one of its types which is a combination of cluster and grid computing models. Many people have different opinion about the term CloudComputing, many even says that it is a new technology and new establishment on internet, but CloudComputing is available on internet since the day the Internet was evolved. In this paper we have focused on technical aspects of cloudcomputing and have explained it via different computing models. This paper provides a better understanding of the cloudcomputing and also explains the evolution of it by understanding the emergence of internet with benefits.
There are many benefits as mentioned above, even though cloudcomputing has many challenges. While moving from owning site to cloud space, companies must aware about the benefits and challenges of cloudcomputing. While analysing these challenges, security of data is the most tedious work in cloudcomputing. According to a survey carried out by Gartner , more than 70% of Chief Technical Officers believed that the primary reason for not using cloudcomputing services is that of the data security and privacy concerns. Convincing the organizations especially small ones about security concern is a tedious work; they are not ready to throw away their infrastructure and immediate move to cloud. Most of the organizations are closely watching this issue and not ready to shift to cloud space, this is main reason in the lack of maturity
Cloudcomputing offering many benefits some of them are: No up-front investment  , on demand self service  low operating cost, high scalability , easy access, reducing business challenges and reducing maintenance cost . Keeping in view benefits of cloudcomputing we use it for managing requirement change in GSD environment and it is evaluated that framework using cloudcomputing resolving the challenges of RCM mentioned above:
technology enables a vendor’s cloud software to automatically move data from a piece of hardware that goes bad or is pulled offline to a section of the system or hardware that is functioning or operational. Therefore, the client gets seam- less access to the data. Separate backup systems, with cloud disaster recov- ery strategies, provide another layer of dependability and reliability. Finally, cloudcomputing also promotes a green alternative to paper-intensive office functions. It is because it needs less computing hardware on premise, and all computing-related tasks take place remotely with minimal computing hard- ware requirement with the help of technological innovations such as virtual- ization and multitenancy. Another viewpoint on the green aspect is that cloudcomputing can reduce the environmental impact of building, shipping, hous- ing, and ultimately destroying (or recycling) computer equipment as no one is going to own many such systems in their premises and managing the offices with fewer computers that consume less energy comparatively. A consolidated set of points briefing the benefits of cloudcomputing can be as follows: 1. Achieve economies of scale: We can increase the volume output or pro-
Education and learning continue to be expanded a step-by-step, and each section of education has improved gradually. The popularity of education using the web and the development of the immaculate online learning condition has turned out to be one of the hot concentrates on exploring remote education. Cloudcomputing is rapidly developing, having nearly any discipline application. By offering benefits to education, it is very important to enhance the quality of education in order to fulfil the essential performance, such as offering negligible effort, flexibility, diversity, cooperation and convenience. The cloud services and programs enable clients to store and get to their nearby info on the remote server farm by utilizing their PCs or smart phones by means of the web. Cloud-based E-learning is the technique to cut down expenses and intricacy of data dealing with, which controlled by third party services. Conventional E-Learning strategies joined with cloudcomputing advancement to offer huge favourable circumstances to the academic users but it compromises in security perspectives. This study explains advantages of E-learning in education and describes different types of attacks in service delivery models of E-Learning proposed by different researchers.
customers while maintaining the same level of quality and services [10, 11]. Both features intend to improve the agility, collaboration, rapid adaptation to the demand triggered fluctuations utilization of resources and availability of services without spatio-temporal restrictions in the cost-effective manner . Due to ability of the cloudcomputing systems to provide with wide range of services ranging from computationally intensive services to the light-weight applications, they have assumed tremendous attraction for the different levels of business operations depending on IT resources [4, 12]. The adoption of cloudcomputing enables the organizations to reduce the upfront IT investment for purchasing IT infrastructures, software development and licensing various applications. Besides, governments showed an expression of interest in adopting the cloudcomputing applications in order for reducing the operational costs of the public projects, and enhancing the reachability, availability, and capabilities of their services delivered to the public domain [13, 14]. Although there is the plethora of benefits of the services and applications delivered to the customers through the cloudcomputing model, but there are significant issues and challenges in the way of successful adoption; and the most important one is the security issue which is reported to hamper the adoption of cloudcomputing in different organizations [5, 14]. As the cloudcomputing model is still in its infancy, due to which it involves uncertainty at different levels such as applications, data storage, data access points, network, hosting. The Service level agreements (SLA) does not include specific guarantees about the security and privacy of the customer’s data hosted on the servers of the cloud providers [15, 16]. The data from the multiple tenants exist on the same server without solid security controls over the data from each individual user. In addition, the instance of hosting valuable data from the customers on the publicly accessible servers enhances the probabilities of attack from malicious agents .
The cloudcomputing can be defined as a collection of concepts, technologies, methodologies that enable to dynamically provision hardware and software resources as a services over internet on pay per use model with a objectives of achieving high resource utilization in a scalable cost effective manner .The cloud deployment models are defined in the basis of the location ownership access and management of cloud services. The Iaas ,Paas ,Saas as the Major services delivered by the cloud. This paper outlines about the storage which is another major service offered by cloud and is known as storage as service. The cloudcomputing provides rich benefits to the cloud clients such as costless services, elasticity of resources, easy access through internet, etc.Storage as a service is a model in which a large company rents space in their storage infrastructure to a small company or individual Keyword- cloudcomputing; platform as a service; software as a service; infrastructure as a service, Representational State Transfer.
Mobile cloudcomputing aims to augment the resource-constraint mobile devices, but currently it is like a baby that requires attention. The ABI research believes more than 240 million business will use services provided by cloud service providers through mobile devices by 2015. Mobile cloudcomputing is a growing technology that includes both cloudcomputing and mobile computingbenefits. Also it is highly applicable for mobile devices. This paper has given an extensive and survey of mobile cloudcomputing technology including its definitions, architecture, motivation for developing, advantages, challenges and future research directions. For better understanding of mobile cloudcomputing before describing it, cloudcomputing is described.
This section defines basic concepts, recent trends and challenges of e-commerce and cloudcomputing and the benefits of hybridizing them. E-commerce involves digitally enabled commercial transactions between and among organizations and individuals that occur over the internet and the web . E-commerce activities in cloudcomputing are carried out by customers; banks; e-commerce companies and cloud service providers .
a key motive that things must be more and more streamlined such that convenience level of human beings increases. Cloudcomputing also comes with the same motive with more elastic features in comparison to the traditional way of working using cluster and grid computing, among others. Cloudcomputing has proved to be the most productive innovation by information and communication technology. In this paper various application areas are reviewed for areas where cloudcomputing is being used to the full extent, and depending upon the type, some other areas to implement cloudcomputing are explored. Cloud- based systems provide more benefits in terms of reliability, effectiveness, fault tolerance, scalability, and cost, among others. Certain challenges are explored for areas of further improvement.
Choosing appropriate cloudcomputing providers leads to great benefits for the customers such as improving security and performance and etc .Cloudcomputing customers depends on many factors in the process of evaluation providers such as accountability, Agiliabilty, Assurance, management, performance ,security and etc [14,15] .Because Security is one of most important factor for any customers or organization , so the organizations must depends on appropriate secured provider to secure their data ,improving their privacy ,and other security issues .In this chapter we uses improved neutrosophic multi criteria decision making method algorithm to choose appropriate cloudcomputing provider based on top ten risks as it was stated in (OWASP).
Abstract- Cloudcomputing offers a flexible and convenient way to share data, which provides diverse benefits to society and people. But there is a natural resistance for users to directly outsource shared data to the cloud server because the data often contains valuable information. Therefore, it is necessary to place a cryptographically improved access control in the shared data. Identity-based encryption is a promising cryptographic primitive for building a convenient data exchange system. However, access control is not static. That is, when the authorization of certain users has expired, there must be a mechanism that can eliminate it from the system.
Desktops are another computing resource that can be virtualized. Desktop virtualization is enabled by several architectures that allow remote desktop use, including the X Window System and Microsoft Remote Desktop Services. The X Window System, also known as X Windows, X, and X11, is an architecture commonly used on Linux, UNIX, and Mac OS X that abstracts graph- ical devices to allow device independence and remote use of a graphical user interface, including display, keyboard, and mouse. X does not include a windowing system—that is delegated to a window manager, such as KDE or Gnome. X is based on an MIT project and is now managed by the X.Org Foundation. It is available as open source software based on the MIT license. X client applications exist on Linux, UNIX, Mac OS X, and Windows. The X server is a native part of most Linux and UNIX systems and Mac OS X and can be added to Windows with the Cygwin platform. The X system was designed to separate server and client using the X protocol and lends itself well to cloudcomputing. X Windows is complex and can involve some troubleshooting, but because it supports many varied scenarios for its use, it has enjoyed a long life since it was first developed in 1984.
Case in point? Six in 10 channel firms say that cloud has generally strengthened their customer relationships, with just 15% claiming it has weakened them and roughly a quarter that said that their client bonds have remained the same. This is encouraging news given the fact that many in the channel have feared publicly that cloud would drive a wedge between them and their customers. There’s been rampant apprehension about such ill effects as a resurgence in vendor direct sales and end user customers choosing a self-‐service model for their IT solutions, i.e. procuring SaaS applications over the Internet. And while both of these trends are happening to a certain extent, CompTIA data suggest not at such dire expense to most of the channel, especially those that have reached a high level of cloud maturity today and intend to remain committed. That said, not all channel firms that adopt cloud will engender more good will with customers; some may simply have a customer set that is not cloud-‐ friendly, others may not gain sufficient expertise to provide value, etc.
However, sometimes it can be hard to classify the characteristics in one of the two categories, since even for some quantitative attributes it makes sense that the users express their preferences in a qualitative manner. There are a number of metrics that can be seen as qualitative but at the same time with some reasonable assumptions they can be precisely quantified (e.g. Interoperability, Usability etc. ) or that they can be resolved in a number of lower level metrics, involving both quantitative and qualita- tive attributes (e.g. Serviceability etc.) or including both precise and imprecise values. For instance, usability metric has been defined as a quantifiable attribute  in the sense of average time experienced by users of the cloud service to install, learn, un- derstand and operate it. But, often this average time is not enough to define how usa- ble a cloud service is, since this information is often vague and imprecise. It might be the case that the average installation or learn time for a cloud customer about a specif- ic service is relative short because of the customer’s huge experience in the specific domain and not because the service is really usable for an average user. It would be an oversight to ignore the degree of difficultness that previous users experienced based on their degree of expertise, when they tried to install, learn, understand and operate the specific cloud service. This value is highly subjective, uncertain and often is available through linguistic terms when previous users are expressing their opinions.
There’s growing sentiment among many cloud experts that ultimately hybrid adoption will be most ad- vantageous for many organizations. Warrilow says “for some time Gartner has advised that hybrid is the most likely scenario for most organiza- tions.” Staten agrees with the notion for two reasons. First, “some appli- cations and data sets simply aren’t a good fit with the cloud,” he says. This might be due to application architec- ture, degree of business risk (real or perceived), and cost, he says. Second, rather than making a cloud-or-no- cloud decision, “it’s more practical and effective to leverage the cloud for what makes the most sense and other deployment options where they make the most sense,” he says. In terms of strategy, Staten recommends regularly analyzing deployment decisions. “As cloud services mature, their applica- bility increases,” he says.
Hadoop MapReduce and the LexisNexis HPCC platform are both scalable archi- tectures directed towards data-intensive computing solutions. Each of these system platforms has strengths and weaknesses and their overall effectiveness for any appli- cation problem or domain is subjective in nature and can only be determined through careful evaluation of application requirements versus the capabilities of the solution. Hadoop is an open source platform which increases its ﬂexibility and adaptability to many problem domains since new capabilities can be readily added by users adopt- ing this technology. However, as with other open source platforms, reliability and support can become issues when many different users are contributing new code and changes to the system. Hadoop has found favor with many large Web-oriented companies including Yahoo!, Facebook, and others where data-intensive computing capabilities are critical to the success of their business. Amazon has implemented new cloudcomputing services using Hadoop as part of its EC2 called Amazon Elastic MapReduce. A company called Cloudera was recently formed to provide training, support and consulting services to the Hadoop user community and to pro- vide packaged and tested releases which can be used in the Amazon environment. Although many different application tools have been built on top of the Hadoop platform like Pig, HBase, Hive, etc., these tools tend not to be well-integrated offer- ing different command shells, languages, and operating characteristics that make it more difﬁcult to combine capabilities in an effective manner.
The Heartbeat Service periodically collects the dynamic performance information about the node and publishes this information to the membership service in the Aneka Cloud. These data are collected by the index node of the Cloud, which makes them available for services such as reserva- tions and scheduling in order to optimize the use of a heterogeneous infrastructure. As already dis- cussed, basic information about memory, disk space, CPU, and operating system is collected. Moreover, additional data are pulled into the “alive” message, such as information about the installed software in the system and any other useful information. More precisely, the infrastructure has been designed to carry over any type of data that can be expressed by means of text-valued properties. As previously noted, the information published by the Heartbeat Service is mostly con- cerned with the properties of the node. A specific component, called Node Resolver, is in charge of collecting these data and making them available to the Heartbeat Service. Aneka provides different implementations for such component in order to cover a wide variety of hosting environments. A variety of operating systems are supported with different implementations of the PAL, and differ- ent node resolvers allow Aneka to capture other types of data that do not strictly depend on the hosting operating system. For example, the retrieval of the public IP of the node is different in the case of physical machines or virtual instances hosted in the infrastructure of an IaaS provider such as EC2 or GoGrid. In virtual deployment, a different node resolver is used so that all other compo- nents of the system can work transparently.
SDN has two main advantages over traditional networks in regards to detection and response to attacks: (1) the (logically) centralized management model of SDN allows administrators to quickly isolate or block attack traffic patterns without the need to access and reconfigure several heterogeneous hardware (switches, routers, firewalls, and intrusion detection systems); (2) detection of attacks can be made a distributed task among switches (SDN controllers can define rules on switches to generate events when flows considered malicious are detected), rather than depending on expensive intrusion detection systems. SDN can also be used to control how traffic is directed to network monitoring devices (e.g., intrusion detection systems) as proposed in . Quick response is particularly important in highly dynamic cloud environments. Traditional intrusion detection systems (IDS) mainly focus on detecting suspicious activities and are limited to simple actions such as disabling a switch port or notifying (sending email) to a system administrator. SDN opens the possibility of taking complex actions such as changing the path of suspicious activities in order to isolate them from known trusted communication. Research will focus on how to recast existing IDS mechanisms and algorithms in SDN contexts, and development of new algorithms to take full advantage of multiple points of action. For example, as each switch can be used to detect and act on attacks,  has shown the improvement of different traffic anomaly detection algorithms (Threshold Random Walk with Credit Based rate limiting, Maximum Entropy, network traffic anomaly detection based on packet bytes, and rate limiting) using Openflow and NOX by placing detectors closer to the edge of the network (home or small business networks instead of the ISP) while maintaining the line rate performance.