• No results found

The Advantages of Cloud Computing

N/A
N/A
Protected

Academic year: 2021

Share "The Advantages of Cloud Computing"

Copied!
9
0
0

Loading.... (view fulltext now)

Full text

(1)

Data Center Knowledge Guide to

Top Considerations in Today’s IT Infrastructure

A look at the Private Cloud, Big Data, and Deploying

a Cloud-Ready Data Center

Bill Kleyman,

Cloud and Virtualization Architect

April 2013

Brought to you by

(2)

Introduction

There is a revolution happening in the IT world. Organizations are looking to new technologies to help them better align their business goals and compete in the increasingly competitive market-place. Growing demands by C-level executives are pushing IT environments to adopt new tech-nologies which can off er real world solutions to sometimes very large problem sets. This is directly leading to the development of new cost eff ective, energy effi cient and expandable infrastructures. Every day there is more information to analyze, more data to process and more users to satisfy. Organizations are looking to place their environ-ments into the cloud with the hope for greater ef-fi ciency and scalability. There continues to be great debate in the public versus private cloud environ-ment. Some organizations look to the public or hy-brid cloud models, while many other are still fully invested in supporting and expanding their own private cloud solutions. Due to the ever increas-ing data stockpile, there are now new challenges for these organizations. IT departments must ask themselves “How do we control the cloud, big data and our ever-growing need to maintain an effi cient data center infrastructure?” This white pa-per will analyze three top infrastructure concerns when it comes to the modern IT environment.

Section 1 – The Power of a Private Cloud

Deploying a powerful private cloud

Understanding the diff erence: Private vs. Public

Management and security

Sizing your private cloud environment

Section 2 – Controlling Big Data

Big data trends

Management engines

Big data analytics

A look at the distributed fi le system Section 3 – Creating a Cloud-Ready Data Center

Utilizing effi cient technologies

Considerations around data center design

Infrastructure management & maintenance

“Data center to cloud” in one day

As the data center continues to evolve into an in-tegral part of any organization — more IT adminis-trators will strive to design an environment around effi ciency and scale. Whether your organization is working with cloud computing or big data — having the right infrastructure in place is crucial for success. This means deploying the right servers, within the right platform, and creating a system that is capable of growth. There is a direct need to create data centers which can truly call themselves home to both private cloud technologies as well as newer platforms concentrating around big data. There are many vendors available to help design the infrastructure of tomorrow — it’s vital to un-derstand which specifi c features and solutions can draw the biggest benefi t. This is where new, innovative technologies from companies which are developing products specifi cally designed for the private clouds and big data IT departments. One such company, PSSC Labs® (www.pssclabs. com) off ers cost eff ective, energy effi cient,

high-Summary

The modern data center has changed. There are new demands around cloud computing, big data and infrastructure distribution. As with any major technology - the delivery process starts at the data center; and more specifi cally, the server infrastructure. Furthermore, this change in the data center is being driven by more users, more data and more reliance on the data center itself. With private cloud technologies and big data leading the way, working with the right data center server platform as well as the right cloud technologies has become more important than ever. In this white paper, we look at the key considerations around today’s IT environment and where the data center and server platform play a role.

(3)

The ability to use the Internet to help distribute data over vast distances has been around for some time. However, the idea around cloud computing has only become a reality over the past few years. The two biggest cloud models in the industry also off er very specifi c benefi ts and drawbacks. Public clouds may initially appear attractive, however there certain elements IT administrators need to be aware of:

Unknown cost structures.Relinquishing control.Public data center lock down.Regional site resiliency issues. Poor performance and limit

resource allocation.

Every day more companies are learning that rent-ing computrent-ing time on a private cloud can be signifi cantly more expensive than deploying their own private cloud infrastructure. In working with PSSC Labs®, several organizations conducted their own audit of public versus private cloud expenses. The fi ndings are not surprising: Renting time on a

private cloud is nearly 500% more expensive when considering the lifespan of a computer system.

A private cloud allows companies to retain control of business critical data. It seems every week there are new stories of unauthorized break-ins by hack-ers into large public cloud environments. Working with a private cloud model has direct benefi ts in that your organization can continue to expand while controlling the data that fl ows through the data center as well as keeping that data protected. Organizations are able to develop an infrastruc-ture capable of great performance and scale. By deploying a private cloud infrastructure, compa-nies are supporting more users, more functions and adding more business value with signifi cantly lower expenses. Private clouds are able to accom-plish this by “right sizing” their computing plat-form. In addition, a private cloud allows for greater dedicated computing performance.

To create a robust and agile private cloud infra-structure — IT administrators must surround themselves with technologies specifi cally en-gineered to support their user and computing environment needs.

It is important to always make sure to understand what you are trying to deliver and accomplish. Then you can plan around growing your environ-ment and ensuring that your platform is capable of scaling with your organizational needs. In a public cloud environment, such scale can be very costly. However, when sized properly, private cloud com-puting models can have a lot of room for growth.

SECTION 1 – The power of a private cloud

Below are a few examples of

practical uses for private cloud

technologies:

Virtual desktops and

applications.

Files and data services.

Private cloud portals and

collaboration spaces.

Disaster recovery functions.

Branch offi ce extensions.

Compliance or regulatory-based

data delivery.

High performance computing

resources for design and

engineering.

(4)

Sizing your cloud

In creating a robust cloud environment, you must fi rst understand what computing resources you are trying to deliver. Simple application virtualiza-tion may not require as much horsepower power. However, virtual desktop infrastructure (VDI) does. Depending on your workload, where your user is located and they types of servers you use — your cloud model can be capable of great scale and agility. Remember, with each type of workload, there will be resource requirements that will need to be met. For example, VDI will require a lot of dedicated RAM and a solid shared storage infra-structure. Furthermore, it’s highly recommended that for highest amounts of density that you work with a scalable blade infrastructure. These blades must be able to handle user loads, applications and maintain high amounts of uptime while at the same time off ering energy effi ciency. These requirements show just how much need there is for highly resilient and power aware blade systems and server platforms. Exciting new technologies from companies, such as PSSC Labs®, take compo-nent functionality, energy effi ciency and system reliability to a whole new level. The company’s PowerServe servers are considered one of the industry’s most reliable pieces of hardware with a fi eld component failure level of less than .001%. For the administrator this means less management overhead and a lot of more uptime. In addition, recent head to head tests have shown PowerServe servers to consume about 40% less energy and product 1/3 less heat than comparable off erings from other manufacturers.

High-density computing

As mentioned earlier, a good private cloud design will be able to fi t numerous users — effi ciently — over a set of high-density servers. For almost any cloud deployment looking to house a large number of users and workloads — working with a blade environment is usually the right way to go. Now, the important part here is working with the right server technology that can best align

nent will have to be built around the latest multi-core process platform and support a large amount of RAM. Furthermore, these machines must be easy to manage and administer. Why? With such a large amount of users, blades must have the capability to be swapped out quickly or repaired on the spot. This is why using a blade environment with a “tool-free” design is a critical consideration. For example, the PSSC Labs® POWERSERVE DUO T2000 is, literally, designed without a backplane. This means that each server is totally independent and maintenance is simplifi ed since administrators can access the physical blade much faster. Further-more, within a single 1U form factor — administra-tors are able to leverage two, completely indepen-dent servers. This new type of server platform not only off ers you twice the density but you now have the freedom to design the computing platform which directly suits your IT environment. You can mix and match confi gurations to best suit appli-cation needs and budget. In a direct comparison with the leading competitor, the PowerServe Duo T2000 server is the only hardware platform which can allow for tool free access to system compo-nents, deliver power reduction software to limit power utilization, and provide remote log in sup-port directly from the manufacturer.

The cloud revolution will only continue to ex-pand. As more organizations jump on the cloud computing bandwagon, they’ll be able to lever-age even more benefi ts of a widely distributed, highly-connected environment. They key point to understand is this: Build around intelligent cloud control and scalability. By planning around needs for both today and the future — your organiza-tion can continue to leverage the full power of the cloud. As mentioned earlier, working with effi cient and highly scalable computing systems can only

(5)

There is a direct correlation between cloud com-puting and the growth in the big data market. There are more users, many more connections to the Cloud, and many more new types of devices. In fact, the average user may now utilize 3-5 devices to access a corporate data center. As more devices come online and connect to the cloud, there will, unquestionably, be more data that has to be analyzed and managed.

Let’s analyze some numbers:

According to IBM, the end-user community

has, so far, created over 2.7 zettabytes of data. In fact, so much data has been created so quickly, that 90% of the data in the world today has been created in the last two years alone.

As of 2012, there is over 240 Terabytes of

information that has been collected by the US Library of Congress.

Facebook processes over 220 billion photos

from its entire user base. Furthermore, they store, analyze, and access over 32 Petabytes of user-generated data.

In 2012, the Obama administrators offi

-cially announced the Big Data Research and Development Initiative. Now, there is over $200 million invested into big data research projects.

In a recent market study, research fi rm IDC

released a new forecast that shows the big data market is expected to grow from $3.2 billion in 2010 to $16.9 billion in 2015.

Finally, in the 2011 IDC Digital Universe

re-port, the numbers really hit home. The report indicates that over 1.8 zettabytes of data were created that year alone.

That’s a lot of data! To help control big data initia-tives, organizations must deploy solutions which are capable of storing the massive data sets as well as providing necessary analysis to support com-pany initiatives.

A look at open-source technology and

the distributed fi le system

Executives, managers, and infrastructure admin-istrators must gain control of their data early and quickly. There are key benefi ts in being able to quantify and control your data points. Not only will you be able to correlate large amounts of informa-tion — you will be able to learn so much more about your customer base and how to better serve them. Taking the conversation even further, new types of big data fi le systems make the big data management process even easier. For example, the Hadoop Distributed File System (HDFS) has taken distributed big data management to a whole new level. Already, HDFS is a scalable, portable fi le system ready for a distributed data infrastructure. Capable of handling numerous diff erent jobs and tasks, HDFS has also recently added high-availabil-ity capabilities, allowing the main metadata server to be failed over manually to a backup in the event of failure. There are many other open-source big data products out there as well. Solutions form MongoDB, HBase and Hadoop are all striving to help organizations better correlate and under-stand large data sets. Remember, these engines require a lot of storage horse power to house and crunch vast amounts of information. It’s important to provide the right type of analytics solution from a hardware perspective to allow these big data engines to truly perform.

Hardware needs for quantifying big

data and deploying analytics

it’s storing vast amounts of big data or running ad-vanced analytics against those big data sets — it is imperative to have the right hardware solution in place to handle some serious computing requirements. There are two hardware platforms, specifi -cally, which are engineered specifi cally for Big Data environments with massive storage capacity and enough processor performance available to crunch this data. The PSSC Labs® CLOUDSEEK 1000xS has a hardware platform which provides

Section 2 – Controlling big data

(6)

some of the most powerful capabilities around analyzing both structured and unstructured data. Compatible with HBase, MongoDB, Cassandra and Hadoop, the CLOUDSEEK server provides a lot of power in a 1U form factor. By holding up to 10 SSD/SATA III/SAS disks, administrators are able to utilize large amounts of dense power to help quantify massive amounts of data.

From a big data storage and effi ciency perspec-tive, one of the top performing server platforms is the CLOUDOOP 12000 system. Currently, it is the only platform supporting two Xeon® E5 series processors, 128 GB ECC memory and over 48 TB of storage space in just 1U of rack space. Further-more, PSSC Labs® certifi es the system compatible with Hadoop, Cloudera, MapR & HortonWorks. With space for twelve 3.5” enterprise hard drives, this system not only controls and process large amounts of data — it does so in a highly effi cient manner. Currently, this platform is capable of work-ing with less than 6 watts of power per terabyte of storage space used. This type of effi ciency not only helps control big data — it also helps control data center server utilization costs.

Big data challenges

In working with big data, it’s important to under-stand that these platforms are new and can have their limitations. For example, HDFS cannot be mounted directly by an existing operating system. In this scenario, administrators would have to use a virtual fi le system (FUSE for example) to get into information out of the HDFS data table. Remember, there are plenty of options out there.

Even now, there are numerous other big data fi le systems being developed to help organizations control their large data sets. For example:

Google File System (GFS)Amazon S3 File System CloudStore (formerly known

as Kosmos Distributed FS)

MapR M3 and M5

There are also some challenges around big data standardization and open-source licensing. Since so many models are built on open-source plat-forms — creating a standard has been diffi cult. Furthermore many IT administrators are still only learning the language of big data. Nevertheless, by deploying a solid underlying hardware infra-structure, your organization will take the fi rst — very much required — steps in designing a scal-able and robust big data and analytics platform. There are new projects in the business world as many database and IT managers work diligently to control their large data sets. Fortunately, there are options out there for many organizations. Big data platforms are truly making a diff erence for those organizations that have to face the big data reality. Hadoop, as mentioned above, has become the unoffi cial standard behind big data management engines. Still, these platforms are very new and require new types of intelligent hardware systems to deploy and manage. There needs to be that un-derlying horsepower which is capable of analyzing and quantifying data at high rates. Data process-ing aside — more organizations are also workprocess-ing around data center effi ciency in parallel with their big data initiatives. In working with technologies like the PSSC Labs® CLOUDOOP 12000 and Cloud-Seek 1000xS — administrators are able to not only deploy super-fast big data hardware; they can do so very effi ciently. This, in turn, not only helps your organization meet big data analysis requirements — but it also helps design a data center built around effi ciency.

(7)

The modern IT landscape has shifted towards the data center infrastructure. At the heart of the cloud, big data, and — in reality — any organiza-tion is the data center environment. More busi-nesses are reliant on the functions of the core data center platform to help them function on a day-to-day basis. With more workloads, a lot more virtualization and the addition of cloud comput-ing; more demands are being placed on the data center infrastructure. These new needs don’t only revolve around greater amounts of computing power. There is a direct need to deploy high-densi-ty computing systems in a very effi cient manner.

Designing a data center ready for

cloud and virtualization

Cloud computing and virtualization both play a key role in the modern data center. Now, organiza-tions are using both the cloud and virtualization technologies to help improve user density and deploy scalable environments. This is why design-ing a cloud-ready data center is so important. The idea is to meet the needs of your organization both today and in the future. Very few vendors can deliver a true all-in-one solution where cloud computing and virtualization are made easy.

For example, the Status Cloud from PSSC Labs® strives to accomplish specifi cally that. Built as a custom confi gured, high performance computing solution — the platform is designed specifi cally to meet the computing needs of the data center as well as next generation web applications. This type of powerful private cloud data center model contains:

300 Intel Xeon 2.5 GHz E5 Cloud Series

CPU Cores

800 GB Total High Performance ECC

System Memory

• 2.67 GB Memory Per Processor Core

CUDA Enabled GPU Options Available20+ TB of USABLE Storage Space

• 20 TB Confi gured for RAID Data Protection • 250 GB Scratch / Data Space Per Node

Gigabit High Performance Network

Backplane

• Management & Monitoring Network Included

• 10GigE & Infi niband Options Available

All necessary Rack, Power & Network

Infrastructure

CBeST® Cluster Management Toolkit

• Easy Management, Monitoring & Upgrades • CPR Recovery Feature Included

Rack & Roll Integration for True Plug & Play

Operation

Three Year Complete Warranty Coverage

These private cloud data center designs are built around high scalability and growth in mind. By deploying an intelligent and highly agile private cloud environment — organizations are able to go from ground to cloud very quickly.

Section 3 – Creating a Cloud-Ready Data Center

Organizations are using both the cloud and virtualization technologies to help improve user density and deploy scal-able environments. This is why designing a cloud-ready data center is so important. The idea is to meet the needs of your organization both today and in the future.

(8)

“Data center to cloud” in one day

Building a private cloud isn’t always an easy task. In fact, there is a lot of research that must take place both from a hardware resources perspective as well as understanding the types of workloads to be deployed. Whether it’s staff -related or just a lack of experience, your organization does not have to watch the private cloud industry pass you by. In fact, there are partners and vendors who are capable of not only delivering a fully func-tional private cloud solution — they can help you control, maintain and manage it as well. In working with PSSC Labs® and their data center to cloud ini-tiative, organizations can leverage a true turnkey private cloud solution.

Custom server design, manufacturing and

deployment.

• Ready for private cloud and big data processing.

• Designs with processor, memory, storage, networking and operating systems included. • Creating an environment with TCO in mind

Server integration and testing.

• BIOS setting standardization • MAC address reporting for easy PXE

boot and image deployment

• Network confi guration prior to shipping • Custom operating system and software

image installation • Custom scripting

• Power measurement reporting at idle and full load

Rack integration and installation.

• Servers

• Network switches

• Power connectors (UPS and PDU) • Server, network cable, and power

cable labeling

Delivery logistics.

• Planned global delivery • Onsite worldwide installation

Deployment, support and monitoring.

• Global support capabilities • Daily monitoring via built-in IPMI

The power of this cloud model is that it can be cus-tom built, provisioned, delivered and installed with minimal eff ort. Furthermore, this type of private cloud can grow with the needs of the organization since the technology is directly designed around scale. Remember, you don’t have to approach the private cloud infrastructure model alone — there are options which can take you from bare metal to private cloud very quickly, and very easily.

The focus around the data center will only contin-ue to grow. Many IT environments see that a public cloud model just isn’t right for them. In those cases, they look to deploy private cloud environ-ments which are capable of supporting the needs of the organization. The power of the private cloud will directly revolve around the effi ciencies and good practices built around a solid data center deployment. The reality is simple — almost all organizations have, in some way, been looking at or deploying a cloud model. More often, however, these are private cloud platforms controlling remote users, cloud workloads and branch offi ces.

(9)

In creating the perfect private cloud, administra-tors must work with technologies that can scale with the demands of their organization. This is where working with PSSC Labs® and their “data center to cloud” model can push your organiza-tion from bare metal to cloud computing. The simplifi cation of the actual manufacturing, testing, integration, and management solution allows organizations to design their cloud, have it deliv-ered, and watch it go live.

Remember, these designs are built around direct scalability and effi ciency. Furthermore, your data center can be built to support a robust private cloud while still lowering the total cost of own-ership. In creating a private cloud model built around smart technologies PSSC Labs® took power effi ciency, high reliability and easy to service design into the TCO measurement. From this, the results are that each server gores through rigor-ous testing and produces the industry’s lowest failure rate. Recent comparisons have proven that PSSC Labs® servers consume nearly 40% less power than other manufacturers. This means data center effi ciency, less overhead costs, and greater power usage eff ectiveness (PUE).

Conclusion

The future holds even more advancements around cloud computing and data center design. More users are going to fl ood the Internet and more devices will be requesting connectivity. This means that even more data is going to be developed in the near future for data center environments to house and analyze. There is a new type of infra-structure being built which can facilitate rapid data delivery and high levels of scalability. The only way to design these types of platforms is to use tech-nologies which are capable of such agility. High-density computing is perfect for the private cloud while similar technologies can go on to further help by analyzing big data components. Already, there is a lot of information available within the cloud. This data is very valuable and can help your organization better understand the user and your business environment. To better quantify this valuable data — there needs to be an effi cient platform in place that can quickly crunch these numbers. In working with highly scalable comput-ing systems, your data center can continue to grow while meeting the direct needs of your business. With more users, more devices, and much more cloud – your organization will need to design an IT environment which is easier to manage. Further-more, in working with highly effi cient technolo-gies, data center can be ready for the private cloud, big data and of course — cloud computing. Recent comparisons have proven that

PSSC Labs® servers consume nearly 40% less power than other manufacturers. This means data center effi ciency, less overhead costs, and greater power usage eff ectiveness (PUE).

References

Related documents

Policy and Program Support Sakina Thompson Human Resources Officer Wayne Swann Interim Chief Information Officer Boyle Stuckey Capital Operations Project Manager

FIGURE 4 | Bayesian phylogenic tree based on 16S rRNA gene of LGMB461, LGMB465, the 10 type strain of Microbispora genus, and 7 strains previously reported by Savi et al.. Values on

En general, hoy, gracias a los nuevos enfoques existe todo un universo de líneas y temas de investigación dentro de la historia militar en España, como el de la violencia

• Emil Traces include information about site names and Emil Traces include information about site names and network quality.

The differences estimated, namely lower values for both growth parameter and asymptotic length for Costa beach, were not due to a lack of larger size classes in any of the

You can also program CVs that control momentum, 3 step and 128 step speed tables, switching speed, normal direction of travel, scalable speed stabilization and more to take

Generic means the ability to model concept semantics in order to translate the license ex- pressed in generic terms into more specific terms compliant with the specific standards

Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as