• No results found

The Efficiencies of Private Cloud for Heterogeneous Database Enterprise Workloads

N/A
N/A
Protected

Academic year: 2021

Share "The Efficiencies of Private Cloud for Heterogeneous Database Enterprise Workloads"

Copied!
11
0
0

Loading.... (view fulltext now)

Full text

(1)

WHITE PAPER

The Efficiencies of Private Cloud for Heterogeneous

Database Enterprise Workloads

Sponsored by: EMC

Henry D. Morris

Carl W. Olofson

Robert P. Mahowald

February 2015

IDC OPINION

Most larger enterprises today have datacenters that contain large numbers of database applications, some transactional and some analytical, each with its own database, and each database with its own dedicated storage. Typically, the databases are managed by a variety of brands of vendor software, which may include Oracle, IBM DB2, and Microsoft SQL Server. In many cases, the server and storage hardware is dedicated and optimized to the database in question. As the number of databases grows, so do the number and variety of database servers and the array of dedicated storage assets they control. The result is an environment that is complex and increasingly costly to administer in today's world of rapid business expansion and change.

One response to this problem is to move to a converged/integrated IT infrastructure of server storage and networks delivered as a private cloud architecture. The goal is to manage physical resources across many databases and applications while boosting asset utilization and administrative agility. Doing this well, however, requires a strategic approach to managing database infrastructure, which unravels some of the complexity involved in managing many databases. Hence enterprises should consider the following:

 Seek to manage the servers and storage across multiple workloads, applying operational policies for maintenance and upgrades across the DBMS brands, thereby simplifying operational tasks and procedures.

 Adopt an approach to storage management that uses facilities that can manage stored data and address required storage system maintenance, backup, and disaster recovery across DBMS brands rather than preserving the complexity by using the tools and utilities of each DBMS vendor for each DBMS brand.

 Identify a key vendor that can provide support for such a strategic approach, one that can offer reliable management of each of the DBMS brands in question.

(2)

Database, system, and storage administrators are dealing with increasingly complex environments involving a variety of DBMSs and other technologies that require care and feeding. As enterprises move to the cloud, they are faced with a choice of segregating workloads into specialized systems (vertical integration) or building a converged/integrated private cloud environment that enables resources to be shared by a variety of RDBMSs and other software while managing these resources together using common management facilities (horizontal integration). In this white paper, we consider the purpose of adopting a cloud architecture from a database management perspective and the benefits of a horizontal approach to the integration of infrastructure with respect to delivering the resources required by the database for its operation, backup, and recovery.

TOWARD A PRIVATE CLOUD ARCHITECTURE SUPPORTING

HETEROGENEOUS DATABASES

According to IDC's 2014 Cloud Maturity Survey (sponsored by EMC) of enterprise IT organizations using or considering using cloud computing, 45% of production workloads are expected to be deployed in private clouds by 2015 (see Figure 1). The key objective of these cloud deployments is to enable a realignment of IT capabilities with business priorities, "crossing the chasm" in the parlance of maturity modeling. Making the right decision for your organization requires an assessment of its needs in terms of the underlying infrastructure and the range of applications/databases that support business.

FIGURE 1

Plans to Deploy Production Workload into Public and Private Clouds

Q. What percentage of your organization's production workloads do you expect will be deployed in the following environments by 2015?

Base = enterprise IT organizations using or considering using cloud computing

Source: IDC and EMC's Cloud Maturity Survey, September 2014

Private cloud (45.3%)

Public cloud (23.0%) Noncloud

(31.7%)

(3)

Fundamentals of Private Clouds

Private clouds (also called on-premise private clouds or enterprise private clouds) represent an evolution in IT infrastructure and applications and database deployment. A self-run private cloud is a cloud service that an enterprise owns and operates itself, within its datacenters. Customers build highly virtualized services with elastic pools of storage, networking capacity, and analytical

horsepower, but they must also take a further step by detaching application stacks from any specific infrastructure environment to support improved workload mobility and liquidity. Private cloud

represents a desire for enterprise IT teams to operate more like a megascale public cloud provider, with a focus on achieving the following benefits:

 Lower operational risk through the faster building of services that users need, ensuring higher utilization

 Better scalability than conventional deployment — by quickly creating customized server instances with exact memory, storage, and CPU resources

 Faster access to new application and database resources

Building and operating self-run private clouds are not trivial tasks, however. Simply virtualizing, and automating, the provisioning of a predefined workload does not make a datacenter a private cloud. At a minimum, there must be a self-service environment for automated provisioning across infrastructure (IaaS), platforms (PaaS), and applications (SaaS). Private clouds require granular consumption metering, the ability to provide a chargeback by business unit and often by user, and a predefined service catalog that ties into the end-user directory to provide policy-based access to services. Building a sustainable cloud that can address current and future needs without major time-consuming alterations or upgrades also requires adoption of standardized infrastructure (compute, storage, and network). Private clouds linked through infrastructure software that delivers automation, orchestration, and monitoring provide enterprises with a foundation that can deliver elastic scaling and pooling, functioning like a utility for database and application workloads.

Private Cloud Is the Foundation for Boosting Strategic Workloads

In the 2014 Cloud Maturity Survey, IDC probed further to determine the relative priority of moving various types of workloads to either private or public cloud. Private cloud was the most popular option across a wide range of strategic production workloads (see Figure 2).

(4)

FIGURE 2

Workload Deployment Plans for 2015

Q. For the following types of workloads, indicate your organization's plans for where the majority of these workloads will be deployed by 2015.

n = 564

Base = enterprise IT organizations using or considering using cloud computing

Source: IDC and EMC's Cloud Maturity Survey, September 2014

The number 1 factor that accelerates an organization's use of cloud is a rapid return on investment, minimizing the complexity and cost of selecting the underlying infrastructure and essentially boiling technology decisions back down to financial benefits to the enterprise. But financial benefits include more than just cost because companies consider cloud an increasingly business-critical technology that is quickly changing the way companies and institutions evaluate, procure, and deploy IT assets.

0 20 40 60 80 100

Mobile Logistics, inventory, and supply chain Revenue-generating cloud services eCommerce Collaboration/end-user productivity App dev/test Disaster recovery ERP (finance, HR, and CRM/SFA) On-demand compute/storage VDI Database/big data/analytics Content serving/archiving

(%) Private cloud Public cloud

(5)

IT departments are looking to operate in an environment that is focused on service delivery and more predictable expenditures. In the process, enterprise IT will transition to a shared, dynamic, automated, elastic IT environment that is delivered through a private, public, or hybrid cloud. The mature cloud-centric IT organization will achieve improved application performance, greater staff productivity, more operational cost efficiency, and increased opportunities for business innovation. An area where this is especially important is database management.

Storage and Database Management in the Cloud

Today's typical enterprise application topology is one of multiple silos of databases and production applications — Microsoft Exchange and Microsoft SharePoint have distinct data architectures, as do Oracle 12c, SQL Server, SAP HANA, and file stores. For each of these workloads, there are multiple versions of the software spanning the past 8–10 years. Based on past architectural choices and deployment practices, IT organizations are typically managing multiple silos of storage as part of each application stack. Each storage silo is designed to serve IT operational needs (performance,

provisioning, backup, data availability, and protection), but because of the variety of applications, and the storage choices made over the years, it's very difficult for customers to get economies of scale across these vertical applications and data stores.

Horizontal Architectures

One solution gaining traction is a standardized, horizontally built infrastructure with a heterogeneous data store supporting multiple applications. This approach is based on the convergence (integration) of network, server, storage, and management that enables automated infrastructure provisioning and management across all applications. This approach is predicated upon virtualized applications — virtualization allows the data infrastructure to share resources across applications, a significant plus, but it also affords the more cost-effective use of infrastructure because hardware deployed using a converged/integrated architecture can deliver better resource density.

This approach also yields greater agility because IT organizations are taking advantage of the ability to offload some resource-intensive functions from the applications onto the underlying data and storage infrastructure. Finally, it may provide for better relationships and faster communications between IT teams because the applications group can better retain control over scheduling and management of resources and doesn't slow down the infrastructure team with repetitive requests and so forth. The road to cloud will be different for all IT organizations, but a common approach we see is for business users to begin with a variety of non-mission-critical SaaS applications to plug a hole in their capabilities, or to simply move faster, without making a new "project" for IT. This period is quickly followed by IT testing projects — application development and testing, short-term storage, Web infrastructure, and so forth, sometimes just for "short, bursty" projects. At this point, a "chasm" of sorts is crossed, and most IT organizations begin to build a plan for rationalizing their portfolios, processes, and architectures — and this starts with private cloud.

(6)

For upper-midsize and larger organizations that begin with private cloud and have the required skills and physical assets, cloud principles — which may include an application virtualization layer using any hypervisor, metering and chargeback, monitoring and provisioning, policy-based service catalogs, scripting and automation, and user self-service — are applied.

These private clouds will also eventually be hybrid clouds — so IT architects want to design them with built-in extensibility to the public cloud. With an eye to breaking that siloed architecture that dominated what IDC calls the "2nd Platform" of client/server applications, these hybrid clouds start with a

foundation of storage, which increasingly is flash based or a tiered array including flash. Building the infrastructure in the private cloud to support key run-the-business applications — from ISVs such as Oracle, SAP, and Microsoft — provides a path for organizations toward eventual expansion into public cloud that can be done in an effective, policy-based way.

Database Administrator Experience

Today, as enterprises think about how to build out their private cloud architecture, they will look at performance, reduced asset cost and maintenance, and manageability, but they will also focus on improving the database administrator (DBA) experience with self-service tools. One of the most time-consuming DBA tasks is managing the storage tier, and there are many storage-centric options for monitoring, provisioning, and replicating databases. Today, application owners running Microsoft or SAP workloads can manage protection policies for backup, set up recovery policies in case of failure, and tune and adjust performance levels at the hypervisor layer, managing those applications

contained in that virtual machine (VM) instance. This is a good, but limited approach because DBAs tied to one application may not be fluent in the storage management and protection policies of other key applications and also because it becomes very complex to automate even routine policy-driven protection in a standardized way across all important virtualized application stacks.

Managing applications at the hypervisor level requires storage and virtualization vendors like EMC to create fluid, API-driven interconnects with the applications' corresponding management console. Managing complex storage configuration choices from within the familiar UI management console gives DBAs choices that correspond directly to their other management tasks and allows them to manage alerts, build reports, and monitor the health of the storage ecosystem, alongside the other application management tasks they do every day. This integration also gives DBAs more

independence by allowing them to request, provision, and manage storage for their applications as needed, without having to schedule these resources with the storage administration team. This level of management integration is a big step forward for DBAs and is just starting to become mainstream. Tomorrow's vision goes even further, exposing access and control of the storage tier through popular system management tools, applications, and databases from market leaders such as Oracle,

Microsoft, and SAP and allowing DBAs to address these tasks with a consolidated view of storage, abstracting most of the complexity of this environment. This consistent way of viewing, monitoring, and changing key storage and backup choices in a uniform way, across all the key run-the-business applications in large organizations, is the next frontier in hybrid cloud.

(7)

Stage 1 for most organizations is to build a test and development regimen, and as organizations move to put production applications in the cloud, all-flash arrays like EMC's XtremIO address this need for low latency and high IOPS, with a focus on serving the DBA. DBAs across multiple environments can get predictable performance with automated tuning, high availability up to 5-9s, active-active

continuous operations, and the simplicity of requiring no capacity planning because all-flash storage supports many different databases and file systems. Also, to simplify a DBA's task, organizations building out private clouds will increasingly opt for all-flash arrays and look at storage as an integrated part of the application stack, not as a separate tier to manage independently.

ADMINISTRATOR CHALLENGES AND PRIORITIES

Administrators at large enterprises are interested in simplifying day-to-day database, system, and storage administration to free up resources for mission-critical activities such as application optimization or assisting in the development of new applications. But the fact is they spend most of their time managing a wide variety of tasks to ensure that applications operate at required performance and availability levels and also providing the needed integrations between enterprise databases, other data sources, and applications within the enterprise or running in the public cloud.

Because the databases are driven by software from different vendors, and each vendor's package has its own requirements, all database management, including operations, backup, and recovery, is done differently for each DBMS vendor product. On top of this, users have, based on DBMS vendor recommendations, tended to segregate storage resources for these vendor DBMS products and manage them separately as well.

While it is the case that some database operations require vendor-supplied utilities, a number of them, especially those related to database storage, can be generalized. Standardizing on a single storage system, and then using the operational facilities for that system that support heterogeneous database management operations, can greatly simplify datacenter operations where databases are concerned. One result of this is that users can enjoy more flexibility in assigning resources to databases because they are managed in common and can add databases as needed without greatly increasing the operational complexity of the datacenter.

PRIVATE CLOUD AND DBA BENEFITS

The horizontal integration approach to system resource management in a private cloud involves assigning physical resources for databases from common pools of servers and storage rather than using bespoke technology that serves just one vendor's DBMS. This approach enables an IT

organization to still optimize for specific database applications but manage in common the underlying resources across different database vendors or even database versions. This delivers IT agility, high resource utilization, and simplified management through the ability to scale performance, protection, and availability policies across the private cloud to any and all supported databases. To an enterprise

(8)

At the same time, IT architects recognize that standardized infrastructure components provide a way for all database assets to be integrated within a common and standard infrastructure design, which makes ongoing operations and maintenance/update of all these configurations more efficient. Though it is important for IT organizations to have infrastructure vertically integrated but also open to multiple database types, they should also seek solutions that can be standardized at the infrastructure level across networking, servers, and storage.

ADDRESSING HETEROGENEITY

For enterprises that use multiple vendors' DBMSs in their datacenter, moving to the cloud begs the question: Should we standardize on one DBMS or remain heterogeneous? Each DBMS vendor would like to see its customers standardize on its product and argues that this is the most effective way to set up a private cloud. There are three important reasons this may not be a good idea for many enterprises:

 Database conversion and data migration are costly and error prone and involve considerable risk. Data must be converted, applications must be changed, and, in some cases, stored procedures must be rewritten.

 Some packaged application vendors will not support such a conversion in any event, forcing the enterprise to either stay with the present configuration or change applications, which invites another set of risks and problems.

 Moving to one vendor for database management, including control of all supporting operations, is seen as creating an unhealthy dependency on a single vendor and can limit future technical choices as new data management problems emerge.

The choice to remain heterogeneous presents its own questions, and one of the most important questions is, "Should we segregate our resources (including servers and storage) by DBMS brand or establish a cross-brand resource management policy based on a common set of infrastructure?" The former means that resource management tasks will be specialized for each DBMS brand, whereas resource management in the latter case can be standardized to a single set of management facilities and tasks. Also, the latter course offers the opportunity to coordinate operations so that, for instance, one can ensure that in a system recovery scenario, the databases from all ISVs are synchronized.

EMC CONVERGED AND CLOUD INFRASTRUCTURE

For the purposes of this study, IDC focuses on converged/integrated systems and supporting cloud infrastructure software delivered by EMC. EMC's solutions include the following technology components:

 Servers: Physical servers provide "host" machines for multiple virtual machines or "guests."  Virtualization: Virtualization technologies for servers, storage, and networks abstract physical

(9)

 Storage systems: SAN, network-attached storage (NAS), and unified systems provide storage for primary block and file data, data archiving, backup, and business continuity. Advanced storage software components are utilized for data replication, data movement, and high availability.  Network: Switches interconnect physical servers and storage. Routers provide LAN and WAN

connectivity. Additional network components provide firewall protection and traffic load balancing.  Cloud management/orchestration: This includes cloud infrastructure management

orchestration, configuration management, performance monitoring, resource management, and usage metering for compute, storage, and network resources.

 Security: Components such as encryption, authentication, and retention ensure information security and data integrity, fulfill compliance and confidentiality needs, manage risk, and provide governance.

For EMC, Vblock Systems, which integrate server, network, storage, and virtualization technologies from Cisco, EMC, and VMware, are the foundation upon which many of these private cloud systems are based.

IT organizations that choose to standardize on a solution like Vblock can more quickly deploy shared pools of cloud infrastructure resources, which can be intelligently provisioned and managed in support of a wide range of critical databases and applications (see Figure 3).

FIGURE 3

Vblock Systems for Private Cloud

CONVERGED NETWORK, SERVER & STORAGE INFRASTRUCTURE

Heterogeneous Databases & Applications

Exchange SharePoint SAP Oracle 12c SQL Server SAP HANA

Standardize IT For Mixed Environments

SINGLE POINT OF

INFRASTRUCTURE MANAGEMENT

FLEXIBILITY TO SUPPORT MIXED DATABASES & APPLICATIONS SUPPORTS PHYSICAL, VIRTUAL & MIXED ENVIRONMENTS

(10)

OPPORTUNITIES AND CHALLENGES

There is a broad opportunity to support heterogeneous database environments — providing the management and operations constructs and methods to meet the new challenges facing database administrators and datacenter managers. This is a widely held strategy in enterprises today and has the potential to be more broadly adopted in future. New acquisitions by an enterprise, new line-of-business application requirements, or the emergence of new data management technologies may lead to a heterogeneous environment down the road. The ability to leverage a shared infrastructure across multiple database systems, in response to usage requirements, is beneficial to enterprises that are looking to keep their options open.

But heterogeneity brings challenges as well to managers of database environments. Take database compression as an example. Database suppliers are providing support for data compression and decompression within the database. At the same time, there are infrastructure solutions in the marketplace that support data compression and decompression in the storage layer. Is more compression (at multiple layers) beneficial, and how are these processes to be coordinated? The providers of infrastructure in support of heterogeneous database environments must work through such integration issues with each of the supported databases to maintain the value proposition of flexibility and integration.

CONCLUSION/SUMMARY

The world of IT is in the midst of a massive structural shift from the PC and client/server application-based 2nd Platform that dominated the past 25 years to what IDC calls the 3rd Platform that extends the use of information based on a foundation of mobile computing, social media, big data and analytics, and cloud technologies. Collecting, manipulating, sharing, and exploiting data via 3rd Platform–based applications will dominate IT investments in the next decade.

The IT organization that can take control of what is, in many cases, a confused and chaotic situation involving multiple types and classes of servers and storage dedicated to various transactional and analytic applications will be in the best position to capitalize on this shift. For many organizations, the path forward involves deploying a converged/integrated private cloud architecture that delivers greater agility and higher rates of utilization through techniques such as virtualization and elastic scalability of IT resources.

Some organizations will adopt specialized converged/integrated systems that are optimized for a specific database middleware or analytic software solution. For any organization that wants to deliver agility and efficiency across multiple software environments while also establishing a coherent management strategy across platforms, the best choice will be a converged/integrated system like EMC's Vblock that can address the requirements of a specific software package while also enabling

(11)

About IDC

International Data Corporation (IDC) is the premier global provider of market intelligence, advisory services, and events for the information technology, telecommunications and consumer technology markets. IDC helps IT professionals, business executives, and the investment community make fact-based decisions on technology purchases and business strategy. More than 1,100 IDC analysts provide global, regional, and local expertise on technology and industry opportunities and trends in over 110 countries worldwide. For 50 years, IDC has provided strategic insights to help our clients achieve their key business objectives. IDC is a subsidiary of IDG, the world's leading technology media, research, and events company.

Global Headquarters

5 Speen Street

Framingham, MA 01701 USA

508.872.8200 Twitter: @IDC

idc-insights-community.com www.idc.com

Copyright Notice

External Publication of IDC Information and Data — Any IDC information that is to be used in advertising, press releases, or promotional materials requires prior written approval from the appropriate IDC Vice President or Country Manager. A draft of the proposed document should accompany any such request. IDC reserves the right to deny approval of external usage for any reason.

References

Related documents

This company is expecting to migrate their workloads to private cloud by automation approach without reinstalling 150 servers’ applications on cloud instances and also to

IBM believes that increased innovation will come once organizations migrate mission-critical production workloads to managed clouds: clouds administered by outside technology

The Uni Systems On-boarding Solution will make moving to the cloud seamless and the ability for application workloads to move to the cloud without the need to re-architect

When applications and workloads demand private cloud, BizCloud VPE provides a secure, scalable cloud environment and is often implemented with private network access integrated

Finally, in Section 4, we recall the injective model structure and the homotopy limit model structure on the category of sections of a left Quillen presheaf, and we study it in the

Glaciers as top attraction of the Alps and in the 21st century disappearing from sight The second half of the 19th and the early 20th centuries were characterized by a growing

The formation of CCBF included the acquisition of existing distribution operations from the Coca-Cola Company owned bottler, Coca-Cola Refreshments.. In the resulting transition

Slå av strömmen och bygla sedan för att aktivera funktionen (Se avsnittet «Programmering») Omvänd funktion/ Magnet H (Öppen) GALEO kodlåsläge H (Sluten) GALEO - integrerade