How To Protect Data From Attack On A Computer System

Download (0)

Full text


Understanding holistic database security


attacks are increasing. The number of SQL injection attacks has jumped by more than two thirds: from 277,770 in Q1 2012 to 469,983 in Q2 2012.1 Ponemon reports that SQL injection

accounts for 28% of all breaches. (See Figure 1)

SQL injection attacks have been around for a long time, but, according to the 2012 IBM Security Systems X-Force Report, they are still the most common type of attack on the Internet. News headlines about the increasing frequency of stolen

information and identity theft have focused awareness on data security breaches—and their consequences. In response to this issue, regulations have been enacted around the world. Although the specifics of the regulations may differ, failure to ensure compliance can result in significant financial penalties, loss of customer loyalty and even criminal prosecution. In addition to the growing number of compliance mandates, organizations are under pressure to embrace the new era of computing, which brings with it new security challenges and a complex security landscape. Hackers are becoming more skilled; they are building sophisticated networks and in some cases are state sponsored. The rise of social media, cloud computing, mobility and big data are making threats harder to identify. Thus unscrupulous insiders are finding more ways to pass protected information to outsiders with less chance of detection. Organizations need to adopt a more proactive and systematic approach to securing sensitive data and addressing compliance requirements amid the digital information explosion. This approach must span across complex, geographically dispersed systems. A paradox exists where organizations are able to process more information than at any other point in history, yet they are unable to understand what data exists and how to protect it from both internal and external attacks.

Sensitive data is found in commercial database systems, such as Oracle, Microsoft SQL Server, IBM DB2 and Sybase, in warehouses like Teradata and Netezza, and also in Hadoop-based systems. This paper discusses the eight essential best practices that provide a holistic approach to safeguarding data sources and achieving compliance with key regulations, such as SOX, PCI DSS, GLBA and data protection laws.

Safeguard databases and achieve


Most of the world’s sensitive data is stored in commercial database systems, such as Oracle, Microsoft SQL Server, IBM DB2 and Sybase— increasingly making databases a favorite target for criminals. This may explain why SQL injection

“It’s really not surprising that servers seem to

have a lock on first place when it comes to the

types of assets impacted by data breaches. They

store and process data, and that fact isn’t lost

on data thieves.”

—Verizon Data Breach Investigations Report, 2012

Figure 1: Analysis of malicious or criminal attacks experienced according to the 2011 Cost of Data Breach Study, Ponemon Institute published March 2012


across multiple sources to determine the complex rules and transformations that may hide sensitive content. By automating the discovery process, this ensures greater data accuracy and reliability than manual analysis.

Mature discovery solutions also find malware placed in databases as a result of SQL injection attacks. In addition to exposing confidential information, SQL injection vulnerabilities allow attackers to embed other attacks inside the database that can then be used against visitors to the website.

Previously, most of the attention has been focused on securing network perimeters and client systems (firewalls, IDS/IPS, anti-virus, and so on). However, with the era of computing information security, professionals are being tasked with ensuring that corporate databases, data warehouses, file repositories and Hadoop-based environments are secure from breaches and unauthorized changes.

This paper discusses the eight essential best practices that provide a holistic approach to safeguarding databases as well as the array of enterprise data sources across the organization while achieving compliance with key regulations such as SOX, PCI DSS, GLBA, HIPAA and data protection laws.

Step 1: Discovery

Ignorance is not an excuse. Organizations are held

accountable to protect all data even if the data isn’t obvious or revealed only through understanding relationships between databases. Start with a good mapping of sensitive assets—both of database instances and sensitive data inside the databases (see Figure 2). Discovery works by examining data values

“Start with discovery, classification, and

building policies and implementing data

security controls.”

— Why Enterprise Database Security Strategy Has Become Critical, Forrester Research, Inc


about database structures and expected behavior, nor can they issue SQL queries (via credentialed access to the database) in order to reveal database configuration information. Automating the scanning of your database infrastructure to find vulnerabilities provides an ongoing evaluation of your security posture and combines historical and real-time analysis.

Step 3. Hardening

The result of a vulnerability assessment is often a set of specific recommendations to eliminate as many security risks as possible. Implementing these recommendations, such as setting a baseline for system configuration settings and locking user access to data, is the first step to hardening the database, warehouse or Hadoop-based system. Other elements of hardening involve removing all functions and options not in use. IT security professionals must be vigilant in establishing security policies, access controls and data usage policies to ensure that both security and business requirements of the organization are being met.

Step 2. Vulnerability and configuration


Organizations need to assess the configuration of databases, warehouses and Hadoop-based systems to ensure they don’t have security holes (see Figure 3). This includes verifying both the way the database is installed on the operating system (for example, checking file privileges for database configuration files and executables) and configuration options within the database itself (such as how many failed logins will result in a locked account, or which privileges have been assigned to critical tables). Plus, organizations need to verify that they are not running database versions with known vulnerabilities. Don’t build your own checklists. There are several benchmarks from organizations such as DISA, STIG, CIS and CVE that provide tests to check for common vulnerabilities including missing patches, weak passwords, and misconfigured privileges and default accounts, as well as unique vulnerabilities for each DBMS platform. Traditional network vulnerability scanners weren’t designed for this because they don’t have embedded knowledge


Step 4. Change auditing

After discovering data sources, classifying the sensitive data types and hardening configurations, organizations must continually track data sources to ensure they don’t deviate from the “gold” (secure) configuration. Change auditing tools that compare snapshots of the configurations (at both the operating system level and at the database level) and immediately send an alert whenever a change is made will improve the security of enterprise data sources and maintain continuous compliance. Change auditing is required by many legal mandates such as the Sarbanes–Oxley Act (SOX). You must audit changes to data and the database executables. A database installation has numerous executables, and each one can be used by an attacker to compromise an environment. An attacker can replace one executable with a version that, in addition to doing the regular work, also stores user names and passwords in a readable file. Change auditing would detect changes to the executable file and any other file created by the attacker.

Step 5. Activity monitoring

Real-time monitoring of database, data warehouse or

Hadoop-based system activity is key to limiting security risks. Activity monitoring across systems collects information from different sources for advanced analytics. Organizations can then create policies based on this security intelligence, such as alerting, masking or even halting malicious activity. Purpose-built activity monitoring solutions offer a level of granular inspection of databases and repositories not found in any other tool. Use activity monitoring to provide immediate detection of intrusions, misuse and unusual access patterns (which are characteristic of SQL injection attacks),

unauthorized changes to financial data, elevation of account privileges, configuration changes executed via SQL

commands, and other malicious events. In addition to sending alerts, mature activity monitoring solutions can help prevent a disaster by taking corrective actions in real time, such as transaction blocking, user quarantine or data masking on the fly. (See Figure 4)


Monitoring privileged users or implementing controls for separation of duties is a requirement for data governance regulations such as SOX and data privacy regulations such as PCI DSS. It is important to detect intrusions because attacks are frequently the result of gaining privileged user access (such as via credentials owned by your business applications). Activity monitoring also helps with vulnerability assessment because it goes beyond traditional static assessments to include dynamic assessments of “behavioral vulnerabilities,” such as multiple users sharing privileged credentials or an excessive number of failed database logins.

Comprehensive activity monitoring technologies also offer application-layer monitoring, allowing organizations to detect fraud conducted via multi-tier applications such as PeopleSoft, SAP and Oracle e-Business Suite, rather than via direct connections to the database.

Step 6. Auditing and compliance


Secure, non-repudiable audit trails must be generated and maintained for any data source activity that impacts security posture, data integrity or viewing sensitive data. In addition to being a key compliance requirement, having granular audit trails is also important for forensic investigations.

Figure 5: Manage the entire compliance lifecycle

“Not all data and not all users are created

equally. You must authenticate users,

ensure full accountability per user, and


Most organizations currently employ some form of manual auditing utilizing traditional native database logging

capabilities. However, these approaches are often found to be lacking because of their complexity and high operational costs due to manual efforts. Other disadvantages include high performance overhead, lack of separation of duties (DBAs can easily tamper with the contents of database logs, thereby affecting non-repudiation) and the need to purchase and manage large amounts of storage capacity to handle massive amounts of unfiltered transaction information.

Fortunately, next-generation activity monitoring solutions are available to provide granular, DBMS-independent auditing with minimal impact on performance, while reducing operational costs via automation, centralized cross-DBMS policies and audit repositories, filtering, and compression. Without the ability to quickly provide independent access security reporting, organizations will face significant costs and possible audit failure even with the best security controls. A data security solution should centralize reporting across databases, data warehouses, file shares and Hadoop-based systems with a customizable workflow automation solution to generate compliance reports on a scheduled basis. The ability to distribute compliance reports to oversight teams for electronic sign-offs and escalation and to store the results of remediation activities promotes automation and reduces the cost of compliance. (See Figure 5)

Step 7. Authentication, access control and

entitlement management

Not all data and not all users are created equally. Regulatory mandates and security requirements are requiring organizations to adopt strong, multifactor authentication methods to protect against unauthorized and unidentified access.

Organizations must authenticate users, ensure full

accountability per user and manage privileges to limit access to data. Organizations should enforce these privileges, even for the most privileged database users. Periodic review of

entitlement reports (also called User Right Attestation reports) as part of a formal audit process will result in better enterprise data security. A holistic database security solution should include reporting capabilities that automatically aggregate user entitlement information across the entire heterogeneous database infrastructure and identify which users have particular special privileges, what new rights have been granted by whom and what entitlements particular users have.

Step 8. Data transformation

A key component of an information governance strategy, data transformation helps organizations address the challenges of information volume and variety with solutions for data protection and privacy—regardless of the type of data, the location or the usage. Use encryption, masking or redaction to render sensitive data unusable, so that an attacker cannot gain unauthorized access to data from outside the data repository or inadvertently reveal sensitive data.

Data transformation techniques should protect data in transit, so that an attacker cannot eavesdrop at the networking layer (and gain access to the data when it is sent to the database client), as well as protect data at rest, so that an attacker cannot extract the data (even with access to the media files).


InfoSphere Guardium is the most widely-used solution for preventing information leaks from the data center and ensuring the integrity of enterprise data. It is installed in more than 400 customers worldwide, including five of the top five global banks; four of the top six insurers; top government agencies; two of the top three retailers; 20 of the world’s top telcos; two of the world’s favorite beverage brands; the most recognized name in PCs; a top three auto maker; a top three aerospace company; and a leading supplier of business intelligence software. InfoSphere Guardium was the first solution to address the core data security gap by providing a scalable, cross-DBMS enterprise platform that both protects databases in real-time and automates the entire compliance auditing process. IBM InfoSphere provides an integrated platform for defining, integrating, protecting and managing trusted information across your systems. The InfoSphere Platform provides all the foundational building blocks of trusted information, including data integration, data warehousing, master data management and information governance, all integrated around a core of shared metadata and models. The portfolio is modular, allowing you to start anywhere, and mix and match InfoSphere software building blocks with components from other vendors, or choose to deploy multiple building blocks together for increased acceleration and value. The InfoSphere Platform provides an enterprise-class foundation for information-inten-sive projects, providing the performance, scalability, reliability and acceleration needed to simplify difficult challenges and deliver trusted information to your business faster.

This document is current as of the initial date of publication and may be changed by IBM at any time. Not all offerings are available in every country in which IBM operates.


Please Recycle