Magic Quadrant for Data Center Backup and Recovery Solutions

Published: 31 July 2017 ID: G00311191

Analyst(s): Dave Russell, Pushan Rinnen, Robert Rhame


Enterprise backup is among the most critical tasks for infrastructure and operations professionals. Gartner provides analysis and evaluation of the leading data center backup solution vendors that offer a range of traditional to innovative availability capabilities.


Strategic Planning Assumptions

By 2021, 50% of organizations will augment or replace their current backup application with another solution, compared to what they deployed at the beginning of 2017.

By 2022, 20% of storage systems will be self-protecting, obviating the need for backup applications, up from less than 5% today.

By 2020, 30% of large enterprises will leverage snapshots and backup for more than just operational recovery (e.g., disaster recovery, test/development, DevOps, etc.), up from less than 15% at the beginning of 2017.

By 2020, 30% of organizations will have replaced traditional backup applications with storage- or HCIS-native functions for the majority backup workloads, up from 15% today.

By 2020, the number of enterprises using the cloud as a backup target will double, up from 10% at the beginning of 2017.

By 2021, over 50% of organizations will supplant backup with archiving for long-term data retention, up from 30% in 2017.

By 2019, despite increasing effectiveness of countermeasures, successful ransomware attacks will double in frequency year over year, up from 2 to 3 million in 2016.

Market Definition/Description

Gartner defines data center backup and recovery solutions as those solutions focused on providing backup capabilities for the upper-end midmarket and large enterprise environments. Gartner defines the upper-end midmarket as being 500 to 999 employees, and the large enterprise as being 1,000 employees or more. Protected data comprises data center workloads, such as file share, file system, operating system, hypervisor, database, email, content management, CRM, ERP and collaboration application data. Today, these workloads are largely on-premises; however, protecting SaaS applications (such as Salesforce and Microsoft Office 365 [O365]) and infrastructure as a service (IaaS) are becoming increasingly important, as are other, newer "born in the public, private or hybrid cloud" applications.

These backup and recovery solutions provide writing data to tape, conventional random access media (such as a hard disk or solid-state drives) or devices that emulate the previous backup targets (such as virtual tape library [VTL]). Data services, such as data reduction (compression, deduplication or single instancing), array and/or server-based snapshot, heterogeneous replication (from/to dissimilar devices), and near-continuous data protection (near-CDP) can also be offered. Exploiting converged data management (also referred to as "copy data management") has become important; here, backup data is leveraged for additional use cases, such as analytics, disaster recovery, test/dev, reporting and so on (see "Predicts 2017: Business Continuity Management and IT Service Continuity Management" ). In particular, the concept of performing a live mount of the backup data, prior to actually restoring it — making it usable nearly instantly, then using tools, such as VMware's Storage vMotion, to move the data from the backup store to primary storage — has become table stakes in the market.

Additionally, integration and exploitation of the cloud, particularly the public cloud, or on-premises object storage as a backup target or to a colocation facility are becoming more important for backup workloads, despite modest deployment to date.

As the backup and recovery market has hundreds of vendors, this report narrows the focus down to those that have a very strong presence worldwide in the upper-end midmarket and large enterprise environments. Solutions that are predominantly sold as a service (backup as a service [BaaS]) do not meet the market definition for inclusion. Software for a homogeneous environment, such as native tools from Microsoft or VMware, primarily for their own platforms, are also excluded, as many midsize and large customers prefer a more heterogeneous, scalable backup product for their environments.

Provider solutions that primarily address backup and recovery of remote office, small enterprise, individual system and/or an endpoint device data are outside of the scope of this data-center-oriented focus. Some providers may, however, also address these workloads, as well as the larger data center workloads described above. However, those are not the primary use cases for deploying these data center solutions.

This 2017 "Magic Quadrant for Data Center Backup and Recovery Solutions" is an update to the "Magic Quadrant for Data Center Backup and Recovery Software" that was last published in June 2016. The slightly renamed Magic Quadrant (change of wording from "Software" to "Solutions") and updated market criteria are driven by expanding packaging options (for example, physical and virtual appliances).

Note that Gartner analysts may need to update the market definition as they progress through the research process. You will be explicitly notified of the change(s), should they need to happen.

Magic Quadrant 

Figure 1. Magic Quadrant for Data Center Backup and Recovery Solutions


Source: Gartner (July 2017)

Vendor Strengths and Cautions


Actifio first came to market as a data virtualization appliance with its pioneering copy data management capability. The concept of copy data collapsed the number of secondary copies of data (archive, backup, disaster recovery, test/dev and so on) by keeping a single golden image of the data. Today, the solution can still be deployed as an appliance, but also as software-only or as a virtual appliance, and often in the cloud. Actifio protects a range of workloads, but excels at large-scale virtual environment and large applications, especially Oracle instances. Actifio is praised for its ease of management via an intuitive administrative console and SLA-driven policy management that facilitates end-to-end application protection. In 1Q17, Actifio expanded on its copy data message, and now positions itself as an enterprise-data-as-a-service (EDaaS) solution that provides data availability and archive capabilities, as well as hybrid cloud mobility and faster application development (DevOps). In late 2017, Actifio will add a catalog for faster search of virtual machine data. Pricing is via a simplified front-end capacity-based model. Once deployed, Actifio receives positive feedback for the ability to easily protect large environments.


  • Actifio successfully pioneered copy data management and delivers a single platform that integrates backup, disaster recovery and long-term data retention, but that can be used for broader data management, such as application development and facilitating multidirectional hybrid cloud deployments.
  • Actifio offers scalable, writeable instant recovery for applications, files and virtual machines (VMs), including physical, as well as virtual, workloads.
  • Customer and references praise Actifio's ease of use and customer support.


  • Some customers have stated that initial deployments can require extra care and effort in order to properly configure the solution.
  • Actifio is mostly deployed in North America and specific countries in Europe, but has been gaining traction in more of Europe and especially in Japan.
  • Actifio is most often deployed in larger enterprises with large amounts of data, and is only recently beginning to examine smaller enterprise routes to market with its streamlined Actifio One product.



Arcserve offers two different backup products: Arcserve Unified Data Protection (UDP), which is available as software-only and as an integrated solution via the Arcserve UDP Appliance, and the legacy Arcserve Backup offering. The majority of the company's efforts are focused on the UDP solution, with capabilities from Arcserve Backup embedded in to UDP via a common console and a single installer. This research primarily focuses on UDP. Now independent from CA Technologies for three years, Arcserve as a stand-alone company has been investing in research and development, marketing and sales expansion. These efforts have made Arcserve one of the few backup vendors in the industry that is experiencing market share growth. In 2Q17, Arcserve acquired two companies: Zetta for cloud backup and disaster recovery, and FastArchiver for email archiving. Zetta will extend Arcserve's backup-to-the-cloud, remote location and failover-to-the-cloud capabilities. Arcserve protects virtual and physical environments, offering robust cloud integration, virtual standby and recovery assurance capabilities. Arcserve offers a broad range of packaging and pricing options. Arcserve is focused on the medium to large midsize enterprise and the decentralized large enterprise, typically with 250 to 2,500 employees.



  • Arcserve UDP offers comprehensive backup, continuous data protection, high availability and disaster recovery for physical, virtual and cloud environments.
  • References rate UDP high on ease of deployment.
  • Arcserve offers a variety of packaging, deployment and pricing options, allowing customers to choose just the function that they need at a favorable price.



  • Gartner end-user inquiry and reference checks give low marks to Arcserve's technical support.
  • Arcserve is typically not deployed in large data centers where Oracle RAC is often required.
  • While well-deployed in Japan and parts of Europe, North American deployments continue to lag.



Commvault has the most expansive list of supported public cloud providers, hypervisors, big data support and database protection, providing technological liberty for future procurement decisions in other areas of the data center and beyond. Commvault is primarily offered as a software-only solution, but has an appliance as an option. Aside from protecting more exotic and newer technologies, coverage for traditional physical and on-premises applications is end-to-end and complete. Building on its early lead, Commvault is adding orchestration to its machine conversion capabilities to enable migration of whole applications to the cloud. Commvault is primarily licensed by capacity for physical servers, by socket or VM bundles for hypervisors, and by VM bundles for portability from virtual environments to the cloud. Despite the 2016 pricing revision and popular solution set bundles, Commvault has not been able to completely get rid of its now-undeserved reputation as an expensive vendor. Customer support and satisfaction feedback remain very favorable. While Commvault does have a unified interface, it is complex in a commensurate measure to its capability. Commvault has introduced administrator-persona-oriented, simplified UIs to improve ease of use.


  • Commvault has comprehensive and leading public cloud support as well as protection of O365 and Salesforce.
  • IntelliSnap features the industry's broadest support for integrating with and exploiting storage hardware platform snapshots, directly supporting over two dozen of the top-selling storage arrays.
  • Customers and references report a very high likelihood to repurchase the solution.



  • Implementing Commvault is relatively complex, and often requires professional services.
  • Administrators report that the product has an initial steep learning curve, which makes training a prerequisite.
  • Field sales will generally lead with VM bundles, which may not be the most cost-effective licensing model for very dense VM-per-socket environments, where socket-based licensing would be less expensive.


Dell EMC

Dell EMC offers data center backup software in multiple editions of Data Protection Suite (DP Suite). The suites are sales bundles of relevant backup products, including Avamar and/or NetWorker as the key backup engines, augmented with alternative backup methods and additional capabilities through other components such as DD Boost for Enterprise Applications, ProtectPoint, CloudBoost, Search and Data Protection Advisor (DPA). (Note that the Data Domain appliances are not evaluated in this research.) The newly launched all-in-one Integrated Data Protection Appliance (IDPA) combines Data Domain with Avamar and DD Boost for Enterprise Applications. Dell EMC now supports emerging big data workloads such as Hadoop and MongoDB. DP Suite has significantly improved support of public cloud storage for both long-term retention and cost-effective disaster recovery of backup data, as well as protecting cloud-native data. Avamar is well-known for its storage and WAN efficiency and strong file backup, while NetWorker has stronger support for heterogeneous environments with different physical servers and database applications. Key roadmap items before year-end 2017 include a centralized dashboard for all DP Suite components and a backup-as-a-service solution to manage Amazon cloud-native backups. However, heavy dependence on Data Domain and management complexity with multiple management consoles remain top customer concerns.


  • Dell EMC offers aggressive discounts and often high attach rates of its backup solution to its primary storage sales.
  • The Integrated Data Protection Appliances are partially addressing the long-existing complexity of managing various components of DP Suite plus Data Domain.
  • Dell EMC is one of the few vendors offering the use of low-cost cloud object storage for backup-based DR without persistent usage of cloud compute.



  • Dell EMC's data protection portfolio often results in lock-in into the Data Domain and other Dell EMC hardware platforms due to high attach rates to EMC hardware.
  • With the centralized DP Suite dashboard still in the works, DP Suite management and support continue to be fragmented and complex.
  • Some large database backup solutions such as ProtectPoint and DD Boost for Enterprise Applications may require evaluation by database administrators and application owners.



Hewlett Packard Enterprise (HPE) Data Protector is a software-only traditional data center backup product, typically deployed in conjunction with tape or deduplication backup target appliances such as HPE StoreOnce and Dell EMC Data Domain. Data Protector is a single, scalable solution supporting a range of host environments, with some gaps, and offering native integration with many enterprise core applications, especially SAP. HPE announced in the fall of 2016 the pending spinoff/merge of its software assets, including Data Protector, to Micro Focus, which will be completed in the fall of 2017. In the last year, HPE completed a much-overdue Data Protector graphical user interface (GUI) overhaul, improved back-end product communication, and added backup analytics enhancements to the Backup Navigator product for problem remediation and service-level assurance. Roadmap items focus on enhancements including self-service restore and the integration of VM Explorer into the core product, along with further improvements to Backup Navigator. HPE offers Data Protector as part of the Adaptive Backup and Recovery (ABR) Suite, licensed by suite level, or as a stand-alone product licensed by feature or capacity. Data Protector is rated lower than other vendors in this research for the lack of innovation and execution compared to major competitors.


  • HPE Data Protector has good, heterogeneous OS, database and file system support.
  • Backup Navigator and Storage Optimizer provide analytics and visibility to the backup and storage environment.
  • HPE is a good value from a cost perspective if the use case is traditional on-premises enterprise backup requirements.



  • The upcoming Micro Focus spin-merge may further weaken Data Protector's (and thus the ABR Suite's) strategic cross-divisional focus and relationship with HPE Storage, which now also resells Veeam.
  • HPE does not offer socket or VM bundle-based pricing, and references cite dissatisfaction with packaging and pricing.
  • Support for Microsoft Hyper-V requires agents in the guest virtual machines.



IBM's Spectrum Protect is offered as software from IBM, or as an appliance from third-party partners, and can be deployed to protect small to extremely large enterprises. Spectrum Protect supports physical and virtual environments, as well as a broad mix of applications, operating systems and backup target devices. Of note, DB2 and Domino, AIX, iSeries and zLinux, and robust physical tape support and media-tracking capability stand out. IBM remains unique with its highly scalable two-tier architecture, along with its incremental-forever methodology, which results in potentially lower total cost of ownership due to reduced infrastructure requirements. In early 2017, IBM united its sales force into one holistic storage seller, resulting in a larger, vertically aligned direct sales group with a greater focus on new-customer acquisition and customer retention. In late 2017, Spectrum Protect's Operations Center will add a new security dashboard and add broader historical reporting for trend analysis. Spectrum Protect is a component of IBM's software-defined storage (SDS) strategy, with IBM offering a Spectrum Storage Suite that allows for deployment of any of IBM's SDS products under a single license. While the Operations Center was released four years ago and has been continually enhanced, Spectrum Protect's ease of use and administrative console receive very mixed feedback.


  • A single instance of Spectrum Protect can protect a very scalable 4PB to 5PB of data.
  • Spectrum Protect's new cloud accelerator capability boosts performance up to many TB/hour in a single Spectrum Protect server instance when writing to cloud storage, enabling customers to leverage the cloud for large environments.
  • Customer support is rated highly by references.
  • .



  • Gartner end-user inquiry and customer references give Spectrum Protect the lowest marks for ease of management of all of the products in this Magic Quadrant, even though IBM states that it has invested in improving the user experience in several areas in recent years.
  • Perceptions of being high-priced remain, mainly due to processor value unit (PVU) pricing that was historically predominant in North America and that it is often used in the rest of world, despite capacity-based alternatives.
  • Spectrum Protect's deduplication is not global across all Spectrum Protect server instances; rather, it is constrained to a single storage pool.



Rubrik offers an integrated backup appliance with a scale-out architecture that offers global deduplication and ease of expansion. Despite being a new entrant to the market, Rubrik has generated heightened market awareness and rapid adoption by upper midsize to large enterprises, augmenting or completely replacing mainstream data center backup solutions. By using modern backup and recovery techniques and a distributed metadata and task scheduler, Rubrik's backup appliances remove dependency on external databases and storage controllers to manage backup and storage, drastically simplifying the backup software and hardware infrastructure. Its software instance can run in remote offices or in Amazon Web Services (AWS) and Microsoft Azure to back up cloud-native data, or serve as a replication target. Rubrik also simplified its pricing practices by offering two main options: all-in-one appliance licensing (with separate support) or subscription-based licensing (including support) for its software instances. However, Rubrik is quite limited with its support matrix today, and can back up VMware virtual machines, Windows and Linux physical servers, Microsoft SQL Server and network-attached storage (NAS). In August 2017, it is launching support for Hyper-V, Nutanix AHV and Oracle running on physical servers. Its scalability needs to be proven as customers gradually grow their backup footprint. Roadmap items for the remainder of 2017 include tape support via integration with QStar and AWS- native snapshot management.



  • Modern backup and recovery techniques, coupled with deployment, management and pricing simplicity, drive market adoption.
  • Rubrik is aggressively expanding the support matrix to cover key production environments in the data center and the public cloud.
  • Customers praise Rubrik's product reliability, pay-as-you-grow architecture and ease of use.



  • Rubrik has a short track record and limited support for physical servers and enterprise applications.
  • Although the scale-out architecture allows for near-linear scalability, it takes time to validate scalability because most of Rubrik's customers start from a midsize installation and expand gradually.
  • Despite the "Cloud Data Management" product name, Rubrik's deployment is primarily on-premises, and its cloud data backup functions are either missing or too new to be proven.



Unitrends Enterprise Backup (UEB) offers a package of on-premises backup, backup to the cloud, cloud-to-cloud IaaS backup and disaster recovery as a service (DRaaS) via integrated appliances, virtual appliances or software-only offerings. Unitrends focuses on delivering a backup appliance for the generalist IT administrator in the small to midsize enterprise. Cloud options include differentiated purpose-built Unitrends Cloud or a broad choice of public cloud providers; public cloud is required if SaaS or IaaS backup will be used. UEB provides hypervisor host-level protection for VMware, Hyper-V and XenServer, and more traditional protection for physical servers, databases and NAS. The Unitrends roadmap includes global deduplication across appliances, and support for KVM and Nutanix host-based hypervisors, and in the next year Salesforce and Google Suite support will be added. Standard licensing is perpetual, with term-based licensing available, and is calculated by server or socket with cloud options on a subscription-based model. Despite Unitrends' delivery as an appliance, references placed UEB's ease of use and deployment score in the middle, but commented favorably on backup speed. However, its scale of solution relative to other vendors in this Magic Quadrant came in the lowest, according to references.


  • Unitrends has a mature portfolio of cloud backup and disaster recovery (DR) capabilities, and supports of a broad range of cloud providers, such as AWS S3 (and Glacier), Microsoft Azure, Google Cloud Platform (and Google Nearline) and any OpenStack Swift-compatible provider.
  • To ensure recoverability, Recovery Assurance available with the Enterprise Plus version provides fully automated, application-level consistency sandbox testing of physical servers, Microsoft Hyper-V or VMware vSphere VMs. This capability is available for on-site, Unitrends Cloud and service providers.
  • Unitrends has a predictive analytics algorithm that can detect a ransomware infection.



  • Due diligence must be used to properly scope for growth, performance and bandwidth requirements, as Gartner has received reports of customers needing additional hardware after the initial purchase to meet service-level objectives.
  • Some modules are not fully integrated into the main interface, and SaaS O365 backup requires purchase of an OEM product.
  • Unitrends lacks broad storage vendor integration for NAS and storage area network (SAN) snapshots, limiting its support only to a few EMC and NetApp models for select functions.



Veeam is known for its agentless VM backup software for VMware and Hyper-V. It recently added agent-based physical server backup software for Linux and Windows, including Windows Server applications such as Active Directory, Exchange, Oracle, SQL and SharePoint. Veeam has become the fourth- largest backup vendor (based on 2016 revenue), and is frequently showing up on vendor shortlists for backup software. The company has effective sales strategies, such as free-of-charge promotions for backup of physical servers and O365 Exchange Online to jump-start adoption. Over the years, it has added data center functions, such as tape support and storage snapshot integration, as well as integration with key deduplication backup target appliances to improve storage efficiency. Customers and references have favorable comments on ease of management. Roadmap items before the end of 2017 include converged management of virtual and physical server backup and NAS backup. However, Veeam's pricing has become less competitive and less flexible than some key competitors'.


  • Veeam offers rich functionality and many simple recovery options for the VMware and Hyper-V environment.
  • Veeam continues to stand out with numerous agentless VM recovery options and unique functions to overcome hypervisor-native limitations.
  • For the last several years, Veeam has been — and continues to be — one of the fastest-growing companies in the backup industry.



  • Several customer references noted that pricing options and license management need to be improved, as Veeam's street pricing is often no longer competitive.
  • While daily management and recovery are straightforward, proper sizing and configuration at the deployment phase may need extra attention, as the change rate for VMs is often higher than many account for.
  • Veeam has only just begun officially supporting physical server environments and has yet to fully integrate or prove these capabilities.


Veritas Technologies

Veritas Technologies offers NetBackup for larger enterprises as software or as an integrated appliance, and offers Backup Exec for the midmarket. This research focuses on NetBackup. NetBackup is a proven, very scalable solution that can protect a diverse mix of physical, virtual and cloud workloads, with optional multitenant support. In the last year, NetBackup has added additional Oracle integration, including rapid recovery, and integrating its portfolio with its broader 360 Data Management solutions. The NetBackup integrated appliances provide a cost-effective, easy-to-deploy solution that is effective for organizations needing an additional or replacement media server. Before the end of 2017, Veritas plans to release broader support for next-generation storage, database and big data solutions. According to Gartner reference surveys and conference polling, Veritas' NetBackup is tied for the second-most-evaluated enterprise backup solution, and remains the market-share-leading offering. The transition into a new stand-alone company again has seen turnover in high-ranking development, sales and marketing leadership. New socket-based pricing for virtual machines has been well-received by the market. Daily administrative requirements for operation of NetBackup have been rated favorably, which is not common for a solution with broad protection and scalability capabilities.



  • NetBackup and the NetBackup Appliances offer a highly scalable solution with a broad support matrix.
  • References rate NetBackup highest for database protection of all of the products in this Magic Quadrant.
  • Product quality and strength of technical support have been rated highly for the last two years.




  • Continued compliance audits, first initiated by Symantec in 2015, remain an ongoing practice by the new Veritas in 2016 and 2017, continuing to cause some customers to seek replacement solutions.
  • Recently, new functionality has largely centered on integration with other Veritas data management solutions; however, NetBackup 8.1 (in beta at the time of publication) offers significant new capabilities.
  • Veritas field support, in terms of sales people and support engineers, has seen turnover since the company re-established its independence.


Vendors Added and Dropped

We review and adjust our inclusion criteria for Magic Quadrants as markets change. As a result of these adjustments, the mix of vendors in any Magic Quadrant may change over time. A vendor's appearance in a Magic Quadrant one year and not the next does not necessarily indicate that we have changed our opinion of that vendor. It may be a reflection of a change in the market and, therefore, changed evaluation criteria, or of a change of focus by that vendor.



  • Rubrik was added to this Magic Quadrant. Over the last year, Rubrik has been focused on selling into the target market of this research, and it has met the inclusion criteria.
  • Note that the current vendor name of HPE is expected to change in late 3Q17 to early 4Q17 to Micro Focus as a result of HPE's completion of its spinoff/merge of many of its software assets (see "HPE's Spinoff/Merge of Its Software Businesses to Micro Focus May Create Significant Challenges for Users" ).


  • No vendors were dropped from this Magic Quadrant this year.


Inclusion and Exclusion Criteria

To qualify for inclusion, vendors need to meet the following nine criteria at the time that initial research and survey work commences (January 2017), unless otherwise noted. The criteria for inclusion in the 2017 "Magic Quadrant for Data Center Backup and Recovery Solutions" are:


Vendor's qualifying backup and recovery solution(s) must focus on the upper-midsize enterprise to the large data center, possess the capability to capture data directly, and not solely rely on other third-party and/or partner means of data capture/ingestion. In short, the vendor must own heterogeneous backup capabilities which meet all of the criteria below.

  • The qualifying solution must be focused on protecting data center workloads, such as file system, operating system, hypervisor, database, email, content management and customer relationship management (CRM) data.
  • The solution(s) must support files on Windows and Linux and multiple applications on Windows, in a physical and/or a virtual deployment supporting both VMware and Microsoft Hyper-V hypervisors prior to publication of this research.
  • The qualifying solution must be deployed at least one-third of the time as an on-premises solution and not predominantly be a cloud-service-required solution.
  • The vendor must have a backup and recovery solution commercially available for at least one calendar year prior to publication of this research.
  • The vendor must actively market its branded backup and recovery products in at least two major geographic regions of North America, Europe or Asia/Pacific.
  • The vendor must have generated greater than $45 million in 2016 (either calendar or fiscal year) for the total revenue for its data center backup and recovery solution(s), exclusive of specific endpoint or lower-midmarket backup and recovery offerings, with signed certification by the vendor to confirm this.
  • The vendor must be the originator of the required capabilities and meet all of the above requirements via intellectual property that they own, and not rely exclusively on third-party and/or resold solutions to meet these criteria.

Gartner will continue to cover emerging vendors, as well as vendors and products that do not yet meet the above inclusion criteria.

Based upon the criteria above, the following vendors and products are believed to have qualified for inclusion in to this Magic Quadrant:

  • Actifio — Actifio Enterprise, Actifio Sky
  • Arcserve — Arcserve Unified Data Protection (UDP), Arcserve Unified Data Protection (UDP) Appliance Series
  • Commvault — Commvault software (formerly Simpana), Commvault A-Series appliances
  • Dell EMC — Avamar, Data Protection Suite (which includes Avamar, NetWorker and Data Protection Advisor), Integrated Data Protection Appliance, NetWorker
  • HPE — Data Protector, Adaptive Backup and Recovery Suite
  • IBM — Spectrum Protect (formerly Tivoli Storage Manager [TSM])
  • Rubrik — Cloud Data Management
  • Unitrends — Unitrends Enterprise Backup, Recovery Series appliances
  • Veeam — Veeam Availability Suite, Veeam Backup & Replication
  • Veritas Technologies — NetBackup, NetBackup Appliances, CloudPoint


Evaluation Criteria

Ability to Execute

Gartner analysts evaluate technology providers on the quality and efficacy of the processes, systems, methods or procedures that enable IT provider performance to be competitive, efficient and effective, and to positively impact revenue, retention and reputation. Ultimately, technology providers are judged on their ability and success in capitalizing on their vision:


  • Product or Service :This is the evaluation of how well a vendor does in building and effectively delivering the solution that the market wants and perceives as being worthy of new investments — ideally resulting in a three- to five-year strategy based on the vendor's portfolio (versus tactical or point product usage). The solution must be easily configured and managed so that the capability of the product is easily exploited. The product's completeness of overall capability, as well as the breadth and depth of the specific key features, will be considered. The overall scalability of a single instance of the solution will be taken into account. Also tracked is the level of customer interest and positive feedback.


  • Overall Viability assesses the organizational health of a vendor, taking into account its ability to execute on a strategy and significantly grow its business. In a maturing market moving toward mainstream, this evaluation criterion was added to this update of the DLP Magic Quadrant.


  • Sales Execution/Pricing compares the strength of a vendor's sales, partnerships, sales channels, deployment plans, pricing models and industry support.


  • Market Responsiveness/Record reflects how vendors respond to customer feedback by assessing performance against previous product roadmaps, the content of future product roadmaps and the cultivation of strategic advantages.


  • Marketing Execution is new to this Magic Quadrant, and measures how vendors are marketing their solutions in order to grow their customer base in specific demographics.


  • Customer Experience is a combined rating of the materials provided to customers when they purchase the technology and, more significantly, what customers tell us about their experiences — good or bad — with each vendor.


  • Operations assesses the ability of the vendor to provide support across all aspects of the customer engagement domain, including support across data silos, different operating systems and content types.


Table 1.   Ability to Execute Evaluation Criteria

Source: Gartner (July 2017)


Completeness of Vision

The Gartner scoring model favors providers that demonstrate Completeness of Vision — in terms of strategy for the future — and the Ability to Execute on that vision. We continue to place stronger emphasis on technologies than on marketing and sales strategies.

Completeness of Vision is ranked according to a vendor's ability to show a commitment to enterprise DLP technology developments in anticipation of user wants and needs that turn out to be on target with the market. A clear understanding of the business needs of DLP customers — even those that do not fully recognize the needs themselves — is an essential component of that vision.

This means that vendors should focus on organizations' business- and regulation-driven needs to identify, locate and control the sensitive data stored on their networks and crossing their boundaries.

Our Completeness of Vision weightings are most influenced by four basic categories of capability: network performance, endpoint performance, data discovery performance and management consoles.

Weightings are subjective and contextual. Readers who conduct their own RFIs may choose to change the weightings to suit the needs of their businesses and industries:


  • Market Understanding is ranked through observation of the degree to which a vendor's products, roadmaps and missions anticipate leading-edge thinking about buyers' wants and needs. Included in this criterion is how buyers' wants and needs are assessed and brought to market in a production-ready offering.


  • Marketing Strategy assesses whether a vendor understands its differentiation from its competitors, and how well this fits in with how it thinks the market will evolve.


  • Sales Strategy examines the vendor's strategy for selling products, including its pricing structure and its partnerships in the DLP marketplace.


  • Offering (Product) Strategy assesses the differentiation of a vendor's products from its competitors, and how it plans to develop these products in the future.


  • Business Model assesses the overall go-to-market strategy of a vendor, its current product portfolio, past performance and future plans for expansion, and its overall business conditions. This evaluation criterion was newly added to this update of the enterprise DLP Magic Quadrant.


  • Vertical/Industry Strategy examines specific features, functionality and go-to-market strategy that focus on specific segments of the market or industry vertical, in order to gain competitive advantage and gain customers. This evaluation criterion was newly added to this update of the enterprise DLP Magic Quadrant.


  • Innovation looks at the innovative features that vendors have developed, to assess whether the vendors are thought leaders or simply following the pack, and the extent to which their products are able to combine with other relevant disruptive technologies.


  • Geographic Strategy is an assessment of the vendor's understanding of the needs and nuances of each region, and how the product is positioned to support those nuances.


Table 2.   Completeness of Vision Evaluation Criteria

 Source: Gartner (July 2017)

Quadrant Descriptions


Leaders have products that work well for Gartner clients in midsize and large deployments. They have demonstrated a good understanding of client needs and generally offer comprehensive capabilities in all three functional areas — network, discovery and endpoint. They have strong management interfaces, and have tight integration with other products within their brands or through well-established partnerships and meaningful integrations. They offer aggressive roadmaps and usually deliver on them. Their DLP products are well-known to clients and are frequently found on RFP shortlists.



Challengers have more competitive visibility and execution success in specific mature industry sectors than Niche Players. Challengers offer all the core features of enterprise DLP, but typically their vision, roadmaps and/or product delivery are narrower than those of Leaders. Challengers may have difficulty communicating or delivering on their vision in a competitive way outside their core industry sectors.



Visionaries make investments in broad functionality and platform support, but their competitive clout, visibility and market share don't reach the level of Leaders. Visionaries make planning choices that will meet future buyer demands, and they assume some risk in the bargain, because ROI timing may not be certain. Companies that pursue Visionary activities will not be fully credited if their actions are not generating noticeable competitive clout, and are not influencing other vendors.


Niche Players

A vendor is considered a Niche Player when its product is not widely visible in competition, and when it is judged to be relatively narrow or specialized in breadth of geographic reach, functions and platforms — or when the vendor's ability to communicate vision and features does not meet Gartner's prevailing view of competitive trends. Niche Players may, nevertheless, be stable, reliable and long-term vendors. Some Niche Players from close, long-term relationships with their buyers, in which customer feedback sets the primary agenda for new features and enhancements. This approach can generate a high degree of customer satisfaction, but also results in a narrower focus in the market (which would be expected of a Visionary).



This Magic Quadrant is a market snapshot that ranks vendors according to competitive buying criteria. Vendors in any sector of the Magic Quadrant, as well as those not ranked on the Magic Quadrant, may be appropriate for your organization's data security needs and budget. Every organization should consider DLP as part of its information security management program. DLP capabilities come from a variety of different types of products — both cloud-hosted and on-premises. The main theme remains that DLP is ultimately a well-defined data security process, bolstered by well-managed supporting technology.


Market Overview

Data loss prevention and the enterprise DLP market are currently experiencing a renaissance through a "second wave" of adoption. As noted on the "Hype Cycle for Data Security, 2015," data loss prevention has clearly moved to the right of the Trough of Disillusionment and climbing toward the Plateau of Productivity. There have been a number of reasons that this has taken place in the last two years.

First, look no further than the breach activity that has engulfed organizations in nearly every sector of the global economy. While DLP is not designed to stop data theft in every conceivable scenario (and was never intended to do so), DLP technology can provide a key element of data visibility when used in concert with other detect and respond technologies. Few data security controls delineate between either motivated insiders or users who unknowingly exfiltrate sensitive data. This has provided an environment where organizations are left scrambling for security tools that can provide any additional visibility and context to aid in the detection of and response to data security incidents.

The first wave of DLP adoption that drove the market into the Trough of Disillusionment focused on DLP as a data security "silver bullet." It was often billed as a way to identify and stop every case of accidental data loss and purposeful data theft. The market has since matured and evolved. Vendors and customers have become aware that enterprise DLP is a key piece of a broader and larger data life cycle process supported by technology, as opposed to DLP simply being another technology buying decision.


EMC's RSA Data Loss Prevention Suite End-of-Life Announcement


EMC's RSA Data Loss Prevention Suite has been a mainstay product in the DLP market since the acquisition of Tablus by RSA in 2007. In the beginning of 2015, EMC began notifying customers that it considered its DLP 9.6 release to be "feature complete," and would not be continuing development with any new or updated product releases. This covers the entire RSA DLP suite of products — DLP Datacenter, DLP Network and DLP Endpoint. EMC has discontinued forward investment in DLP, and those resources have been reallocated to products such as RSA Security Analytics, which EMC believes better addresses organizations' ability to detect and respond to data breaches.


The exit of RSA DLP has been a significant source of Gartner client inquiries related to DLP in 2015. We have been tracking these calls, and approximately 10% of all calls related to data loss prevention from April 2015 to December 2015 have been about the RSA DLP end-of-life announcement and plans by Gartner clients to migrate to a different enterprise DLP vendor.


According to the end of product support (EOPS) information from EMC (available at ), clients currently on RSA DLP 9.5 can extend support for one year (through November 2016). Gartner recommends all current RSA DLP customers be sure to upgrade to version 9.6 as soon as possible, which will reach EOPS status in December 2017, with the option to extend support for one additional year through December 2018. This should provide ample time for clients to carefully evaluate whether they should replace RSA DLP with a comparable enterprise DLP vendor, or look at an integrated DLP approach with multiple vendors and products, which might provide greater data visibility and use-case coverage.


RSA DLP also impacts other businesses. The Cisco Email Security Appliance (ESA) product business has integrated the RSA DLP engine in the email security platform. This caution was noted on the most recent "Magic Quadrant for Secure Email Gateways" and warrants mention again.


At this point, there are multiple possibilities for the RSA DLP technology. It could simply reach end-of-life status and not be developed further. EMC might also choose to sell this technology to another vendor, although the chance of this has decreased since it has already announced end-of-life plans and customers are making active migration plans away from RSA DLP. Still, the possibility for the technology to live on or be delivered to market in another way still exists.


Enterprise DLP Versus Integrated DLP

Data loss prevention capabilities are integrated into a wide variety of security point products. IT leaders struggle to understand the depth and breadth of integrated DLP capabilities, their appropriate intended use cases, and when to implement these technologies and/or "best-of-breed" enterprise DLP products.

As noted above in the Market Definition, enterprise DLP and integrated DLP can each play a pivotal role in an overall data life cycle protection strategy for your organization. They are also not mutually exclusive. As an example, organizations may choose to enable DLP capabilities at the secure email gateway, secure Web gateway and cloud access security broker, and choose not to deploy network DLP. Some organizations may not have the ability to strictly control endpoint systems; therefore, other technologies must be employed to provide visibility into data movement and data usage. Organizations should not limit their view of valid DLP solutions to only enterprise DLP products. Integrated DLP will result in many distinct policies across separate security controls, and without a proper policy management strategy, it is doomed to failure.

Both enterprise DLP and integrated DLP must provide content-aware and context-aware capabilities to be reasonably effective. Please refer to "How to Choose Between Enterprise DLP and Integrated DLP Approaches" for further information.


Microsoft and Its Impact on the DLP Market

Microsoft made a significant push into multiple information security markets in 2015. It added native DLP capabilities throughout its Exchange, SharePoint and OneDrive for Business platforms, both on-premises and online. Natively, Microsoft has included some key security capabilities (see "How to Enhance the Security of Office 365" ) — and specifically, DLP capabilities (see "Data Loss Prevention in Microsoft Office 365" ).

Microsoft has also made two noteworthy acquisitions in 2015:

  • Adallom — A CASB that provides visibility, compliance, data security and threat prevention capabilities between end users and cloud applications/storage. Adallom was acquired by Microsoft in September 2015.


  • Secure Islands — A data classification and tagging solution that allows users to identify and tag sensitive content, and also allows for integrated DLP actions to be taken based upon the tag value (for example, add rights management to an Excel file tagged as "Financial Data"). Secure Islands was acquired by Microsoft in November 2015.


Many of the enterprise DLP and integrated DLP vendors have also stepped up with product enhancements (or strategic partnerships) that add DLP support for both Microsoft Office 365 and OneDrive for Business, as these are key applications for organizations moving from on-premises infrastructure and applications to infrastructure as a service (IaaS) and SaaS.


Data Classification and Tagging Complementary to DLP

Data classification and tagging has been identified as a capability for file analysis software, as noted in "Market Guide for File Analysis Software." Data classification and tagging, in a security context, typically include the capabilities to both apply a metadata tag or value to unstructured or semistructured content, and take some form of data action based on the tag value, including encrypt, apply digital rights management or block transmission.

Gartner clients note three main drivers for data classification and tagging projects:

  1. Data classification and tagging used in conjunction with an enterprise DLP product — Many of the enterprise DLP products covered in this Magic Quadrant have the ability to identify, create and set DLP policy rules based on the existence of metadata values or tags attached to a file. This is commonly used in enterprise DLP policy as an additional screening layer to aid in reduction of false positives on content inspection rules and add additional content controls. This is most commonly used to apply encryption or electronic digital rights management (EDRM) based on a metadata value or tag.
  2. Data classification employed as a stand-alone system for classifying content scanned in a data-at-rest project — These efforts center on data classification and data life cycle management, with goals to identify where sensitive data is stored on-premises in file shares or content management systems, as well as online in cloud storage platforms. These efforts are particularly useful to tag data that has not been accessed for a certain time period, or has a certain type of sensitive data or a number of sensitive records. This can be valuable in identifying data suitable for archiving or deletion, which is further along the data life cycle.
  3. Data classification used as a means to address the mobile data use case — An example is sensitive data moved to an unmanaged asset where a DLP endpoint is either not deployed (or cannot be deployed), such as a tablet or phone, yet there is still a need to retain security controls or tag values of a specific file or data type. Frequently, this is tied to deployment of EDRM, such as Microsoft Azure Rights Management Service (RMS).

Data classification products typically have two main classification capabilities:

  1. Automated classification based upon content libraries, rules, heuristics and Bayesian classifiers and other content string identification systems.
  2. User-driven classification that empowers users to apply initial content tags to unstructured content at either the time of creation or editing; that content tag can either persist through the life of that data, or can be edited or appended by other users.


There are pros and cons to either data classification approach. With automated classification, you start with a programmatically driven decision about data, and your tags will only be as intelligent or sophisticated as the classification policy and engine allow. User-driven classification allows data owners and content creators who are closest to the data to self-identify and establish data tags. The major drawbacks are human error and reliance on data owners to be savvy about the types of data they access, and making judgments calls such as "confidential" versus "highly confidential." For further guidelines on data classification in the context of data security, please see "How to Overcome Pitfalls in Data Classification Initiatives."


DLP Managed Services

DLP managed services have dramatically increased in both popularity and adoption in the last two years. Thirteen percent of all Gartner client inquiries for data loss prevention in 2015 mentioned DLP as a managed services offering.

DLP managed services are different from MSSP for other security technologies, such as firewalls or intrusion prevention systems, where the primary focus is log management. DLP managed services depend on a proper implementation, and a thorough understanding of the policy and rules, to optimize event workflows and to understand serious data security issues from noisy events.

The same vendors that are traditionally strong in the MSSP market do not always carry over that same strength to managing DLP. DLP requires a considerably different approach to managing the event workflow — it's not a volume game, or even one aided considerably by event correlation. Understanding both asset criticality and data criticality from data owners are two key elements to having a successful DLP managed services offering. DLP systems are not always something that you want to allow third-party access to — or hosting of, in some cases — especially for clients with strong data residency concerns.

Symantec and Digital Guardian are two vendors most frequently mentioned by clients looking for a managed services option. Symantec has a strong network of partners that manage all of the components of the Symantec DLP product suite. InteliSecure, infoLock Technologies, Novacoast and Wipro are commonly mentioned as implementation partners that can both implement Symantec DLP and provide DLP managed services. Digital Guardian has its own MSP approach that allows customers to choose flexible deployment options, and choose for either Digital Guardian to host the DLP management infrastructure, or for the customer to retain hosting on-premises in the organization. Forcepoint, Intel Security, Fidelis Cybersecurity and GTB Technologies also have partnerships for DLP managed services; however, those vendors appear less frequently among Gartner clients. Raytheon's acquisition of Foreground Security in October 2015 will likely improve Forcepoint DLP managed services, particularly for U.S. federal government clients.

Strongly consider DLP vendors that have management platform flexibility (cloud, hybrid or on-premises), as well as flexible deployment options for endpoint and network DLP. Also look for those DLP managed service providers with vendor-certified security analysts on staff, current client references and a proven track record of helping organizations improve the operational workflow of DLP implementations into actionable and usable information.


Intersections Between DCAP and DLP

Data-centric audit and protection (DCAP) is a category of products characterized by the ability to centrally manage data security policies and controls across unstructured, semistructured and structured repositories or silos. Based upon data security governance (DSG) principles, these products encompass the ability to classify and discover sensitive datasets and control access to the sensitive data by centrally managing and monitoring privileges and activity of users and administrators.

There is a critical need to establish organizationwide data security policies and controls based upon DSG. DSG allows an organization to achieve a balance between appropriate security and competitive advantage by classifying and prioritizing security and expenditure for particular sensitive datasets. Each dataset has its own protection, storage and controls that will vary as a function of time (for example, sales, intellectual property and personally identifiable information [PII] datasets have different lifetimes). By using the DSG process to engage key stakeholders (such as business, IT, governance, compliance/legal and risk), organizations can then approve whether each dataset merits investment in security controls that mitigate particular risks associated with compliance or data threats, or to protect intellectual property. There are many data security products available, each with different security control capabilities, but most focus on particular data silos (see "Market Guide for Data-Centric Audit and Protection" ). The challenge facing organizations today is that data is pervasive and does not stay in a single silo on-premises, but is compounded by the use of cloud SaaS or IaaS. Organizations must apply DSG before implementing any data security product or process.

DSG drives data classification and discovery to be core requirements for many data security products, including DCAP and DLP. DLP provides visibility into sensitive data in use on the endpoint, in motion over the network and at rest on file shares. DLP policies provide real-time protection of unstructured data being extracted from endpoints or via email. This is complementary to DCAP, which uses data classification and discovery to help with the real-time user activity monitoring of data at rest and in use within structured or unstructured silos, and the application of protective products. In cloud SaaS, data protection offerings are becoming a "melting pot" of separate DLP and DCAP policies, driving the need for data security policy orchestration between all of the silos of data security controls in an organization.


Data Loss Prevention for Cloud and Mobile Users

Data loss prevention is a necessary component of cloud and mobile computing. The way that DLP is currently implemented for cloud and mobile use cases is through the use of native data security controls within a CASB, enterprise DLP policy integration with a CASB, or application isolation or containers on managed mobile devices, such as tablets and phones.

Historically, DLP for mobile devices was accomplished through backhauling all managed device traffic through a VPN and network DLP appliance (usually limited to Web and email traffic). Cloud applications and cloud storage platforms break this model, as highlighted in "Overcome the Limitations of DLP for Mobile Devices."

DLP functionality for unmanaged devices is today best-suited to implementation via a CASB or cloud security service. Due to privacy and mobile device constraints, having a useful mobile agent on a device you do not own is simply not a reality for many users or organizations. In particular, there are not full-featured DLP agents for iPads, iPhones or the near infinite variations of Android devices that perform DLP capabilities. Most of these solutions are little more than trusted viewing agents or ways to access files shared through mobile content management (MCM). There likely never will be a DLP agent for mobile form factors, due to device limitations and resource-heavy requirements that full content-aware inspection would bring to a mobile device. For further details on mobile data protection, please refer to the "Magic Quadrant for Enterprise Mobility Management Suites."




  • Vendor surveys and recorded product demos from all vendors represented in this Magic Quadrant
  • 300+ Gartner client inquiry calls centered on data loss prevention from March 2015 to January 2016
  • Customer reference surveys — delivered in an online survey to 48 customers, and live interviews with 10 customers of vendors represented in the Magic Quadrant
  • Gartner Secondary Research used to research company financials and market-size metrics


Evaluation Criteria Definitions

Ability to Execute


Product/Service: Core goods and services offered by the vendor for the defined market. This includes current product/service capabilities, quality, feature sets, skills and so on, whether offered natively or through OEM agreements/partnerships as defined in the market definition and detailed in the subcriteria.


Overall Viability: Viability includes an assessment of the overall organization's financial health, the financial and practical success of the business unit, and the likelihood that the individual business unit will continue investing in the product, will continue offering the product and will advance the state of the art within the organization's portfolio of products.


Sales Execution/Pricing: The vendor's capabilities in all presales activities and the structure that supports them. This includes deal management, pricing and negotiation, presales support, and the overall effectiveness of the sales channel.


Market Responsiveness/Record: Ability to respond, change direction, be flexible and achieve competitive success as opportunities develop, competitors act, customer needs evolve and market dynamics change. This criterion also considers the vendor's history of responsiveness.


Marketing Execution: The clarity, quality, creativity and efficacy of programs designed to deliver the organization's message to influence the market, promote the brand and business, increase awareness of the products, and establish a positive identification with the product/brand and organization in the minds of buyers. This "mind share" can be driven by a combination of publicity, promotional initiatives, thought leadership, word of mouth and sales activities.


Customer Experience: Relationships, products and services/programs that enable clients to be successful with the products evaluated. Specifically, this includes the ways customers receive technical support or account support. This can also include ancillary tools, customer support programs (and the quality thereof), availability of user groups, service-level agreements and so on.


Operations: The ability of the organization to meet its goals and commitments. Factors include the quality of the organizational structure, including skills, experiences, programs, systems and other vehicles that enable the organization to operate effectively and efficiently on an ongoing basis.


Completeness of Vision


Market Understanding: Ability of the vendor to understand buyers' wants and needs and to translate those into products and services. Vendors that show the highest degree of vision listen to and understand buyers' wants and needs, and can shape or enhance those with their added vision.


Marketing Strategy: A clear, differentiated set of messages consistently communicated throughout the organization and externalized through the website, advertising, customer programs and positioning statements.


Sales Strategy: The strategy for selling products that uses the appropriate network of direct and indirect sales, marketing, service, and communication affiliates that extend the scope and depth of market reach, skills, expertise, technologies, services and the customer base.


Offering (Product) Strategy: The vendor's approach to product development and delivery that emphasizes differentiation, functionality, methodology and feature sets as they map to current and future requirements.


Business Model: The soundness and logic of the vendor's underlying business proposition.


Vertical/Industry Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of individual market segments, including vertical markets.


Innovation: Direct, related, complementary and synergistic layouts of resources, expertise or capital for investment, consolidation, defensive or pre-emptive purposes.


Geographic Strategy: The vendor's strategy to direct resources, skills and offerings to meet the specific needs of geographies outside the "home" or native geography, either directly or through partners, channels and subsidiaries as appropriate for that geography and market.