mountainss Cloud and Datacenter Management Blog

Microsoft SystemCenter blogsite about virtualization on-premises and Cloud


Leave a comment

Use Microsoft #Azure Backup for your Datacenter protection #HybridCloud #Sysctr

HybridCloud DPM

With Microsoft Azure Backup, you can protect application workloads such as Hyper-V VMs, Microsoft SQL Server, SharePoint Server, Microsoft Exchange and Windows clients to:
– Disk (D2D), giving high RTOs for tier 1 workloads
– Azure (D2D2C) for long term retention.
And, you can manage the protection of various protected entities (servers and clients) from a single on-premises user interface.

You can deploy Microsoft Azure Backup server as:
– A physical standalone server.
– A Hyper-V virtual machine – You can run DPM as a virtual machine hosted on an on-premises Hyper-V host server, to back up on-premises data.
– A Windows virtual machine in VMWare – You can deploy DPM to provide protection for Microsoft workloads running on Windows virtual machines in VMWare. In this scenario DPM can be deployed as a physical standalone server, as a Hyper-V virtual machine, or as a Windows virtual machine in VMWare.
– An Azure virtual machine – You can run DPM as a virtual machine in Azure to back up cloud workloads running as Azure virtual machines.

Try it yourself and download Microsoft Azure Backup Software here

Here you can find more Microsoft information about Backup and Disaster recovery

DPM to AzureBackup

Microsoft System Center 2012 R2 Data Protection Manager RU9 to Azure Backup Vault in the Cloud

What’s new in DPM in System Center 2016 Technical Preview

Here you can find the Microsoft Azure Storage Backup & Recovery Blog to keep you Up-To-Date

AzureBackup


Leave a comment

#Microsoft System Center 2016 TP4 Whats New and Download #Sysctr #SCDPM #SCVMM

 

System Center 2016 TP4

Whats New in System Center 2016 Technical Preview 4 Today :



Leave a comment

New Free Ebook Data Protection for the Hybrid Cloud #sysctr #SCDPM #Azure

DPM for HybridCloud

Microsoft is happy to announce the release of their newest free ebook, Microsoft System Center Data Protection for the Hybrid Cloud (ISBN 9780735695832), by Shreesh Dubey, Vijay Tandra Sistla, Shivam Garg, and Aashish Ramdas; Mitch Tulloch, Series Editor.

If you are responsible for architecting and designing the backup strategy for your organization, especially if you’re looking for ways to incorporate cloud backup into your business continuity scenarios, this book is for you. With the increasing trends in virtualization as well as the move to the pubic cloud, IT organizations are headed toward a world where data and applications run in on-premises private clouds as well as in the public cloud. This has key implications for data protection strategy, and it is important to choose the solution that provides the same level of data protection you have afforded so far while allowing you to harness the power of the public cloud.

This book covers improvements added in DPM 2012 R2 as well as the integration with Microsoft Azure Backup service

Here you can download your Free copy of the Ebook


Leave a comment

#sysctr Management Packs for Data Protection Manager 2012 R2 Reporting, DedupReporter, Discovery and Monitoring

DPM Reporting

This download contains the management pack files (*.MP) required to monitor and generate reports on Data Protection Manager (DPM) Servers centrally using System Center Operations Manager (SCOM). The management guide document (*.docx) also provided with this download contains detailed instructions on how to set up, configure, deploy reporting centrally on the Operations Manager server and how to use the new enhanced extensible DPM Reporting Framework to generate custom aggregable reports. SC OpsMgr 2012 R2 is required to be installed and running.

Here you can download the System Center Management Packs for Data Protection Manager 2012 R2 Reporting, DedupReporter, Discovery and Monitoring


Leave a comment

UPDATE #SCDPM Team Blogpost Maintenance on System Center Data Protection Manager#sysctr

DPMDB Check

Microsoft is writing a series of three blog posts to cover three areas and will point to an existing post for the final piece. This will be broken down into an approach that starts by checking the database consistency, followed by a look at fragmentation (or eliminating it) to optimize performance. Third, Microsoft makes sure there is no extra growth and that it is sized optimally, and lastly, we talk about backing up the DPMDB in order to have a good copy available should it ever be needed.

Here you can read more on the System Center Data Protection Manager Team Blog (Part 1)

DPMDB Maintenance Part 2: Identifying and dealing with fragmentation in your DPMDB

DPMDB Maintenance Part 3: Dealing with a large DPMDB

databasesadministration

 


Leave a comment

Update Rollup 5 for System Center 2012 R2 Data Protection Manager is now available #sysctr #SCDPM

DPM Backup

Microsoft is excited to announce the release of Update Rollup 5 for System Center 2012 R2 Data Protection Manager.  This is a feature-rich release and an important milestone in Microsoft cloud integrated backup vision. In the coming weeks, Microsoft will publish detailed blogs and videos of the new features, keep watching this space for more!

Azure is an integral and important part of this update rollup. Every feature of this update has an element of Azure plugged in to it. Customers will experience similar functionality and a more seamless experience irrespective of whether their data is protected locally or to the cloud. We have enabled more features for customers who are already using Azure for their long term backup & retention needs. If Azure is not an integral part of your DPM led backup strategy, this release still provides a compelling value prop.

The high level description of the enhancements in this update are divided into four categories:

  1. Support for new workloads
  2. Better data transfer and retention options to Azure
  3. Enhanced monitoring and alerting

Read more on the Microsoft System Center Data Protection Manager Team Blog

HybridCloud DPM


Leave a comment

#Microsoft Data Deduplication Overview and DPM Storage #Winserv #sysctr #SCDPM

Data deduplication involves finding and removing duplication within data without compromising its fidelity or integrity. The goal is to store more data in less space by segmenting files into small variable-sized chunks (32–128 KB), identifying duplicate chunks, and maintaining a single copy of each chunk. Redundant copies of the chunk are replaced by a reference to the single copy. The chunks are compressed and then organized into special container files in the System Volume Information folder. The result is an on-disk transformation of each file as shown in Figure 1. After deduplication, files are no longer stored as independent streams of data, and they are replaced with stubs that point to data blocks that are stored within a common chunk store. Because these files share blocks, those blocks are only stored once, which reduces the disk space needed to store all files. During file access, the correct blocks are transparently assembled to serve the data without calling the application or the user having any knowledge of the on-disk transformation to the file. This enables administrators to apply deduplication to files without having to worry about any change in behavior to the applications or impact to users who are accessing those files.

DedupOn-disk transformation of files during data deduplication

After a volume is enabled for deduplication and the data is optimized, the volume contains the following:

  •  Unoptimized files. For example, unoptimized files could include files that do not meet the selected file-age policy setting, system state files, alternate data streams, encrypted files, files with extended attributes, files smaller than 32 KB, other reparse point files, or files in use by other applications (the “in use” limit is removed in Windows Server 2012 R2).
  • Optimized files. Files that are stored as reparse points that contain pointers to a map of the respective chunks in the chunk store that are needed to restore the file when it is requested.
  • Chunk store. Location for the optimized file data.
  • Additional free space. The optimized files and chunk store occupy much less space than they did prior to optimization.

To cope with data storage growth in the enterprise, administrators are consolidating servers and making capacity scaling and data optimization key goals. Data deduplication provides practical ways to achieve these goals, including:

  •  Capacity optimization. Data deduplication stores more data in less physical space. It achieves greater storage efficiency than was possible by using features such as Single Instance Storage (SIS) or NTFS compression. Data deduplication uses subfile variable-size chunking and compression, which deliver optimization ratios of 2:1 for general file servers and up to 20:1 for virtualization data.
  • Scale and performance. Data deduplication is highly scalable, resource efficient, and nonintrusive. It can process up to 50 MB per second in Windows Server 2012 R2, and about 20 MB of data per second in Windows Server 2012. It can run on multiple volumes simultaneously without affecting other workloads on the server. Low impact on the server workloads is maintained by throttling the CPU and memory resources that are consumed. If the server gets very busy, deduplication can stop completely. In addition, administrators have the flexibility to run data deduplication jobs at any time, set schedules for when data deduplication should run, and establish file selection policies.
  •  Reliability and data integrity. When data deduplication is applied, the integrity of the data is maintained. Data Deduplication uses checksum, consistency, and identity validation to ensure data integrity. For all metadata and the most frequently referenced data, data deduplication maintains redundancy to ensure that the data is recoverable in the event of data corruption.
  • Bandwidth efficiency with BranchCache. Through integration with BranchCache, the same optimization techniques are applied to data transferred over the WAN to a branch office. The result is faster file download times and reduced bandwidth consumption.
  • Optimization management with familiar tools. Data deduplication has optimization functionality built into Server Manager and Windows PowerShell. Default settings can provide savings immediately, or administrators can fine-tune the settings to see more gains. One can easily use Windows PowerShell cmdlets to start an optimization job or schedule one to run in the future. Installing the Data Deduplication feature and enabling deduplication on selected volumes can also be accomplished by using an Unattend.xml file that calls a Windows PowerShell script and can be used with Sysprep to deploy deduplication when a system first boots.

Plan to Deploy Data Deduplication

Install and Configure Data Deduplication

Monitor and Report for Data Deduplication

Deduplicating Microsoft System Center 2012 R2 DPM storage :

Business benefits
Using deduplication with DPM can result in large savings. The amount of space saved by deduplication when optimizing DPM backup data varies depending on the type of data being backed up. For example, a backup of an encrypted database server may result in minimal savings since any duplicate data is hidden by the encryption process. However backup of a large Virtual Desktop Infrastructure (VDI) deployment can result in very large savings in the range of 70-90+% range, since there is typically a large amount of data duplication between the virtual desktop environments. In the configuration described in this topic Microsoft ran a variety of test workloads and saw savings ranging between 50% and 90%.

Recommended deployment
To deploy DPM as a virtual machine backing up data to a deduplicated volume Microsoft recommend the following deployment topology:

  • DPM running in a virtual machine in a Hyper-V host cluster.
  • DPM storage using VHD/VHDX files stored on an SMB 3.0 share on a file server.
  • For this example deployment Microsoft configured the file server as a scaled-out file server (SOFS) deployed using storage volumes configured from Storage Spaces pools built using directly connected SAS drives. Note that this deployment ensures performance at scale.

DeDup DPM storage

Note the following:

  • This scenario is supported for DPM 2012 R2
  • The scenario is supported for all workloads for which data can be backed up by DPM 2012 R2.
  • All the Windows File Server nodes on which DPM virtual hard disks reside and on which deduplication will be enabled must be running Windows Server 2012 R2 with Update Rollup November 2014.

Sizing Volumes for Data Deduplication in Windows Server

More information on Deduplicating DPM storage you can find here

Microsoft System Center DPM Blog

Dedup