• Is your Exchange database too big?
  • Are your servers bloated with duplicates?
  • Does your backups take a long time?
  • Are you afraid your recovery time will exceed the expected time?
  • Are you ready to eliminate PST-files?
  • Does your Migrations take forever?
  • Do you have too much scattered, unstructured and unmanaged data?
  • Can you never find that ONE document within the heaps of files?
  • Do you manage without single instance store?

It has been said, The world’s information is doubling every two years.  In 2012, Gartner/IBM told us that 90% of the data in the world today has been created in the last two years alone.  This is not just due to mobile cameras, pictures and video, and the ever so available Cloud services and online social sites, it could also to a large degree, be internal in your own company.

The fear of deletion

Sure, that sounds like a computer game, but No sane normal person I know, would dare to delete anything, at all, at least if they haven’t made at least a couple of copies of it, stored it at least two different places.
That’s just human nature. You will of course delete a mail, but first you’ll copy it to the fileserver on the common area, the project area,  my home area, and locally on my laptop. Maybe you’ll store it in SharePoint under ‘my project’, and send it to your private account (storing it in the  sent items), and then drag and drop it to my personal pst file that resides on the file server. Everyone knows a PST file will have their archive bit set every time you open outlook, and for a backup system the whole PST file will be included in every single incremental or full backup. Not to mention if I virtualize my servers and data, then I have virtualized duplicates running around. COOL! ….or not! The consequence is that people tend to use the e-mail system as a repository for both mail and documents.

The search

Try this: Find a unique document, a word 2013 file (*.docx), from the period 11-23 December 2012, containing the word “Sales” and “company”. You need to search SharePoint, Exchange, you local laptop, and your Fileserver.    imageCan you do that a) within a day? b) in less than 30 seconds? or c) Not at all? 

It is easy to search and manage one single document in-between 10 other documents on your desktop, but image if you had to find one single document from a football stadium full of archived documents?

Buy more storageimage

Should administrators just ignore the data growth and hope that we, the people, only stores our data once, …and…uhm also store it in in the right location.   As an Administrator we have the power to block out cloud storage (dropbox, Google, synplicity, jottacloud), and implement Quotas on anything, block PST file usage, block availability to store files locally on laptops, Or we can just buy more storage to accommodate the growth. The latter will of course bring about a small war with the financial department within the company. Well, you can’t have too much storage, right?

The backup hellimage

You store 20TB of data. You have 10% growth in a year. i.e.. this is a very small business. Every week you take full backups during the weekend and incremental backups during the weekdays. You retain the incremental backups for 4 weeks, and the weekly backups for 3 months. You also take monthly backups retained a year, and yearly backups retained for 10 years. See where this is going?  Do the Math, or at least put together a little quick estimate. How much will you have stored on either disks or tape in the end?   Not only that, think of all the time and resources you’ll spend backing it up, and last but not least, the time you spend restoring it. One of the fine things about backup systems today is of course the ability to de-duplicate data, i.e.. with hardware devices or software compression/deduplication.

What not to do..image

Yes, you would of course buy more storage, if the goal was to impress your neighbour. you can block or put quotas on anything in the E-Mail system, Fileservers, and on SharePoint, if you love angry, pissed off users. We can let all users manage their mailboxes and catalogue structure and files, at least an hour a week, if we love to pay for unproductive work. We can continue to implement restrictions as much as we like, but the New “information taming” technologies such as deduplication, compression, and analysis tools should drive down the cost of creating, capturing, managing, and storing information.

What to do..

One suggestion to tame the amount of data, the complexity of searches, and to simplify backup, is to archive before you backup.

Archive, then index (for search), deduplicate (SiS), compress (i.e.. 50%), and store the data onto cheap storage, and make just one single replica for redundancy, to make sure you never backup your archive again. One such system that will provide you with all this, and is a system I’ve worked a lot with, is the excellent product Symantec Enterprise Vault, This does the automatic archiving from Microsoft Exchange (2003—>2013), SharePoint, and fileservers (Celerra, VNX, NetApp, or Windows) for you, it deduplicates, compresses, and also indexes anything archived, recognising more than 500 different document (attachment) types.

Exchange DAG: This reduces your Exchange databases to a minimum, and it’s less data to replicate between DAG’s, and your Exchange system will overall be more happy since it’s less bloated. 3 DAG servers gives you 3 times more data to carry.

Duplicates: You’ll get rid of duplicates since Enterprise Vault will deduplicate between all fileservers, Exchange servers and SharePoint servers.

Backups\Restores: Your backups will be reduced with more than 70-90% because your servers will have data offloaded from the front-end File-, Exchange- and SharePoint servers. You will have much less data to recover whenever the disaster strikes. Your RPO and RTO will become a lot smaller.

Search: What used to be a heap of unstructured data, is now searchable, retrievable, and you can even restore the fund data from the archive onto your servers again if you’d like. This way your scattered hidden data will become more visible and usable.

PST: You will eliminate (corrupt) PST files across your entire organisation. The PST files can be migrated directly into the archive.

Migrations: We all know that Microsoft will always release a newer version of all the software you’ve got installed, so you’ll spend less time migrating less data from one storage to another, or from one software version to another, in a smoother migration.

Delete with confidence: When data gets too old, the archive system will delete your data based on age, with confidence. This completes the lifecycle of your data, and even saves future storage space in your archive.

Quota: You’ll end your users quota frustrations, since you virtually will have a “no limit” space. You’ll solve the “dumping ground” syndrome.

My Experience

I’ve tried archiving with Enterprise Vault with many different type of customers, many different kind of front-end storage systems, including filers and windows systems. I’ve archived Exchange servers in mixed environments.

I’ve stored the archived data onto a lot of different storage systems, CAS, NAS, SAN, and NTFS. I’ve reduced Exchange Storage foot print down to 10% in many cases, and kept it there”!!

I’ve reduced file server with up to 80%, leaving it with a lot less data to backup or restore.

Symantec has proven their Archiving technology as the very best in it’s genre, according to Gartner’s magic quadrant, being both the Leader and the Visionary in their category.


Next week I’ll be giving you more reflections on Backup, Archiving, Storage or DR. Maybe I’ll show you specific tips and tricks you may, or may not have tried out. I might even explain to you why RPO and RTO matters, and also give you an explanation to what it is, and how to relate to it, in your SLA’s.

%d bloggers like this: