Deprecated: __autoload() is deprecated, use spl_autoload_register() instead in /home/pbuc/public_html/forum/mods/ext_phorummail/ezc/Base/src/ezc_bootstrap.php on line 36

Deprecated: Methods with the same name as their class will not be constructors in a future version of PHP; KeyCAPTCHA_CLASS has a deprecated constructor in /home/pbuc/public_html/forum/mods/keycaptcha/keycaptcha.php on line 108
Ideal Netbackup solution for backing up millions of files
Welcome! » Log In » Create A New Profile

Ideal Netbackup solution for backing up millions of files

Posted by Anonymous 
Ideal Netbackup solution for backing up millions of files
December 13, 2014 05:17AM
Flashbackup is the ideal option to backup these many files. We use it and till now no issues reported. On Sun, 28 Sep 2014 04:02:11 +0530 Shaheensn wrote >Hi, It would be extremely helpful if the expert minds in this forum could help me with below concerns related to backing up data using Symantec NetBackup. Background Our system has a folder containing millions of small files spread across 1000s of sub folders running into more than 500GB that needs to be backed up on a daily basis. The backup usually starts end of day and is ideally supposed to finish before users start accessing the system the following day. However, the current backup using Symantec Netbackup takes lot of time and hence, this impacts the system performance. I assume this is because of the current backup approach used by Symantec is based on file-level backup. The current backup policy has a monthly full backup, weekly differential backup and daily incremental backup. However, the daily backup itself takes more than double the desired time. Symantec client version is 7.6.02.The server is a physical server having Windows Server 2003. The drive is dedicated to storing the files. I am looking for a solution that would provide the fastest backup in order to ensure that backups are completed before users start accessing the system in the morning. 1. What is the best approach for backing up millions of small files spread across 1000s of folders using Symantec Netbackup? 2. I understand that flash backup feature in Symantec uses block-level backup and this would be ideal approach since there are lot of files involved? 3. What are the disadvantages of using Symantec Flashbackup feature? 4. Does installing Symantec SAN client without having fiber channel offer any improvement in backup speed? I read on some forums that you should use SAN client only if there is fiber channel support. 5. Would moving applications to Virtual Machines hosted on Vmware environment provide any improvement. PS: Backups are not my domain of expertise and hence, I would like to apologize in advance if I have mentioned something absurd. Secondly, changing the application file structure is not an option :( Cheers, Shaheen +---------------------------------------------------------------------- |This was sent by shaheen.nalakath < at > via Backup Central. |Forward SPAM to abuse < at > +---------------------------------------------------------------------- _______________________________________________ Veritas-bu maillist - Veritas-bu < at >
[url= < at > Middle?][/url]Get your own [b]FREE[/b] website, [b]FREE[/b] domain & [b]FREE[/b] mobile app with Company email.
[url=][b]Know More >[/b][/url] -->
Ideal Netbackup solution for backing up millions of files
February 24, 2015 02:50PM
This probably isnt ideal for you or your situation but this is how i handled this problem in my environment. I never tried using flashbackup for these servers.

We have 2 Major File servers, 1 is 1.6 tb and has several million files on it (emails, log files, spreadsheets - accounting file server) and a separate NAS which is 15tb and probably 5 million files on it. I tried to backup the accounting file server by backing up the whole machine, but it took 70+ hours to complete and rarely completed for the full backup window. the NAS device we used ndmp, which would complete its full backup in the 48 hour window, but restores were very very slow.

For the Accounting Server, i created 4 separate policies. within those policies i put the major shares or directories as the backup selection. ex: C:Windows, E:AccountingSpreadsheets, E:AccountingLogFiles etc....... I then chose in the policy allow multiple data streams. I also staggered the start of each of these policies. I went from 70+ hours down to 40 hours full backup. incrementals took about a quarter of the time also, as long as the schedules were staggered. the only issue is having to check once a week to ensure no one has created a new share.

For the NAS server, i abandoned ndmp because of the lengthy time it took to restore data. i now use a media server and the backup selection is the UNC path ex. NASDEVICESHARENAME. I have 7 different policies for this server (similar to above), with staggered start times. I map the drives on the media server, and check mark the allow multiple data streams, and backup network drives. the backup for this machine is slightly longer than using ndmp, but restores take a quarter of the time.

I hope this helps your issue.
Sorry, only registered users may post in this forum.

Click here to login