Welcome! » Log In » Create A New Profile

Best Practice using Staging Groups

Posted by Guest 
Guest
Best Practice using Staging Groups
June 16, 2010 08:47AM
Hi,

I have approx 2TB of data I need to backup nightly to tape spread across approx 10 servers (one of which is very slow due to a huge amount of small files (10M+).

I have an 8TB NAS which I want to use as an incremental staging group which changes are backed up to each night.

Then I have 2x LTO4 drives in a group which I want to do a full backup for off site storage every night.

My theory was to do an incremental backup to storage group for speed and minimal load of server then a full backup to the tapes from the staging group but obviously I can only choose Full or incremental.

Can anyone advise me of the best practice for performing a daily incremental backup to my NAS then a full backup to my tape group. I have a 12 hour backup window and am running Arcserve 12.5.

Thanks for your help.
Setup a deduplication device for the backup to NAS.

Setup one backup with all targets selected and multistream and multiplex enabled. With LTO4 try out 5 streams for the multiplex. It will be necessary to try different setting to find the optimum setting.

Job Engine buffers can be adjusted as well (under Server Admin Config) but there too trial and error is the only way to know what will be best.

If a buffer is too large the job has to wait for it to fill, if it is too small then the data gets sent too fast.

Keep that large volume deframented, fragmentation can really degrade throughput on large volumes with many small files.

Also if that large volume is running under Windows 2008 R2 install this storport.sys update.
http://support.microsoft.com/kb/981208

Adjusting the Windows Client Agent backup priority might help as well.

It think the CA support site has a tech doc on Client Agent registry setting that can help with performance as well.
Sorry, you do not have permission to post/reply in this forum.