/build/static/layout/Breadcrumb_cap_w.png

Want to make a script to backup user psts to our NAS on a schedule. But one problem....

@ our corporate office we have about 25 computers. We have to archive emails to our machines because we have no better solution to manage space on exchange at this time. The .pst files, don't really get backed up and i want to make sure to make a reliable automated policy so they do. I can write a robocopy script to back them up upon via a schedule - no problem - but that would mean a full backup of their psts every time, and WE HAVE ALOT of .pst data at this office. It COULD be a strain on the network even though this is something i would make sure happens after hours. 

how do you guys prefer to manage backing up large files to your local file server? Just curious to see if you guys had a similar item to contemplate and what you would do

Thanks. 

0 Comments   [ + ] Show comments

Answers (1)

Posted by: rileyz 8 years ago
Red Belt
2
You stuck in between a rock and hard place. PSTs can be troublesome and when it goes wrong, users normally give you grief.

Copy the data to the sever every night, apparently Remote Differential Compression dosent work with pst which sucks, so its a full file copy. Let your backup solution handle the backups, hopefully you will be doing a keep forever monthly, full weekly and differential daily. 

With the above you should be able to rebuild a pst if something goes wrong.

Other solution is to buy some software that will look after the pst's.

err, good luck.
 
This website uses cookies. By continuing to use this site and/or clicking the "Accept" button you are providing consent Quest Software and its affiliates do NOT sell the Personal Data you provide to us either when you register on our websites or when you do business with us. For more information about our Privacy Policy and our data protection efforts, please visit GDPR-HQ