Hello Friends –

Back to you all with my second blog of year 2017 with Logshipping automation. This time picked up a little deviated topics like year 2016. Being a DBA background monitoring, configuring & managing high availability & disaster recovery solution were a day to day responsibility for me, like most of us have/had at some point in time in career. Recently I’d been notified with a challenge for a customer managing highly confidential data. So here is the problem statement.

“While we’re expanding our business, some of high confidential data needs to be procured in secured way from a highly skilled & specialized organization involved into business for years. Challenge is data procured from vendor needs to be up-to-date on daily basis in customer environment, of course without compromising a bit of security.”
First things comes to an experienced DBA in mind, what kind of security model/concerns are being refrained here.

  • Secured Bridge between vendor & customer
  • Method to push data
  • Access to data
  • Method of accessing data
  • Etc
  • Etc
  • ….

List is gone be unfathomable at moments. If proper & secured infrastructure is in place, most of the stuff comes to DBA in setting up pull data from vendor environment, push to customer in timely fashion so further analysis and respective tasks take place, seamlessly.

Here is quick flow of purposed solution from my side.


It’s working perfectly for a while and let me provide required set of tools excluding SQL Server Engine, Database & SQL Agent. Pretty much everything is being used in this solution is freeware and doesn’t carry any security issues.


  • Vendor is providing daily zip file with a suffix of YYYYMMDD format to distinguish file names.
  • SFTP download is running with PPK shared by vendor to customer.
  • SQL DB Engine & SQL Agent is running under domain & authentication is well placed.
  • Listed tools are installed correctly and able to run seamless. Better to do a test run with wizard mode.
  • A DBA database exists on SQL instance to hold historical sync information for back tracking. In my case DB name is SQLMaint.
  • Database Mail configuration is in place to send success & failure notification.

Completed PowerShell & T-SQL scripts are mentioned below.

PowerShell script does inline task.

  • Download T-log backups consolidated zip file securely from vendor sftp location. Following cases script gets terminated
    • SFTP folder doesn’t exists
    • Local folder doesn’t exists
    • Zip file already downloaded
    • Zip file is corrupted
  • Unzip downloaded file at given location.
  • Kick off T-SQL agent job if expected no of log files unzipped from daily zip file

Below T-SQL script does inline task.

  • Read all unzipped TRN backup files sequentially to avoid termination due to log sequence no.
  • Put header information in a table.
  • Restore T-Log backup to Log shipping database based upon sequence no.
  • Send alert on success & failure respectively.

I strongly believe, there is always chance of betterment, so suggestions are most welcome.

Happy learning.

Thank you!
Avanish Panchal
Regional Head – DataPlatformGeeks & DPS2017 Core Team Member
Like us on FaceBook | Join the fastest growing SQL Server group on FaceBook

Follow Avanish Panchal on Twitter | Follow Avanish Panchal on FaceBook