Designing a Backup Plan (SQL Server)

Recently, we had a brand new Production server hosted on a Windows 2003 Cluster along with SQL Server 2005 with Service Pack 3. This was an OLTP system hosted in a DataCentre in New Jersey,United States of America. The server had 16 CPU’s along with 32 GB of Memory. The server initially had just one database named ABC whose size was around 93 GB.As per the agreed backup policy with the Customer, we always transfer the Production Database Backup onto the Staging Server over the N/W through Robocopy. This staging server was located at the NewYork Data Centre.Transferring a file of 93 GB over the N/W was really a tedious task as sometimes even after 24 hours passed by but the file never got transferred successfully, also on couple of occassions, I have seen that even though the file got transferred properly but it was a corrupted one. In this article, Satnam Singh an SQL Server DBA from Larsen&Toubro Infotech,Mumbai,India discusses one such approach which he used on his Production Server to streamline the backup process.

Just FYI, we didn’t had any third party utility like Idera,Red-Gate Software,LiteSpeed to compress the DB Backup Files.

The first step was to develop a stored procedure which will take the Full backup of the database named ABC and compress it on the server itself. Here is one of the great T-SQL script of all times which i used for this purpose.

USE [master]
GO
/****** Object:  StoredProcedure [dbo].[USP_BAKZIP]    Script Date: 06/20/2012 04:19:43 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER OFF
GO
CREATE PROCEDURE [dbo].[USP_BAKZIP]
@DBNAME VARCHAR(256),
@BAKPATH VARCHAR(1000),
@ZIPPATH VARCHAR(1000),
@TYPE VARCHAR(1) -- F (FULL BACKUP) T (TRANSACTION LOG BACKUP)
AS
DECLARE @SQLSTATEMENT VARCHAR(2000)
SET @SQLSTATEMENT =''
DECLARE @BTYPE VARCHAR(25)
DECLARE @BTYPEEXT VARCHAR(4)
DECLARE @TIMESTAM VARCHAR(20)
SET @TIMESTAM=REPLACE(REPLACE(REPLACE(CONVERT(VARCHAR(25),GETDATE(),120),'-','_'),':','_'),' ','_')
IF @TYPE ='F'
BEGIN
SET @BTYPE =' DATABASE '
SET @BTYPEEXT='.BAK'
END
IF @TYPE ='T'
BEGIN
SET @BTYPE =' LOG '
SET @BTYPEEXT='.TRN'
END
IF @TYPE NOT IN ('T','F') 
BEGIN
GOTO ERROR
END
 
SET @SQLSTATEMENT = 'BACKUP ' + @BTYPE + @DBNAME+' TO DISK ="'+@BAKPATH+@DBNAME+'_'+@TIMESTAM+@BTYPEEXT+'" '
EXEC (@SQLSTATEMENT)
 
SET @SQLSTATEMENT = 'B:\DB-Backups\RAR.exe A "'+@ZIPPATH+@DBNAME+'_'+@TIMESTAM+'.RAR" "'+@BAKPATH+@DBNAME+'_'+@TIMESTAM+@BTYPEEXT+'"'
EXEC MASTER..XP_CMDSHELL @SQLSTATEMENT

In order to execute the above Stored Procedure, I use the below T-SQL,

EXEC USP_BAKZIP 'ABC','B:\DB-BACKUPS\ABC\Full\','B:\LogShippingLogs\','F'

Where ABC is the name of the Database whose backup needs to be performed,B:\DB-BACKUPS\ABC\Full\ is the path where the Full Backup of the database resides,B:\LogShippingLogs\ is the path where the compressed copy of the DB Backups will reside, F means that the backup performed will be a Full Backup.

Once the Full Backup of the database gets archived successfully, the next step would be transfer it onto the staging server. I accomplish this task by executing the below stored procedure against the master database on the server.

<pre class="brush:sql"><pre class="brush:sql">use master;
GO
Exec xp_cmdshell 'B:\DB_Backup.bat'</pre>
<pre class="brush:sql">
</pre>
<pre class="brush:sql">Where <strong>DB_Backup.bat </strong>is a Windows Batch File which contains the following code for transferring the RAR file onto the staging server PQR.</pre>
</pre>
<pre class="brush:sql">REM</pre>
<pre class="brush:sql">REM Uses ROBOCOPY, which is from the Windows 2003 Server Resource Kit
REM Flags are:
REM /MIR - Mirrors filesystem including creating directories or deleting files no longer in source.
REM /LOG+: - Appends a LOG file listing the files mirrored or deleted.
REM
REM
b:\robocopy.exe B:\LogShippingLogs \\PQR\DBBACKUPS *.RAR /MOV /LOG+:b:\copylog.txt</pre>
 
Where <strong>B:\LogShippingLogs </strong>is the path on the Production server where the archived copy of the Full Backup resides, <strong>PQR </strong>is the name of the
Staging server, <strong>DBBACKUPS </strong>is the folder on the staging server which will contain the archived copy, *.RAR means that it will only copy the

RAR files. One important point tomention here isthat this isa Cut->Copy->Paste operation.

Once the archived copy ofthe databasegets transferred onto the staging server, the nextstep istosend a notification email tothe team

   

confirming the same. Inordertoachieve this, I firstcreatea DatabaseMail Profile named DBMail onthe server andthenuse the below T-SQL script against the msdb databasetosend the email alert.

<pre class="brush:sql">EXEC msdb.dbo.sp_send_dbmail
    @profile_name = 'DBMail',
    @recipients = 'abc@abc.com',
    @subject = 'ABC Production Database Full Backup Has Been Successfully Copied Onto The Staging Server (PQR)',
    @BODY='
Hi Team,
 
The Full Backup of the ABC Production Database Has Been Successfully Copied Onto The Staging Server (PQR).
 
With Thanks,
SQL DBA Support Team
 
' ;</pre>

Once the email alert is sent, I then delete the original Full Backup from the Production Server itself. This is accomplished by executing the
below T-SQL against the master database on the server.

<pre class="brush:sql">DECLARE @DeleteOldDiff varchar(500);
SET @DeleteOldDiff = 'xp_cmdshell ''del "B:\DB-BACKUPS\ABC\Full\'+'*.BAK'+'"''' ;
exec (@DeleteOldDiff)</pre>

This is the approach which we use on our Production Database Server to manage the Full Backups happening on a daily basis. The backup size being 93 GB, it takes around 13.5 hours for the entire operation to get completed. Also with 16 CPU’s and 32 GB of Memory, till date we haven’t faced any performance issue on the Production Server. One important advice, if you have a poor hardware on the server then I would normally advice you not to implement the above procedure because it will badly hurt the Performance of the Application.

After the Full Backup, the next step is to have the Proper differential Backup happening on the server. We normally have the policy of maintaining copy of differential backup on the server. Differential Backup’s are important because in case of  emergencies, it will help us for faster Recovery of data.

Also one cannot completely rely on the Transactional Log Shipping procedure. Inordertotake the differential backup, I have a job scheduled every 3 hours,which internally has the below T-SQL executing within it.

<pre class="brush:sql">DECLARE @name VARCHAR(50) -- database name   
DECLARE @path VARCHAR(256) -- path for backup files   
DECLARE @fileName VARCHAR(256) -- filename for backup   
DECLARE @fileDate VARCHAR(20) -- used for file name  
DECLARE @SQL VARCHAR(8000)
 
SET @path = 'B:\DB-BACKUPS\ABC\Differential\'   
	 
SELECT @fileDate = CONVERT(VARCHAR(20),GETDATE(),112)  
   + '_'  
   + REPLACE(CONVERT(VARCHAR(20),GETDATE(),108),':','') 
 
DECLARE db_cursor CURSOR FOR   
SELECT name  
FROM master.dbo.sysdatabases  
WHERE name IN ('ABC')  
 
OPEN db_cursor    
FETCH NEXT FROM db_cursor INTO @name    
 
WHILE @@FETCH_STATUS = 0    
BEGIN    
       SET @fileName = @path + @name + '_' + @fileDate + '.BAK'   
    SET @SQL= 'BACKUP DATABASE '+convert(varchar(256),@name)+'
TO DISK = '''+convert(varchar(256),@fileName)+''' WITH DIFFERENTIAL'
EXEC(@SQL)
 
 
       FETCH NEXT FROM db_cursor INTO @name   
END   

CLOSE db_cursor    
DEALLOCATE db_cursor</pre>
 
Where <strong>B:\DB-BACKUPS\ABC\Differential\ </strong>is the path where the Differential Backup of the database will reside.Just before a new copy of the

differential backup happens, we deletethe Old copy using the below T-SQL against the master database,

<pre class="brush:sql">DECLARE @DeleteOldDiff varchar(500);
SET @DeleteOldDiff = 'xp_cmdshell ''del "B:\DB-BACKUPS\ABC\Differential\'+'*.BAK'+'"''' ;
EXEC (@DeleteOldDiff)</pre>

The above approach is the one which I use to manage the differential backups on my servers.

After the differential backups comes the Transactional Log Backups. For the Transactional Log Backups,we have the Transactional Log Shipping procedure in place on the Production Server which keeps on taking the Transactional Log Backups every 15 minutes. The retention period of the Transactional Log Backups has been been specified as 16 hours.

This was an approach which we used on our Production server to effectively manage the Database Backups. Please do provide us your suggestions which would have helped us to make the backup strategy much more effective and better.

 

Regards

Satnam Singh

Like us on FaceBook Follow us on Twitter

Follow me on FaceBook| Join the fastest growing SQL Server group on FaceBook

   

3 Comments on “Designing a Backup Plan (SQL Server)”

  1. Satnam, since you delete backup from production server. what if u need to restore a backup onto production server in case of a database corruption (god forbid) u said entire operation takes ~13 hours. u can expect the same amount of time to restore to production server in such scenario!!!! why dont u retain one day backup on production server for a safer side?

    Although such scenarios are rare and you have a DR to take care of your availability, but u never know….

  2. am agreed with Naginder comment , as per our backup planning we must keep the backup on both side source & destination .

    whenever mishappen will be happen on database level or server level , we can recover the database on urgent basis.

  3. Satnam,

    Good description. I would like to point few things which will help any DBA while designing backup plans, most of them you would have read in experts blogs already.
    1.A good backup is always the one which you test by restoring it. So have an environment where you can restore your backups and test well. At least this should be a fire drill on DR to test your backup strategy at regular intervels.
    2.Recovery time of your production should always be minimal. While you achive recovery point by frequency of backups, recovery time is more important for business. So plan efficiently so that you have minimum downtime in case you have to build a server from scratch in the worst case.

Leave a Reply

Your email address will not be published.