Intelligently Backup SQL Server Analysis Service Databases

May 5, 2012 at 12:50 AMSteven Wang

 There are normally 2 main approaches to backup SQL SSAS databases:

1. Backup SSAS Databases at a scheduled time

2. Backup SSAS databases whenever after a process is performed.

Although there is nothing wrong with these 2 approaches, they are both not the best. They are both not very easy to be managed, especially when there are lots of databases under same SSAS instance. A better way is to intelligently backup SSAS databases based on the data change. A job can be set up to check the data change for the SSAS databases at a scheduled interval and if an SSAS database has been detected a change either data or schema then it will be backed up; if it has nothing been changed then a new backup is not needed. In this post, I will use an SSIS package to intelligently manage the SSAS database backup. My solution is based on the logic below:

1. Only modified SSAS databases (Data or schema) will be backed up

2. A new added SSAS database will be always backed up

3. SSAS backups can be traced. The backup history will be kept.

4. backup file retention can be configured (how many backup file need to be kept?)

To start with, I need 2 SSAS maintenance tables to manage the backup solution. I use the below script for my tables: (there are some columns in this script is not specific for the backup purpose, but as they are related to the SSIS package, they will be kept.)

CREATE TABLE [dbo].[Database_Backup_Admin]
	[Database_Name] [nvarchar](128) NOT NULL,
	[Data_Modified] [datetime] NULL,
	[Backup_Date] [datetime] NULL,
	[DNS_Instance_Name] [nvarchar](128) NOT NULL,
	[Instance_Type] [nvarchar](50) NOT NULL,
	[Backup_file_to_Keep] [tinyint] NULL,
	[Is_Current] [bit] NULL,
	[Is_Backuped_Latest] [bit] NULL,
	[CacheWarm_Date] [datetime] NULL,
	[Is_CacheWarm_Current] [bit] NULL,
	[Is_CacheWarm_On] [bit] NULL,
		[Database_Name] ASC,
		[DNS_Instance_Name] ASC,
		[Instance_Type] ASC

CREATE TABLE [dbo].[Database_Backup_History]
	[Database_Name] [nvarchar](128) NOT NULL,
	[Backup_Date] [datetime] NOT NULL,
	[DNS_Instance_Name] [nvarchar](128) NOT NULL,
	[Instance_Type] [nvarchar](50) NOT NULL,
	[Backup_file_Name] [nvarchar](128) NULL,
	[Backup_file_Size] [int] NULL,
	[Backup_start_Time] [datetime] NULL,
	[Backup_End_Time] [datetime] NULL,
	[Backup_Duration_Second] [int] NULL,
		[Database_Name] ASC,
		[Backup_Date] ASC,
		[DNS_Instance_Name] ASC,
		[Instance_Type] ASC


Once the tables are created, I created an SSIS package with the layout as below:

From the package task flow, the first 2 steps are to check if all the databases in our Database_Backup_Admin table are up to date. If a database was in the server but now is removed, then the Is_Current column of the table for the database will be set to 0.  If there is a new database added into the server, it will be loaded into the table. There is a slowly changing dimension data transformation used for for this check. The DBSchema_Catalogs DMV is used to check the SSAS databases in the server instance. The data flow task for the database check as below:

SSAS database data change can be verified by MDschema_cubes DMV. The last_data_update column of MDschema_cubes has a time stamp for the cube data change. The package will update the Data_Modified column for Database_Backup_Admin table based on the last_data_update column of MDschema_Cubes.

After the Data_Modified column is updated, a comparison between Data_Modified and Backup_date will be performed. If the backup_date is older than Data_Modified, then Is_Backuped_Latest will be marked as False, which suggests a new backup is needed.

The last for each loop container is used to backup SSAS database by using XMLA backup command. A logic is also used to remove the older backup files based on the configuration how many backup files to keep. The Scrip used to remove the old backup file:

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.IO
Imports System.Windows.Forms

<System.AddIn.AddIn("ScriptMain", Version:="1.0", Publisher:="", Description:="")> _
<System.CLSCompliantAttribute(False)> _
Partial Public Class ScriptMain
	Inherits Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase

	Enum ScriptResults
		Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success
		Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
	End Enum
    Public Class CompareFileLastWriteTime
        Implements IComparer
        Public Function Compare(ByVal x As Object, ByVal y As Object) As Integer Implements IComparer.Compare
            'this "Compare" code is bollowed from:
            Dim File1 As FileInfo
            Dim File2 As FileInfo

            File1 = DirectCast(x, FileInfo)
            File2 = DirectCast(y, FileInfo)

            Compare = DateTime.Compare(File2.LastWriteTime, File1.LastWriteTime)
        End Function
    End Class

	Public Sub Main()
		' Add your code here

        Dim BackupDir As String = Dts.Variables("BackupDir").Value.ToString
        Dim DatabaseName As String = Dts.Variables("DatabaseName").Value.ToString
        Dim FilesToKeep As Integer = CInt(Dts.Variables("FilesToKeep").Value)
        Dim NewBackupFile = New FileInfo(BackupDir & Dts.Variables("BackupFileName").Value.ToString)
        Dim directory = New DirectoryInfo(BackupDir)
        Dim AllBackups = directory.GetFiles(DatabaseName & "_*")
        Dim filename As String = ""
        Dim oldesttime As Date = Now
        Dim FilesCount As Integer = 0

        Array.Sort(AllBackups, New CompareFileLastWriteTime)
        Dts.Variables("BackupEndTime").Value = Now

        For Each entry In AllBackups
            If oldesttime > entry.LastWriteTime Then
                oldesttime = entry.LastWriteTime
                filename = entry.FullName
                FilesCount += 1
                If FilesCount > FilesToKeep AndAlso filename <> "" Then
                    FilesCount -= 1
                End If
            End If

        Dts.Variables("BackupFileSize").Value = CInt(NewBackupFile.Length / 1024)

		Dts.TaskResult = ScriptResults.Success
	End Sub

End Class


The package has the below variables:

 For these variables, these variables need to be configured:

1. InstanceName

2. DatabaseName

3. BackupDir (Olap backup directory, please note that you need to replace the single \ with \\ if you set value through sql agent job)

4. TimeZone (As the SSAS DMVs record the times as the UTC time, so you need to configure this TimeZone to your local time, for instance, New Zealand is 12.)

5. BackupFilesToKeep (How many backup files need to be kept)

The BackupXMLA variable is an expression, which will be changed dynamically for each loop:

"<Backup xmlns=\"\">
    <DatabaseID>" + @[User::DatabaseName]  + "</DatabaseID>
  <File>" + @[User::BackupFileName]  + "</File>

 This is the command to do the SSAS database backup.

 I have attached the package here.

Intelligent_OLAPDB_Backup_BICOE.dtsx (289.11 kb)


An SQL agent job can set up to perform the backup and remember to set the variables below to your own values:


Posted in: Backup | SSAS | SSIS

Tags: , , , , ,

A simple Way to Migrate SQL Logins from One Instance to Another

March 19, 2012 at 6:39 AMSteven Wang

Subject: Trasfer SQL Logins with original passwords and SIDs

Scope: the technical applies to SQL server 2008 and later.

When we upgrade or migrate an SQL server instance to a new instance, if we have lots of SQL logins the migration process will get bogged down in 2 thorny issues:

  • SQL Login SIDs
  • SQL Login passwords


SSIS Transfer Logins Task can easily transfer the windows logins and groups but not for SQL logins. When using SSIS Transfer Logins Task to transfer the SQL logins it will generate new passwords and SIDs for the transfered SQL logins and these logins are disabled in the destination SQL server instance. This is not very useful for the real work.


When an SQL login is created in the new SQL server instance, a new SID is created and bound to the Login. In this case, the database user in the migrated database is no longer mapped into the new created SQL login as they are using different SID now.

Generally speaking, the SQL user can be remapped by using Alter User with login = [login_Name] or sp_change_users_login (deprecated SP, better to use Alter User).  However, when there are lots of SQL logins this technical becomes cumbersome. Particularly, for some environment, a production database might frequently be restored to a test environment then you need to fix the SQL user mapping all the time.

For the SQL login password, it is also every awkward for us to retype the password by creating a new SQL login in the new server instance.

Although the SQL login SIDs and password are stored in the old instance master database, before SQL server 2008 there is no an easy way to script out this binary information and transfer to a new instance. There are a few very intelligent techniques available in the SQL community and mainly it will use a complicated function to convert the binary data into string information and then convert it back to binary data in the destination instance. But in SQL server 2008 or later, these techniques became kinds of overkill.

In SQL server 2008, the COVERT function has added a new feature to convert binary data type to other data type. (check the book online for more details) The syntax as below:

CONVERT ( data_type [ ( length ) ] , expression [ , style ] )


CONVERT(varchar(max), 0xE4305FD31D353546B4EC6E56C906E912, 1)


When Expression is binary(n) or varbinary(n), 3 style options are available 0 (default), 1, and 2.

When use 0 (default), binary bytes will be translated to ASCII characters. Each byte is converted 1:1.

When use 1, binary bytes will be translated to character string and the characters 0x will be added to the left of the converted result

When use 2, binary bytes will be translated to character string and 0x prefix will not be used.

SQL login SIDs and password_hash are both use 0x prefix, so we can use the binary style 1 to script out the SIDs and Hashed password from the old server instance and apply to the new instance.

The code to script out the SQL logins with the convert function becomes very simple, an example script as below:

'Create Login ' + QUOTENAME( 
+ ' With Password=' + CONVERT(varchar(max), A.password_hash, 1) + ' hashed'		--script out the passwrod
+ ', DEFAULT_DATABASE=' + quotename(A.default_database_Name) --if you apply the scrip to a mirroed server or log-shipped server user master database, as user database is not in useable state
+ ', DEFAULT_LANGUAGE=' + quotename(A.default_language_Name)
+ ', CHECK_POLICY=' + Case A.is_policy_checked when 0 then 'OFF' When 1 Then 'On' End 
+ ', CHECK_EXPIRATION=' + Case A.is_expiration_checked when 0 then 'OFF' When 1 Then 'On' End
+ ', SID=' + CONVERT(varchar(max), A.SID, 1)		--script out the SIDs
 As SQLLogin
sys.sql_logins A
Where Not like '##%##'  --remove those system generated sql logins
And A.sid != 0x01 --SA sid is always same

 (Note: If you have problem to copy and paste the script into the SSMS, copy it and paste to Office Word first and then paste to SSMS.)

You can use this scrip to store the SQL Login creation script to a table in the target server and use a cursor or loop to execute these scripts to transfer the SQL logins.

In the next blog, I will talk more depth on how to use an SSIS package to transfer the Logins and SQL server permissions from one server to another.

Posted in: Backup | Database Administration | T-SQL

Tags: , , , ,

Using Trace Flag 3042 for Backup Compression

October 20, 2011 at 7:56 AMSteven Wang

While we were using the SQL Server 2008 R2 native backup compression to replace the Redgate SQL Backup, we noticed a huge backup file size increase at the beginning of the backup which basically nearlly full up of our backup disk volume.

The database we were backing up close to 2 Terabyte and by using the SQL backup with compression, the final backup size was roughly 376GB as below:

When we used the SQL server native backup with compression


Backup database MyDB to disk = N'I:\Backups\CompressionBackup_Without_TF.bak' With init, compression


the initial file size has gone up to 638GB, which is roughly one third of the original database size, as below:

This is a huge 262GB backup file size increase, which was totally thrown away our Backup disk capacity plan based on the Redgate compression ratio.

Surprisingly, once the backup completed the file size shrank back to around 380GB which is similar size as the SQL backup compressed file.

This is indeed a default behaviour of the SQL Server Native backup compression. for a compressed backup, it is hard for SQL server to determine what is the final size of the backup. simply, it creates a initial backup files size with 1/3 of the database size (pre-allocation algorithm). In case during the course of the backup more disk space is needed, it will dynamycally extend the file as needed. In case the final backup size is smaller thant the initail 1/3 of the database size, it will truncate the file to the actual.

This is not good enough for us as I mentioned above that we have already planned the capacity based on the Redgate SQL Backup. In some point that we may not have enough disk space to hold the initial backup files. We really want to have the actual size required for the compressed backup.

This issue was solved by introduce a trace flag 3042. The trace flag 3042 bypasses the “pre-allocation algorithm” and grows the file as needed with small performance penalty.

This trace flag was not offically documented but a few month ago it was announced in micorosoft support see the articale Space requirements for backup devices in SQL Server

By using the trace flag, you can notice that the initil backup file size is 0 and will grow up to the final size.

Backup compression with trace falg 3042 on:


dbcc traceon(3042)

Backup database BiW to disk = N'I:\Backups\CompressionBackup_With_TF.bak' With init, compression

dbcc traceoff(3042)


The initial file size was 0 as below:

You can turn on this trace flag globally by adding the -T3042 to the startup parameters.

Posted in: Backup | Trace Flag

Tags: ,