Add a table to an Existing Transactional Replication using T-SQL and Generate Snapshot using T-SQL

Replace the PublishedDB with your published database and TableName with the table name you want to replicate

-- Adding TableName to Publication
use [PublishedDB]

exec sp_addarticle
@publication = N'PublicationName',
@article = N'TableName',
@source_owner = N'dbo',
@source_object = N'TableName',
@type = N'logbased',
@description = N'',
@creation_script = N'',
@pre_creation_cmd = N'truncate',
@schema_option = 0x000000000803509F,
@identityrangemanagementoption = N'manual',
@destination_table = N'TableName',
@destination_owner = N'dbo',
@status = 24,
@vertical_partition = N'false',
@ins_cmd = N'CALL [sp_MSins_dboTableName]',
@del_cmd = N'CALL [sp_MSdel_dboTableName]',
@upd_cmd = N'SCALL [[sp_MSupd_dboTableName]'

The next step is to start the snapshot agent to pick up the new tables added to replication and replicate across to Subscriber

Replace the PublicatioName with your existing publication name.

-- Start the Snapshot Agent job.
USE [PublishedDB]
EXEC sp_startpublication_snapshot @publication = 'publicationName';

Posted in Replication, SQL Server, T-SQL Scripts, Transactional Replication, Tricks | Leave a comment

Script to check Job history of a particular job.

use msdb
select * from sysjobhistory sjh
join sysjobs sj
on sjh.job_id =sj.job_id
where'<Your Job Name here>'
Posted in SQL Server, T-SQL Scripts | Tagged , | Leave a comment

Script to get All database file sizes, used and unused space

-- Author: Suleman Mohammed
-- Blog:
-- Create date: 29-05-2012
-- Description:

--Sproc to check data and log files sizes, usedspace and
--unused space

--Usage : To database size of a particular database
-- EXECUTE uspCheckDBSize
-- @dbname = <DatabaseName>
-- GO
--Usage : To database size of all databases
-- EXECUTE uspCheckDBSize
-- GO
-- Version Control 1.0 – 29-05-2012 Created Sproc – SM
-- Version Control 2.0 - 12-11-2013 Added Logic to get sizes of all database - SM

IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[uspCheckDBSize]') AND type in (N'P', N'PC'))
DROP PROCEDURE [dbo].[uspCheckDBSize]
 @dbname SYSNAME = NULL

 DatabaseName SYSNAME,
 FileSizeMB NUMERIC(10,2),
 UsedSpaceMB NUMERIC(10,2),
 UnusedSpaceMB NUMERIC(10,2),
 IF @dbname IS NULL
 SET @dbname='?'


SET @cmd=N'USE '+@dbname+N';
 DB_NAME() AS [DatabaseName],
 [DBFileName] = ISNULL(, ''*** Total size of the database ***''),
 [FileSizeMB] = CONVERT(NUMERIC(10, 2), SUM(ROUND(a.size / 128., 2))) ,
 / 128., 2))) ,
 [UnusedSpaceMB] = CONVERT(NUMERIC(10, 2), SUM(ROUND(( a.size
 ''SpaceUsed'') )
 / 128., 2))) ,
 [Type] = CASE WHEN a.groupid IS NULL THEN '' ''
 WHEN a.groupid = 0 THEN ''Log''
 ELSE ''Data''
 FROM sysfiles a
 GROUP BY groupid ,
 HAVING a.groupid IS NULL
 WHEN a.groupid = 0 THEN 0
 END ,
 a.groupid ,
 END ,'


IF @dbname = '?'
 INSERT INTO #dbsizes
 EXECUTE sp_msforeachdb @CMD
 INSERT INTO #dbsizes
 EXECUTE sp_executesql @statement=@cmd

SELECT * FROM #dbsizes

DROP TABLE #dbsizes

Posted in SQL Server, T-SQL Scripts | Tagged , , , | Leave a comment

How to realign your skills as a DBA

One of a few things a lot of Data professionals ask me in this age of data overload – How to realign your skills as a DBA. Especially those who are starting their careers or are Junior level DBAs or developers or had a career break and want to get back into track, have a lot of concerns on not been able to keep up with technology.

My advice if you are in this situation is to take a step back and get a birds view of your skill set and identify what you are good at. Then take another step back and look at where the current market is going. For eg:- with the SQL Server 2012 release last year, there is a lot of scope for database migration and upgrades, so do you have the skills to be able to do this ? Obviously you can’t live your entire life with SQL 2000 or SQL 2005 or SQL 2008 DBA skills :). So upgrade your skills. SQL Server 2014 CTP is out and the release to happen soon.

There is a lot of help and advice freely available from the experts in the industry, this is one of the biggest things I like about SQL Server, we have a worldwide community who share experience and knowledge and wisdom from their practical exposures of the new technology. And this only gets bigger and bigger each day. While the technology is changing every day, there are a lot of pros and cons with every change, you might not necessarily know all of this from day one, however if you know where to look for information, you can learn from the experiences of the experts. Another example I can give here is, for somebody who is wondering how good the New features of SQL Server 2012 are, refer to the blog post

Set aside 30 mins every day to quickly browse through blogs from experts, and learn atleast one new thing everyday.

If you have any question or need help with SQL Server, drop me an email on , and I might be able to point you in a direction to answer your question.

Posted in SQL Server, SQL Server 2012 | Leave a comment

Run multiple batch files simultaneously in background every 2 seconds

I have been working on developing a custom monitoring tool to monitor Version Store information for Row versioning and snapshot isolation performance on tempdb.

Have to run a multiple SQL scripts I developed simultaneously and in background every 2 sec. So had three requirements to it basically

  • Run multiple batch files simultaneously
  • Run all the batches every 2 sec
  • Run them all in background so you don’t see the cmd prompt every time the scripts are called.

Run batch files Simultaneously:

Create a master batch file (Monitor.bat) to call the multiple batch scripts simultaneously from master batch file. Use start cmd.exe for each of the batch to run.

start cmd.exe /C  D:\batch1.bat
start cmd.exe /C  D:\batch2.bat
start cmd.exe /C D:\batch3.bat

Run all the batches every 1 sec

To run all the three batches every 1 sec, create a loop with a 1 sec delay; Now the only way to stop this loop would be to close the monitor.bat cmd prompt.


start cmd.exe /C  D:\batch1.bat
start cmd.exe /C  D:\batch2.bat
start cmd.exe /C D:\batch3.bat

timeout /T 2
GOTO :Loop

Run them all in background so you don’t see the cmd prompt every time the scripts are called.

Use the /min option to run the cmd in minimized / background mode

The complete Monitor.bat master batch will look like below:


start /min cmd.exe /C  D:\batch1.bat
start /min cmd.exe /C  D:\batch2.bat
start /min cmd.exe /C D:\batch3.bat

timeout /T 2
GOTO :Loop

Posted in .bat scripts, Tricks, Windows | Tagged | Leave a comment

File *.ps1 cannot be loaded because the execution of scripts is disabled on this system. Please see “get-help about_signing” for more details.


PS D:\> .\ScriptKeys.ps1
File D:\ScriptKeys.ps1 cannot be loaded because the execution of scripts is disabled on this system. Please see “get-help about_signing” for more details.
At line:1 char:17
+ .\ScriptKeys.ps1 <<<<
+ CategoryInfo          : NotSpecified: (:) [], PSSecurityException
+ FullyQualifiedErrorId : RuntimeException


PS D:\> Set-ExecutionPolicy Unrestricted

Execution Policy Change
The execution policy helps protect you from scripts that you do not trust. Changing the execution policy might expose
you to the security risks described in the about_Execution_Policies help topic. Do you want to change the execution
[Y] Yes  [N] No  [S] Suspend  [?] Help (default is “Y”): y
Set-ExecutionPolicy : Access to the registry key ‘HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\PowerShell\1\ShellIds\Microsoft
.PowerShell’ is denied.
At line:1 char:20
+ Set-ExecutionPolicy <<<<  Unrestricted
+ CategoryInfo          : NotSpecified: (:) [Set-ExecutionPolicy], UnauthorizedAccessException
+ FullyQualifiedErrorId : System.UnauthorizedAccessException,Microsoft.PowerShell.Commands.SetExecutionPolicyCommand


Run the power shell as Administrator and re run the below to fix both the errors above:

Set-ExecutionPolicy Unrestricted
Set-ExecutionPolicy RemoteSigned

Posted in PowerShell, Uncategorized | Tagged , , | Leave a comment

Merge Statement for a SQL Server 2000 compatible Database

I have recently been asked if we can write a Merge T-sql to handle inserts and updates in the same statement for a Database on SQL Server 2008 at SQL 2000 compatibility level.

The answer is YES and NO; I have a SQL Server 2008 Instance on SP3 and the below illustration works on it. Tried that on SQL Server 2008 Instance on SP1 this fails with the below error

Msg 325, Level 15, State 1, Line 214

Incorrect syntax near ‘merge’. You may need to set the compatibility level of the current database to a higher value to enable this feature. See help for the SET COMPATIBILITY_LEVEL option of ALTER DATABASE.

You can use Merge statement to handle Inserts and updates for SQL 2000 compatible database.

Test it for yourself using the below script.

create database testmerge

alter database testmerge
set compatibility_level = 80

create table merge_test
(     id int identity(1,1) primary key,
name sysname,
email varchar(250)

create procedure uspTestMerge
@name sysname,
@email varchar(250)
merge merge_test as target
using (select @name,@email) as source (name,email)
on (
when matched then
update set =
when not matched then
insert (name,email)
values (,
output deleted.*, $action, inserted.*       ;

uspTestMerge 'Ivan',''

uspTestMerge 'david',''

uspTestMerge 'suleman',''

select * from merge_test

uspTestMerge 'david',''

select * from merge_test
Posted in SQL Server, T-SQL Scripts | Tagged , , , , , , | 3 Comments

Database Migration Automated

Its been a while since I posted anything here…

Phew… Finishing a very tight deadline and critical Database Migration Project today.

I have done myself some justification by automating the Database Migration work, that would otherwise require me to perform the same tasks for 120 databases in around 2-3 weeks…

Here is an over view of what my automated script does for me to perform the Pre cutover tasks.

My proposal was to migrate across every thing that can be migrated prior to cutover. Minimizing the amount of work involved for CutOver, reducing downtime.

Part 1:-

  • Configure the Distributor, Publisher and enable the database to synchronize from backup ( to make sure I would not need to generate a full snapshot to initialize, as the business requirement was to do this task with least overhead on the live systems)
  • Create an empty publication with no articles. (again I could add all the articles but I choose not to, so I can have control over which article I would like to replicate across)
  • Backup database to be migrated across
  • Verify the backup taken above is valid
  • Copy the backup across the network to the destination server ( scripted Robocopy and used multi threading (MT: 100), flows like a breeze on a dedicated network )
  • Restore the database on the destination server.
  • Create a push subscription

The script creates a log file for each task organizing it in the folder structure with ServerName and a subfolder with DatabaseName then creates a txt log file with results of each step.

The script then prompts the user to check the log file to make sure there are no errors, and press any key to continue migration.

Part 2:-

  • Copy across logins with SID to destination server (I did this using SSIS package, I could use the sp_help_revlogin)
  • Copy across agent Jobs and SSIS packages ( again did this using SSIS package)
  • Copy across Linked Servers
  • Automated LogShipping setup.

Issues faced:

Faced no unusual issues other than things you would face normally when setting up transactional replication using backup to initialize. We could have avoided these issues if we had enough time to analyze the databases prior to actually working on the migration. I was put on the task with no lead time to investigate.

A subset of the issues I faced:

  • Errors with TimeStamp Column when initializing using backup; The business requirement was not to change any database schema, so converting TimeStamp column to varbinary was ruled out. I got lucky as the databases with this issue were all small databases ranging upto 8 GB and I could easily initialize delivering a full snapshot for these without any concerns.
  • Tables without Primary Keys, obviously replication would only work for tables with primary keys defined on them. I prepared a script that would provide me a row count of each table of every user database on each of the source and destination servers I was migrating. I pulled this data into an excel, prepared a pivot table and there it was , very easy to compare which tables were out of sync. Thanks to redgate data compare and SSIS package I could easily sync these tables without any issues. the trick is to compare the tables selecting all the columns for comparison.
  • Ever changing business requirements
  • Application teams were busy working towards their release and had BLOB data imports into the database I was replicating across in preparation of the Migration CutOver. Had to increase the max text repl size to let the application import BLOB data.
  • Had a couple of Logshipping databases on the source included in the list to perform this premigration replication activity. Later identified as to setup Log shipping separately for these two databases. Now the replication would not allow you remove the subscription and publication as the subscriber was a logshipped read only database. Had to manually remove replication on both publisher and subscriber individually.
  • SQL Servers evaluation Editions were installed on the destination servers. Had to upgrade SQL Server Evaluation Edition to Enterprise Edition. Had to Manage SQL Server Active – Passive Windows Cluster.
  • I had to move the distribution database to another drive at some point in the pre migration process due to lack of disk space on the servers.
Posted in .bat scripts, LogShipping, Replication, RoboCopy, Setup, SQL Server, SSIS, T-SQL Scripts, Transactional Replication, upgrade | Tagged , , , | Leave a comment

Passed Exam 70-465 : Designing Database Solutions for Microsoft SQL Server 2012

Passed 70-465 : Designing Database Solutions for Microsoft SQL Server 2012 on Saturday the 11th Aug 2012

Watch this space for my next stop i.e., Exam 70-463: Implementing a Data Warehouse with Microsoft SQL Server 2012

Posted in MCITP, MCM, MCSA, MCTS, SQL Server, SQL Server 2012 | Tagged , , , | Leave a comment

Passed Exam 70-464 : Developing Microsoft SQL Server 2012 Databases

Passed Exam 70-464 : Developing Microsoft SQL Server 2012 Databases

Watch this space for my next stop i.e., 70-465 : Designing Database Solutions for Microsoft SQL Server 2012

Posted in MCM, MCSA, SQL Server, SQL Server 2012 | Tagged , | Leave a comment