Archive for the ‘SQL Server’ Category

exec sp_executesql nvarchar(max) truncating at 4000 characters

November 28th, 2013 1 comment

The Problem

Sometimes you may need to write dynamic SQL.

One example is writing query for running over linked server connections. You might write something like this

Create Procedure prcTest
    @start datetime
    Declare @sql nvarchar(max)

    set @sql = 'Insert Into #tblTemp
    select * from OpenQuery(LINKSERVER, ''Select * from dbo.tblRemote r
        inner join tblRemote2 r2 on r.Id = r2.fkId 
    .... lots more sql x 4000+ characters
    Where r.DateCreated > ''''' + Convert(varchar, @start, 106) + '''''

    exec sp_executesql @sql

    select * from #tblTemp inner join tblOther on id = otherId


Assuming this has over 4000 characters it will truncate at 4000 characters, your SQL will be malformed as it is missing the closing quote ‘ and will crash.

The Resolution

So what is going on!

I’ve read everywhere on the internet about nvarchar(max) having a 2GB limit etc, but it is clearly truncating at 4000 characters.

Add in a

select Len(@sql)

to confirm that it is definitely a 4000 character limit.

So what is going on?

Implicit Type Conversion!!!

The innocent looking code adding a date to the query,

 ''''' + Convert(varchar, @start, 106) + '''''

is causing SQL Server to perform an implicit type conversion and our NVarchar(MAX) is somehow becoming an NVarchar(4000).

Change the addition of the date to the following

''''' + Convert(nvarchar(max), @start, 106) + '''''

The problem is averted and the select Len(@sql) will happily count beyond 4000.

(BTW convert(varchar, @date, 106), creates a ‘1-Jan-2000’ formatted date which resolves any issues that can occur with different regional settings on your sql server causing problems)

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Categories: SQL Server, T-SQL Tags:

SSIS 2012 Project Deployment – Set CheckpointFile with Parameters

July 23rd, 2013 No comments

The Problem

SSIS in SQL Server 2012 now supports a “Project Deployment” model. Adding things like Connection Strings is relatively intuitive.
I want to use a parameter to set my CheckPointFileName property.
How can you do this so it can be configured by the environment.

The Solution

  1. Create a parameter in your package and call it “CheckpointFileName”
  2. Click On the ellipses “…” on the Expressions property.
  3. In property Choose CheckPointFileName from the drop down
  4. Set the expression by opening the expression builder and choosing the parameter you just created, (sometimes you need to drag and drop the paramater into the expression box)


  5. Click OK

Now you can deploy your package and set this parameter as part of you environment setup and change the checkpoint file name as required

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Categories: SQL Server, SSIS Tags:

Post Deployment Script, 72006: Fatal scripting error: Incorrect syntax was encountered while parsing

July 12th, 2013 No comments

The Problem

You create a post deployment script, with a statement like;

:r .\Static Data\Insert Seed Values.sql

The system throws an error and will not build:

72006: Fatal scripting error: Incorrect syntax was encountered while parsing ‘Data\Insert’

The solution

There are two problems:

  1. The post deployment script does not like spaces in the file paths, so quote the file path in “double quotes”.
    :r ".\Static Data\Insert Seed Values.sql"
  2. The post deployment script runs in SQL CMD mode, so whilst ever you have the script file open it will throw syntax errors either
    1. Close the post deployment script; or
    2. Turn on SQL CMD mode on the file, (in 2012 with latest SSDT this is the last icon in the toolbar

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Categories: SQL Server, SSDT, Visual Studio Tags:

SSDT generate script throws an “object not set to an instance of an object” error

June 3rd, 2013 No comments

The Issue

I created a Visual Studio Database project by scripting the code directly off the server in question.

When I then try to publish the project clicking Deploy –> Generate Script, I get an “Object not set to the instance of an object” error.

I get no other information about the problem

The Investigation

So to work out what was going on I excluded all the stored procedure and function logic from the project and tried to generate the script, and it worked.

The next step was to include scripts back into the project, trying to “Generate Script” every time I added scripts back until it failed.

The problem

CREATE FUNCTION [dbo].[GetSomething]
SELECT 1 as column1, 2 as column2

A script like the one above causes it to fail to generate successfully.

The solution

The problem is that it does not like the function returning a select statement directly.

Change the code to use a table variable, like so:

CREATE FUNCTION [dbo].[GetSomething]
RETURNS @returntable TABLE
  column1 int,
  column2 int
    Insert Into @returntable
    SELECT 1 as column1, 2 as column2


The project will now build

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Categories: SQL Server Tags:

How to use SQLCMD variables for a Linked Server Connection with Intellisense

March 13th, 2013 10 comments

The Problem

For a long time I have struggled with how to correctly setup a SQL Server Database project. The main sticking point is to try to script views that use linked server connections.

Generally when you create a view that references a linked server connection you will get the following error:

SQL71561: View: [dbo].[vwTest] has an unresolved reference to object [LinkedServer].[RemoteDB].[dbo].[tblTest]. 

The End Game

I have been able to achieve the holy grail in Visual Studio database projects;

  1. A solution that compiles even with views that run across linked server connections.
  2. The solution has Intellisense working for the linked server database.
  3. The ability to adjust the details of the linked servers when publishing the database via the SQLCMD Variables.

The Solution

The following instructions are performed on Visual Studio 2012 / SQL Server 2012 and SQL Server Data Tools (SSDT)

Step 1 – Create a database project for your linked server, (RemoteDB)

You need a .dacpac file of your target database.
The easiest way to do this is:

  1. Create a new SQL Server Project
  2. Right Click on the database project and choose Import –> Database
  3. Create a connection to your remote/linkedserver target database.
  4. Click finish and visual studio will create a scripted copy of your database.
  5. Build the project and fix any compile issues

In my case I was trying to reference an Enterprise level database with a few thousand tables, but rather than generate the entire database I just created the 20 – 30 tables that are referenced most often and used that as my DB project. Over time as I need more and more tables I will just script and import a few tables at a time, via the Import –> Script option.

Once successfully built the .dacpac file will be found in the “bin\Debug” directory

Note: I also generally write a post build event and copy the .dacpac file to a central higher level folder to allow any project to reference the .dacpac file from a central location.

Step 2 – Add a database reference

In the main database project add a database reference as follows:

Some points about these settings:

  1. Choose your .dacpac file that you generated in step 1.
  2. Database Location – even though the Database is on a “Different Database, Different server”, you need to set it as “Different Database, same server”. This setting allows intellisense to work.
  3. Suppress errors – this is just something in case you cause errors in you .dacpac

3. Create a view

Create a view and you will be able to do:

from [$(RemoteDB)].dbo.tblTest

You should get full intellisense on your schema as compiled in you .dacpac file from step 1.


4. Setup your SQLCMD Value

Our replacement variable value is the two-part name for the LinkedServer].[RemoteDB
NOTE: Leave the leading and trailing values off as the outer braces will be added from your sql script as you have to enter [$(RemotedDB)]
This way when the project is generated your solution will generate the correct 4 part name to your linked server objects

SQLCMD Variables

5. Publish

Publish the solution:

  1. Choose Publish
  2. Click “Load Variables” to load in the values set in step 4.
  3. Choose the database to deploy to
  4. Click “Create Profile”, to save your publishing profile
  5. Click “Generate Script”
  6. Review the Generated Script and you will note that for some reason the single ] has been changed to ]], remove the extra ]
  7. Right-Click on the window and choose “Excecute” to run the script

Because of the double brace ]] issue, (which I have raise with Microsoft on Microsoft Connect ), you would be able to just click Publish.

The value in the SQLCMD Variable will be replaced into the script when deployed, and your table references will become 4-Part table names with [LinkedServer].[RemoteDB].[Schema].[Table] as the output.

Please Download a working sample of the solution.


Let me know if this works for you.

VN:F [1.9.22_1171]
Rating: 9.5/10 (2 votes cast)

Categories: SQL Server Tags:

SQL Server 2008 R2 – Intellisense not working – Visual Studio 2010 SP 1

March 18th, 2011 11 comments


Recently intellisense in the SQL Server management Studio stopped working.

It appears that it was caused by the installation of Visual Studio 2010 SP1.

The issue was raised here:

It was originally suggested that this could fixed by SQL Server 2008 R2 Cumulative Update #6, but this in turn lead to a crash in Visual Studio 2010 when ever you opened a .sql file in your database projects.

It suggested that this issue has been fixed and will be released in SQL Server 2008 R2 Cumulative Update #7, which was due out in Mid April.

You can install Cumulative Update #6 and in most cases this will get intellisense working in SSMS but it will break Intellisense (on database projects) in VS2010.

So if you are like me, you are between a rock and a hard place, your choice of which environment you prefer to have working.

Lets hope that CU #7 fixes it.

I will update this post with my results after I have applied CU #7.


Update as @ 19th April

Today Microsoft has released Cumulative Update package 7 for SQL Server 2008 R2

After downloading and installing, preliminary results are that the problem is solved.

Intellisense in SSMS seems to be working and I no longer get the crashes in Visual Studio 2010 and Intellisense in Visual Studio 2010 database projects also seem to be working correctly. So, so far so good.

Update as of Nov 2011

As someone has pointed out the release of SQL Server 2008 R2 SP1 is supposed to fix this issue, (as it contains all the rolled up cumulative updates).

I personally have had mixed results with some machines working again and others having “Intellisense” working in SQL management Studio but remaining broken in Visual Studio 2010. My main work Development environment is still broken in this way and if I ever get it working I will udpate you on what I had to do, but for now I am living with only having intellisense in the SQL management studio.

VN:F [1.9.22_1171]
Rating: 10.0/10 (2 votes cast)

Categories: SQL Server Tags:

Reconnect all SQL logins after DB restore

March 2nd, 2011 No comments


Below is a script that I use to reconnect sql logins after a database restore.

It reconnects all users to all databases.


Use Master

Set NoCount On

if exists(select * from tempdb..sysobjects where id = object_id('tempdb..#tblDatabases'))
	Drop Table #tblDatabases

if exists(select * from tempdb..sysobjects where id = object_id('tempdb..#tblUsers'))
	Drop Table #tblUsers

Declare @db nvarchar(max)
Declare @usr nvarchar(max)
Declare @sql nvarchar(Max)
Declare @msg nvarchar(200)

Create Table #tblDatabases
	DBName nvarchar(4000)

Create Table #tblUsers
	UserName nvarchar(4000)

Insert Into #tblDatabases
select name from master.dbo.sysdatabases where dbid > 4 
Order By name

Select Top 1 @db = DBname from #tblDatabases

While @db is not null
	set @msg = Char(13) + 'Processing ' + @db
	set @sql = N'
	Use [' + @db + ']
	Declare @usr nvarchar(max)
	set @usr = null

	Truncate Table #tblUsers;
	Insert Into #tblUsers SELECT name from sysusers where altuid is null 
                 and name not in (''dbo'', ''guest'', ''sys'', ''INFORMATION_SCHEMA'');
	Select Top 1 @usr = userName from #tblUsers;
	While @usr is not null
		print ''fixing user '' + @usr
		exec sp_change_users_login ''auto_fix'', @usr
		Delete from #tblUsers where username = @usr
		Set @usr = null
		Select Top 1 @usr = userName from #tblUsers	
	--print @sql
	-- main loop
	Delete from #tblDatabases where DBName = @db;
	Set @db = null
	Select Top 1 @db = DBname from #tblDatabases

Drop Table #tblDatabases
Drop Table #tblUsers

Set NoCount Off

VN:F [1.9.22_1171]
Rating: 0.0/10 (0 votes cast)

Categories: SQL Server Tags:

Restore SQL Server Database – disconnect all users

January 18th, 2011 No comments


Often when you try to restore a database there will be connections to the DB, which will stop the restore from succeeding.

What I do is use the wizard to configure the restore, and then click the script button, to script the restore to the clipboard, so that I can run it as a TSQL statement.

The trick is you want to disconnect all sessions, so you send the Database into single user mode which will disconnect all sessions. After the restore you then set the database back to normal.

The code would look like

Alter Database YOURDB Set SINGLE_USER with Rollback Immediate


Alter Database YOURDB Set MULTI_USER 

The important bits are the Alter Database statements either side of the restore database script.

VN:F [1.9.22_1171]
Rating: 7.0/10 (3 votes cast)

Categories: SQL Server Tags:

SQL Server SHRINKFILE not working

January 5th, 2011 No comments

As microsoft put in in SQL Server Books online, “A shrink operation is most effective after an operation that creates lots of unused space, such as a truncate table or a drop table operation” (Note: all quotes are from SQL Server BOL)

The other day I blew away over 100GB of tables and issued the shrinkfile on both the Data and Log files and nothing happened.

Here is the reason…

If you use the SQL management tool and choose to shrink the data file releasing unused space you will get the following command executed:


When you read about “Truncate Only” it says

Releases all free space at the end of the file to the operating system  but does not perform any page movement inside the file. The data file is  shrunk only to the last allocated extent.

So depending on how the tables are allocated on disk inside the .mdf file it cannot release anything. Kind of like trying to partition a hard disk, you have to defrag the disk to move all the data to the front of the disk so you can make a big “contiguous” block at the end to repartition.

so really we want to essentially defrag our database file first, and then truncate, enter the “NOTRUNCATE” statement


Again SQL Server BOL says:

Moves allocated pages from the end of a data file to unallocated pages in the front of the file with or without specifying target_percent.  The free space at the end of the file is not returned to the operating  system, and the physical size of the file does not change.

Note: This is only applicable to the datafile, it does nothing on the log file
So what you really want to do is:


This will essentially defrag your .mdf file and then truncateonly will shrink down to the last allocated extent.

I may be wrong but I think that is what the ShrinkDatabase attempts to do and hence why it appears to take so long.


My recommendation:

  1. Try to shrink the file first and see what gets released.
  2. If its not good enough then try ShrinkDatabase.

Note: Shrinking my database file of 100GB took 41minutes on a 16 CPU box with 32GB of memory

Recently I had the same problem trying to clear up a 130GB database. I found another quick but possibly dodgey approach.
I could not get the log file to shrink. I backed up the log file, shrunk and tried everything and it would not shrink.
I then changed the database from full to simple logging, and shunk the log file to 1MB in about 2 seconds.

VN:F [1.9.22_1171]
Rating: 9.5/10 (6 votes cast)

Categories: SQL Server Tags:

SSIS Date conversion error going from Oracle to SQL Server

December 19th, 2010 1 comment


The other day I had the task to move a large amount of data from Oracle to SQL Server.
I had chosen to import the data via SSIS, doing a direct copy from one table in oracle to one table in SQL Server.

Everything was fine until I one of my tables crashed, with a data conversion error.

The issue is that the lowest date that SQL Server will accept is “1-Jan-1753”. There was some bad data in Oracle dated year 1049, consequently SQL server did no know how to handle it. How was I going to handle it?

The solution is to use a transformation script. In your data flow task, add a transformation script between your source and destination. When you drop in on the form it pops up a dialogue, leave it default for transform.

On the inputs column choose the date column in question (for the example lets assume its called CreatedDate).
Make sure you change its “UsageType” in the grid from “Read” to “Read-Write”.

Go to the script tab and click “Design Script” enter something like the following code:
Note: This is in vb .net. I think 2008 R2 allows c#, but you should get the idea:

            Dim dt As DateTime = Row.CreatedDate()
            Dim dtMin As DateTime = System.DateTime.Parse("1753-01-01 00:00:00")
            Dim dtMax As DateTime = System.DateTime.Parse("9999-12-31 23:59:59")
            If (Row.CreatedDate_IsNull = False) Then
                If (dt < dtMin Or dt > dtMax) Then
                    If (dt < dtMin) Then
                        Row.CreatedDate() = dtMin
                        Row.CreatedDate() = dtMax
                    End If
                End If
            End If
        Catch ex As Exception
            Row.CreatedDate_IsNull = True
        End Try

This code will find any row that has a date that falls outside the bounds of what SQL server will accept and set it to the Max or Min date values.

Finally some general tips that I learned in attempting to move large amounts of data from Oracle to SQL.

1. Don’t use a linked server connection it is really slow.
2. The Microsoft OLEDB connector for Oracle is pretty fast.
3. When setting up the OLEDB Destination via “Table or View” make sure you choose the “Fast Load” option!! Why you would want it to run slow is beyond me but the “Table or View – fast load” is about 10 to 15 times faster.


VN:F [1.9.22_1171]
Rating: 9.0/10 (1 vote cast)

Categories: SQL Server Tags: