The postings on this site are my own and do not represent my Employer's positions, advice or strategies.

LifeAsBob - Blog



No Ads ever, except search!
Monday, February 6, 2023 Login

Remove SMO - Parse for GO11/29/2022 7:25:14 AM

SMO = SQL Management Objects.

GO is not a TSQL command, but is often used in many scripts, the client must parse for go and submit in batches.

SSMS = SQL Server Management Studio.

SSMS Does this automatically.

Take the same script and execute it via submitting the command to the DBMS (java, c#, powershell etc) and it will fail.

Generally, the solution for this is to implement SMO in the code.

Long term this has always been an issue as SMO libraries are a pain too install, upgrade, patch.  As the years go by even upgrading projects in visual studio become difficult fighting the GAC and other .NET fun.

Could not load file or assembly 'Microsoft.SqlServer.BatchParser, Version=, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified.

Finally, I just removed SMO from my project(s) and life is soo much easier.

Below is the REGEX Function I use to parse a command into the different statements based on the GO operator.

using System.Collections;
using System.Text.RegularExpressions;
using System.Collections.Generic;
using System.Collections.Specialized;

    public static class SQLFunctions

        public static ArrayList ParseForGo(String query)
            // If 'GO' keyword is present, separate each subquery, so they can be run separately.
            // Use Regex class, as we need a case insensitive match.

            string separator = "GO";
            Regex r = new Regex(string.Format(@"^\s*{0}\s*$", separator), RegexOptions.IgnoreCase | RegexOptions.Multiline);
            MatchCollection mc = r.Matches(query);
            ArrayList queries = new ArrayList();
            int pos = 0;
            foreach (Match m in mc)
                string sub = query.Substring(pos, m.Index - pos).Trim();
                if (sub.Length > 0) queries.Add(sub);
                pos = m.Index + m.Length + 1;

            if (pos < query.Length)
                string finalQuery = query.Substring(pos).Trim();
                if (finalQuery.Length > 0) queries.Add(finalQuery);

            return queries;

To call this:
 // If the user has selected text within the query window, just execute the
                // selected text.  Otherwise, execute the contents of the whole textbox.
                string allquerytext = txtQuery.SelectedText.Length == 0 ? txtQuery.Text : txtQuery.SelectedText;
                //                if (query.Trim() == "") return;

                // now parse for the go operatory
                ArrayList queries = new ArrayList();
                if (chkAdhocParseForGo.Checked)
                    queries = parseForGo(allquerytext);

Honda CTX 700 20147/25/2022 10:52:38 AM
New to me !
Honda CTX 700 2014

SQL Server Management Studio MFA hanging not prompting7/12/2022 5:58:15 AM

We have certain "zones" / "restrictions" that cause us too use a common "jump box" (server). 

Sometimes SQL Server Management Studio SSMS hangs when using Multi Factor Authentication to access Azure SQL resources.  The prompts for logging in never appear.  ( Make sure the URL's for Microsoft are white listed).

This appeared to only affect some users.

Internet Explorer is "retired", but it appears that SSMS still uses some calls to IE underneath the hood to call MFA.

The team was able to find a workaround that users can compete to clean up a bad cookie that seems to be the ultimate culprit. Details on this fix are below:


Executing the below commands in a Powershell window. – If you have never ran Internet explorer on the server before, do that before running the command. E.g open IE and close it, the run the below.


RunDll32.exe InetCpl.cpl,ClearMyTracksByProcess 1

start-sleep 2

RunDll32.exe InetCpl.cpl,ClearMyTracksByProcess 2

start-sleep 2

RunDll32.exe InetCpl.cpl,ClearMyTracksByProcess 8

start-sleep 5


[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12



Postgres to SQL Quick Tips3/4/2022 7:23:55 AM
Very First make sure your working with a great Postgres DBA.

1.  Convert the schema from Postgres to SQL Server
2.  Install ODBC Drivers 
3.  Use SSIS or it's watered down cousin DTSWizard
4.  If really necessary you can drop down to custom code to transfer data.
5.  Edit the ProvidersDescriptors.xml
6.  Out of Memory Reading tuples ?  (add UseDeclareFetch=1 to connect string).

For data type compatibility 
For:  sql server odbc postgres column attribute colum_size is not valid providerdescriptors.xml
MaximumLengthColumnNameNumericPrecisionColumnName, and NumericScaleColumnName attribute values to "LENGTH""PRECISION", and "SCALE", respectively.

The column attribute "COLUMN_SIZE" is not valid.

The column attribute "DECIMAL_DIGITS" is not valid.

The column attribute "COLUMN_SIZE" is not valid.

    NameColumnName = "COLUMN_NAME"
    DataTypeColumnName = "TYPE_NAME"
    MaximumLengthColumnName = "COLUMN_SIZE"
    NumericPrecisionColumnName = "COLUMN_SIZE"
    NumericScaleColumnName = "DECIMAL_DIGITS"

... to ...

    NameColumnName = "COLUMN_NAME"
    DataTypeColumnName = "TYPE_NAME"
    MaximumLengthColumnName = "LENGTH"
    NumericPrecisionColumnName = "PRECISION"
    NumericScaleColumnName = "SCALE"
sql server postgresql out of memory while reading tuples 
"Driver={PostgreSQL Unicode};Server=ip;Port=port;Database=db_name;Uid=user;Pwd=pass;UseDeclareFetch=1"

Stored procedure can not return BIGINT11/2/2021 2:02:07 PM
SQL Server stored procedures can not return a big integer.

Table has identity value with primary key as big integer.

Stored procedure is running Return Scope_Identity() which works until the identity values exceeds the value of the implicit integer conversion, 2,147,483,646.

This is documented, but rarely run into.

The correct best practice is too use an output parameter for the stored procedure.
This requires changing the procedure and calling code, which can be difficult if this happens in production environment during a busy period.

The temporary solution is too reset the identity value to a negative values, as the range of an integer, -2,147,483,646 to +2,147,483,646.

Watch when you do this, as checkident('tablename') will reset an identity value, so be sure to use noreseed as part of the command.

For monitoring purposes:

dbcc checkident ([table name],reseed,-2147483646)
select max([column identity]),min([column identity]) from [table name] with (nolock)
select count(*) from [table name] with (nolock) where [column identity] < 0


and some discussion on the work-around, which is too return the value as an output parameter.

Grant Create Table Permissions to a Schema, but not DBO11/2/2021 1:48:31 PM

Granting Create Table permissions to a specific schema requires, granting create table at the database level and alter at the schema level (doing no grants to the dbo schema).

Create user [some user] from external provider

Grant alter,select,insert,update,delete on schema::[schema name] to user [some user]

-- if control or references is needed
-- grant control,references on schema:: [schema name] to user [some user]

grant create table, create view, create function,create type to [some user]

Always confuses me to remember the grant alter and the create table portions.

SQL Server Walk the Dog8/18/2021 6:07:33 AM

Recently I was given several text files with 500k to 1 million update statements in each that needed to be run.

Try loading and running this in SQL Server Management studio and you'll find that if it does load and run, sometimes it fails part through with memory error or other issues (all related to the client, not the engine).

It would have been better for the teams to work together and write this as a table join update as opposed to generating so many individual statements, but that ship had sailed and now these needed to be run.

I ended up "Walking the Dog" or RBAR (Row by agonizing Row).

Load the text files to a table, use a cursor to read each row and dynamically execute it.  I threw in a global counter i could query to figure out where it was at in the process.  In total it took about 2-3 hours to run.

Here is how to walk the dog.



DECLARE @sqlstmt nvarchar(4000)


create table ##Global_Count (current_rownum int)

insert into ##Global_Count values (0)



PRINT '-------- starting --------'; 


DECLARE cur_statements CURSOR FOR  

SELECT sqlstmt FROM DocumentId_3


OPEN cur_statements 


FETCH NEXT FROM cur_statements INTO @sqlstmt




       Exec sp_Executesql @sqlstmt

       update ##Global_Count set current_rownum = current_rownum + 1

FETCH NEXT FROM cur_statements INTO @sqlstmt



Close cur_statements

Deallocate cur_statements


select * from ##Global_Count

drop table ##Global_Count