SQL Query: How to retrieve the Table columns from a stored procedure

SELECT
name AS ColumnName
,system_type_name AS DataType
,CASE
WHEN is_nullable=0 THEN ‘NOT NULL’
WHEN is_nullable=1 THEN ‘NULL’
END AS IS_NULLABLE
,column_ordinal AS OrdinalPosition
,(‘Col_’ + CAST(column_ordinal AS varchar(4))) AS Alias
FROM sys.dm_exec_describe_first_result_set
(N'[dbo.Registration]’, null, 0) ;

 

SQL SERVER – How to retrieve the metadata of a stored procedure

In my earlier articles, I wrote about sys.dm_exec_describe_first_result_set
(A new dynamic management view shipped with SQL Server 2012), it is very handy when you need to know the metadata of the first possible result set of any Transact SQL. Today, I came across an issue when I tried to get the metadata of a stored procedure in earlier version of SQL Server and wanted to use the list of columns for further manipulations. However, I could not find a proper solution like sys.dm_exec_describe_first_result_set in the earlier version of SQL Server. So I started developing the solution.

Given below are the solution.

SOLUTION 1 : For SQL Server 2012 and above using sys.dm_exec_describe_first_result_set

1
2
3
4
5
USE AdventureWorks2012;
GO
SELECT FROM sys.dm_exec_describe_first_result_set
(N'[dbo].[uspGetManagerEmployees] 16', null, 0) ;
 --OUTPUT

metadata of stored procedure.1.1

SOLUTION 2 : For SQL Server 2005 and above using OPENROWSET

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
USE AdventureWorks2012
GO
--DROP TABLE #temp1
GO
SELECT * INTO  #temp1
FROM OPENROWSET('SQLNCLI', 'Server=(local);uid=sa;pwd=test',
'EXEC AdventureWorks2012.[dbo].[uspGetManagerEmployees] 16');
GO
SELECT * FROM tempdb.sys.columns
WHERE object_id=object_id('tempdb..#temp1')
GO
--View the column name / metadata of the
--above created temporary table.
--OUTPUT

metadata of stored procedure.1.2

Advertisements

Regex: query examples and website links

source link:

  1. debuggex.com
  2. regex101.com

OS_PostR_RECT_620d0f5d10-b28e_COMPLETE_NP_FFP.CSV

OS_Rec_A_RECT_620d0f5d10-b28e_COMPLETE_NP_FFP.CSV

OS_PostRec_June 2017_EMMain_07L_1708201_7163015_T2O_f29c00f7f2_FFP.ctrl
OS_PostRec_June 2017_EMMain_07L_1708201_7163015_T2O_f29c00f7f2_FFP.txt
OS_PostRec_June 2017_EMMain_07L_170820163023_0ab98771e1_T2O_20170904140654_20170904_141835_FFP.txt

OS_Rec_April 2017_EMMain_EMError_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP.txt
OS_PostRec_April 2017_EMMain_EMError_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP.txt
OS_Rec_April 2017_EMMain_EMError_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP.txt
OS_PostRec_April 2017_EMMain_EMError_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP.txt

OS_Rec_April 2017_EMMain_EMError_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018
OS_PostRec_April 2017_EMMain_EMError_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018
OS_Rec_April 2017_EMMain_EMError_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018
OS_PostRec_April 2017_EMMain_EMError_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018

 

OS_Rec_April 2017_EMMain_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018

OS_PostRec_April 2017_EMMain_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018

OS_Rec_April 2017_EMMain_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018

OS_PostRec_April 2017_EMMain_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018

 

OS_Rec_April 2017_EPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018

OS_PostRec_April 2017_EPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018

OS_Rec_April 2017_EPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018

OS_PostRec_April 2017_EPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018

 

OS_Rec_April 2017_OPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018

OS_PostRec_April 2017_OPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018

OS_Rec_April 2017_OPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018

OS_PostRec_April 2017_OPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018

 

OS_Rec_April 2017_SPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018

OS_PostRec_April 2017_SPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_FFP Feb 2018

OS_Rec_April 2017_SPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018

OS_PostRec_April 2017_SPMain_06Q_26052017082756_a8abaa7c46_20170530_174428_T2O Feb 2018

 

 

2017_M04_Postrec_July_2017_Basildon & Brentwood_99E_allprov_20171003_AiC.zip
2017_M05_rec_August_2017_Basildon & Brentwood_99E_ALLprov_20171003_AiC.zip
2017_M05_rec_August_2017_Basildon & Brentwood_99E_nt2048_20171003_AiC.zip
2017_M05_rec_August_2017_Basildon & Brentwood_99E_rddRF4_20171003_AiC.zip

 

 

Regular expression to find month names:

(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))

 

(\b\d{1,2}\D{0,3})?\b(?:Jan(?:uary)?|Feb(?:ruary)?|Mar(?:ch)?|Apr(?:il)?|May|Jun(?:e)?|Jul(?:y)?|Aug(?:ust)?|Sep(?:tember)?|Oct(?:ober)?|(Nov|Dec)(?:ember)?)\D?(\d{1,2}\D?)?\D?((19[7-9]\d|20\d{2})|\d{2})

 

Regular expression tofind Month Year:

((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))

 

 

 

RecOrPostRec:

(?:[_])(?<RecOrPostRec>(?:(Post)?(?:Rec)))(?:[_])

 

CCGCode:

(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])

 

FileType (EMMain/EPMain):

(?:[_])(?<FileType>[a-zA-Z]{6,15})(?:[_])(?:(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_]))

Save:

(?<RecOrPostRec>_Rec_)|(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))|(?<Year>[0-9]{4}_)|(?<FileType>_EMMain_)|(?<CCGCode>_[0-9][0-9a-zA-Z]{2}_)

Full regex – never change or delete:

(?:[_])(?<RecOrPostRec>(?:(Post)?(?:Rec)))(?:[_])((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))(?:[_])(?<FileType>[a-zA-Z]{6,15})(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])(?:(?<UniquePart>[a-zA-Z0-9_]*))(?:[_])(?<Processor>(?:FFP)|(?:T2O))

Full regex – v2:

(?<ExtractSetType>[0-9a-z]+)_(?<ConcilationType>(?:Post)?Rec)_(?:(?<Month>Jan(?:uary)?|Feb(?:ruary)?|Mar(?:ch)?|Apr(?:il)?|May|Jun(?:e)?|Jul(?:y)?|Aug(?:ust)?|Sep(?:tember)?|Oct(?:ober)?|Nov(?:ember)?|Dec(?:ember)?))[ ](?<Year>[0-9]{4})_(?<FileType>(?:[a-z]{2,15}_)?[a-z]{2,15})_(?<CCGCode>[0-9][0-9a-z]{2})_(?:(?<UniquePart>[a-zA-Z0-9_]*))_(?<ProcessorType>FFP|T2O)

AiC Rec EMMain:

(?:[_])(?<Rec>Rec)(?:[_])((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))(?:[_])(?<FileType>EMMain)(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])(?:(?<UniquePart>[a-zA-Z0-9_]*))(?:[_])(?<Processor>FFP)

AiC PostRec EMMain:

(?:[_])(?<PostRec>PostRec)(?:[_])((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))(?:[_])(?<FileType>EMMain)(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])(?:(?<UniquePart>[a-zA-Z0-9_]*))(?:[_])(?<Processor>FFP)

T2O Rec EMMain:

(?:[_])(?<Rec>Rec)(?:[_])((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))(?:[_])(?<FileType>EMMain)(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])(?:(?<UniquePart>[a-zA-Z0-9_]*))(?:[_])(?<Processor>T2O)

T2O PostRec EMMain:

(?:[_])(?<PostRec>PostRec)(?:[_])((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))(?:[_])(?<FileType>EMMain)(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])(?:(?<UniquePart>[a-zA-Z0-9_]*))(?:[_])(?<Processor>T2O)

 

 

 

AiC Rec EMError:

(?:[_])(?<Rec>Rec)(?:[_])((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))(?:[_])?(?:[a-zA-Z]{6,15})?(?:[_])(?<FileType>EMError)(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])(?:(?<UniquePart>[a-zA-Z0-9_]*))(?:[_])(?<Processor>FFP)

AiC PostRec EMError:

(?:[_])(?<PostRec>PostRec)(?:[_])((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))(?:[_])?(?:[a-zA-Z]{6,15})?(?:[_])(?<FileType>EMError)(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])(?:(?<UniquePart>[a-zA-Z0-9_]*))(?:[_])(?<Processor>FFP)

T2O Rec EMError:

(?:[_])(?<Rec>Rec)(?:[_])((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))(?:[_])?(?:[a-zA-Z]{6,15})?(?:[_])(?<FileType>EMError)(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])(?:(?<UniquePart>[a-zA-Z0-9_]*))(?:[_])(?<Processor>T2O)

T2O PostRec EMError:

(?:[_])(?<PostRec>PostRec)(?:[_])((?:(?<Month>(?:Jan(?:uary))|(?:Feb(?:ruary))|(?:Mar(?:ch))|(?:Apr(?:il))|(?:Apr(?:il)))|(?:May(?:))|(?:Jun(?:e))|(?:Jul(?:y))|(?:Aug(?:ust))|(?:Sep(?:tember))|(?:Oct(?:ober))|(?:Nov(?:ember))|(?:Dec(?:ember))) (?<Year>([0-9]{4})))(?:[_])?(?:[a-zA-Z]{6,15})?(?:[_])(?<FileType>EMError)(?:[_])(?<CCGCode>[0-9][0-9a-zA-Z]{2})(?:[_])(?:(?<UniquePart>[a-zA-Z0-9_]*))(?:[_])(?<Processor>T2O)

 

 

Current version:

 

(?<ExtractSetType>[0-9a-z]+)_(?<ConcilationType>(?:Post)?Rec)_(?:(?<Month>Jan(?:uary)?|Feb(?:ruary)?|Mar(?:ch)?|Apr(?:il)?|May|Jun(?:e)?|Jul(?:y)?|Aug(?:ust)?|Sep(?:tember)?|Oct(?:ober)?|Nov(?:ember)?|Dec(?:ember)?))[ ](?<Year>[0-9]{4})_(?<FileType>(?:[a-z0-9]{2,15}_)?[a-z0-9]{2,15})_(?<CCGCode>[0-9][0-9a-z]{2})_(?:(?<UniquePart>[a-zA-Z0-9_]*))_(?<ProcessorType>FFP|T2O)(?<Extra>\S*)(?<FileExtension>[.]+[a-z]+)

 

(?<ExtractSetType>[0-9a-z]+)_(?<ConcilationType>(?:Post)?Rec)_(?:(?<Month>Jan(?:uary)?|Feb(?:ruary)?|Mar(?:ch)?|Apr(?:il)?|May|Jun(?:e)?|Jul(?:y)?|Aug(?:ust)?|Sep(?:tember)?|Oct(?:ober)?|Nov(?:ember)?|Dec(?:ember)?))[ ](?<Year>[0-9]{4})_(?<FileType>(?:[a-z0-9]{2,15}_)?[a-z0-9]{2,15})_(?<CCGCode>[0-9][0-9a-z]{2})_(?:(?<UniquePart1>[a-zA-Z0-9_]*))_(?<T2O>T2O)?_(?:(?<UniquePart2>[a-zA-Z0-9_]*))_(?<FFP>FFP)(?<FileExtension>[.]+[a-z]+)

 

Regular expression to select archive name:

(?<FinancialYear>[0-9]{4})_(?<FinancialMonthName>[a-z]+[0-9]{2})_(?<ConcilationType>(?:Post)?Rec)_(?:(?<Month>Jan(?:uary)?|Feb(?:ruary)?|Mar(?:ch)?|Apr(?:il)?|May|Jun(?:e)?|Jul(?:y)?|Aug(?:ust)?|Sep(?:tember)?|Oct(?:ober)?|Nov(?:ember)?|Dec(?:ember)?))_(?<CalendarYear>[0-9]{4})_(?<CCGName>(?:[a-z& _]+))_(?<CCGCode>[0-9][0-9a-z]{2})_(?<ExtractType>[a-z0-9]+)_(?<Date>[0-9]{8})_(?<ProcessorType>[a-z]{3})(?<FileExtension>[.]+zip)

 

Regular expression to select RECT files:

(OS)_((Post)([a-zA-Z])|(Rec)(_[A-Z]))_RECT_([a-zA-z0-9_])*FFP.csv

(?<ExtractSetType>OS)_((?<ConcilationPostRec>Post)([a-z])*|(?<ConcilationRec>Rec)(_[a-z])*)_(?<FileType>RECT)_([a-z0-9-_])*(?<ProcessorTypeRECT>COMPLETE_NP_FFP).(?<FileExtension>csv)

SSIS C#: Read two variables, join them in a two dimensional array, assign to an object variable

SSIS C#: Read two variables, join them in a two-dimensional array, assign to an object variable.

–write DataTable to an object variable

Part 1 plan:

  1. I will receive value in project parameter. For example: SSIS Project variable CCGCode=0A;1D;2J, SSIS Project variable PCTCode=0A;1D;2J
  2. Read the multiple values from the project parameter separated by semicolon;
  3. Split the values by semicolon then insert into a DataTable
  4. Assign the DataTable to an Object Variable

Part 2 plan:

  1. Foreach ADO Enumerator
    1. Collection: Read from the Object Variable
    2. Map the values to a string variable
  2. Use the values to in the task

public void Main()
{

DataTable dt = new DataTable();
OleDbDataAdapter da = new OleDbDataAdapter();
dt.Columns.Add(“CCGCode”, typeof(string));
dt.Columns.Add(“PCTCode”, typeof(string));

string[] ccgCodes, pctCodes;
ccgCodes = Dts.Variables[“$Project::CCGCode”].Value.ToString().Split(‘;’);
pctCodes = Dts.Variables[“$Project::PCTCode”].Value.ToString().Split(‘;’);
for (int i = 0; i < ccgCodes.Length; i++)
{
DataRow dr = dt.NewRow();

//without using the string.IsNullOrEmpty() check it fails, even the input does not have any null values.
dr[“CCGCode”] = string.IsNullOrEmpty(ccgCodes[i])? “NUL” : ccgCodes[i];
dr[“PCTCode”] = string.IsNullOrEmpty(pctCodes[i]) ? “NUL” : pctCodes[i];
dt.Rows.Add(dr);
//MessageBox.Show(dr[“CCGCode”].ToString());
//MessageBox.Show(dr[“PCTCode”].ToString());
//codes.Add(new CCGPCTCodes(ccgCode[i], pctCode[i]));
}

try
{
Dts.Variables[“User::objCodes”].Value = dt;
}
catch (Exception e)
{
Dts.Events.FireError(0, null, e.Message, null, 0);
}
}

C#: Generate Scripts for database for SQL Server

— Generate Scripts for database objects with SMO for SQL Server

Generate Scripts for database objects with SMO for SQL Server

https://www.mssqltips.com/sqlservertip/1826/getting-started-with-sql-server-management-objects-smo/

 

/*working code*/

public void Main()public void Main()

{

ETLControlDetails etl =GetETLControlDetails();

GenerateDropTableScript(etl);

GenerateCreateTableScript(etl);

MessageBox.Show(sb.ToString());

}

/*working code*/

StringBuilder sb = new StringBuilder();

#region public struct ETLControlDetails
public struct ETLControlDetails
{
public string Server { get; set; }
public string SourceServer { get; set; }
public string SourceInstance { get; set; }
public string SourceDatabase { get; set; }
public string SourceSchema { get; set; }
public string SourceTable { get; set; }
}
#endregion

 

#region public ETLControlDetails GetETLControlDetails()
public ETLControlDetails GetETLControlDetails()
{

ETLControlDetails etl = new ETLControlDetails();
etl.Server = Dts.Variables[“User::SourceServer”].Value.ToString() + “\\” + Dts.Variables[“User::SourceInstance”].Value.ToString();
etl.SourceServer = Dts.Variables[“User::SourceServer”].Value.ToString();
etl.SourceInstance = Dts.Variables[“User::SourceInstance”].Value.ToString();
etl.SourceDatabase = Dts.Variables[“User::SourceDatabase”].Value.ToString();
etl.SourceSchema = Dts.Variables[“User::SourceSchema”].Value.ToString();
etl.SourceTable = Dts.Variables[“User::SourceTable”].Value.ToString();
return etl;
}
#endregion

#region public void ConnectToServer(string server)
public Server ConnectToServer(string server)
{
Server myServer = new Server(server);
try
{
/*Using windows authentication*/
myServer.ConnectionContext.LoginSecure = true;
myServer.ConnectionContext.Connect();
////
//Do your work
////
if (myServer.ConnectionContext.IsOpen)
myServer.ConnectionContext.Disconnect();
///*Using SQL Server authentication*/
//myServer.ConnectionContext.LoginSecure = false;
//myServer.ConnectionContext.Login = “SQLLogin”;
//myServer.ConnectionContext.Password = “entry@2008”;
LogInformation(“ConnectToServer”, String.Format(“Connected to the server {0}”, server));
}
catch (Exception e)
{
LogError(“ConnectToServer – Failed connecting to the server”, e);
}
return myServer;
}
#endregion

 

#region public void GenerateCreateTableScript(ETLControlDetails etl)
public void GenerateCreateTableScript(ETLControlDetails etl)
{
/*generate create table script for a individual table */
Server server = new Server(etl.Server);
Database db = server.Databases[etl.SourceDatabase.ToString()];
List<Urn> list = new List<Urn>();
DataTable dataTable = db.EnumObjects(DatabaseObjectTypes.Table);

foreach (DataRow row in dataTable.Rows)
{
list.Add(new Urn((string)row[“Urn”]));
}
Scripter scripter = new Scripter();
scripter.Server = server;
scripter.Options.IncludeHeaders = true;
scripter.Options.IncludeIfNotExists = true;
scripter.Options.SchemaQualify = true;
scripter.Options.SchemaQualifyForeignKeysReferences = true;
scripter.Options.NoCollation = true;
scripter.Options.DriAllConstraints = true;
scripter.Options.DriAll = true;
scripter.Options.DriAllKeys = true;
scripter.Options.DriIndexes = true;
scripter.Options.ClusteredIndexes = true;
scripter.Options.NonClusteredIndexes = true;
scripter.Options.ToFileOnly = true;
//scripter.Options.FileName = @”C:\Download\script_custom.sql”;
/*get the all the tables for a database*/
//scripter.Script(list.ToArray());

/*get a particular table script only*/
Table myTable = db.Tables[etl.SourceTable, etl.SourceSchema];
//scripter.Script(new Urn[] { myTable.Urn});

StringCollection sc = scripter.Script(new Urn[] { myTable.Urn });
foreach (string script in sc)
{
sb.AppendLine();
sb.AppendLine(“–create table”);
sb.Append(script + “;”);
}
}
#endregion

 

#region public void GenerateCreateTableScript_V2(ETLControlDetails etl)
public void GenerateCreateTableScript_V2(ETLControlDetails etl)
{
//=================================================
Server myServer = ConnectToServer(etl.Server);

Scripter scripter = new Scripter(myServer);
Database db = myServer.Databases[etl.SourceDatabase.ToString()];

Table myTable = db.Tables[etl.SourceTable, etl.SourceSchema];
/* With ScriptingOptions you can specify different scripting
* options, for example to include IF NOT EXISTS, DROP
* statements, output location etc*/
ScriptingOptions scriptOptions = new ScriptingOptions();
scriptOptions.IncludeHeaders = true;
scriptOptions.ScriptDrops = true;
scriptOptions.IncludeIfNotExists = true;
scriptOptions.WithDependencies = true;
scriptOptions.DriAll = true;
/* You can optionally choose each DRI object separately as given below */
scriptOptions.DriAllConstraints = true;
scriptOptions.DriAllKeys = true;
scriptOptions.DriChecks = true;
scriptOptions.DriClustered = true;
scriptOptions.DriDefaults = true;
scriptOptions.DriForeignKeys = true;
scriptOptions.DriIndexes = true;
scriptOptions.DriNonClustered = true;
scriptOptions.DriPrimaryKey = true;
scriptOptions.DriUniqueKeys = true;
scriptOptions.FileName = @”C:\Download\script_custom.sql”;

/* Generating IF EXISTS and DROP command for tables */
StringCollection tableScripts = myTable.Script(scriptOptions);
foreach (string script in tableScripts)
{
sb.AppendLine();
sb.AppendLine(“–create table”);
sb.Append(script + “;”);
//MessageBox.Show(script);
}
/* Generating CREATE TABLE command */
tableScripts = myTable.Script();
foreach (string script in tableScripts)
{
sb.AppendLine();
sb.AppendLine(“–create table”);
sb.Append(script + “;”);
//MessageBox.Show(script);
}
}
#endregion

 

#region public void GenerateDropTableScript(ETLControlDetails etl)
public void GenerateDropTableScript(ETLControlDetails etl)
{
/*generate drop table script for a individual table*/
Server server = new Server(etl.Server);
Database db = server.Databases[etl.SourceDatabase.ToString()];
List<Urn> list = new List<Urn>();
DataTable dataTable = db.EnumObjects(DatabaseObjectTypes.Table);

foreach (DataRow row in dataTable.Rows)
{
list.Add(new Urn((string)row[“Urn”]));
}
Scripter scripter = new Scripter();
scripter.Server = server;
scripter.Options.IncludeHeaders = true;
scripter.Options.IncludeIfNotExists = true;
scripter.Options.ScriptDrops = true;
scripter.Options.SchemaQualify = true;
//scripter.Options.FileName = @”C:\Download\script_custom.sql”;
/*get the all the tables for a database*/
//scripter.Script(list.ToArray());

/*get a particular table script only*/
Table myTable = db.Tables[etl.SourceTable, etl.SourceSchema];
//scripter.Script(new Urn[] { myTable.Urn });

StringCollection sc = scripter.Script(new Urn[] { myTable.Urn });
foreach (string script in sc)
{
sb.AppendLine();
sb.AppendLine(“–script option”);
sb.Append(script + “;”);
}

}
#endregion

 

Source from online:

Problem
In my last article, Getting started with SQL Server Management Objects (SMO), I discussed what SMO is, how you can start working with SMO, how to connect to SQL server, how to enumerate through the different SQL objects, create a database, create a table etc, all programmatically.

In this tip I would like to take you on an SMO ride to generate SQL object scripts programmatically. Though you can do this through SQL Server Management Studio (SSMS) there might be times (more details on usage scenarios given below) when you would need to create SQL scripts automatically.

Solution
As I discussed in my last tip, SQL Server objects are represented as object hierarchies inside SMO, for example a Server object is a collection of Database objects. A Database object is a collection of a Table (though there are couple of other collection inside the Database object as well such as Stored Procedure, Views, User-defined Functions etc).  A Table  is a collection of a Column and so on.

Every object in this hierarchy has a method called a script, which returns a string collection of scripts. Apart from that, SMO provides a utility class, Scripter, which generates the script in a more efficient way. For example, the Scripter class can discover the relationships between objects and can provide scripts for dependencies as well and it can respond to Progress and Error events.

Usage Scenario

As I said, SQL Server Management Studio (SSMS) provides a wizard type interface to script out all or selected objects, but there might be some scenarios, some of them are discussed below, where you would consider the use of SMO instead of SSMS.

  • You want to create a tool which will automatically set up (or sync) an environment for dev or test which resembles the production environment.
  • You have a data warehouse database, in this database data is not that important (or even the data size is so big, taking a backup of the entire database would not be feasible) as data can again be pulled from the source systems but you want to make sure the schema objects are scripted and backed up automatically so that you can re-create the database in case of a disaster.
  • Programmatically you want to control the backup and restore process of database administration (I will cover this in more detail in another tip “Backup and Restore Programmatically with SMO”).
  • Programmatically you want to transfer a database schema and data to another instance of SQL Server. (I will cover this in more detail in another tip “Transferring schema objects and data programmatically with SMO”).

Example

Before you start writing your code using SMO, you need to take reference of several assemblies which contain different namespaces to work with SMO. For more details on what these assemblies are and how to reference them in your code, refer to my tip Getting started with SQL Server Management Objects (SMO).

C# Code Block 1 – Here I am using the Scripter utility class to generate the script for two selected databases. Two Databaseobjects are created first; one of them refers to the AdventureWorks database and another one refers to AdventureWorksDW.  The script method of the scripter object is called which takes database object URN (Unique Resource Name) as input and returns a string collection of scripts. URN is a new concept in SMO (this was not available in SQL-DMO) which provides similar notation like XPath to denote object hierarchy.

C# Code Block 1 – Generating Database Script

Scripter scripter = new Scripter(myServer);
Database myAdventureWorks = myServer.Databases[“AdventureWorks”];
Database myAdventureWorksDW = myServer.Databases[“AdventureWorksDW”];
Urn[] DatabaseURNs = new Urn[] { myAdventureWorks.Urn, myAdventureWorksDW.Urn };
StringCollection scriptCollection = scripter.Script(DatabaseURNs);
foreach (string script in scriptCollection)
Console.WriteLine(script);

C# Code Block 2 – In this code block I am generating CREATE TABLE scripts for all the tables in the AdventureWorks database. As said before, a database is a collection of tables, so I am enumerating through the table collection of the database to generate a script for each table. Along with that I am also using the ScriptOptions class to specify the different scripting options, for example in this code I am scripting IF NOT EXISTS and DROP TABLE scripts as well.

C# Code Block 2 – Generating scripts for table collection

Scripter scripter = new Scripter(myServer);

Database myAdventureWorks = myServer.Databases[“AdventureWorks”];
/* With ScriptingOptions you can specify different scripting
* options, for example to include IF NOT EXISTS, DROP
* statements, output location etc*/
scriptOptions = new ScriptingOptions();
scriptOptions.ScriptDrops = true;
scriptOptions.IncludeIfNotExists = true;
foreach (Table myTable in myAdventureWorks.Tables)
{
/* Generating IF EXISTS and DROP command for tables */
StringCollection tableScripts = myTable.Script(scriptOptions);
foreach (string script in tableScripts)
Console.WriteLine(script);

/* Generating CREATE TABLE command */
tableScripts = myTable.Script();
foreach (string script in tableScripts)
Console.WriteLine(script);
}

C# Code Block 3 – This code block further extends the use of the Scripter and ScriptOptions classes to generate a script for a table (HumanResources.EmployeeAddress) along with all the other objects on which this table depends on. The ScriptOptionsclass also provides several properties for DRI (Declarative Referential Integrity) objects. You can either select individual DRI objects or all to script out along with the main object script.

C# Code Block 3
Generating table script with all dependencies and DRI objects

Scripter scripter = new Scripter(myServer);
Database myAdventureWorks = myServer.Databases[“AdventureWorks”];
Table myTable = myAdventureWorks.Tables[“EmployeeAddress”, “HumanResources”];/* Generate Scripts of table along with for all
* objects on which this table depends on */
ScriptingOptions scriptOptionsForDependendencies = new ScriptingOptions();
scriptOptionsForDependendencies.WithDependencies = true;
/* DriAll will include all DRI objects in the generated script. */
scriptOptionsForDependendencies.DriAll = true;
/* You can optionally choose each DRI object separately as given below */
//scriptOptionsForDependendencies.DriAllConstraints = true;
//scriptOptionsForDependendencies.DriAllKeys = true;
//scriptOptionsForDependendencies.DriChecks = true;
//scriptOptionsForDependendencies.DriClustered = true;
//scriptOptionsForDependendencies.DriDefaults = true;
//scriptOptionsForDependendencies.DriForeignKeys = true;
//scriptOptionsForDependendencies.DriIndexes = true;
//scriptOptionsForDependendencies.DriNonClustered = true;
//scriptOptionsForDependendencies.DriPrimaryKey = true;
//scriptOptionsForDependendencies.DriUniqueKeys = true;/* If you can use FileName to output generated script in a file
* Note : You need to have access on the specified location*/
scriptOptionsForDependendencies.FileName = @”D:\TableScriptWithDependencies.sql”;
StringCollection tableScripts = myTable.Script(scriptOptionsForDependendencies);
foreach (string script in tableScripts)
Console.WriteLine(script);

C# Code Block 4 – This code block provides CREATE TABLE scripts for all the AdventureWorks’ tables along with all the indexes of each table. Here I am using IndexCollection class to enumerate through index collection of the table to generate CREATE INDEX scripts for all the indexes on the given table.

C# Code Block 4
Generating scripts for table collection along with all the indexes on the table

Scripter scripter = new Scripter(myServer);
Database myAdventureWorks = myServer.Databases[“AdventureWorks”];
/* With ScriptingOptions you can specify different scripting
* options, for example to include IF NOT EXISTS, DROP
* statements, output location etc*/
ScriptingOptions scriptOptions = new ScriptingOptions();
scriptOptions.ScriptDrops = true;
scriptOptions.IncludeIfNotExists = true;foreach (Table myTable in myAdventureWorks.Tables)
{
/* Generating IF EXISTS and DROP command for tables */
StringCollection tableScripts = myTable.Script(scriptOptions);
foreach (string script in tableScripts)
Console.WriteLine(script);/* Generating CREATE TABLE command */
tableScripts = myTable.Script();
foreach (string script in tableScripts)
Console.WriteLine(script);IndexCollection indexCol = myTable.Indexes;
foreach (Index myIndex in myTable.Indexes)
{
/* Generating IF EXISTS and DROP command for table indexes */
StringCollection indexScripts = myIndex.Script(scriptOptions);
foreach (string script in indexScripts)
Console.WriteLine(script);

/* Generating CREATE INDEX command for table indexes */
indexScripts = myIndex.Script();
foreach (string script in indexScripts)
Console.WriteLine(script);
}
}

C# Code Block 5 – In my last tip on SMO, I showed you how you can use SMO to create a database and a table on the server programmatically. In this code block, I am using the same code, but this time before creating the objects on the server I am generating database and table scripts. This means it is not required to create objects on the server in order to generate scripts, if you have objects in memory you can generate scripts for these objects as well even without creating on the server.

 

C# Code Block 5
Generating Script for objects in memory even before creating on the server

/* Create database called, “MyNewDatabase” */
Database myDatabase = new Database(myServer, “MyNewDatabase”);/* Output the database script on the console */
StringCollection DBScripts = myDatabase.Script();
foreach (string script in DBScripts)
Console.WriteLine(script);/* Create a table instance */
Table myEmpTable = new Table(myDatabase, “MyEmpTable”);
/* Add [EmpID] column to created table instance */
Column empID = new Column(myEmpTable, “EmpID”, DataType.Int);
empID.Identity = true;
myEmpTable.Columns.Add(empID);
/* Add another column [EmpName] to created table instance */
Column empName = new Column(myEmpTable, “EmpName”, DataType.VarChar(200));
empName.Nullable = true;
myEmpTable.Columns.Add(empName);
/* Add third column [DOJ] to created table instance with default constraint */
Column DOJ = new Column(myEmpTable, “DOJ”, DataType.DateTime);
DOJ.AddDefaultConstraint(); // you can specify constraint name here as well
DOJ.DefaultConstraint.Text = “GETDATE()”;
myEmpTable.Columns.Add(DOJ);
/* Add primary key index to the table */
Index primaryKeyIndex = new Index(myEmpTable, “PK_MyEmpTable”);
primaryKeyIndex.IndexKeyType = IndexKeyType.DriPrimaryKey;
primaryKeyIndex.IndexedColumns.Add(new IndexedColumn(primaryKeyIndex, “EmpID”));
myEmpTable.Indexes.Add(primaryKeyIndex);/* Output the table script on the console */
StringCollection TableScripts = myEmpTable.Script();
foreach (string script in TableScripts)
Console.WriteLine(script);

/* If you want to create objects on the server you need call
* create method or else objects will not be created on the server */
myDatabase.Create();
myEmpTable.Create();

Output:

The complete code listing (created on SQL Server 2008 and Visual Studio 2008, though there is not much difference if you are using it on SQL Server 2005 and Visual Studio 2005) can be found in the below text box.

Visual Studio: C# Debugging is not working

 

 

Understanding symbol files and Visual Studio’s symbol settings

Symbols are a fundamental requirement for debugging and other diagnostic tools. Fortunately in most cases when you are building and launching your application in Visual Studio you don’t have to think about symbols for your code. However the odds are that at some point in time you’ll need to change how symbols load, where the debugger looks for them, or will need to load symbols for a 3rd party component (e.g. Windows or .NET libraries). Additionally because symbols are so fundamental to debugging, we continue to make tweaks to the experience so understanding the ins and outs of how Visual Studio behaves can save you hours of frustration.

In this blog post I’ll walk you through what symbols are and how to configure Visual Studio’s symbol settings (which are used by other diagnostic tools beyond the debugger such as the performance tools and IntelliTrace), the various knobs available when debugging, and how to trouble shoot issues when Visual Studio isn’t able to find the symbol files that you need.

Symbol basics

Before we delve into the details of symbol files it’s important to briefly review what symbols are and why they are important:

  • What is a symbol file? For the Microsoft compilers, these are the .pdb files that are produced as part of your build.
  • What is in a symbol (.pdb) file? The exact contents of symbol files will vary from language to language and based on your compiler settings, but at a very high level they are the record of how the compiler turned your source code into machine code that the processor executes.
  • Why do I need symbols? Without symbols, tools are unable to correlate the instructions executing in the application to the original source code.
    • When debugging, without a symbol file you are unable to set breakpoints on a specific line of code. If symbols are not loaded you will see a hollow circle with a warning symbol while in debug mode, and if you hover the mouse over it a tooltip will tell you that the breakpoint will not be hit because no symbols have been loaded.
    • Depending on what you are debugging, symbols may be required to show you a complete call stack and to inspect objects using the Watch windows, or DataTips (e.g. this is true for C++).
    • Note: If you are debugging a dump file that does not contain the heap, the debugger will need access to the original binary file so it can determine the correct symbol file to load. Said another way, if you are debugging a dump with no heap information, you need both the corresponding binary and symbol file on the symbol path.

clip_image001

Visual Studio’s default behavior

Before we look at any of Visual Studio’s advanced settings it’s important that I stop and review the default behavior (meaning if you never touch a setting how will it behave):

  • Visual Studio will try to load symbols for all binaries (referred to as “modules”) in the process when the module is loaded (and for all modules already loaded when attaching to a process).
    • The exception to this is when you are debugging managed (.NET) applications, the debugger will not load symbols for any binaries considered “not your code” when “Just My Code” is enabled.
  • No symbol locations are set, so it will not find symbols for any Microsoft runtime binaries
    • If you right click on a module in the Call Stack or Modules windows and choose to load symbols it will automatically try to get them from the Microsoft public symbol servers assuming it can’t be found on your local machine.
  • Visual Studio will always find symbols when:
    • The symbol file is located in the same folder as its corresponding module. The default build output settings for Visual Studio projects will output the symbols next to the binaries. This means that Visual Studio will always be able to find the symbols for your projects.
    • The symbol file is located in the same directory is was placed during compilation. The full path of the .pdb is placed into the binary at build time.

How can I tell if a symbol is loaded and if not why?

The screenshot above (with the hollow breakpoint) shows a situation where symbols didn’t load for a source file you are trying to set a breakpoint in. The other ways to determine if symbols are loaded:

  • A message will appear in the call stack window saying that symbols are not loaded

clip_image003

  • The Modules window will tell you (Debug -> Windows -> Modules):
    • The status of the symbol file (loaded, skipped, or couldn’t be opened or found)
    • Path the binary is loaded from
    • [if loaded] where the symbol file was loaded from
    • The module version
    • The module’s time stamp

clip_image005

Additionally the debugger can tell you why it didn’t load symbols and where it searched for them. To see this information, open the Modules window, right click on a module and choose “Symbol Load Information…”

clip_image007

This will display a box that shows you all the paths the debugger searched for the symbol file.

clip_image009

Some common reasons symbols aren’t loaded include:

  • Symbol paths don’t point to the correct location
  • The symbol file is from a different version of the module than the one loaded in the process
    • Visual Studio requires that the symbol file come from the exact same build as the module. It cannot load symbols that come from a different build even if the source code was identical
  • [Managed only] Just My Code settings prevent the debugger from loading the symbol file

Configuring Visual Studio’s settings

Now that you understand what symbols are, and how to determine if they are loaded let’s look at how you configure Visual Studio’s symbol settings. To access symbols settings, go to the “Debug” menu and choose “Options…” (“Options and Settings…” in previous versions of Visual Studio), and then select the “Symbols” sub page

clip_image011

You’ll notice the following settings on the page:

    1. Symbol file (.pdb) locations
    2. Symbol cache settings
    3. “Load all symbols” button

Autom

  1. Automatic symbol loading settings

Symbol file locations

If you are building and debugging your application from Visual Studio this option likely won’t apply to the symbols for your modules, but remote symbol locations (or symbol servers) are used to load symbols in situations where you need a 3rd party symbol file (e.g. one from Microsoft), or you are working in an environment where you may not have the symbols on your local machine (e.g. your application is built using a build server.  If you are using TFS read about how to add symbol and source archiving support).

The symbol file location box tells the debugger where to look for symbol files, these can be http symbol servers (e.g. the prepopulated “Microsoft Symbol Severs” entry), network shares, or folders on your local machine

  • You can add as many paths as you need.
  • There is a pre-populated entry for Microsoft’s public symbol servers. If you want to load symbols for modules from Microsoft (e.g. for the runtime or operating system) check this box.
  • Visual Studio will search local paths before querying network paths regardless of the order provided.
  • For performance reasons, beginning in Visual Studio 2012 Update 1, Visual Studio will only search each symbol server once for a symbol file in a given Visual Studio session (until you restart Visual Studio) when automatic symbol loading is enabled. This means you don’t pay the cost of a network call every time you start debugging when the server doesn’t contain the file.
  • Environment Variable: _NT_SYMBOL_PATH: If you see this in your symbol file locations it means that the environment variable _NT_SYMBOL_PATH is set. Visual Studio uses a library from Windows to load symbols, and the library will always search any locations in this environment variable for symbols; which is why you cannot uncheck the option in Visual Studio. You will need to unset the environment variable if you want Visual Studio to ignore the variable.
    If you need the environment variable for other purposes, the easy way to unset the variable locally is to open a command prompt, enter “set _NT_SYMBOL_PATH=” and then launch Visual Studio from the command prompt. You system’s environment settings will remain unaffected.

Symbol cache

The symbol cache is the location on your local machine that Visual Studio places a copy of the symbols it finds on remote locations for future use. Assuming you provide a path for the symbol cache, Visual Studio will search the cache before trying to find symbols in any symbol file locations you specified above. For performance reasons we recommend specifying a symbol cache if you need symbols stored in a remote location.

Load All Symbols

This button is only enabled while Visual Studio is in debug mode, and clicking it will tell the debugger to try to load symbols for all modules in the process.

Automatic Symbol Loading

Visual Studio offers two modes of automatic symbol loading:

  • Automatically load symbols for all modules unless excluded: As the title indicates, unless you add a module to the exclude list by clicking “Specify excluded modules”, Visual Studio will try to load symbols for all modules in the process. You will typically want this setting if you want symbols loaded for almost everything in the process, or if there are only a handful of very large ones you don’t want loaded for memory or debug startup performance reasons.
  • Only specified modules: This setting by default will load symbols that are next to the binary on the disk, but will not try to load symbols for any other modules unless you add them to the include list by clicking “Specify modules”.
    • beginning with Visual Studio 2013 Update 2, the “Specify modules” dialogs accept * for module names. So if for example you wanted to use manual loading but always load symbols for anything with “Microsoft” in the name, you could enter “*Microsoft*”

clip_image013

  • Symbols can be manually loaded from the Call Stack window as needed. To do this, you can select an individual frame (or select all with ctrl+a), right click and choose “Load symbols”. This will load symbols for all of the modules that were in the call stack window at that time. If loading symbols improves the call stack and additional modules are found you will need to repeat this as it won’t automatically try to load symbols for modules that appear due to the previous load.

clip_image015

  • The other option to load symbols manually when you are debugging is to locate the module in the Modules window (Debug -> Windows -> Modules), right click and choose “Load Symbols”.

clip_image016

Deep dive on manual symbol loading

It is worth calling out that “Only specified modules” is my and many of the Visual Studio team’s preferred setting when debugging. The reason for this is:

  • When debugging very large applications you can load symbols on demand as you need them. This helps with:
    • The performance of your debug session—you don’t have to wait for symbols to load for everything in the process you are debugging.
    • Visual Studio’s memory—if you are debugging a very large application you may need to selectively load symbols for only the modules you are interested in. Since Visual Studio is a 32bit process, it can at most grow to 4 GB in Virtual Memory. For very large applications you can have more in symbol files than this.
  • You can leave your symbol servers enabled without encountering unexpected performance hits when debugging new applications or during your first debug session in a new Visual Studio instance.

If you have a very large solution that you build entirely on your machine you will need to uncheck the “Always load symbols located next to modules” checkbox to see the benefits I mentioned above. Then you will either need load the symbols on demand while debugging, or set the ones you need to automatically load.

  • If you need to hit breakpoints in those modules you will want to set them to load automatically.
  • If you aren’t sure you will need the symbols ahead of time you will want to wait and load them only if you need them to inspect an object or complete the call stack.

Conclusion

Keeping track of symbols and configuring your settings correctly and for optimal performance can be quite complicated. Hopefully the above content helps you understand the settings available to you. However if you run into issues I did not cover, or have any other feedback you’d like to share, please let me know below, through Visual Studio’s Send a Smile feature, or in our MSDN forum.