Showing posts with label SSIS 2012. Show all posts
Showing posts with label SSIS 2012. Show all posts
Case
I am filling an Object variable with an Execute SQL Task and I want to use it in a Foreach Loop Container (Foreach ADO Enumerator), but the Foreach Loop stays empty. So I want to check the value of my Object variable. However debugging the package does not show me the value of Object variables. How can I see the content of my Object variable?

No (readable) value for Object variables





















Solution
A solution could be to use a Script Task after the Execute SQL Task to show the content of the Object variable. The script below shows the top (x) records in a MessageBox. The code doesn't need any changes. The only change that you could consider to make is changing the number of records to show in the MessageBox (see C# variable maxRows).
Getting content of Object variable



















1) Add a Script Script Task
Add a new Script Task to the surface of your Control Flow and connect it to your Execute SQL Task. Then edit the Script Task to provide one Object variable in the property ReadOnlyVariables or ReadWriteVariables. This should of course be the same Object variable as in your Execute SQL Task.
Provide one Object variable























2) Edit Script
Make sure to select Microsoft Visual C# as Script Langugage and then hit the Edit Script button to open the Vsta environment. Then first locate the Namesspaces to add an using for System.Data.OleDb.
#region Namespaces
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.Data.OleDb; // Added
#endregion

Then scroll down and located the Main method "public void Main()" and replace it with the code below.
public void Main()
{
// Show max number of data rows in a simgle messagebox
int maxRows = 3;

/////////////////////////////////////////////////////////////////////
// No need to change lines below
/////////////////////////////////////////////////////////////////////

// Create a table object to store the content of the object variable
OleDbDataAdapter dataAdapter = new OleDbDataAdapter();
DataTable myTable = new DataTable();

// Create message string to show the content of the object variable
string message = "";
string header = "Error";

// Five checks before looping through the records in the object variable
////////////////////////////////////
// 1) Is a variable provided?
////////////////////////////////////
if (Dts.Variables.Count.Equals(0))
{
message = "No read-only or read-write variables found";
}
////////////////////////////////////
// 2) Multiple variables provided
////////////////////////////////////
else if(Dts.Variables.Count > 1)
{
message = "Please provide only 1 read-only or read-write variable";
}
////////////////////////////////////
// 3) Is it an object variable?
////////////////////////////////////
else if (!Dts.Variables[0].DataType.ToString().Equals("Object"))
{
message = Dts.Variables[0].Name + " is not an Object variable";
}
////////////////////////////////////
// 4) Is it null or not an table?
////////////////////////////////////
else
{
try
{
// Try to fill the datatable with the content of the object variable
// It will fail when it is null or not containing a table object.
dataAdapter.Fill(myTable, Dts.Variables[0].Value);
}
catch
{
// Failing the third check
message = Dts.Variables[0].Name + " doesn't contain a usable value";
}
}

////////////////////////////////////
// 5) Is it containing records
////////////////////////////////////
if (myTable.Rows.Count > 0)
{
int j = 0;
// Loop through all rows in the dataset but don't exceed the maxRows
for (j = 0; j < myTable.Rows.Count && j < maxRows; j++)
{
// Get all values from a single row into an array
object[] valuesArray = myTable.Rows[j].ItemArray;

// Loop through value array and columnnames collection
for (int i = 0; i < valuesArray.Length; i++)
{
message += myTable.Rows[j].Table.Columns[i].ColumnName + " : " + valuesArray[i].ToString() + Environment.NewLine;
}
// Add an empty row between each data row
message += Environment.NewLine;
}

// Create header
header = "Showing " + j.ToString() + " rows out of " + myTable.Rows.Count.ToString();
}
else if (!message.Equals(""))
{
// Don't do anything
// Record count is 0, but an other validition already failed
}
else
{
// Record count is 0
message = Dts.Variables[0].Name + " doesn't contain any rows";
}

// Show message with custom header
MessageBox.Show(message, header);

Dts.TaskResult = (int)ScriptResults.Success;
}
Now close the Vsta environment and click on OK in the Script Task editor to finish it.


3) The result
Now run the package to see the result. I tried to make it a bit monkey proof by adding some checks in the code. If you provide a good and filled variable then it will show the data. Otherwise it will show an error telling you what's wrong.
The result




Case
I want to import and export Environments to/from my SSIS Catalog. Doing it manually in SSMS takes ages. How can you do that more quickly?
I want to export this environment



















Solution
I will ago I created a couple of PowerShell scripts to deploy environments to your SSIS Catalog with a CSV file, database table or an array as source. The script below is a follow up that allows you to export one or more environments as json files, but also has an import method to deploy those exported environments to a Catalog: Get-CatalogEnvironment and Set-CatalogEnvironment.

This is an example of how you execute the two methods. It first starts with importing a separate script file with various methods and then you can either execute the Get or the Set method:
# PowerShell code
# If you have trouble executing a PowerShell due an Execution Policy then run
# the following script. Note that you need to run PowerShell as administrator
# More information: https://technet.microsoft.com/nl-nl/library/ee176961.aspx
# Set-ExecutionPolicy Unrestricted

# Include functions from a secondairy file
. "$PSScriptRoot\Ssisfunctions.ps1"

# Download example
Get-CatalogEnvironment -SsisServer "mySqlServer\myInstance" -ExportPath "c:\backup\" -FolderName MyEnvFolder -EnvironmentName MyEnvName -Verbose

# Upload example
Set-CatalogEnvironment -SsisServer "mySqlServer\myInstance" -FolderName MyEnvFolder -EnvironmentName MyEnvName -ImportFilePath "C:\temp\employees.json" -DeleteExistingEnvironment $true -Verbose
Example of execution














The environment json files look like this:
[
{
"Name":"FolderStageFiles",
"Description":"Location of stage files",
"Type":"String",
"Sensitive":false,
"Value":"d:\\sources\\"
},
{
"Name":"FtpPassword",
"Description":"Secret FTP password",
"Type":"String",
"Sensitive":true,
"Value":$3cr3t
},
{
"Name":"MIS_STG_Connectionstring",
"Description":"Connectionstring to stage database",
"Type":"String",
"Sensitive":false,
"Value":"Data Source=.\\sql2016;Initial Catalog=MIS_STG;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;"
},
{
"Name":"NumberOfRetries",
"Description":"Number of retries for Webservice Task",
"Type":"Int16",
"Sensitive":false,
"Value":3
}
]

You can also get detailed help information and instructions with the standard PowerShell method. Get-Help. It allows you to see examples or the see which parameter is mandatory or optional.
# PowerShell code
# Getting help about the commands
Get-Help Set-CatalogEnvironment -detailed
Get-Help Get-CatalogEnvironment -example

And this is the content of the Ssisfunctions.ps1 file containing the various methods. Take a look and let me know if you have any improvements
# PowerShell code: Ssisfunctions.ps1 (v0.1)
<#
.Synopsis
Download one or more environments from an SSIS Catalog as JSON files

.DESCRIPTION
This functions allows you to download an Environment from the SSIS Catalog. By leaving out the foldername or environmentname you can also download
multiple files. All files are downloaded as JSON files in the format [FolderName].[EnvironmentName].json
Example file of export:

[
{
"Name":"FolderStageFiles",
"Description":"Location of stage files",
"Type":"String",
"Sensitive":false,
"Value":"d:\\sources\\"
},
{
"Name":"FtpPassword",
"Description":"Secret FTP password",
"Type":"String",
"Sensitive":true,
"Value":null
},
{
"Name":"MIS_STG_Connectionstring",
"Description":"Connectionstring to stage database",
"Type":"String",
"Sensitive":false,
"Value":"Data Source=.\\sql2016;Initial Catalog=MIS_STG;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;"
},
{
"Name":"NumberOfRetries",
"Description":"Number of retries for Webservice Task",
"Type":"Int16",
"Sensitive":false,
"Value":3
}
]

.PARAMETER SsisServer
Mandatory: The name of the SQL Server instance that runs the SSIS Catalog

.PARAMETER FolderName
Optional: The name of the Catalog folder that contains the Environment

.PARAMETER EnvironmentName
Optional: The name of the Environment

.PARAMETER ExportPath
Optional: The fully qualified path where the json files will be saved. Default value: c:\temp\

.PARAMETER Verbose
Optional: Get more logging information on the screen

.EXAMPLE
Get-CatalogEnvironment -SsisServer "myServer\myInstance" -ExportPath "c:\backup\" -FolderName myCatalogFolder -EnvironmentName myEnvironmentName

.EXAMPLE
Get-CatalogEnvironment -SsisServer "myServer\myInstance" -ExportPath "c:\backup\" -Verbose

.NOTES
You cannot get the value of sensitive variables.The value will be NULL in the export file.
Current scripts works for SSIS 2016. Change version number in code to use an other version of SSIS.

.LINK
https://server.hoit.asia/
#>
Function Get-CatalogEnvironment
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true, Position=0)]
[ValidateLength(1,50)] # String must be between 1 and 50 chars long
[string]$SsisServer,

[Parameter(Mandatory=$false, Position=1)]
[ValidateLength(1,128)] # String must be between 1 and 128 chars long
[string]$FolderName,

[Parameter(Mandatory=$false, Position=2)]
[ValidateLength(1,128)] # String must be between 1 and 128 chars long
[string]$EnvironmentName,

[Parameter(Mandatory=$false, Position=3)]
[string]$ExportPath = "C:\temp\"
)

# Don't continue after error
$ErrorActionPreference = "Stop"

#################################################
############## SHOW ALL PARAMETERS ##############
#################################################
Write-Verbose "========================================================="
Write-Verbose "==     Used parameters - Get-CatalogEnvironment       =="
Write-Verbose "========================================================="
Write-Verbose "SSISServer              : $($SsisServer)"
Write-Verbose "FolderName         : $($FolderName)"
Write-Verbose "EnvironmentName : $($EnvironmentName)"
Write-Verbose "ExportPath      : $($ExportPath)"
Write-Verbose "========================================================="

 
#################################################
############### ADD SSIS ASSEMBLY ###############
#################################################
# Change assembly version number to use an other SSIS version
# 13.0.0.0 = SSIS 2016
# 12.0.0.0 = SSIS 2014
# 11.0.0.0 = SSIS 2012
$SsisNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
Add-Type -AssemblyName "$($SsisNamespace), Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"


#################################################
############ CONNECT TO SSIS SERVER #############
#################################################
# First create a connection to SQL Server
$SqlConnectionstring = "Data Source=$($SsisServer);Initial Catalog=master;Integrated Security=SSPI;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnectionstring

# Then use that SQL connection to create an
# Integration Services object.
$IntegrationServices = New-Object $SsisNamespace".IntegrationServices" $SqlConnection

# Check if connection succeeded
If (!$IntegrationServices)
{
Throw [System.Exception] "Failed to connect to server $($SsisServer)"
}
Else
{
Write-Verbose "Connected to server $($SsisServer)"
}


#################################################
########### CONNECT TO SSIS CATALOG #############
#################################################
# Create object for SSISDB Catalog
$Catalog = $IntegrationServices.Catalogs["SSISDB"]

# Check if the SSISDB Catalog exists
If (!$Catalog)
{
# Catalog does not exist.
Throw [System.Exception] "SSISDB catalog does not exist"
}
Else
{
Write-Verbose "SSISDB catalog found"
}


#################################################
############## CHECK EXPORT FOLDER ##############
#################################################
# Check if folder exists
If (-Not (Test-Path $ExportPath))
{
# Create new folder
New-Item -ItemType directory -Path $ExportPath | Out-Null
Write-Host "Folder created: " $ExportPath
}
Else
{
Write-Verbose "Folder $($ExportPath) found"
}


#################################################
############# LOOP THROUGH FOLDERS ##############
#################################################
# Loop though all folder or filter on a folder name
Foreach ($Folder in $Catalog.Folders | WHERE {$_.Name -eq $FolderName -or (!$FolderName)})
{
# Loop though all environments or filter on a environment name
Foreach ($Environment in $Folder.Environments | WHERE {$_.Name -eq $EnvironmentName -or (!$EnvironmentName)})
{
Write-Host "Exporting $($ExportPath)$($Folder.Name).$($Environment.Name).json"
$Environment.Variables | Select-Object -Property Name,Description,@{Name='Type';Expression={"$($_.Type)"}},Sensitive,Value | ConvertTo-Json -Compress | Out-File "$($ExportPath)$($Environment.Parent.Name).$($Environment.Name).json"

# Show warnings if the environment contains sensitive variables
$Environment.Variables | Select-Object -Property Name,Sensitive | Where {$_.Sensitive -eq $True} | ForEach-Object {
Write-Warning "Variable $($_.Name) is sensitive. Cannot retrieve its value"
}
}
}

}

<#
.Synopsis
Upload a json environment file to an SSIS Catalog

.DESCRIPTION
This functions allows you to upload an Environment to the SSIS Catalog. It can update (no deletes) or replace an existing environment.
Example file which can be imported:

[
{
"Name":"FolderStageFiles",
"Description":"Location of stage files",
"Type":"String",
"Sensitive":false,
"Value":"d:\\sources\\"
},
{
"Name":"FtpPassword",
"Description":"Secret FTP password",
"Type":"String",
"Sensitive":true,
"Value":$3cr3t
},
{
"Name":"MIS_STG_Connectionstring",
"Description":"Connectionstring to stage database",
"Type":"String",
"Sensitive":false,
"Value":"Data Source=.\\sql2016;Initial Catalog=MIS_STG;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;"
},
{
"Name":"NumberOfRetries",
"Description":"Number of retries for Webservice Task",
"Type":"Int16",
"Sensitive":false,
"Value":3
}
]

.PARAMETER SsisServer
Mandatory: The name of the SQL Server instance that runs the SSIS Catalog

.PARAMETER FolderName
Mandatory: The name of the Catalog folder where the Evironment will be stored

.PARAMETER EnvironmentName
Mandatory: The name of the Environment

.PARAMETER ImportFilePath
Mandatory: The fully qualified path of the json file that needs to be imported

.PARAMETER DeleteExistingEnvironment
Optional: Setting to $true first deletes an existing environment. Default value: $false

.PARAMETER Verbose
Optional: Get more logging information on the screen

.EXAMPLE
Set-CatalogEnvironment -SsisServer "MYSERVER\myInstance" -FolderName MyEnvFolder -EnvironmentName MyEnvName -ImportFilePath "C:\backup\Environments.Generic.json" -DeleteExistingEnvironment $true

.EXAMPLE
Set-CatalogEnvironment -SsisServer "MYSERVER\myInstance" -FolderName MyEnvFolder -EnvironmentName MyEnvName -ImportFilePath "C:\backup\Environments.Generic.json" -Verbose

.NOTES
You cannot insert null values. The will be skipped with a warning
Current scripts works for SSIS 2016. Change version number in code
to use an other version of SSIS.

.LINK
https://server.hoit.asia/
#>
Function Set-CatalogEnvironment
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true, Position=0)]
[ValidateLength(1,50)] # String must be between 1 and 50 chars long
[string]$SsisServer,

[Parameter(Mandatory=$true, Position=1)]
[ValidateLength(1,128)] # String must be between 1 and 128 chars long
[string]$FolderName,

[Parameter(Mandatory=$true, Position=2)]
[ValidateLength(1,128)] # String must be between 1 and 128 chars long
[string]$EnvironmentName,

[Parameter(Mandatory=$true, Position=3)]
[ValidateScript({Test-Path -Path $_ -PathType Leaf})] # File must exist
[ValidatePattern(‘.json$’)] # Extension must be .json
[string]$ImportFilePath,

[Parameter(Mandatory=$false, Position=4)]
[bool]$DeleteExistingEnvironment = $false
)

# Don't continue after error
$ErrorActionPreference = "Stop"

#################################################
############## SHOW ALL PARAMETERS ##############
#################################################
Write-Verbose "========================================================="
Write-Verbose "==     Used parameters - Set-CatalogEnvironment       =="
Write-Verbose "========================================================="
Write-Verbose "SSISServer              : $($SsisServer)"
Write-Verbose "FolderName         : $($FolderName)"
Write-Verbose "EnvironmentName : $($EnvironmentName)"
Write-Verbose "ImportFilePath      : $($ImportFilePath)"
Write-Verbose "DeleteExistingEnvironment : $($DeleteExistingEnvironment)"
Write-Verbose "========================================================="


#################################################
############### ADD SSIS ASSEMBLY ###############
#################################################
# Change assembly version number to use an other SSIS version
# 13.0.0.0 = SSIS 2016
# 12.0.0.0 = SSIS 2014
# 11.0.0.0 = SSIS 2012
$SsisNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
Add-Type -AssemblyName "$($SsisNamespace), Version=13.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"


#################################################
############ CONNECT TO SSIS SERVER #############
#################################################
# First create a connection to SQL Server
$SqlConnectionstring = "Data Source=$($SsisServer);Initial Catalog=master;Integrated Security=SSPI;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnectionstring

# Then use that SQL connection to create an
# Integration Services object.
$IntegrationServices = New-Object $SsisNamespace".IntegrationServices" $SqlConnection

# Check if connection succeeded
If (!$IntegrationServices)
{
Throw [System.Exception] "Failed to connect to server $($SsisServer)"
}
Else
{
Write-Verbose "Connected to server $($SsisServer)"
}


#################################################
########### CONNECT TO SSIS CATALOG #############
#################################################
# Create object for SSISDB Catalog
$Catalog = $IntegrationServices.Catalogs["SSISDB"]

# Check if the SSISDB Catalog exists
If (!$Catalog)
{
# Catalog does not exist. Different name used?
Throw [System.Exception] "SSISDB catalog does not exist"
}
Else
{
Write-Verbose "SSISDB catalog found"
}


#################################################
################## CHECK FOLDER #################
#################################################
# Create object to the (new) folder
$Folder = $Catalog.Folders[$FolderName]
 
# Check if folder exists
If (!$Folder)
{
     # Folder doesn't exists, so create the new folder.
     Write-Host "Creating new folder $($FolderName)"
     $Folder = New-Object $SsisNamespace".CatalogFolder" ($Catalog, $FolderName, $FolderName)
     $Folder.Create()
}
Else
{
     Write-Verbose "Folder $($FolderName) found"
}


#################################################
################## ENVIRONMENT ##################
#################################################
# Create object for the (new) environment
$Environment = $Folder.Environments[$EnvironmentName]

# Check if folder already exists
If (-not $Environment)
{
Write-Host "Creating new environment $($EnvironmentName) in $($FolderName)"

$Environment = New-Object $SsisNamespace".EnvironmentInfo" ($Folder, $EnvironmentName, $EnvironmentName)
$Environment.Create()
}
ElseIf($DeleteExistingEnvironment -and $Environment)
{
Write-Verbose "Environment $($EnvironmentName) found with $($Environment.Variables.Count) existing variables"
Write-Host "Dropping and recreating environment $($EnvironmentName) in $($FolderName)"
$Environment.Drop()
$Environment = New-Object $SsisNamespace".EnvironmentInfo" ($folder, $EnvironmentName, $EnvironmentName)
$Environment.Create()
}
Else
{
Write-Verbose "Environment $($EnvironmentName) found with $($Environment.Variables.Count) existing variables"
}


#################################################
############### GET FILE CONTENT ################
#################################################
Write-Verbose "Reading $($ImportFilePath)"
$EnvironmentInput = Get-Content -Raw -Path $ImportFilePath | ConvertFrom-Json


#################################################
################### VARIABLES ###################
#################################################
# Keep track of number of updates and inserts
$InsertCount = 0
$UpdateCount = 0

# Loop through file content
$EnvironmentInput | Select-Object -Property Name,Description,Type,Sensitive,Value | ForEach-Object {

# Get variablename from json and try to find it in the environment
$Variable = $Environment.Variables[$_.Name]

# Make sure each variable has a value
If ($_.Value.ToString().Length -eq 0)
{
Write-Warning "Variable $($_.Name) skipped because it has no value"
}
else
{
# Check if the variable exists
If (-not $Variable)
{
# Insert new variable
Write-Verbose "Variable $($_.Name) added"
$Environment.Variables.Add($_.Name, $_.Type, $_.Value, $_.Sensitive, $_.Description)

$InsertCount = $InsertCount + 1
}
else
{
# Update existing variable
Write-Verbose "Variable $($_.Name) updated"
$Variable.Type = $_.Type
$Variable.Value = $_.Value
$Variable.Description = $_.Description
$Variable.Sensitive = $_.Sensitive

$UpdateCount = $UpdateCount + 1
}
}
}
$Environment.Alter()

Write-Host "Finished, total inserts $($InsertCount) and total updates $($UpdateCount)"
}

Later on I will add various extra methods for example to test the existence of an environment, to delete an environment, to move an environment, to copy an environment, to rename an environment or to connect an environment to a project. Please let me know if you have any suggestions for extra functionality or improvements!
Case
For a client I need to read hundreds of bus route matrices and they all vary in size. This makes it hard to read them dynamically with a Foreach Loop Container because the number of columns differs per file. And I don't want to create hundreds of Data Flow Tasks by hand. Even BIML won't help this time, because the routes change regularly and I don't want to generate and deploy packages every day.
I need to dynamically unpivot data within the Data Flow Task. How do I solve this within SSIS?
Dynamically unpivot data



















Solution
The trick for this case is to read everything as one big column and then dynamically split and unpivot the column in a Script Component Transformation. The unpivot output will always have three columns: Start Station, End Station and Distance. And the good news is that it has only a few lines of relatively easy code.
The solution


























1) Source with one big column
Change your Flat File Connection Manager so that it will read everything as one big column. Make sure the column is big enough to fit all data. For this example I called the column 'ColumnOne'.
Flat File with one column only














2) Script Component Transformation Input
Drag a Script Component on the surface and choose Transformation. Connect it to your source. Then edit the Script Component  and go to the 'Input Columns' page. On that page select the column with all the matrix data as ReadOnly.
Input Columns
























3) Script Component Transformation Input
On the 'Inputs and Outputs' page we need to add the new output columns. For this example I need a StartStation (string), EndStation (string) and the Distance (int).
An other important step is setting the SynchronousInputID property (of Output 0) to 'None'. This makes the transformation asynchronous which means the number of row in could be unequal to the number of rows out. And that means the input buffer with records isn't reused in this component, but a new output buffer will be created.
Inputs and Outputs
























4) The script
Go to the script page, choose C# as scripting language and hit the Edit Script button. And now copy the contents of my Input0_ProcessInputRow method to your Input0_ProcessInputRow method. And there are also two variables called Stations and Distances. They are declared above this method. Copy those to your code and put them on the same place.
I also remove the unused methods PreExecute, PostExecute and CreateNewOutputRows to keep the code clean and mean.
#C# Code
#region Namespaces
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
#endregion

/// <summary>
/// Split and unpivot data
/// </summary>
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
// Define two arrays for distances and stations
// The Stations array will be filled only once
// The Distances array will change for each row
string[] Stations;
string[] Distances;

/// <summary>
/// This method is called once for every row that passes through the component from Input0.
/// </summary>
/// <param name="Row">The row that is currently passing through the component</param>
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
// The first time this method executes the Stations array
// is still empty (null). In the true clause of the if-
// statement we will fill the Stations array.
// Therefore the second, third, etc. time this method
// executes we will go to the false clause of the if-
// statement.
if (Stations == null)
{
// We know that the first row contains the stations.
// We will add those to the stations array and use
// it to determine the end station later on.

// Split the string from ColumnOne on ; (or your own
// column separator). The Split returns an array.
Stations = Row.ColumnOne.Split(';');
}
else
{
// Now the rows will contain distances (and the StartStation)
// Split the distances on ; (or your own column separator)
Distances = Row.ColumnOne.Split(';');

// Now loop through distances array, but start on 1 (not on 0)
// because 0 contains the StartStation in the distances array
for (int counter = 1; counter < Distances.Length; counter++)
{
// Add new Row and then fill the columns
Output0Buffer.AddRow();
// Get the Distance from the Distance array and convert it to int
Output0Buffer.Distance = Convert.ToInt32(Distances[counter]);
// Get the Start station from the distance array (the first item)
Output0Buffer.StartStation = Distances[0];
// Get the End station from stations array
Output0Buffer.EndStation = Stations[counter];
}
}
}
}

4) The result
Now close the Script Component and add more transformations or a destination and see what the Script Component does with your data. I added a dummy Derived Column and Data Viewer to see the data before and after the Script Component. For this file I had 27 rows and columns as input and 676 rows as output (26 * 26).



Case
I created dozens of packages in my project but I forgot to change the default Protection Level in the project properties from "EncryptSensitiveWithUserKey" to "DontSaveSensitive". Now I have to change all packages one by one. Is there an alternative? I tried search and replace in the XML, but I can't find the Protection Level property.




















Solution
Of course the best option is to prevent this from happening by setting the default before you start. You can do this in the properties of the project. All new packages will then inherit the Protection Level from the project.
Setting Protection Level on project














First, when trying to search and replace in the XML code of the packages you will notice that you cannot find the default 'EncryptSensitiveWithUserKey' which makes it hard to replace.
Default Protection Level is not in package
















Secondly, the Protection Level is also stored in the Visual Studio project file (*.dtproj). When you open a package in design mode and press the save button it also updates metadata in the project file.
Protection Level in project file as well


















Solution A
Good old Command Prompt to the rescue! The dtutil Utility can do the package conversion for you. If you are afraid of the Command Prompt or even never heard about it, then don't use this solution.

1) Command Prompt
Open a Command Prompt and use CD (Change Directory) command to navigate to your folder with packages.
Navigate to your project folder with packages













2) Foreach Loop Container in DOS
Now you can call the dtutil Utility for each package in that folder with something similar as a Foreach Loop Container:
FOR %p IN (*.dtsx) DO dtutil.exe /file "%p" /encrypt file;"%p";0 /quiet
The colors explain the command













3) Execute
When you execute the command, dtutil Utility will quickly change the Protection Level of all your packages.
101 packages changed within 5 seconds. Try that in Visual Studio!





















4) Project Protection Level
If you haven't already done it, change the Protection Level in the Project Properties. See second screenshot of this blog post.

5) dtproj file
Now the project and all its packages have the same Protection Level, but the project doesn't now that yet. If you try to execute a package it will complain about the Protection Level inconsistencies.
Failed to execute the package or element. Build errors were encountered.








Error : Project consistency check failed. The following inconsistencies were detected:
 MyPackage000.dtsx has a different ProtectionLevel than the project.
 MyPackage001.dtsx has a different ProtectionLevel than the project.

To update the dtproj file you have to open all packages and then Rebuild the project. This will update the project file. Now you can execute the packages without the consistency error.
Open all packages and rebuild the project





















Solution B
Good old PowerShell to the rescue! This PowerShell script does the same as above, but also changes the project file. So no manual labour at all. Because the dtutil utility was so fast, I didn't edit the packages with .net libraries. It just executes dtutil in a hidden window.

The script is thoroughly tested for SSIS 2012-2016 from 'EncryptSensitiveWithUserKey' to 'DontSaveSensitive'. Other situations require more testing. Make sure to keep a copy of your project before using this script and let me know which situations require some more attention.
Change the protection level of the entire project in seconds


















#PowerShell script
################################
########## PARAMETERS ##########
################################
$projectFolder = "C:\SSIS\myProject\myProject"
$dtutilPath = "C:\Program Files\Microsoft SQL Server\130\DTS\Binn\dtutil.exe"
# The number changes per SQL Server version
# 130=2016, 120=2014, 110=2012
# Also check the drive where SQL Server is
# installed


#################################################
########## DO NOT EDIT BELOW THIS LINE ##########
#################################################
clear
Write-Host "========================================================================================="
Write-Host "== Used parameters =="
Write-Host "========================================================================================="
Write-Host "Project Folder :" $projectFolder
Write-Host "dtutil Path :" $dtutilPath
Write-Host "========================================================================================="


######################################
########## Check parameters ##########
######################################
# Test whether the paths are filled
# and exists.
if ($projectFolder -eq "")
{
Throw [System.Exception] "Project path parameter is mandatory"
}
elseif (-Not (Test-Path $projectFolder))
{
Throw [System.IO.FileNotFoundException] "Project path $($projectFolder) doesn't exists!"
}
elseif (-Not $projectFolder.EndsWith("\"))
{
# Make sure path ends with \ for command
$projectFolder = $projectFolder + "\"
}
if ($dtutilPath -eq "")
{
Throw [System.Exception] "dtutil parameter is mandatory"
}
elseif (-Not (Test-Path $dtutilPath))
{
Throw [System.IO.FileNotFoundException] "dtutil not found at $($dtutilPath)"
}


#############################################
########## dtutil for loop command ##########
#############################################
# In this script we are executing dtutil.exe
# Perhaps a bit quick & dirty, but more quick
# than dirty. It changes 100 packages within
# seconds.
$command = "/C FOR %p IN (""$($projectFolder)*.dtsx"") DO dtutil.exe /file ""%p"" /encrypt file;""%p"";0 /quiet"
Write-Host "Editing packages in $($projectFolder)... " -NoNewline

# Open the command prompt (hidden) and execute
# dtutil.exe with the parameters from above.
Start-Process "C:\Windows\System32\cmd.exe" -ArgumentList $command -WindowStyle Hidden -Wait
Write-Host "Done."


##########################################
########## Editing project file ##########
##########################################
# Find the project file. There should be
# only one dtproj file.
$projectFile = get-childitem $projectFolder -name -filter *.dtproj
Write-Host "Editing project file $($projectFile)... " -NoNewline

# Edit the project file and replace the
# protection level. First replace is for
# all the packages and the second replace
# is for the project itself. It uses a
# regular expression for the replace, but
$projectFilePath = Join-Path -Path $projectFolder -ChildPath $projectFile
(Get-Content $projectFilePath) -replace 'ProtectionLevel">[0-9]', 'ProtectionLevel">0' -replace 'ProtectionLevel="[A-Za-z]*"', 'ProtectionLevel="DontSaveSensitive"' | Set-Content $projectFilePath
Write-Host "Done."

##############################
########## Finished ##########
##############################
# Finished editing packages and project file
Write-Host "Finished editing $($projectFile) and $((get-childitem $projectFolder -name -filter *.dtsx).Count) packages" -ForegroundColor Magenta



Case
I have a CSV file with numeric values that use a dot "." as decimal separator instead of the comma "," we use locally. When I try to import it in SSIS with a Flat File Source it gives me an error. I don't want to/can't change the regional settings on the server. How do I import this flat file without errors?
The value could not be converted because of a potential loss of data
10.5 should be 10,5 (or vice versa)






















Error: 0xC02020A1 at DFT - Process Data, FF_SRC - myCsvFile [2]: Data conversion failed. The data conversion for column "myColumn" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at DFT - Process Data, FF_SRC - myCsvFile [2]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "FF_SRC - myCsvFile.Outputs[Flat File Source Output].Columns[myColumn]" failed because error code 0xC0209084 occurred, and the error row disposition on "FF_SRC - myCsvFile.Outputs[Flat File Source Output].Columns[myColumn]" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
Error: 0xC0202092 at DFT - Process Data, FF_SRC - myCsvFile [2]: An error occurred while processing file "D:\myFolder\2016-12-27.csv" on data row 2.
Error: 0xC0047038 at DFT - Process Data, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on FF_SRC - myCsvFile returned error code 0xC0202092.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.


Solution
You can change the LocaleID of the connection manager to import this file. Right click the connection managers and choose Properties...
Go to properties of flat file connection manager




















Then locate the LocaleID property and change it to English (United States), English (United Kingdom) or an other country that uses a dot "." as decimal separator. Or change it to for example Dutch (Netherlands) if you have the opposite problem.
Change LocaleID
















Now run the package again to see the result.
Success











In 2006 Jamie Thomson came up with naming conventions for SSIS tasks and data flow components. These naming conventions make your packages and logs more readable. Five SQL Server versions and a decade later a couple of tasks and components were deprecated, but there were also a lot of new tasks and components introduced by Microsoft.

Together with Koen Verbeeck (B|T) and André Kamman (B|T) we extended the existing list with almost 40 tasks/components and created a PowerShell Script that should make it easier to check/force the naming conventions. This PowerShell script will soon be published at GitHub as a PowerShell module. But for now you can download and test the fully working proof of concept script. Download both ps1 files and the CSV file. Then open "naming conventions v4.ps1" and change the parameters before executing it. The script works with local packages because you can't read individual package from the catalog, but you can use a powershell script to download your packages from the catalog.
PowerShell Naming Conventions Checker
























Task name Prefix Type New
For Loop Container FLC Container
Foreach Loop Container FELC Container
Sequence Container SEQC Container
ActiveX Script AXS Task
Analysis Services Execute DDL Task ASE Task
Analysis Services Processing Task ASP Task
Azure Blob Download Task ADT Task *
Azure Blob Upload Task AUT Task *
Azure HDInsight Create Cluster Task ACCT Task *
Azure HDInsight Delete Cluster Task ACDT Task *
Azure HDInsight Hive Task AHT Task *
Azure HDInsight Pig Task APT Task *
Back Up Database Task BACKUP Task *
Bulk Insert Task BLK Task
CDC Control Task CDC Task *
Check Database Integrity Task CHECKDB Task *
Data Flow Task DFT Task
Data Mining Query Task DMQ Task
Data Profiling Task DPT Task *
Execute Package Task EPT Task
Execute Process Task EPR Task
Execute SQL Server Agent Job Task AGENT Task *
Execute SQL Task SQL Task
Execute T-SQL Statement Task TSQL Task *
Expression Task EXPR Task
File System Task FSYS Task
FTP Task FTP Task
Hadoop File System Task HFSYS Task *
Hadoop Hive Task HIVE Task *
Hadoop Pig Task PIG Task *
History Cleanup Task HISTCT Task *
Maintenance Cleanup Task MAINCT Task *
Message Queue Task MSMQ Task
Notify Operator Task NOT Task *
Rebuild Index Task REBIT Task *
Reorganize Index Task REOIT Task *
Script Task SCR Task
Send Mail Task SMT Task
Shrink Database Task SHRINKDB Task *
Transfer Database Task TDB Task
Transfer Error Messages Task TEM Task
Transfer Jobs Task TJT Task
Transfer Logins Task TLT Task
Transfer Master Stored Procedures Task TSP Task
Transfer SQL Server Objects Task TSO Task
Update Statistics Task STAT Task *
Web Service Task WST Task
WMI Data Reader Task WMID Task
WMI Event Watcher Task WMIE Task
XML Task XML Task
Transformation name Prefix Type New
ADO NET Source ADO_SRC Source *
Azure Blob Source AB_SRC Source *
CDC Source CDC_SRC Source *
DataReader Source DR_SRC Source
Excel Source EX_SRC Source
Flat File Source FF_SRC Source
HDFS File Source HDFS_SRC Source *
OData Source ODATA_SRC Source *
ODBC Source ODBC_SRC Source *
OLE DB Source OLE_SRC Source
Raw File Source RF_SRC Source
SharePoint List Source SPL_SRC Source
XML Source XML_SRC Source
Aggregate AGG Transformation
Audit AUD Transformation
Balanced Data Distributor BDD Transformation *
Cache Transform CCH Transformation *
CDC Splitter CDCS Transformation *
Character Map CHM Transformation
Conditional Split CSPL Transformation
Copy Column CPYC Transformation
Data Conversion DCNV Transformation
Data Mining Query DMQ Transformation
Derived Column DER Transformation
DQS Cleansing DQSC Transformation *
Export Column EXPC Transformation
Fuzzy Grouping FZG Transformation
Fuzzy Lookup FZL Transformation
Import Column IMPC Transformation
Lookup LKP Transformation
Merge MRG Transformation
Merge Join MRGJ Transformation
Multicast MLT Transformation
OLE DB Command CMD Transformation
Percentage Sampling PSMP Transformation
Pivot PVT Transformation
Row Count CNT Transformation
Row Sampling RSMP Transformation
Script Component SCR Transformation
Slowly Changing Dimension SCD Transformation
Sort SRT Transformation
Term Extraction TEX Transformation
Term Lookup TEL Transformation
Union All ALL Transformation
Unpivot UPVT Transformation
ADO NET Destination ADO_DST Destination *
Azure Blob Destination AB_DST Destination *
Data Mining Model Training DMMT_DST Destination
Data Streaming Destination DS_DST Destination *
DataReaderDest DR_DST Destination
Dimension Processing DP_DST Destination
Excel Destination EX_DST Destination
Flat File Destination FF_DST Destination
HDFS File Destination HDFS_DST Destination *
ODBC Destination ODBC_DST Destination *
OLE DB Destination OLE_DST Destination
Partition Processing PP_DST Destination
Raw File Destination RF_DST Destination
Recordset Destination RS_DST Destination
SharePoint List Destination SPL_DST Destination
SQL Server Compact Destination SSC_DST Destination *
SQL Server Destination SS_DST Destination


Example of the prefixes

Case
I used PowerShell to deploy my SSIS project to the Catalog. Can I also automatically create a SQL Server Agent job with an SSIS jobstep?
SQL Agent Job for SSIS package



















Solution
Yes, almost every Microsoft product supports PowerShell and SQL Server Agent is no exception. Only the SSIS specific part of the jobstep seems to be a little more difficult to handle. So for this example I first created a SQL Server Agent job(step) for an SSIS package in SSMS manually and then scripted it to see the jobstep command. This command is a long string with all SSIS specific information like the packagepath, enviroment and parameters. Below you see parts of the generated TSQL Script. We are interested in the part behind @command= in row 12:
/****** Object:  Step [PowerShell and SSIS - Master.dtsx]    Script Date: 25-10-2016 22:30:35 ******/
EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'PowerShell and SSIS - Master.dtsx',
@step_id=1,
@cmdexec_success_code=0,
@on_success_action=1,
@on_success_step_id=0,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0, @subsystem=N'SSIS',
@command=N'/ISSERVER "\"\SSISDB\Finance\PowerShell and SSIS\Master.dtsx\"" /SERVER "\"MyServer\MSSQLSERVER2016\"" /Par "\"$ServerOption::LOGGING_LEVEL(Int16)\"";1 /Par "\"$ServerOption::SYNCHRONIZED(Boolean)\"";True /CALLERINFO SQLAGENT /REPORTING E',
@database_name=N'MyServer\MSSQLSERVER2016',
@flags=0


This command string is used in the PowerShell script below, but hardcoded parts are replaced with values from the PowerShell parameters (see row 66). The rest of the script is more straightforward and easily to adjust or extend. If you're not sure about how to adjust the script then first take a look at the T-SQL script which has similar steps and with the same properties to set.


#PowerShell SSIS JobStep
################################
########## PARAMETERS ##########
################################
# Destination
$SsisServer = "MyServer\MSSQLSERVER2016"
$FolderName = "Finance"
$ProjectName = "PowerShell and SSIS"

# Job
$JobName = "Load DWH"
$MasterPackage = "Master.dtsx"
$JobStartTime = New-TimeSpan -hours 6 -minutes 30

clear
Write-Host "========================================================================================="
Write-Host "== Used parameters =="
Write-Host "========================================================================================="
Write-Host "SSIS Server : " $SsisServer
Write-Host "FolderName : " $FolderName
Write-Host "ProjectName : " $ProjectName
Write-Host "Job name : " $JobName
Write-Host "MasterPackage : " $MasterPackage
Write-Host "ScheduleTime : " $JobStartTime
Write-Host "========================================================================================="
Write-Host ""


# Reference SMO assembly and connect to the SQL Sever Instance
# Check the number in the path which is different for each version
Add-Type -Path 'C:\Program Files\Microsoft SQL Server\130\SDK\Assemblies\Microsoft.SqlServer.Smo.dll'
$SQLSvr = New-Object -TypeName Microsoft.SQLServer.Management.Smo.Server($SsisServer)

# Check if job already exists. Then fail, rename or drop
$SQLJob = $SQLSvr.JobServer.Jobs[$JobName]
if ($SQLJob)
{
# Use one of these 3 options to handle existing jobs

# Fail:
#Throw [System.Exception] "Job with name '$JobName' already exists."

# Rename:
Write-Host "Job with name '$JobName' found, renaming and disabling it"
$SQLJob.Rename($SQLJob.Name +"_OLD_" + (Get-Date -f MM-dd-yyyy_HH_mm_ss))
$SQLJob.IsEnabled = $false
$SQLJob.Alter()

# Drop:
#Write-Host "Job with name $JobName found, removing it"
#$SQLJob.Drop()
}


#Create new (empty) job
$SQLJob = New-Object -TypeName Microsoft.SqlServer.Management.SMO.Agent.Job -argumentlist $SQLSvr.JobServer, $JobName
$SQLJob.OwnerLoginName = "SA"
$SQLJob.Create()
Write-Host "Job '$JobName' created"


# Command of jobstep
# This string is copied from T-SQL, by scripting a job(step) in SSMS
# Then replace the hardcode strings with [NAME] to replace them with variables
$Command = @'
/ISSERVER "\"\SSISDB\[FOLDER]\[PROJECT]\[PACKAGE]\"" /SERVER "\"[INSTANCE]\"" /Par "\"$ServerOption::LOGGING_LEVEL(Int16)\"";1 /Par "\"$ServerOption::SYNCHRONIZED(Boolean)\"";True /CALLERINFO SQLAGENT /REPORTING E
'@
$Command = $Command.Replace("[FOLDER]", $FolderName)
$Command = $Command.Replace("[PROJECT]", $ProjectName)
$Command = $Command.Replace("[PACKAGE]", $MasterPackage)
$Command = $Command.Replace("[INSTANCE]", $SsisServer)


# Create new SSIS job step with command from previous block
$SQLJobStep = New-Object -TypeName Microsoft.SqlServer.Management.SMO.Agent.JobStep -argumentlist $SQLJob, "$ProjectName - $MasterPackage"
$SQLJobStep.OnSuccessAction = [Microsoft.SqlServer.Management.Smo.Agent.StepCompletionAction]::QuitWithSuccess
$SQLJobStep.OnFailAction = [Microsoft.SqlServer.Management.Smo.Agent.StepCompletionAction]::QuitWithFailure
$SQLJobStep.SubSystem = "SSIS"
$SQLJobStep.DatabaseName = $SsisServer
$SQLJobStep.Command = $Command
$SQLJobStep.Create()
Write-Host "Jobstep $SQLJobStep created"


# Create a daily schedule
$SQLJobSchedule = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Agent.JobSchedule -ArgumentList $SQLJob, "Daily $JobStartTime"
$SQLJobSchedule.IsEnabled = $true
$SQLJobSchedule.FrequencyTypes = [Microsoft.SqlServer.Management.SMO.Agent.FrequencyTypes]::Daily
$SQLJobSchedule.FrequencyInterval = 1 # Recurs Every Day
$SQLJobSchedule.ActiveStartDate = Get-Date
$SQLJobSchedule.ActiveStartTimeofDay = $JobStartTime
$SQLJobSchedule.Create()
Write-Host "Jobschedule $SQLJobSchedule created"


# Apply to target server which can only be done after the job is created
$SQLJob.ApplyToTargetServer("(local)")
$SQLJob.Alter()
Write-Host "Job '$JobName' saved"


You could combine this with the deploy script to handle the complete SSIS deployment in one script.
SSIS Appetizer
I'm not sure I have a purpose for this, but did you know that you can use the cache file of the Cache Transformation (introduced in SSIS 2008) as a source file in the Raw File Source.

Demo
For this demo I use two Data Flow Tasks. The first creates the cache file and the second one uses it as a source.
Two Data Flow Task

























1) Create Cache
The first Data Flow has a random source (a flat file in this case) and a Cache Transformation named "CTR - Create Cache"  as a destination. When you create the Cache Connection Manager, make sure to check "Use file cache" to provide a file path for the cache file. Copy the path for the next step.
The Cache Transformation and Connection Manager

















2) Read Cache
The second Data Flow Task uses a Raw File Source. In the editor you can specify the location of the Raw File. Paste the path from the Cache Connection Manager (a .caw file). For demonstration purposes I added a dummy Derived Column behind it with a Data Viewer on the path between them. Now run the package a see the result. You will get some hash columns 'for free'.
Raw File Source













Please let me know in the comments if you found a good purpose for this.

Note: you can't use a raw file as a cache file unless you're able to add the extra hash columns as well.