If you're looking for Interview Questions on SSIS for Experienced or Freshers, you are at the right place. There are a lot of opportunities from many reputed companies in the world. According to research SSIS has a market share of about 24.5%. So, You still have the opportunity to move ahead in your career in SSIS. Mindmajix offers Advanced SQL Server Technical Interview Questions 2023 that help you in cracking your interview & acquire a dream career as SSIS Developer.
We have categorized SSIS Interview Questions - 2023 (Updated) into 9 levels they are:
1. What are the Differences between SSIS and Informatica?
2. What is the SSIS package and what does it do?
3. What is the Difference between Execute TSQL Task and Execute SQL Task?
4. What is the use of a config file in SSIS?
5. What kind of containers can you use with SSIS packages?
6. What is the Manifest file in SSIS?
7. What is the difference between Merge and Union All?
8. How to back up or retrieve the SSIS packages?
9. What is the Different between Control Flow and Data Flow?
10. What is the Data Profiling task?
Ans: SSIS was first introduced with SQL Server 2005, which was the next generation of SQL Server software after SQL Server 2000. SSIS is an ETL tool that handles data Extraction, Transformation, and load. In addition, it can handle more tasks besides the ETL tasks such as data profiling, file system manipulation, etc…. If you know C# .net or VB.net, you can use the script task to perform much more efficient batch operations.
If you want to enrich your career and become a certified professional in SSIS, then enroll in "SSIS Online Training". This course will help you to achieve excellence in this domain. |
Ans: SSIS control flow allows you to program graphically how the tasks will run by using the logical connectors between tasks. there are three basic logical connectors that you can use: success, failure, or complete. Furthermore, you can use the FX (expression) to handle more complex conditions in your control flow.
Related Article: SSIS Control Flow |
Ans: The data transformation stage applies a series of rules or functions to the extracted data from the source to derive the data for loading into the end target.
Ans: You can create global variables and task-level variables in SSIS. For programmers, these variables are the same as global and function-level variables. A global variable is available to all tasks across the entire job. Variables created in tasks are only available within that task.
Ans: There are three types of containers: sequence, for loops, and for each loop.
A sequence container is a simple way to group similar tasks together. Think of a sequence container as an organization container for more complex SSIS packages.
A for loop container is what you can use to execute your tasks a certain number of times. For example, you need to update records ten times, you can place the task that updates the records inside this for loop container and specifies 10 as the end of the loops. by using the for loop container, you don't have to create either ten different packages to do the same task or have to run the whole package ten times when you schedule your job.
A for each loop container will be useful when you don’t know ahead of time how many times a task should perform. for instance, let’s say that you want to delete all the files inside a folder, but you don’t know how many files are there at any particular time, by using for each loop, it can go through the collection of files and delete them for you after the collection is emptied out, it knows that when it should stop.
Ans: Deployment is the process in which packages convert from development mode into executables mode. For deploying the SSIS package, you can directly deploy the package by right-clicking the Integration Services project and build it.
This will save the package.dtsx file on the projectbin folder. Also, you can create the deployment utility using which the package can be deployed at either SQL Server or as a file on any location.
For creating deployment utility, follow these steps:
Ans:
Feature
|
SSIS
|
Informatica
|
Security
|
Secure to use
|
Secure to use
|
Administration & Maintenance
|
Easy to use
|
Difficult to maintain
|
Ease of Use
|
Easy to implement SSIS
|
Moderate
|
Performance
|
Performs well for Medium SQL server environment
|
Performs well for Medium SQL server environment
|
Productivity
|
Moderate
|
High while implementing large applications
|
Product Maturity
|
Younger to Informatica
|
Elder than SSIS
|
Cost
|
No cost
|
Expensive
|
Ans: The manifest file is the utility that can be used to deploy the package using the wizard on the file system and SQL Server database.
Ans: File system deployment means saving package files on a local or network drive. Then you can use the SQL Agent job to schedule when the packages will run.
Ans: If your package is deployed on SQL Server then you can back up the MSDB database as all the packages on SQL server deploy at MSDB.
Ans: Data flow task is the pumping mechanism that moves data from source to destination. However, in the case of SSIS, you have much more control over what happens from start to finish. In fact, you have a set of out-of-the-box transformation components that you snap together to clean and manipulate the data while it is in the data pipeline.
Just as the control flow handles the main workflow of the package, the Data Flow handles the transformation of data. Almost anything that manipulates data falls into the Data Flow Category.
Ans: Data profiling is the process of analyzing the source data to better understand what condition the data is in, in terms of cleanliness, patterns, numbers or nulls, and so on. data profiling tasks usually be used at the beginning of the development cycle to support the design of the destination database schema. Note that this task is not used when you develop the normal recurring ETL packages.
Ans: The Multicast transform, as the name implies, can send single data input to multiple output paths easily. You may want to use this transformation to send a path to multiple destinations sliced in different ways. The multicast transformation is similar to the Split Transformation because both send data to multiple outputs. However, you can not specify the conditions for which part of the data will be in which output in the Multicast transformation.
Ans: The Merge transformation can merge data from two paths into a single output. The Transform is useful when you wish to break out your Data Flow into a path that handles certain errors and then merge it back into the main Data Flow downstream after the errors have been handled. it’s also useful if you wish to merge data from two Data Sources.
Note that the data must be sorted before using the Merge Transformation. you can do this by using the sort transformation prior to the merge or by specifying an ORDER By clause in the source connection. Also, the metadata must be the same for both paths. For example, the CustomerID column cannot be a numeric column in one path and a character column in the other path.
The Union All Transformation works much the same way as the Merge Transformation, but it does not require the data to be sorted. It takes the outputs from multiple sources or transforms and combines them into a single result set.
Ans: The OLE DB Command Transform is a component designed to execute a SQL Statement for each row in an input stream. This task is analogous to an ADO Command Object being created, prepared, and executed for each row of a result set. The input stream provides that data for parameters that can be set into the SQL Statement that is either an Inline statement or a stored procedure call.
Ans: The Execute Package Task enables you to build SSIS solutions called parent packages that execute other packages called “Child Packages”. You’ll find this capability an indispensable part of your SSIS development as your packages begin to grow.
Separating packages into discrete functional workflows makes for shorter development and testing cycles and facilitates best development practices. in SSIS, the child package is aware of the parent package parameters and can reach out and access those parameters–level configuration values. The majority of the configurable properties are in the Package tab of the Executable package Task Editor.
The first option is to provide the location of the child package. The options here are either the File system and SQL Server. You can deploy the SSIS package in the FIle system task as a .dtsx file or within the msdb database of a SQL Server instance. if you select a file system, you must first create a new connection manager connection to the child package.
If the child package is located in a SQL Server, you’ll need to provide the OLE DB Connection Manager for the SQL Server that holds your packages. In either case, browse to and then select the child package within the connection to set the package to execute in the task.
Ans: Packages use transactions to bind the database actions that tasks perform into atomic units. and by doing this maintain data integrity. Al MS IS container types – packages the For loop, For each loop, and Sequence containers, and the task hosts that encapsulate each task can be configured to use transactions. IS provides three options for configuring transactions: Not supported, Supported, and Required.
Require indicates that the container starts a transaction unless one is already started by its parent container. if a transaction already exists, the container joins the transaction, For example, if a package that is not configured to support transactions includes a Sequence container that uses the Required option, the Sequence Container would start its own transaction. If the package were configured to use the Required option, the Sequence container would join the package transaction.
Supported indicates that the container does not start a transaction, but joins any transaction started by its parent container. For example, if a package with four Executable SQL tasks starts a transaction and all four tasks use the Supported option, the database updates performed by the Execute SQL tasks are rolled back if any task fails. if the package does not start a transaction, the four execute SQL tasks are not bound by a transaction, and no database updates except the ones performed by the failed task are rolled back.
Not Supported indicates that the container does not start a transaction or join an existing transaction. A transaction started by a parent container does not affect child containers that have been configured to Not Support transactions. For instance, if a package is configured to start a transaction and a For Loop Container in the package uses the NotSupported option, none of the tasks in the For Loop can roll back if they fail.
Ans: In SSIS there is one task Execute TSQL task which is similar to Execute SQL task. We will see what is the difference between the two.
Execute the TSQL Task:
Execute SQL Task:
Ans: A task will only execute if the condition that is set by the precedence constraint preceding the task is met. By using these constraints, it will choose different execution paths depending on the success or failure of other tasks.
Ans: Config file in SSIS is used to provide inputs to connection manager different properties that package use to assign values at runtime dynamically.
Using config file users need to make changes to the config file which package will take automatically at runtime because of using it you don’t need to every time make changes in the packages every in case you are deploying package on multiple servers or locations. There are multiple ways in which configuration values can be stored.
XML configuration file: Store the config file as an XML file.
Ans:
All process base tasks are part of control flow while ETL related tasks are part of Dataflow which is again a subset of control flow.
Ans: A checkpoint is a property in SSIS which enables the project to restart from the point of failure. When we set the property to true package create the checkpoint file which stores the information about package execution and use to restart the package from the point of failure. If the package runs successfully, the checkpoint file is deleted, and then re-created the next time the package runs.
Ans: You need dtexec to run SSIS packages from the command line.
In SQL2005 Express you got dtexec and necessary tools with SQL SERVER 2005 EXPRESS EDITION WITH ADVANCED SERVICES
MICROSOFT SQL SERVER 2005 EXPRESS EDITION TOOLKIT
However, without SQL Agent you have to use other scheduling methods.
Caveat: I haven’t actually tried this and there are reports that dtexec just fails with a license error. In my case, I just needed to run a dtsx once and did it with Developer edition with SQLExpress as a data flow destination.
Ans: I am running an Execute SQL Task statement in my SSIS package. The Execute SQL Task is running SQL and checking that the tables have more than 1000 rows. If they have less than 1000 rows, I want to fail the package.
Ans: AFAIK, tasks in SSIS fail on error. So if your Execute SQL Task has a statement like so in it:
You should get the results you want.
Ans: FYI, I’ve just had the same error.
I switched to Windows authentication, disconnected, then tried to log in again with SQL authentication. This time I was told my password had expired. I changed the password and it all worked again.
Ans: SSIS (SQL Server Integration Services) is an upgrade of DTS (Data Transformation Services), which is a feature of the previous version of SQL Server. SSIS packages can be created in BIDS (Business Intelligence Development Studio). These can be used to merge data from heterogeneous data sources into SQL Server. They can also be used to populate data warehouses, to clean and standardize data, and to automate administrative tasks.
SQL Server Integration Services (SSIS) is a component of Microsoft SQL Server 2005. It replaces Data Transformation Services, which has been a feature of SQL Server since Version 7.0. Unlike DTS, which was included in all versions, SSIS is only available in the “Standard” and “Enterprise” editions.
Integration Services provides a platform to build data integration and workflow applications. The primary use of SSIS is data warehousing as the product features a fast and flexible tool for data extraction, transformation, and loading (ETL).). The tool may also be used to automate the maintenance of SQL Server databases, update multidimensional cube data, and perform other functions.
Ans: I have an SSIS package that has a child package that is failing. The error message isn’t very helpful.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020
The problem seems to be I am running out of Virtual Memory to complete the job.
I found a forum thread that may help solve the problem.
From the solutions offered I am unsure though how to:
Increase default buffer size
Allocate a child package into its own memory allocation.
I am running the package daily in SQL Server 2005. I was running fine daily up until the 12th. I am assuming the dat file that we are using to import data into the database grew to a size that was too large for the database to handle. It’s only an 8.90MB CSV file though. The import is a straight column to column import.
The problem child package in step 1 and fails and continues and successfully completes the next 8 steps.
Ans: The reason I ask is that SSIS memory is allocated from me to leave the area of memory that sits outside of the SQL Server process space.
See HERE for details on configuring the amount of memory available to me to leave a portion of memory.
For generic performance tuning of SSIS consult the following article.
Integration Services: Performance Tuning Techniques
I hope this makes sense but feel free to drop me a line once you have digested the material.
Ans: If you have access to SQL Server Agent through SQL Server Management Studio, here are the steps to create a job using the Graphical User Interface. The steps show how to create an SQL job to run SSIS using SQL Agent Service Account and also how to create a proxy to run under a different using different credentials.
I hope that helps.
Ans: VARCHAR(MAX) column values will be stored IN the table row, space permitting. So if you have a single VARCHAR(MAX) field and it’s 200, 300 bytes, chances are it’ll be stored inline with the rest of your data. No problem or additional overhead here.
Only when the entire data of a single row cannot fit on a single SQL Server page (8K) anymore, only then will SQL Server move VARCHAR(MAX) data into overflow pages.
So all in all, I think you get the best of both worlds – inline storage when possible, overflow storage when necessary.
Ans: That answer points to this article: SSIS package problem
Here are the proposed solutions – have you evaluated them?
Method 1: Use a SQL Server Agent proxy account Create a SQL Server Agent proxy account. This proxy account must use a credential that lets SQL Server Agent run the job as the account that created the package or as an account that has the required permissions.
This method works to decrypt secrets and satisfies the key requirements by the user. However, this method may have limited success because the SSIS package user keys involve the current user and the current computer. Therefore, if you move the package to another computer, this method may still fail, even if the job step uses the correct proxy account.
Method 2: Set the SSIS Package ProtectionLevel property to ServerStorage Change the SSIS Package ProtectionLevel property to ServerStorage. This setting stores the package in a SQL Server database and allows access control through SQL Server database roles. Back to the top
Method 3: Set the SSIS Package ProtectionLevel property to EncryptSensitiveWithPassword Change the SSIS Package ProtectionLevel property to EncryptSensitiveWithPassword. This setting uses a password for encryption. You can then modify the SQL Server Agent job step command line to include this password.
Method 4: Use SSIS Package configuration files to store sensitive information, and then store these configuration files in a secured folder. You can then change the ProtectionLevel property to DontSaveSensitive so that the package is not encrypted and does not try to save secrets to the package.
When you run the SSIS package, the required information is loaded from the configuration file. Make sure that the configuration files are adequately protected if they contain sensitive information.
Method 5: Create a package template For a long-term resolution, create a package template that uses a protection level that differs from the default setting. This problem will not occur in future packages.
Ans: If your ETL is mostly E and L, with very little T, and if you can write your SPs so they don’t rely on cursors, then going the SP-only route is probably fine.
For more complex processes, particularly those that involve heavy transforms, slowly changing dimensions, data mining lookups, etc, SSIS has three advantages.
First, it manages memory very efficiently, which can result in big performance improvements compared to T-SQL alone.
Second, the graphical interface lets you build large, complex, and reliable transform much more easily than hand-crafted T-SQL.
And third, SSIS lets you more easily interact with additional external sources, which can be very handy for things like data cleansing.
Ans: I am copying some user data from one SqlServer to another. Call them Alpha and Beta. The SSIS package runs on Beta and it gets the rows on Alpha that meet a certain condition. The package then adds the rows to Beta’s table. Pretty simple and that works great.
The problem is that I only want to add new rows into Beta. Normally I would just do something simple like….
INSERT INTO BetaPeople
SELECT * From AlphaPeople
WHERE ID NOT IN (SELECT ID FROM BetaPeople)
But this doesn’t work in an SSIS package. At least I don’t know-how and that is the point of this question.
Ans: Your example seems simple, it looks like you are adding only new people, not looking for changed data in existing records. In this case, store the last ID in the DB.
CREATE TABLE dbo.LAST (RW int, LastID Int)
go
INSERT INTO dbo.LAST (RW, LastID) VALUES (1,0)
Now you can use this to insert the last ID of the row transferred.
UPDATE dbo.LAST SET LastID = @myLastID WHERE RW = 1
When selecting OLEDB source, set data access mode to SQL Command and use
DECLARE @Last int
SET @Last = (SELECT LastID FROM dbo.LAST WHERE RW = 1)
SELECT * FROM AlphaPeople WHERE ID > @Last;
Note, I do assume that you are using ID int IDENTITY for your PK.
If you have to monitor for data changes of existing records, then have the “last changed” column in every table, and store the time of the last transfer.
A different technique would involve setting up a linked server on Beta to Alpha and running your example without using SSIS. I would expect this to be way slower and more resource-intensive than the SSIS solution.
INSERT INTO dbo.BetaPeople
SELECT * FROM [Alpha].[myDB].[dbo].[AlphaPeople]
WHERE ID NOT IN (SELECT ID FROM dbo.BetaPeople)
Ans: There is a 4000-character limit to the DT_WSTR (Unicode string) data type used by SSIS. Since expressions are converted to DT_WSTR, they also cannot exceed 4000 characters.
You’ll get an error if you exceed the limit at design time. However, you will not receive any obvious warnings or errors if you do so at run time. The operation will fail silently, and the expression may be evaluated incorrectly.
Note: this does not mean that strings or SSIS variables cannot exceed 4000 characters. The limitation only applies to expressions and other places that DT_WSTR is used.
Here’s an example that reproduces the issue:
Even though the variable is not empty, the constraint will incorrectly evaluate to False because the expression exceeds 4000 characters. However, this will not cause any errors and the package will not fail.
If you notice strange behavior in your SSIS logic, take a look at your expressions and ensure that they don’t exceed 4000 characters when evaluated.
Ans: I have a SQL Task that needs to run a simple update to update a single row.
I have set the SQLStatement to:
update agency set AgencyLastBatchSeqNo = ? where agencyID = ?
On the Parameter Mapping page, I gave set Parameter 0 and Parameter 1 to variables that I know contain the right values. I have also set the Parameter Name values correctly.
In the database, the column AgencyLastBatchSeqNo is an int, AgencyID is a big int. Does anyone have a reference to find what the data types map to in SSIS? I have guessed at SHORT for the int and LONG for the big int.
When I run the task I get the following error:
[Execute SQL Task] Error: Executing the query “update agency set AgencyLastBatchSeqNo = ? where AgencyID = ?” failed with the following error: “Parameter name is unrecognized.”. Possible failure reasons: Problems with the query, “ResultSet” property not set correctly, parameters not set correctly, or connection not established correctly.
Ans: The answer to this is to change the Parameter Name value in the Parameter Mapping screen.
Given the following query
SELECT Id, AnimalName FROM dbo.Farm WHERE Farm_id =?
Assuming my Parameter is an integer Variable named User:: Farm_id
Choose the following values on the Parameter Mapping Screen
Variable Name: User::Farm_id
Direction: Input
Data Type: LONG
Parameter Name: 0
Parameter Size: -1
Originally the Parameter Name will be “NewParameterName”. Simply change this to the ordinal position of your variable marker (“?”)
Ans: Persist Security = true means that the Password used for SQL authentication is not removed from the ConnectionString property of the connection.
When Integrated Security = true is used then the Persist Security is completely irrelevant since it only applies to SQL authentication, not to windows/Integrated/SSPI.
Ans: From MS:
In Script component code, you use typed accessor properties to access certain package features such as variables and connection managers.
The PreExecute method can access only read-only variables. The PostExecute method can access both read-only and read/write variables.
For more information about these methods, see Coding and Debugging the Script Component. Comparing the Script Task and the Script Component
It looks like Dts is only available in the Script Task.
Here is what the code looks like:
Skip code block
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Public Class ScriptMain
Inherits UserComponent
Dim updateSQL As String
Public Overrides Sub PostExecute()
Me.ReadWriteVariables(“SQL_ATTR_Update”).Value = “Test”
End Sub
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
‘updateSQL = Row.ITMID + Row.PRCCAT
End Sub
End Class
Ans: On the script tab, make sure you put the variable in either the read-only variables or readwritevariables text boxes.
Here is a simple script that I use to format the errors in a data flow (saved in a RecordSet Variable) into the body of an email. Basically, I read the record set variable into a data table and process it row by row with the for loops. After this task completes I examine the value of uvErrorEmailNeeded to determine if there is anything to email using a conditional process flow connector. You will also need to add a reference to system.xml in your vb script. This is in SQL 2005.
Code block
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.Xml
Imports System.Data.OleDb
Public Class ScriptMain
Public Sub Main()
Dim oleDA As New OleDbDataAdapter
Dim dt As New DataTable
Dim col As DataColumn
Dim row As DataRow
Dim sMsg As String
Dim sHeader As String
oleDA.Fill(dt, Dts.Variables(“uvErrorTable”).Value)
If dt.Rows.Count > 0 Then
Dts.Variables(“uvErrorEmailNeeded”).Value = True
For Each col In dt.Columns
sHeader = sHeader & col.ColumnName & vbTab
Next
sHeader = sHeader & vbCrLf
For Each row In dt.Rows
For Each col In dt.Columns
sMsg = sMsg & row(col.Ordinal).ToString & vbTab
Next
sMsg = sMsg & vbCrLf
Next
Dts.Variables(“uvMessageBody”).Value = “Error task. Error list follows:” & vbCrLf & sHeader& sMsg & vbCrLf & vbCrLf
End I
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
Ans: There is no best value, it depends greatly on the design of the database, the number of users, the kind of hardware you are operating one, etc. That is why you need to test for yourself with your system.
Ans: If you connect to the Integration Services instance on the server (a different choice in the dropdown from “Database Engine” when you connect in SQL Server Management Studio), they’ll be under the MSDB folder under Stored Packages.
When you start management studio and connect to a database, make sure you have the server type set to Integration Services instead of Database Engine.
Ans: Between your control flow tasks, click on the arrow and choose Edit. When you do this, you get a dialog that allows you to check the “constraint” (success, completion or failure) of the task, an “expression” (i.e. you can have your execute SQL task return a value, store that value in a variable, and check the value of that variable in an expression to determine whether to continue down the path you are editing), an “expression and a constraint”, and an “expression or a constraint”.
These last two are the same except for the logic. “Expression and constraint” requires a true condition on both the expression and the constraint, “expression or constraint” requires a true condition on only one of the expressions and the constraint.
Ans: We have an SSIS job that has been running for over a year with no issue. The job takes a data set from a select statement in an oracle db and transfers the result to a table on a SQL Server 2005 instance.
As of this morning, we receive the following error message:
Error: 2010-05-26 05:06:47.71 Code: 0xC02090F5 Source: [job_name] DataReader Source [793] Description: The component “DataReader Source” (793) was unable to process the data. End Error Error: 2010-05-26 05:06:47.71 Code: 0xC0047038 Source: job_name Description: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.
The PrimeOutput method on component “DataReader Source” (793) returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure. End Error Error: 2010-05-26 05:06:47.71 Code: 0xC0047021 Source: P… The package execution fa… The step failed.
Ans: There may be error messages posted before this with more information about the failure.
Did you look for other more specific error messages?
Are you logging errors or steps as they run? If so did you look in the logging table? If you aren’t logging in your SSIS package, I’d set that up as part of this fix, it will make it immeasurably easier to find the problem.
Things I would consider: schema changes, permissions changes, any recent software, operating systems updates to the servers involved, data mismatches (the first time perhaps that the Oracle table held data that couldn’t be inserted into the SQL table – check columns that don't directly match first, string data that might get truncated, dates stored as strings that need to convert to DateTime, etc.).
Ans: Do you mean what are the pros and cons of using SQL Server Agent Jobs for scheduling running SSIS packages and command shell executions? I don’t really know the pros of windows scheduler, so I’ll stick to listing the pros of SQL Server Agent Jobs.
On one occasion we found a situation where Windows Scheduler was able to do something we couldn’t do with a SQL Server Agent Job. During the early days after a SAN migration, we had some scripts for snapshotting and cloning drives that didn’t work in a SQL Server Agent Job.
So we used a Windows Scheduler task to run the code for a while. After about a month, we figured out what we were missing and were able to move the step back to the SQL Server Agent Job.
Regarding SSIS over exe stored procedure calls.
Let me know if you need me to elaborate on any points. Good luck!
Ans: Every column has a boolean property called ColumnName_IsNull (in your case should be Row.ID_IsNull). I think you should set it to true to set the columns value NULL
Ans: Set the property RetainSameConnection on the Connection Manager to True so that a temporary table created in one Control Flow task can be retained in another task.
Here is a sample SSIS package is written in SSIS 2008 R2 that illustrates using temporary tables.
Walkthrough:
Create a stored procedure that will create a temporary table named ##tmpStateProvince and populate with few records. The sample SSIS package will first call the stored procedure and then will fetch the temporary table data to populate the records into another database table.
The sample package will use the database named Sora Use the below create stored procedure script.
Skip code block
USE Sora;
GO
CREATE PROCEDURE dbo.PopulateTempTable
AS
BEGIN
SET NOCOUNT ON;
IF OBJECT_ID(‘TempDB..##tmpStateProvince’) IS NOT NULL
DROP TABLE ##tmpStateProvince;
CREATE TABLE ##tmpStateProvince
(
CountryCode nvarchar(3) NOT NULL
, StateCode nvarchar(3) NOT NULL
, Name nvarchar(30) NOT NULL
);
INSERT INTO ##tmpStateProvince
(CountryCode, StateCode, Name)
VALUES
(‘CA’, ‘AB’, ‘Alberta’),
(‘US’, ‘CA’, ‘California’),
(‘DE’, ‘HH’, ‘Hamburg’),
(‘FR’, ’86’, ‘Vienne’),
(‘AU’, ‘SA’, ‘South Australia’),
(‘VI’, ‘VI’, ‘Virgin Islands’);
END
GO
Create a table named dbo.StateProvince that will be used as the destination table to populate the records from the temporary table. Use the below create table script to create the destination table.
Skip code block
USE Sora;
GO
CREATE TABLE dbo.StateProvince
(
StateProvinceID int IDENTITY(1,1) NOT NULL
, CountryCode nvarchar(3) NOT NULL
, StateCode nvarchar(3) NOT NULL
, Name nvarchar(30) NOT NULL
CONSTRAINT [PK_StateProvinceID] PRIMARY KEY CLUSTERED
([StateProvinceID] ASC)
) ON [PRIMARY];
GO
Create an SSIS package using Business Intelligence Development Studio (BIDS). Right-click on the Connection Managers tab at the bottom of the package and click New OLE DB Connection… to create a new connection to access SQL Server 2008 R2 database.
Click New… on Configure OLE DB Connection Manager.
Perform the following actions on the Connection Manager dialog.
SelectNative OLE DBSQL Server Native Client 10.0 from Provider since the package will connect to SQL Server 2008 R2 database
Enter theServer name, like MACHINENAMEINSTANCE
SelectUse Windows Authentication from Log on to the server section or whichever you prefer.
Select the database from select or enter a database name, the sample uses the database name Sora.
ClickTest Connection
ClickOK on the Test connection succeeded
ClickOK on Connection Manager
The newly created data connection will appear on Configure OLE DB Connection Manager. ClickOK.
OLE DB connection manager KIWISQLSERVER2008R2.Sora will appear under the Connection Manager tab at the bottom of the package. Right-click the connection manager and click Properties
Set the property RetainSameConnection on the connection KIWISQLSERVER2008R2.Sora to the value True.
Right-click anywhere inside the package and then click Variables to view the variables pane. Create the following variables.
A new variable namedPopulateTempTable of data type String in the package scopeSO_5631010 and set the variable with the value EXEC dbo.PopulateTempTable.
A new variable namedFetchTempData of data type String in the package scope SO_5631010and set the variable with the value SELECT CountryCode, StateCode, Name FROM ##tmpStateProvince
Drag and drop an Execute SQL Task onto the Control Flow tab. Double-click the Execute SQL Task to view the Execute SQL Task Editor.
On the General page of the Execute SQL Task Editor, perform the following actions.
Set the name to Create and populate a temp table
Set the connection Type to OLE DB
Set the connection to KIWISQLSERVER2008R2.Sora
SelectVariable from SQLSourceType
SelectUser::PopulateTempTable from SourceVariable
ClickOK
Drag and drop a Data Flow Task onto the Control Flow tab. Rename the Data Flow Task as Transfer temp data to a database table. Connect the green arrow from the Execute SQL Task to the Data Flow Task.
Double-click the Data Flow Task to switch to the Data Flow tab. Drag and drop an OLE DB Source onto the Data Flow tab. Double-click OLE DB Source to view the OLE DB Source Editor.
On the Connection Manager page of the OLE DB Source Editor, perform the following actions.
SelectKIWISQLSERVER2008R2.Sora from OLE DB Connection Manager
SelectSQL command from variable from Data access mode
SelectUser::FetchTempData from Variable name
ClickColumns page
Clicking the Columns page on OLE DB Source Editor will display the following error because the table##tmpStateProvince specified in the source command variable does not exist and SSIS is unable to read the column definition.
To fix the error, execute the statement EXEC dbo.PopulateTempTable using SQL Server Management Studio (SSMS) on the database Sora so that the stored procedure will create the temporary table. After executing the stored procedure, click the Columns page on OLE DB Source Editor, you will see the column information. Click OK.
Drag and drop OLE DB Destination onto the Data Flow tab. Connect the green arrow from OLE DB Source to OLE DB Destination. Double-click OLE DB Destination to open OLE DB Destination Editor.
On the Connection Manager page of the OLE DB Destination Editor, perform the following actions.
SelectKIWISQLSERVER2008R2.Sora from OLE DB Connection Manager
selectable or view – fast load from Data access mode
Select[dbo].[StateProvince] from Name of the table or the view
ClickMappings page
Click Mappings page on the OLE DB Destination Editor would automatically map the columns if the input and output column names are the same. Click OK. Column StateProvinceID does not have a matching input column and it is defined as an IDENTITY column in the database. Hence, no mapping is required.
Data Flow tab should look something like this after configuring all the components.
Click the OLE DB Source on the Data Flow tab and press F4 to view Properties. Set the propertyValidateExternalMetadata to False so that SSIS would not try to check for the existence of the temporary table during the validation phase of the package execution.
Execute the query select * from dbo.StateProvince in the SQL Server Management Studio (SSMS) to find the number of rows in the table. It should be empty before executing the package.
Execute the package. Control Flow shows successful execution.
In Data Flow tab, you will notice that the package successfully processed 6 rows. The stored procedure created early in this posted inserted 6 rows into the temporary table.
Execute the query select * from dbo.StateProvince in the SQL Server Management Studio (SSMS) to find the 6 rows successfully inserted into the table. The data should match with rows founds in the stored procedure.
The above example illustrated how to create and use a temporary table within a package.
Ans: If you are just interested in knowing the columns being processed and not interested in the info for further use, one possible option is making use of the SSIS logging feature. Here is how it works for data flow tasks.
I hope that helps.
Stay updated with our newsletter, packed with Tutorials, Interview Questions, How-to's, Tips & Tricks, Latest Trends & Updates, and more ➤ Straight to your inbox!
Name | Dates | |
---|---|---|
SSIS Training | Aug 05 to Aug 20 | |
SSIS Training | Aug 08 to Aug 23 | |
SSIS Training | Aug 12 to Aug 27 | |
SSIS Training | Aug 15 to Aug 30 |
Ravindra Savaram is a Content Lead at Mindmajix.com. His passion lies in writing articles on the most popular IT platforms including Machine learning, DevOps, Data Science, Artificial Intelligence, RPA, Deep Learning, and so on. You can stay up to date on all these technologies by following him on LinkedIn and Twitter.
1 /15
Copyright © 2013 - 2023 MindMajix Technologies