Passwords are not saved in a package unless in an encrypted format. Check the ProtectionLevel property of your package to set this up.
-Jamie
|||Thanks, that was it.sqlPasswords are not saved in a package unless in an encrypted format. Check the ProtectionLevel property of your package to set this up.
-Jamie
|||Thanks, that was it.sqlI have a parent package that uses Execute Package Task against three child packages. They are all setup to use the same configuration file, and the packages are all located in the same folder. Purpose of the configuation is to use the same connection strings for each child package.
I removed the path from the config. file specification to facilitate deployment. During debugging, the child packages cannot find the config file when they are run via the parent unless the full path is specified. If run standalone, a child package can find the config. file without the path. As I said, all dtsx filaes and the config file is in the same folder.
Also, the password-removal issue is killing me in development. Everytime I touch the config. file, it removes the password from the connection string. I have "protection level" set to "don't save sensitive" - I'd appreciate help with this too.
thanks
Mark Challen wrote:
I have a parent package that uses Execute Package Task against three child packages. They are all setup to use the same configuration file, and the packages are all located in the same folder. Purpose of the configuation is to use the same connection strings for each child package.
I removed the path from the config. file specification to facilitate deployment. During debugging, the child packages cannot find the config file when they are run via the parent unless the full path is specified. If run standalone, a child package can find the config. file without the path. As I said, all dtsx filaes and the config file is in the same folder.
Also, the password-removal issue is killing me in development. Everytime I touch the config. file, it removes the password from the connection string. I have "protection level" set to "don't save sensitive" - I'd appreciate help with this too.
thanks
When you have it set to "don't save sensitive" that's what's supposed to happen. It's not saving sensitive information.|||As far as your paths go, you have to use absolute paths. So, you might be better off using an environment variable or something to store the path to the config file.
Relative paths have been requested: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=253232|||
Phil Brammer wrote:
When you have it set to "don't save sensitive" that's what's supposed to happen. It's not saving sensitive information.
Do you happen to know which setting will allow the saving of a password so I can work on the packages in development without having to reset the password everytime I touch the configuration? I can understand removing the password when building the solution for deployment, having it auto-magically disappear while staying in the designer it makes no sense.
|||http://msdn2.microsoft.com/en-us/library/ms141747.aspx|||Mark Challen wrote:
I removed the path from the config. file specification to facilitate deployment. During debugging, the child packages cannot find the config file when they are run via the parent unless the full path is specified. If run standalone, a child package can find the config. file without the path. As I said, all dtsx filaes and the config file is in the same folder.
In my current project I am using that technique; removing the path of the config file from the package configuration and it has worked fine in both ways when called from the aprent package or standalone....
Config file has to be in same folder than the apckages though.
|||Thanks for the link Phil. Doesn't answer the question, and I'm getting increasingly frustrated by the lack of control with SSIS. Not to get off on a rant, but I personally would rather be responsible for setting security, not having MS dictate that I encrypt everything.
Anyhow, I have found a workaround - once the config. file is established on the parent I set it's "Read-only" property to true in the file system. A low-tech approach, but it seems to work.
The issue of the requirement for an absolute path for the config. file is going to be a showstopper for me (no way am I setting an environment variable; I don't see the thinking behind that - how do I establish the variable in the first place?). I will probably execute my child packages independently from the command line from my own custom wrapper code, passing in my connection strings on the command line. Too bad, I wanted to use the Configuration feature and I spent a lot of time trying to make it work. This seems to be endemic to SSIS (like checkpointing) - a cool feature at first glance, but try to use it and you hit a brick wall. </Rant>
Rafael Salas wrote:
In my current project I am using that technique; removing the path of the config file from the package configuration and it has worked fine in both ways when called from the aprent package or standalone....
Config file has to be in same folder than the apckages though.
Are you sure the package is successfully locating the config file and overriding the connection strings, or is it defaulting back to your original connection manager settings in the child?
|||Mark Challen wrote:
The issue of the requirement for an absolute path for the config. file is going to be a showstopper for me (no way am I setting an environment variable; I don't see the thinking behind that - how do I establish the variable in the first place?).
You set up the environment variable by way of the system properties in Windows. Once it's set up, you simply reference it thereby making your configurations portable. This is a good way, in my opinion to handle changes across environments that have different configurations (file paths, etc...)
For me, I use SQL Server based configurations and environment variables. I've pretty much got everything setup so that I never have to make changes to a package when promoting it from development into QA and finally into production. Even though file paths may change (commonly C: drive to D: drive, perhaps), the packages continue to work with little interaction. This helps given that we can't promote to production -- another group does that. (Thank you Sarbanes-Oxley!)|||
Mark Challen wrote:
Are you sure the package is successfully locating the config file and overriding the connection strings, or is it defaulting back to your original connection manager settings in the child?
I am 100% sure that it works; actually it is very easy to see if it is not; each time a configuration does not take place; a warning is generated and can be seen in the progress tab of BIDS or in the logging table/file when logging is enable. I rather prefer using indirect configurations; but I have gotten a client where Env variable are not an option....
I have a DTS PACKAGE IN SQL 2000 where I need to use vbscript to loop thru the files. I have the package detecting the directory - and the file, BUT the file itself has an interval number from 1- 48 set within it. So in one day I am getting 48 different files.
I can not delete previous files or move them - so I need to - each day - loop thru the day for each interval.
Any thoughts on how I can do this?
Interval 1 = 12:00 AM
Interval 2 = 12:30 AM
Interval 3 = 1:00 AM
etc. etc.
Thanks!
M
I am not clear exactly what you're after. If you want to translate intervals into respective times for the 48 intervals, you can do like so:
Code Snippet
--create dummy #digits table
select top 48 i=identity(int,1,1)
into #digits
from syscolumns
--the query
select i, right(convert(varchar,dateadd(minute,(i-1)*30,0),100),7) [time]
from #digits
--drop #digits
drop table #digits
Well not exactly taking the intervals and transposing them into a time. Each interval means it is a certain time frame (a 30 min time frame within the 24 hours - giving me a total of 48 intervals in a day).
The file that I get - I get 48 times a day. So I have to loop thru the 48 files in activex for the dts. And I am a bit stuck on that part.
|||Moved to the SSIS forum, since you are looking for a solution with no T-SQL code. This is more likely to have someone to answer it here. Thanks
|||Are you trying to do this in DTS or SSIS? If it's DTS, try this newsgroup - microsoft.public.sqlserver.dts.|||Forums site is screwy - finally was able to get in and edit/reply... this is for DTS in SQL 2000...I have a DTS PACKAGE IN SQL 2000 where I need to use vbscript to loop thru the files. I have the package detecting the directory - and the file, BUT the file itself has an interval number from 1- 48 set within it. So in one day I am getting 48 different files.
I can not delete previous files or move them - so I need to - each day - loop thru the day for each interval.
Any thoughts on how I can do this?
Interval 1 = 12:00 AM
Interval 2 = 12:30 AM
Interval 3 = 1:00 AM
etc. etc.
Thanks!
M
I am not clear exactly what you're after. If you want to translate intervals into respective times for the 48 intervals, you can do like so:
Code Snippet
--create dummy #digits table
select top 48 i=identity(int,1,1)
into #digits
from syscolumns
--the query
select i, right(convert(varchar,dateadd(minute,(i-1)*30,0),100),7) [time]
from #digits
--drop #digits
drop table #digits
Well not exactly taking the intervals and transposing them into a time. Each interval means it is a certain time frame (a 30 min time frame within the 24 hours - giving me a total of 48 intervals in a day).
The file that I get - I get 48 times a day. So I have to loop thru the 48 files in activex for the dts. And I am a bit stuck on that part.
|||Moved to the SSIS forum, since you are looking for a solution with no T-SQL code. This is more likely to have someone to answer it here. Thanks
|||Are you trying to do this in DTS or SSIS? If it's DTS, try this newsgroup - microsoft.public.sqlserver.dts.|||Forums site is screwy - finally was able to get in and edit/reply... this is for DTS in SQL 2000...I am trying to build a package that loops through different SQL Server connections and puts the result set in a single SQL Server connection. I have looked at the Flat File example but this does not help. How do I loop through a list of SQL Servers (can be in a table, or file) make the connection to that SQL Server, run a script against Master to gather DB names (have this part) and write it to a reporting server (have this part). I have tried the For Loop and For Each Loop, but can't get either to work with SQL connections.
I have seen this question a lot, with no real answers. (Everyone who answers just points back to the Flat File example)
Thanks,
Pete
Let me make sure I understand what you are trying to do.
You want to loop through a table that basically gives you a servername,username and password. And for each servername found you want to connect to that server and execute a sql stmt or stored proc, get the results back and store the results on your reporting server.
Is that correct?
You could use property expressions to dynamically change the connections as you loop through your table.
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=75254&SiteID=1
Peter Cwik wrote:
Yes you understand it right. The idea you expressed sounds right, but I don't know how to do it.
Peter,
The reason everyone points to the flatfile example is because the steps that you need to go through are exactly the same. In the Flat File example you are talking about I am assuming (because you did not link to it) that you are setting the ConnectionString property using a property expression. Exactly the same in your scenario.
Can you explain why the Flat File example is not useful to you?
-Jamie
|||
Jamie,
The Flat File example is not helpful because the syntax on the screens is different. When starting out on a new program like integration services specific instructions are needed. For example there are many places to right click and get properties, but if you don't have focus on the correct object the instructions on what to change in a property sheet won't make sense. Often times those that give advice forget the nuances of the programs. While we all laugh at the "idiot" who complains that "I can't click on your computer", we forget that newbies don't know that there is an icon on the main desktop called My Computer. I'm looking for specifics because I haven't done it before and its not intuitive.
I understand the concept of what I need to do, grab a list, parse through it while passing the information to the connection. But how? While the concept is clear there are obviously steps that need to be followed to get it to work. Its easy to get to the moon too, just build a vehicle that can move fast enough escape the earth's gravity. While the concept is understandable, without the specifics your going no where.
Sorry for my little rant, I do need and want help on the specific steps.
Hope this helps,
Peter Cwik
|||I try to expand on the answer Darren gave in the post Jamie linked to above in this one of my own (GD at the number prepositions in this sentence!): http://forums.microsoft.com/MSDN/ShowPost.aspx?postid=843095&siteid=1I'll admit there are things missing and not specific to your example, but if it gets you started, you may be able to figure the rest out.|||
Peter Cwik wrote:
Jamie,
The Flat File example is not helpful because the syntax on the screens is different. When starting out on a new program like integration services specific instructions are needed. For example there are many places to right click and get properties, but if you don't have focus on the correct object the instructions on what to change in a property sheet won't make sense. Often times those that give advice forget the nuances of the programs. While we all laugh at the "idiot" who complains that "I can't click on your computer", we forget that newbies don't know that there is an icon on the main desktop called My Computer. I'm looking for specifics because I haven't done it before and its not intuitive.
I understand the concept of what I need to do, grab a list, parse through it while passing the information to the connection. But how? While the concept is clear there are obviously steps that need to be followed to get it to work. Its easy to get to the moon too, just build a vehicle that can move fast enough escape the earth's gravity. While the concept is understandable, without the specifics your going no where.
Sorry for my little rant, I do need and want help on the specific steps.
Hope this helps,
Peter Cwik
Peter,
I don't think that response was warranted. People here are more than willing to help. I recommend you try and make your querie more specific though. i.e. Which particular step you are having problems with? If you're having trouble with all of them then let's take them one at a time - starting with the first one.
You are right that you need to "grab a list, parse through it while passing the information to the connection". Which part of that are you having problems with?
Assuming the list is in a database you can get that list into a SSIS variable using an Execute SQL Task.|||
The last step is where I lose it. I have been able to get the list, pass it to a variable. I couldn't figure out how to pass it a connection string (does that mean connection manager, tsql, vb...) I tried passing it to tsql string, which works as long as I have linked server connections, which is not what I want.
I think you may have misread my intent above. I never said that people here are not willing to help. If I thought that I wouldn't be here asking for help. My point is that experts sometimes get used to the higher level functions and assume others are on their level when giving instructions. In SSIS I am not advanced, I admit that, and I need a little hand holding to get going on this particular topic.
As for the flat file example, there are specific screens and prompts for inputing the name of a file and the directory. There is built-in functionality to handle this type of input for flat files but I have not found the same built-in functionality for cycling through SQL connections.
|||That was pretty close. Thank you.
I had a developer write a VB script that does it all too, but like your example I'm trying to find the built in functionality of SSIS.
Thanks again.
I am trying to build a package that loops through different SQL Server connections and puts the result set in a single SQL Server connection. I have looked at the Flat File example but this does not help. How do I loop through a list of SQL Servers (can be in a table, or file) make the connection to that SQL Server, run a script against Master to gather DB names (have this part) and write it to a reporting server (have this part). I have tried the For Loop and For Each Loop, but can't get either to work with SQL connections.
I have seen this question a lot, with no real answers. (Everyone who answers just points back to the Flat File example)
Thanks,
Pete
Let me make sure I understand what you are trying to do.
You want to loop through a table that basically gives you a servername,username and password. And for each servername found you want to connect to that server and execute a sql stmt or stored proc, get the results back and store the results on your reporting server.
Is that correct?
You could use property expressions to dynamically change the connections as you loop through your table.
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=75254&SiteID=1
Peter Cwik wrote:
Yes you understand it right. The idea you expressed sounds right, but I don't know how to do it.
Peter,
The reason everyone points to the flatfile example is because the steps that you need to go through are exactly the same. In the Flat File example you are talking about I am assuming (because you did not link to it) that you are setting the ConnectionString property using a property expression. Exactly the same in your scenario.
Can you explain why the Flat File example is not useful to you?
-Jamie
|||
Jamie,
The Flat File example is not helpful because the syntax on the screens is different. When starting out on a new program like integration services specific instructions are needed. For example there are many places to right click and get properties, but if you don't have focus on the correct object the instructions on what to change in a property sheet won't make sense. Often times those that give advice forget the nuances of the programs. While we all laugh at the "idiot" who complains that "I can't click on your computer", we forget that newbies don't know that there is an icon on the main desktop called My Computer. I'm looking for specifics because I haven't done it before and its not intuitive.
I understand the concept of what I need to do, grab a list, parse through it while passing the information to the connection. But how? While the concept is clear there are obviously steps that need to be followed to get it to work. Its easy to get to the moon too, just build a vehicle that can move fast enough escape the earth's gravity. While the concept is understandable, without the specifics your going no where.
Sorry for my little rant, I do need and want help on the specific steps.
Hope this helps,
Peter Cwik
|||I try to expand on the answer Darren gave in the post Jamie linked to above in this one of my own (GD at the number prepositions in this sentence!): http://forums.microsoft.com/MSDN/ShowPost.aspx?postid=843095&siteid=1I'll admit there are things missing and not specific to your example, but if it gets you started, you may be able to figure the rest out.|||
Peter Cwik wrote:
Jamie,
The Flat File example is not helpful because the syntax on the screens is different. When starting out on a new program like integration services specific instructions are needed. For example there are many places to right click and get properties, but if you don't have focus on the correct object the instructions on what to change in a property sheet won't make sense. Often times those that give advice forget the nuances of the programs. While we all laugh at the "idiot" who complains that "I can't click on your computer", we forget that newbies don't know that there is an icon on the main desktop called My Computer. I'm looking for specifics because I haven't done it before and its not intuitive.
I understand the concept of what I need to do, grab a list, parse through it while passing the information to the connection. But how? While the concept is clear there are obviously steps that need to be followed to get it to work. Its easy to get to the moon too, just build a vehicle that can move fast enough escape the earth's gravity. While the concept is understandable, without the specifics your going no where.
Sorry for my little rant, I do need and want help on the specific steps.
Hope this helps,
Peter Cwik
Peter,
I don't think that response was warranted. People here are more than willing to help. I recommend you try and make your querie more specific though. i.e. Which particular step you are having problems with? If you're having trouble with all of them then let's take them one at a time - starting with the first one.
You are right that you need to "grab a list, parse through it while passing the information to the connection". Which part of that are you having problems with?
Assuming the list is in a database you can get that list into a SSIS variable using an Execute SQL Task.|||
The last step is where I lose it. I have been able to get the list, pass it to a variable. I couldn't figure out how to pass it a connection string (does that mean connection manager, tsql, vb...) I tried passing it to tsql string, which works as long as I have linked server connections, which is not what I want.
I think you may have misread my intent above. I never said that people here are not willing to help. If I thought that I wouldn't be here asking for help. My point is that experts sometimes get used to the higher level functions and assume others are on their level when giving instructions. In SSIS I am not advanced, I admit that, and I need a little hand holding to get going on this particular topic.
As for the flat file example, there are specific screens and prompts for inputing the name of a file and the directory. There is built-in functionality to handle this type of input for flat files but I have not found the same built-in functionality for cycling through SQL connections.
|||That was pretty close. Thank you.
I had a developer write a VB script that does it all too, but like your example I'm trying to find the built in functionality of SSIS.
Thanks again.
The April 2006 update of SQL Server 2005 Books Online contains a
new topic titled "How to: Loop through Excel Files and Tables", at
ms-help://MS.SQLCC.v9/MS.SQLSVR.v9.en/extran9/html/a5393c1a-cc37-491a-a260-7aad84dbff68.htm
or online at
http://msdn2.microsoft.com/en-us/library/ms345182.aspx
Why don't you see if that helps, and if not, let us know exactly
what point in that article something goes wrong.
The download page for BOL, for local installation, is
http://www.microsoft.com/downloads/...&DisplayLang=en
Steve Kass
Drew University
davidz wrote:
> I am attempting to use the foreach loop structure in an SSIS package to
> loop through however many Excel files are placed in a directory and
> then perform an import operation into a SQL table on each of these
> files sequentially. The closest model for this that I was able to find
> in the MS tutorial used a flat file source rather than Excel. That
> involved adding a new expression to the Connection Manager that set the
> connection string to the current filename, as provided by the foreach
> component. That works just fine, but when I attempt to apply the same
> method to an Excel source, rather than a flat file source, I cannot get
> it to work. I see the following error associated with the Excel source
> on the Data Flow page: "Validation error. Data Flow Task: Excel Source
> [1]: The AcquireConnection method call to the connection manager "Excel
> Connection Manager 1" failed with error code 0xC020200." I think that
> it's just a matter of getting the right expression, and I thought that
> perhaps I should be constructing an expression for ExcelFilePath rather
> than the Connection String, but I have fiddled with it for hours and
> haven't come up with something that will be accepted. Has anybody out
> there been able to do this, or can perhaps refer me to some
> documentation that contains an example of what I am trying to do?
> Thanks for any help you can give.
I have an application that manages the booking of rooms in several different buildings, and am trying to build an SSIS package to build a table containing records showing the total number of hours per day that each building is being used. Because two bookings in the same building can take place at the same time or overlap with each other, it's not a straightforward sum of the hours per booking.
I've written a VBA script that loops through booking records comparing their start and end times, building a total value for each day, and then writing the building id, date and total hours to a table. My problem is that I'm not sure how to do this
in SSIS. The script component operates on row by row basis so doesn't seem to allow looping through records, and the aggregate component doesn't allow scripts.
I'm still pretty new to SSIS so may be missing something obvious, but if anyone has help to offer it would be greatly appreciated.
Thanks,
Tim
You will need an Execute SQL Task, a few variables, and a ForEach loop.
You need to begin with the Exceute SQL Task to read your table for the appropriate records and write them to a ADO RecordSet. The task will need a variable to write the recordset to and so you will need to define a variable of data type "Object" to hold the record set. In the Execute SQL Task set the ResultSet property to "Full Result Set" as you will be retrieving multiple rows. In the ResultSet option you will keep the ResultName to 0 and in the variable you need to choose the name of the variable you would have created.
In the ForEach Loop you need to use the ForEach ADO Enumerator and set the SourceVariable to the variable that holds the recordset. If you are planning to use the values of the columns retrieved from the Execute SQL Task within the ForEach Loop then you would need to define variables to pass them into, and then define the variables in the Variables option for the ForEach Loop.
I hope this helps.
|||A few altenative approaches:
One, you can use a script component in the data flow to accomplish this. You'd have to capture the rows in an internal data store (a DataTable or array would work), and then process them once you have all the data for a building.
Two, you could use an Execute SQL task as desibull suggests, to get a recordset. However, rather than processing it in a For Each (which is going to introduce the same single record at a time issue), process it in a Script Task.
Three, it seems (at least theoretically) possible to build a SELECT statement which would do this for you. In particular, CTEs in SQL 2005 could help with this type of problem. More detail on the table structure could help determine if that is possible.
Considering you already have the VBA code for processing the data, I'd probably go with option 2.
I have a stored procedure that processes an individual file from a
directory and archives it in a subdirectory.Now, the problem is, when i
execute it , it will only process one file. What i want to do is to check
to see if there are any files in the folder, and if there are , process
them all, and once done, go to the next part in a DTS package, if there are
no files, simply go to the next part in the DTS package. I tried an activex
script that would get the filecount in the folder, and if there were more
than 0 files in the folder, then DTS-sUCCESS and on "success" workflow , it
would run the stored procedure, and thus it woould process one file, then
"on completion" the workflow connected it back to the activeX script(thus
looping), which would count the files again. Now if there were 0 files, it
would report DTS_FAILIURE, and I had it set up ,"on failiure" to go to the
next step in the package, but it wouldn't run.
Someone mind showing me a ray of light?What you can do is create another SP (Parent SP)which is a wrapper on
the current SP .
Read the files one by one in the main SP and call your SP .
In DTS flow replace the current SP with Parent SP.
Srinivas
Alex wrote:
Quote:
Originally Posted by
Hello,
>
I have a stored procedure that processes an individual file from a
directory and archives it in a subdirectory.Now, the problem is, when i
execute it , it will only process one file. What i want to do is to check
to see if there are any files in the folder, and if there are , process
them all, and once done, go to the next part in a DTS package, if there are
no files, simply go to the next part in the DTS package. I tried an activex
script that would get the filecount in the folder, and if there were more
than 0 files in the folder, then DTS-sUCCESS and on "success" workflow , it
would run the stored procedure, and thus it woould process one file, then
"on completion" the workflow connected it back to the activeX script(thus
looping), which would count the files again. Now if there were 0 files, it
would report DTS_FAILIURE, and I had it set up ,"on failiure" to go to the
next step in the package, but it wouldn't run.
>
Someone mind showing me a ray of light?
I do recall other people having similar problems with Excel but I don't know if there was a resolution.
Sorry...not much help I'm afraid.
-Jamie
|||Thanks for the response - at least I have had one, nowI was able to complete your scenario successfully following the instructions at: http://msdn2.microsoft.com/en-us/library/ms345182.aspx
Note that I paid particular attention to this paragraph:
To avoid validation errors later as you configure tasks and data flow components to use this connection manager, assign a default Excel workbook in the Excel Connection Manager Editor. After you create and configure the package, you can delete this value in the Properties window. However, after you delete this value, a validation error may occur because the connection string property of the Excel connection manager is no longer valid until the Foreach Loop runs. In this case, set the DelayValidation property to True on the connection manager, on the tasks in which the connection manager is used, or on the package.
If I did not have DelayValidation set True on the connection manager and have a valid default value in the connection manager I received the same errors as you. I also set DelayedValidation = True on the Data Flow Task for extra caution.
Donald
|||Hi Donald,
Thanks for the link and the reponse ... I appreciate it.
I'll have a look - the article loks promising. I did already have all my 'delay validations' set to true, and I initially created the excel connection with a valid connection - I had it pointing at one of the file I want to impoty. When it is hard-coded to the excel file, my import runs fine... as soon as I delete the connection name, and replace it witht eh variable name @.[User::RankingFileFullPath], mr problems occur. I'll read through the article, and create a new package, just to ensure I don't inherit some issues I created in my attempts to appease the SSIS connections gods... :-)
thanks again
|||I followed the article's guidelines, but am having an issue.
The sequence of actions:
Create New Project.
Create ForEach container.
Create Variables ExcelFileFullPath and ExcelConnectionProperties. Set the Container to map the ExcelFileFullPath.
Create Excel Connection, pointing to valid, existing excel file.
Alter Excel connection properties: Delay Validation = true. Expression: ConnectionString:
"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @.[User::ExcelFilefullPath] + ";Extended Properties=\"" + @.[User::ExcelConnectionProperties] + "\""
All good, so far.
Create a data flow control in the container. Add an excel source. Try connecting excel source to excel connection - error: Initial error was an Hresult type error (not captured). Trying to go further, I now get 'Invlaid Arguement' when trying to choose name of excel sheet - this happens on new excel source as well.
I'm going to start again, this time creating an excel import that works correctly. Once that's done, I'll try and move it into the container, and apply the properties as described in article, and see if that makes a difference (surely it shouldn't?)....
|||Status:
I have created a Data Flow Task that opens a HardCoded (or direct) Excel Connection, does a data conversion, then a conditional split, and inserts rows into a OLE DB table connection. I have a ForEach container that correctly populates the excel file path into the ExcelFullFilePath variable. The Data flow task is OUTSIDE the container. All runs through successfully.
Next:
I intend to move the data flow task into the Container, but not change the excel connection, and confirm that it runs successfully. Assumming success, I will then apply the process from the article, specifically:
Set Delay Validation = True on the Package, Containter, Data Flow and Excel Connection.
Set the ConnectionString (via expressions) =
"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @.[User::ExcelFilefullPath] + ";Extended Properties=\"" + @.[User::ExcelConnectionProperties] + "\""
I'll respond on progress/results
-- so far:
Run with Data Flow in ForEach container - still runs through, USING Hard-coded connection
Change Delay Validation properties on all objects - still runs through, USING Hard-coded connection
Change ConnectionString property to use dynamic ("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @.[User::ExcelFilefullPath] + ";Extended Properties=\"" + @.[User::ExcelConnectionProperties] + "\"") - SUCCESS.
Add additional .xls file to folder for processing - failure. BUT that was (due to a GOOD, DESCRIPTIVE ERROR MESSAGE) because the connection requires the same sheet name for the 2 excel files. I can live with that.
Update 2nd test file sheet name, re-run... SUCCESS. 2 Excel file successfully imported.
Many thanks for the response, especially yours, Donald.
I still don't understand why I had to go through the sequence I had to, i.e. create a fully working excel import, and then alter the properties, but it worked. Also, a useful note for those doing this in the future - make sure the sheets for the multiple files have the same sheet names.
HTH and thanks
|||Thanks for these detailed notes. They will help us to improve this experience in the future.
Donald
|||Wow, thanks GethWho. I've been looking for this answer all day!more excel files, process the data in them via the conditional
splitter, pushing the good data into a table, and all other rows into
an error table.
I'm having some issues using the ForEach container to process multiple
excel spreadsheets into tables. The excel import into the tables is
more or less working (imports data for good cases, but uses a null if
the Excel Source when it gets an unexpected value - but that's a
seperate problem).
I found something related to this when searching, but it related to
CTPs (June and September) and trying to reuse the connection strings
they built up there (using my own variable names, naturally) causes a
'Property Value failure':
--
The connection string format is not valid. It must consist of one or
more components of the form X=Y, separated by semicolons. This error
occurs when a connection string with zero components is set on database
connection manager.
--
I attemtpted to use this:
"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" +
@.[User::RankingFileFullPath] + ";Extended Properties=\"Excel
8.0;HDR=YES\";"
The excel importer works fine as a stand-alone component. Trying to use
the process defined in 'Profession SQL Server Integration Services'
pp140, I tried to use an expression to assign the variable value to the
connection string. I get a validation error:
--
Error at Import TPNB Ranking Excel spreadsheets [Excel Source [1]]: The
AcquireConnection method call to the connection manager "Excel
Connection Manager" failed with error code 0xC0202009.
Error at Import TPNB Ranking Excel spreadsheets [DTS.Pipeline]:
component "Excel Source" (1) failed validation and returned error code
0xC020801C.
Error at Import TPNB Ranking Excel spreadsheets [DTS.Pipeline]: One or more component failed validation.
Error at Import TPNB Ranking Excel spreadsheets: There were errors during task validation.
Error at Excel Importer [Connection manager "Excel Connection Manager"]: An OLE DB error has occurred. Error code: 0x80040E4D.
--
Any advice?
....
.... in addition ....
I attempted to change the package - I set the Data Flow validation to
Delay Validation, and changed the expression to change from:
ConnectionString @.[User::RankingFileFullPath]
to
ExcelFilePath @.[User::RankingFileFullPath]
This allowed the package to start debugging, and gave more information in the failure:
--
SSIS package "Excel Importer.dtsx" starting.
SSIS breakpoint 'Break when the container receives the OnPreExecute event' at executable 'Excel Importer' has been hit
SSIS breakpoint 'Break when the container receives the OnPreExecute event' at executable 'Foreach Loop Container' has been hit
SSIS breakpoint 'Break when the container receives the OnQueryCancel event' at executable 'Excel Importer' has been hit
Information: 0x4004300A at Import TPNB Ranking Excel spreadsheets, DTS.Pipeline: Validation phase is beginning.
Warning: 0x802092A7 at Import TPNB Ranking Excel spreadsheets,
ProductSalesRank Table [278]: Truncation may occur due to inserting
data from data flow column "Rank" with a length of 1000 to database
column "SalesRank" with a length of 50.
Error: 0xC0202009 at Excel Importer, Connection manager "Excel
Connection Manager": An OLE DB error has occurred. Error code:
0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database
Engine" Hresult: 0x80004005 Description: "Unrecognized
database format 'D:\Testing\TestRanking.xls'.".
Error: 0xC020801C at Import TPNB Ranking Excel spreadsheets, Excel
Source [1]: The AcquireConnection method call to the connection manager
"Excel Connection Manager" failed with error code 0xC0202009.
Error: 0xC0047017 at Import TPNB Ranking Excel spreadsheets,
DTS.Pipeline: component "Excel Source" (1) failed validation and
returned error code 0xC020801C.
Error: 0xC004700C at Import TPNB Ranking Excel spreadsheets, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at Import TPNB Ranking Excel spreadsheets: There were errors during task validation.
SSIS breakpoint 'Break when the container receives the OnQueryCancel event' at executable 'Excel Importer' has been hit
Warning: 0x80019002 at Foreach Loop Container: The Execution method
succeeded, but the number of errors raised (5) reached the maximum
allowed (1); resulting in failure. This occurs when the number of
errors reaches the number specified in MaximumErrorCount. Change the
MaximumErrorCount or fix the errors.
SSIS breakpoint 'Break when the container receives the OnQueryCancel event' at executable 'Excel Importer' has been hit
SSIS breakpoint 'Break when the container receives the OnQueryCancel event' at executable 'Excel Importer' has been hit
SSIS breakpoint 'Break when the container receives the OnWarning event' at executable 'Excel Importer' has been hit
Warning: 0x80019002 at Excel Importer: The Execution method succeeded,
but the number of errors raised (5) reached the maximum allowed (1);
resulting in failure. This occurs when the number of errors reaches the
number specified in MaximumErrorCount. Change the MaximumErrorCount or
fix the errors.
SSIS breakpoint 'Break when the container receives the OnPostExecute event' at executable 'Excel Importer' has been hit
SSIS package "Excel Importer.dtsx" finished: Failure.
The program '[2460] Excel Importer.dtsx: DTS' has exited with code 0
(0x0).--Apparently, I added too much detail, and poeple are avoiding the question :-)
I've tested doing exactly the same thing, but using CSV's as the source
(i.e. text files and not excel files). And it works fine. So there's
nothing wrong with the environment, DB etc.
...aaarg|||
I do recall other people having similar problems with Excel but I don't know if there was a resolution.
Sorry...not much help I'm afraid.
-Jamie
|||Thanks for the response - at least I have had one, now
The only reference's I've found pointed me to (a) re-installing service
pack 1 to ensure SQL and SSIS etc. are all good. I did that, but no go.
The other reference seemed to say I need to write an activeX (or .Net
script) component to handle the files. That's just got to be wrong -
what's the point of having to continuously kludge different sections of
SSIS together with .Net scripting... In that case it seems that we
should rather do the whole thing in 'Net ... *sighs* ...
I guess I'm just a bit frustrated that something as simple as importing
a few excel files into a SQL table doesn't work simply ... after all
the forEach and loop controls was one of the NEW BIG THINGS that SSIS
was bringing us.... oh well, it's getting later on a Friday, and it
looks like it's going to be a long, frustrating weekend (... it might
be better if I knew how to write the .Net )
Well, I hope you (and anyone else reading my ranting) has a good weekend.|||
I was able to complete your scenario successfully following the instructions at: http://msdn2.microsoft.com/en-us/library/ms345182.aspx
Note that I paid particular attention to this paragraph:
To avoid validation errors later as you configure tasks and data flow components to use this connection manager, assign a default Excel workbook in the Excel Connection Manager Editor. After you create and configure the package, you can delete this value in the Properties window. However, after you delete this value, a validation error may occur because the connection string property of the Excel connection manager is no longer valid until the Foreach Loop runs. In this case, set the DelayValidation property to True on the connection manager, on the tasks in which the connection manager is used, or on the package.
If I did not have DelayValidation set True on the connection manager and have a valid default value in the connection manager I received the same errors as you. I also set DelayedValidation = True on the Data Flow Task for extra caution.
Donald
|||
Hi Donald,
Thanks for the link and the reponse ... I appreciate it.
I'll have a look - the article loks promising. I did already have all my 'delay validations' set to true, and I initially created the excel connection with a valid connection - I had it pointing at one of the file I want to impoty. When it is hard-coded to the excel file, my import runs fine... as soon as I delete the connection name, and replace it witht eh variable name @.[User::RankingFileFullPath], mr problems occur. I'll read through the article, and create a new package, just to ensure I don't inherit some issues I created in my attempts to appease the SSIS connections gods... :-)
thanks again
|||I followed the article's guidelines, but am having an issue.
The sequence of actions:
Create New Project.
Create ForEach container.
Create Variables ExcelFileFullPath and ExcelConnectionProperties. Set the Container to map the ExcelFileFullPath.
Create Excel Connection, pointing to valid, existing excel file.
Alter Excel connection properties: Delay Validation = true. Expression: ConnectionString:
"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @.[User::ExcelFilefullPath] + ";Extended Properties=\"" + @.[User::ExcelConnectionProperties] + "\""
All good, so far.
Create a data flow control in the container. Add an excel source. Try connecting excel source to excel connection - error: Initial error was an Hresult type error (not captured). Trying to go further, I now get 'Invlaid Arguement' when trying to choose name of excel sheet - this happens on new excel source as well.
I'm going to start again, this time creating an excel import that works correctly. Once that's done, I'll try and move it into the container, and apply the properties as described in article, and see if that makes a difference (surely it shouldn't?)....
|||Status:
I have created a Data Flow Task that opens a HardCoded (or direct) Excel Connection, does a data conversion, then a conditional split, and inserts rows into a OLE DB table connection. I have a ForEach container that correctly populates the excel file path into the ExcelFullFilePath variable. The Data flow task is OUTSIDE the container. All runs through successfully.
Next:
I intend to move the data flow task into the Container, but not change the excel connection, and confirm that it runs successfully. Assumming success, I will then apply the process from the article, specifically:
Set Delay Validation = True on the Package, Containter, Data Flow and Excel Connection.
Set the ConnectionString (via expressions) =
"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @.[User::ExcelFilefullPath] + ";Extended Properties=\"" + @.[User::ExcelConnectionProperties] + "\""
I'll respond on progress/results
-- so far:
Run with Data Flow in ForEach container - still runs through, USING Hard-coded connection
Change Delay Validation properties on all objects - still runs through, USING Hard-coded connection
Change ConnectionString property to use dynamic ("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @.[User::ExcelFilefullPath] + ";Extended Properties=\"" + @.[User::ExcelConnectionProperties] + "\"") - SUCCESS.
Add additional .xls file to folder for processing - failure. BUT that was (due to a GOOD, DESCRIPTIVE ERROR MESSAGE) because the connection requires the same sheet name for the 2 excel files. I can live with that.
Update 2nd test file sheet name, re-run... SUCCESS. 2 Excel file successfully imported.
Many thanks for the response, especially yours, Donald.
I still don't understand why I had to go through the sequence I had to, i.e. create a fully working excel import, and then alter the properties, but it worked. Also, a useful note for those doing this in the future - make sure the sheets for the multiple files have the same sheet names.
HTH and thanks
|||Thanks for these detailed notes. They will help us to improve this experience in the future.
Donald
|||Wow, thanks GethWho. I've been looking for this answer all day!TITLE: Package Validation Error
Package Validation Error
ADDITIONAL INFORMATION:
Error at Data Flow ACH Validate File and Header Info [Lookup ACH Batch Number [506]]: input column "ID" (571) and reference column named "BANKBATCHNBR" have incompatible data types.
"Lookup ACH Batch Number" is the name of the lookup transformation. input column "ID" has a string data type. reference column "BANKBATCHNBR" has a varchar(50) data type. can someone please tell me how i should go about debugging this validation error? thanks.
What is the length of the input column? I think (but may be wrong) that the lengths have to be the same.
There is a property of the LOOKUP transform that contains an XML snippet. I *think* that snippet contains the metadata of the lookup cache. I'm afraid I can't be absolutely certain about these things (and I can't remember the name of the property) because I don't have a BIDS instance to hand.
-Jamie
TITLE: Package Validation Error
Package Validation Error
ADDITIONAL INFORMATION:
Error at Data Flow ACH Validate File and Header Info [Lookup ACH Batch Number [506]]: input column "ID" (571) and reference column named "BANKBATCHNBR" have incompatible data types.
"Lookup ACH Batch Number" is the name of the lookup transformation. input column "ID" has a string data type. reference column "BANKBATCHNBR" has a varchar(50) data type. can someone please tell me how i should go about debugging this validation error? thanks.
What is the length of the input column? I think (but may be wrong) that the lengths have to be the same.
There is a property of the LOOKUP transform that contains an XML snippet. I *think* that snippet contains the metadata of the lookup cache. I'm afraid I can't be absolutely certain about these things (and I can't remember the name of the property) because I don't have a BIDS instance to hand.
-Jamie
Hi,
I am trying to use a lookup in a package and check for some conditions. On the advanced tab, I am trying to modify the condition from = to <=. But the same doesnt work when the target is on oracle, but the same works fine on SQL Server and DB2.
Any idea regarding the same?
Thanks,
Manish Singh
Can you execute the SQL query against the Oracle database using SQL*Plus or another similar query tool? It may be possible that Oracle does not support the specific syntax you're trying to use; since it's not included in your post it is difficult to say.
Can you post the modified SQL statement here?
|||Yes, the query gets executed against oracle database after passing some defalt values. Below is the modified sql which I am changing on the advanced tab. The same works for SQL Server and DB2
select * from
(SELECT A_KEY, KEY_ID, SOURCE_ID, EFFECTIVE_FROM_DT, EFFECTIVE_TO_DT, EFF_START_DT, EFF_END_DT
FROM TABLEA) as refTable
where [refTable].[SOURCE_ID] = ? and [refTable].[KEY_ID] = ? and [refTable].[EFFECTIVE_FROM_DT] <= ? and [refTable].[EFFECTIVE_TO_DT] >= ? and [refTable].[EFF_START_DT] <= ? and [refTable].[EFF_END_DT] >= ?
The default condition generated is as follows:
select * from
(SELECT A_KEY, KEY_ID, SOURCE_ID, EFFECTIVE_FROM_DT, EFFECTIVE_TO_DT, EFF_START_DT, EFF_END_DT
FROM TABLEA) as refTable
where [refTable].[SOURCE_ID] = ? and [refTable].[KEY_ID] = ? and [refTable].[EFFECTIVE_FROM_DT] = ? and [refTable].[EFFECTIVE_TO_DT] = ? and [refTable].[EFF_START_DT] = ? and [refTable].[EFF_END_DT] = ?
|||Thanks for the additional information, although I'm not sure what to tell you. It looks fine to me, so I do not know why Oracle doesn't like it.I have a package that works fine in development. I move the package over to test and it fails validation in the lookup transform.
Error 46 Validation error. Data Flow Task - PO Lines Interface: Lookup - LIST PRICE [29621]: output column "LIST_PRICE_PER_UNIT" (29667) and reference column named "LIST_PRICE_PER_UNIT" have incompatible data types. SPO_TO_ORACLE_PO.dtsx 0 0
What strikes me as odd is the fact that I don't have a way of specifying the data types. I just specify the column I wish to return as a new column with the same name. Anyway, why would this work in one instance but not another?
thanks
John
Perhaps the data type on the lookup column is different in the new environment.|||Check the regional settings of the development machine and que production machine!
Regards!
|||The tables are identical on both machines but I have noticed some subtle differences in the actual data values contained within the tables. I'm going to take a closer look at that. In one table the column has "0" values but in the other they are defined as "0.0000000". I'm actuall surprised that package validation would catch these kinds of differences.|||Yeah!! :-)
Let we know if you resolved your problem!
Regards!!
I have an SSIS package that unpivots data - each row from the source becomes 20-24 rows in the destination db. In the dataflow it looks up some type-data IDs before inserting into the destination table. The whole process flies through the data at an incredible rate, however I find that the final commit on data insertion takes a very long time, and two of the lookup tasks remain yellow throughout. It appears that there may be a contention issue because the final table has FKs to those lookup tables.
In T-SQL it is possible to do SELECT..... WITH (NOLOCK), but when I modified the SQL statement of the lookup task by adding this clause, it caused an error "Incorrect syntax near the keywork 'with'."
Is there any way around this problem? THNX!
PhilSky
You could split the data-flow into 2, using raw files to pass data between them. This would alleviate blocking problems (if indeed that is the issue).
Incidentally, I have a request for enhancements to the LOOKUP component that means this would be even easier. Feel free to vote for these enhancements here: http://lab.msdn.microsoft.com/productfeedback/viewfeedback.aspx?feedbackid=58d0100b-aa34-4191-ae03-7655543614a1 and leave a comment as well if you could.
-Jamie
|||I've tried that and it works fine what is your SQL statement you are trying to use. The other option is to change the transaction isolation level for the dataflowI have a problem related to execution of a lookup in a transformation script in SQL server 2000 DTS package.
I have a lookup that launches a stored procedure:
EXECUTE TableID ?, ? output
This lookup is executed from a script like this:
Dim newID as integer
DTSLookups("GetNewID").Execute "string_data", newID
The problem is that the second parameter (it is an output parameter) is unchanged althghough it is changed in the stored procedure
Any suggestions ?
Thank you, Gabyx
The Execute method won't assign a value to newID. It returns a value, or array of values.
Try this:
Dim newID as integer
newID = DTSLookups("GetNewID").Execute "string_data", newID
Let me know if this works for you.
Hi,
Maybe someone can assist me in solving the following error.
I created a facttable with surrogate keys. In the package that processes the fact data a lookup-task is supposed to find the appropriate dimension key.
If a certain dimension row is not found, this error is redirected.
In this errorflow a stored procedure will add the missing row in the dimensiontable, after which another lookup takes place. Then the results of both lookups are brought together in a union all task.
When I test this errorflow I notice that the missing dimension row is indeed added to the dimensiontable. The only task that turns red is the first lookup. Please refer below for the errormessages.
I understand the first one; that is why I created the redirection. But what does the rest mean? And even if I redirect the error for further processing is it still counted as a raised error? If the maximum allowed is 1 (as indicated) will the task still fail?
Any help will be greatly appreciated. Thanks in advance and regards,
Albert.
The errormessages are as follows:
Error: 0xC020901E at Xforms, lookups en wegschrijven, Lookup LosplaatsKey [5071]: Row yielded no match during lookup.
Error: 0xC0047072 at Xforms, lookups en wegschrijven, DTS.Pipeline: No object exists with the ID 5317.
Error: 0xC0047022 at Xforms, lookups en wegschrijven, DTS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Lookup nwe losplaats" (5315) failed with error code 0xC0047072. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
Error: 0xC0047021 at Xforms, lookups en wegschrijven, DTS.Pipeline: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread1" has exited with error code 0xC0047072. There may be error messages posted before this with more information on why the thread has exited.
Error: 0xC0047039 at Xforms, lookups en wegschrijven, DTS.Pipeline: SSIS Error Code DTS_E_THREADCANCELLED. Thread "WorkThread2" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown. There may be error messages posted before this with more information on why the thread was cancelled.
Error: 0xC0047039 at Xforms, lookups en wegschrijven, DTS.Pipeline: SSIS Error Code DTS_E_THREADCANCELLED. Thread "WorkThread3" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown. There may be error messages posted before this with more information on why the thread was cancelled.
Error: 0xC0047021 at Xforms, lookups en wegschrijven, DTS.Pipeline: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread2" has exited with error code 0xC0047039. There may be error messages posted before this with more information on why the thread has exited.
Error: 0xC0047021 at Xforms, lookups en wegschrijven, DTS.Pipeline: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread3" has exited with error code 0xC0047039. There may be error messages posted before this with more information on why the thread has exited.
.
.
.
Task failed: Xforms, lookups en wegschrijven
Warning: 0x80019002 at FactRittenInit: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (8) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "FactRittenInit.dtsx" finished: Failure.
Try deleting the lookup and re-adding it?|||
Hi Crispin,
Thanks for your reply. What I described is not yet a production situation, but a test in development.
The testcase is such that indeed one lookup should not yield a match. This row is then redirected and added by means of a stored procedure. After this a new lookup is performed and then the result is unioned with the first flow.
| Error redirect
V
Lookup Key --> OLE DB Command (proc adds row)
| |
V V
Union all <-- Lookup new key
|
V
The behaviour is that 'Lookup Key' notices that one dimension row is missing (which is true) and turns red. The lookup is redirected as expected, the 'OLE DB Command' adds the missing row in the dimension table and the 'Lookup new key' and 'Union all' succeed.
Nevertheless then the process stops because of the errormessages mentioned before.
Albert
|||I'm more concerned with this error:Error: 0xC0047072 at Xforms, lookups en wegschrijven, DTS.Pipeline: No object exists with the ID 5317.|||It looks like your second failure is on the second lookup - after the new row is added. Lookups, by default, cache all their data at the beginning of the data flow. So even if you are adding the new rows, the cached values in the second lookup won't have it. To work around this, go to the Advanced tab of the second lookup, and check enable memory restriction, which turns off caching.|||
Hi John,
You are right! Never thought of that. Thanks (you save me a headache).
Albert.
Has anyone seen this strange behaviour?
I have a package which loads the fact data from Stage into Warehouse database. This packages normally handles early arriving facts. In that package I use lookup to check the dims which exists, and where they don't I populate the dimension and use the surrogate key to load the facts. This works fine.
I had a request to load 7 years worth of historical data. Instead of re-writing the package I took the package which handles early arriving facts and deleted the section which handles early arriving facts. I knew all the dimensions already exists and I don’t want to hinder the performance when I load millions of rows. During testing I found something very interesting.
If you have configured error path in the lookup component and removed the error path later, the package will NOT fail (won't produce error) even if the lookup can't find matching values.
Correct Behaviour Example 1:
[1] Stage fact table has 2 records, with product code 1 and 2.
[2] Warehouse Product table has only product code 1.
[3] Source - Lookup - Destination in the data flow task. Error port on lookup is not configured.
[4] From source we read 2 records, and the package will fail at lookup as it can't find Product Code 2.
Correct Behaviour Example 2:
[1] Stage fact table has 2 records, with product code 1 and 2.
[2] Warehouse Product table has only product code 1.
[3] Source - Lookup - Destination in the data flow task. Error port on lookup is configured to go to RowCount.
[4] From source we read 2 records, and the package will run successfully. It will put one record into warehouse table and send the invalid record into RowCount.
Incorrect Behaviour Example 3:
[1] Stage fact table has 2 records, with product code 1 and 2.
[2] Warehouse Product table has only product code 1.
[3] Source - Lookup - Destination in the data flow task. Delete the configured error port from lookup.
[4] From source we read 2 records, and the package will run successfully. It will put one record into warehouse table and discard the other.
My understanding if the error port is NOT configured as shown in example 2, it should fail as shown in example 1.
Am I missing a point or is this suppose to be a correct behaviour or is it a bug?
Thanks
Sutha
If you want Lookup to fail, set error disposition to Fail Component, instead of redirecting the row to error output - the error disposition controls the component fail/not-fail behavior.If nothing is connected to error path, the redirected rows are lost, but this is not signalled as failure - there are legitimate reasons to have packages where you don't care about such rows, and want to continue package without signalling an error.