Wednesday, March 28, 2012
looping through xml document with OpenXML
I am trying to compare records from an XML document to records in a table.
If the records in the XML document are not in the table then I wish to
insert those records.
How do I loop through the XML document within a store proc?
Has anyone done this who can provide a sample.
Thanks
Rather than looping, it'd probably be more efficient to use a NOT IN clause.
Have a look at the following example and see if you can adapt it for your
data:
USE Northwind
GO
DECLARE @.productsDoc nvarchar(2000)
SET @.productsDoc = '
<Products>
<Product ProductID="1">
<ProductName>Chai</ProductName>
<UnitPrice>10</UnitPrice>
</Product>
<Product ProductID="101">
<ProductName>Porridge</ProductName>
<UnitPrice>16</UnitPrice>
</Product>
<Product ProductID="102">
<ProductName>Haggis</ProductName>
<UnitPrice>19</UnitPrice>
</Product>
</Products>'
DECLARE @.h integer
EXEC sp_xml_preparedocument @.h OUTPUT, @.productsDoc
SET IDENTITY_INSERT Products ON
INSERT Products (ProductID, ProductName, UnitPrice)
SELECT * FROM
OPENXML(@.h, 'Products/Product', 2)
WITH
(
ProductID integer '@.ProductID',
ProductName nvarchar(40),
UnitPrice money
)
WHERE ProductID NOT IN
(SELECT ProductID FROM Products)
SET IDENTITY_INSERT Products OFF
Hope that helps,
Graeme
--
Graeme Malcolm
Principal Technologist
Content Master Ltd.
www.contentmaster.com
"Opa" <Opa@.discussions.microsoft.com> wrote in message
news:8CF49B69-9E84-4B7D-96AE-C2E0FCE854E2@.microsoft.com...
Hi,
I am trying to compare records from an XML document to records in a table.
If the records in the XML document are not in the table then I wish to
insert those records.
How do I loop through the XML document within a store proc?
Has anyone done this who can provide a sample.
Thanks
|||Thanks Graeme.
Looks like your code should work and I will give it a try.
"Graeme Malcolm" wrote:
> Rather than looping, it'd probably be more efficient to use a NOT IN clause.
> Have a look at the following example and see if you can adapt it for your
> data:
> USE Northwind
> GO
> DECLARE @.productsDoc nvarchar(2000)
> SET @.productsDoc = '
> <Products>
> <Product ProductID="1">
> <ProductName>Chai</ProductName>
> <UnitPrice>10</UnitPrice>
> </Product>
> <Product ProductID="101">
> <ProductName>Porridge</ProductName>
> <UnitPrice>16</UnitPrice>
> </Product>
> <Product ProductID="102">
> <ProductName>Haggis</ProductName>
> <UnitPrice>19</UnitPrice>
> </Product>
> </Products>'
> DECLARE @.h integer
> EXEC sp_xml_preparedocument @.h OUTPUT, @.productsDoc
> SET IDENTITY_INSERT Products ON
> INSERT Products (ProductID, ProductName, UnitPrice)
> SELECT * FROM
> OPENXML(@.h, 'Products/Product', 2)
> WITH
> (
> ProductID integer '@.ProductID',
> ProductName nvarchar(40),
> UnitPrice money
> )
> WHERE ProductID NOT IN
> (SELECT ProductID FROM Products)
> SET IDENTITY_INSERT Products OFF
> Hope that helps,
> Graeme
> --
> --
> Graeme Malcolm
> Principal Technologist
> Content Master Ltd.
> www.contentmaster.com
>
> "Opa" <Opa@.discussions.microsoft.com> wrote in message
> news:8CF49B69-9E84-4B7D-96AE-C2E0FCE854E2@.microsoft.com...
> Hi,
> I am trying to compare records from an XML document to records in a table.
> If the records in the XML document are not in the table then I wish to
> insert those records.
> How do I loop through the XML document within a store proc?
> Has anyone done this who can provide a sample.
> Thanks
>
>
Monday, March 26, 2012
Looping through non system database objects using SMO and VB.net
Hi,
We are using SMO to compare objects in our SQL Server database with another instance of sql server. I'm able to loop thourgh the stored procedures with no problem and retreave the names of them however it loops through all of the SPs even the system ones. This makes the loop take a while since it has to cycle through all of the system stored procedures. Is there a way to loop through only the dbo sps? I'm using VB.net
For Each sp In theserver.Databases.Item(DBName).StoredProcedures
x = sp.ToString
If sp.IsSystemObject = False Then
'MsgBox(x)
End If
Next
Thanks
Hi,
see this here:
http://www.sqlteam.com/item.asp?ItemID=23185
The following can be used to test your code against your databases:
For one of my databases with 902 Procedures this was a huge difference.
using System;
using Microsoft.SqlServer.Management.Smo;
using Microsoft.Win32;
namespace SMOProject
{
class Program
{
static void Main(string[] args)
{
EvaluateSMOTime(".", "SQLSErver2005", true);
EvaluateSMOTime(".", "SQLSErver2005", false);
Console.ReadLine();
}
internal static void EvaluateSMOTime(string ServerName, string DatabaseBaseName, bool SetDefaultField)
{
DateTime Before = DateTime.Now;
Server theServer = new Server(ServerName);
if (SetDefaultField)
theServer.SetDefaultInitFields(typeof(StoredProcedure), "IsSystemObject");
Database myDB = theServer.Databases[DatabaseBaseName];
foreach (StoredProcedure sp in myDB.StoredProcedures)
{
if (!sp.IsSystemObject)
{
Console.Write(".");
}
}
DateTime After = DateTime.Now;
TimeSpan Diff = After.Subtract(Before);
Console.WriteLine(string.Format("With{1} tweaking the DefaultInitFields : {2} ms", Diff.Milliseconds, SetDefaultField ? string.Empty : "on"));
}
}
}
HTH, Jens K. Suessmeyer.
http://www.sqlserver20005.de
Monday, March 12, 2012
Lookup Component Stalls
i am using a lookup component to do a typical SCD. Compare the Natural keys and if they are the same -- REdirect the rows and do whatever, If not present -- means the Error Rows -- redirect and do whatever.
WHen I use the component to do a Historical Load (which means -- there are no rows are in the Destination table) and put the Memory to Partial Cache -- the Data Flow STalls after about 46,000 rows, it just doesnt complete after that. But the moment I switch it to Full Cache -- it flows -- But Partial is what I am supposed to be using -- keeping in mind -- the Incremental Loads. Why does the component stall ?
I had used Partial Cache in an earlier project -- with a 18 Million Row Table --(albeit for incremental load) and it worked fine (though is was slow -- but tleast it worked) -- but now I am trying to load just 300,000 rows but it stalls.
I am using a 2GB RAM machine -- and set the Memory to 750 MB/500 MB nothing worked
I tried two different machines -- same thing happened.
Any insight will be appreciated.
I am jut wondering why you said that the partial cache is required in the historical load. I always use (after making sure the server have enough memory available) full cache.
How is your LKup set up? you should be using the only required columns; you said this is for a SCD; so you should be uisng a query with the busines/natural key columns only (please provide the number of columns and data type of the required columns). Also, how is the source component set up? are you using 'fast load'?
|||Hmmm so you use Full Cache always? If I leave it at Full Cache -- from next time onwards during Incremental Loads -- it will load all 290,000 rows into memory and possibly less than 10 rows have changed. I dont want to do that.
Yes I bring ONLY the required columns actually 4 columns NaturalKey, InferredMember, Hash1 and Hash2 -- all Integers
Yes I use Fast Load.
I use OLEDB for my Source.
|||
JaguarRDA wrote:
Hmmm so you use Full Cache always? If I leave it at Full Cache -- from next time onwards during Incremental Loads -- it will load all 290,000 rows into memory and possibly less than 10 rows have changed. I dont want to do that.
Yes I bring ONLY the required columns actually 4 columns NaturalKey, InferredMember, Hash1 and Hash2 -- all Integers
Yes I use Fast Load.
I use OLEDB for my Source.
An INT32 field, though (call it "natural key") will only take up 1.1MB of memory to store 290,000 rows. Not too bad, eh?|||
That is exactly my point. Even with 4 columns SSIS should be able to cache the whole result set in no time. I would recommned you to watch the progress tab in BIDS to check how long the lookup caching takes. Check also the task manager to see if there is any other process taking to much resources from the box. You can try to replace the OLE DB destination with a rowcount transformation just to test the transformation speed. You can remove the lookup and destination and use the same row count to mesaure the 'reading' speed from the source component.
Make sure you are using fast load in your OLE DB destination.