Showing posts with label customer. Show all posts
Showing posts with label customer. Show all posts

Friday, March 30, 2012

losing data from tables

Hi!
Is there any known problem with MSSQL 7.0 losing data from tables?
I have an application that is used by 150 seperate customer and each
customer have about 10 users. The customers have theirs database at a
server administrated by me. The application access its database through
ODBC. In maintenanse plan I have a short-time backup of the databases
three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
19:00.
Sometime a stange malfunction occur. Two tables are empty. The last 7
days backup from yeasterday is ok but when they started using the
application next morning the two tables are empty. I is always the same
two tables. This malfunction happens one to two times per year.
any proposal?
Best regards,
Bertil MorefältBertil,
To know what is happening you'd need to be able to view the activity for the
database. This can be done by a trace (Profiler) and or by viewing the
activities contained in the transaction log (third-party log tool). Might
check out:
http://www.lumigent.com/products/le_sql.html
for the future. This type of tool may allow you to "undo" the deletes as
well.
HTH
Jerry
"Bertil Morefalt" <bertil@.community.nospam> wrote in message
news:OxDh9qbyFHA.1028@.TK2MSFTNGP12.phx.gbl...
> Hi!
> Is there any known problem with MSSQL 7.0 losing data from tables?
> I have an application that is used by 150 seperate customer and each
> customer have about 10 users. The customers have theirs database at a
> server administrated by me. The application access its database through
> ODBC. In maintenanse plan I have a short-time backup of the databases
> three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
> 19:00.
> Sometime a stange malfunction occur. Two tables are empty. The last 7 days
> backup from yeasterday is ok but when they started using the application
> next morning the two tables are empty. I is always the same two tables.
> This malfunction happens one to two times per year.
> any proposal?
> Best regards,
> Bertil Morefält|||Hi Bertil,
It's really strange that two tables will become empty and I believe some
operations must be done. You may follow the Jerry's suggestion using
Profiler to trace the SQL actions.
Sincerely yours,
Michael Cheng
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
=====================================================This posting is provided "AS IS" with no warranties, and confers no rights.|||Hi,
I've recently had some data issues with my MS SQL Server 2000. From
time to time, we will lose some data from tables here and there. I
don't know what's causing it, but it's a pretty decent size database.
We won't loose a whole tables' worth, but some data.
About 50 tables, some of the tables, such as our: vendors, leads,
contacts, manufacturers can have up to 12,000 records, but that's about it.
About Jerry's suggestion.. Does Microsoft have any profilers for their
own products or do you always have to go third party? Thanks.
Marc F.
Michael Cheng [MSFT] wrote:
> Hi Bertil,
> It's really strange that two tables will become empty and I believe some
> operations must be done. You may follow the Jerry's suggestion using
> Profiler to trace the SQL actions.
>
> Sincerely yours,
> Michael Cheng
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> =====================================================> This posting is provided "AS IS" with no warranties, and confers no rights.
>|||You can use SQL Profiler which comes with SQL Server.
-Sue
On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
>Hi,
>I've recently had some data issues with my MS SQL Server 2000. From
>time to time, we will lose some data from tables here and there. I
>don't know what's causing it, but it's a pretty decent size database.
>We won't loose a whole tables' worth, but some data.
>About 50 tables, some of the tables, such as our: vendors, leads,
>contacts, manufacturers can have up to 12,000 records, but that's about it.
>About Jerry's suggestion.. Does Microsoft have any profilers for their
>own products or do you always have to go third party? Thanks.
>Marc F.
>
>Michael Cheng [MSFT] wrote:
>> Hi Bertil,
>> It's really strange that two tables will become empty and I believe some
>> operations must be done. You may follow the Jerry's suggestion using
>> Profiler to trace the SQL actions.
>>
>> Sincerely yours,
>> Michael Cheng
>> Microsoft Online Partner Support
>> When responding to posts, please "Reply to Group" via your newsreader so
>> that others may learn and benefit from your issue.
>> =====================================================>> This posting is provided "AS IS" with no warranties, and confers no rights.|||Hey Sue,
Thanks. I didn't know it had that. I'm playing with it now.. is there
any way for this to automatically run in the background or do I always
have to start it up?
Marc F.
Sue Hoegemeier wrote:
> You can use SQL Profiler which comes with SQL Server.
> -Sue
> On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>>Hi,
>>I've recently had some data issues with my MS SQL Server 2000. From
>>time to time, we will lose some data from tables here and there. I
>>don't know what's causing it, but it's a pretty decent size database.
>>We won't loose a whole tables' worth, but some data.
>>About 50 tables, some of the tables, such as our: vendors, leads,
>>contacts, manufacturers can have up to 12,000 records, but that's about it.
>>About Jerry's suggestion.. Does Microsoft have any profilers for their
>>own products or do you always have to go third party? Thanks.
>>Marc F.
>>
>>Michael Cheng [MSFT] wrote:
>>Hi Bertil,
>>It's really strange that two tables will become empty and I believe some
>>operations must be done. You may follow the Jerry's suggestion using
>>Profiler to trace the SQL actions.
>>
>>Sincerely yours,
>>Michael Cheng
>>Microsoft Online Partner Support
>>When responding to posts, please "Reply to Group" via your newsreader so
>>that others may learn and benefit from your issue.
>>=====================================================>>This posting is provided "AS IS" with no warranties, and confers no rights.
>|||Hi Marc,
Sure...You can run the same thing that Profiler runs using
server side traces - produces a trc file that you can open
in profiler or you can view as well as import into a table
using fn_trace_gettable.
The quickest way to get started with it is to just use
Profiler to create the trace you want. Then in Profiler, go
to File and select Script Trace. This will create the server
side trace for you - all you would really need to change on
the script is the destination file for the trace. You can
execute traces in jobs if you want.
If you execute the entire script you generated with
Profiler, it will start up the trace. You can see in the
script how the tracing is done with the trace stored
procedures. The stored procedures for traces are named
sp_trace_xxx.
There are also a few fn_trace_xxx trace functions you can
use as well. To get information on all traces that are
running, use:
SELECT * FROM :: fn_trace_getinfo(default)
You get the trace id when you execute this which can help
you in using the other functions, stored procedures that use
the trace id for a parameter (such as sp_trace_setstatus).
Hope that helps get you going with it -
-Sue
On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
>Hey Sue,
>Thanks. I didn't know it had that. I'm playing with it now.. is there
>any way for this to automatically run in the background or do I always
>have to start it up?
>Marc F.
>Sue Hoegemeier wrote:
>> You can use SQL Profiler which comes with SQL Server.
>> -Sue
>> On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
>> <marc@.digitalalias.net> wrote:
>>
>>Hi,
>>I've recently had some data issues with my MS SQL Server 2000. From
>>time to time, we will lose some data from tables here and there. I
>>don't know what's causing it, but it's a pretty decent size database.
>>We won't loose a whole tables' worth, but some data.
>>About 50 tables, some of the tables, such as our: vendors, leads,
>>contacts, manufacturers can have up to 12,000 records, but that's about it.
>>About Jerry's suggestion.. Does Microsoft have any profilers for their
>>own products or do you always have to go third party? Thanks.
>>Marc F.
>>
>>Michael Cheng [MSFT] wrote:
>>Hi Bertil,
>>It's really strange that two tables will become empty and I believe some
>>operations must be done. You may follow the Jerry's suggestion using
>>Profiler to trace the SQL actions.
>>
>>Sincerely yours,
>>Michael Cheng
>>Microsoft Online Partner Support
>>When responding to posts, please "Reply to Group" via your newsreader so
>>that others may learn and benefit from your issue.
>>=====================================================>>This posting is provided "AS IS" with no warranties, and confers no rights.
>>|||Hi Sue,
Thanks for that info. I was wondering if there are any tutorials or
docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
in 21 Days", but it only mentions profiler. I was playing with it and a
lot of the events and their displayed data doesn't make sense to me.
I'm trying them one-by-one to see what kind of result they show after I
do something related to our database. As you probably know.. it's
becoming a long process. :D
Marc F.
Sue Hoegemeier wrote:
> Hi Marc,
> Sure...You can run the same thing that Profiler runs using
> server side traces - produces a trc file that you can open
> in profiler or you can view as well as import into a table
> using fn_trace_gettable.
> The quickest way to get started with it is to just use
> Profiler to create the trace you want. Then in Profiler, go
> to File and select Script Trace. This will create the server
> side trace for you - all you would really need to change on
> the script is the destination file for the trace. You can
> execute traces in jobs if you want.
> If you execute the entire script you generated with
> Profiler, it will start up the trace. You can see in the
> script how the tracing is done with the trace stored
> procedures. The stored procedures for traces are named
> sp_trace_xxx.
> There are also a few fn_trace_xxx trace functions you can
> use as well. To get information on all traces that are
> running, use:
> SELECT * FROM :: fn_trace_getinfo(default)
> You get the trace id when you execute this which can help
> you in using the other functions, stored procedures that use
> the trace id for a parameter (such as sp_trace_setstatus).
> Hope that helps get you going with it -
> -Sue
> On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>>Hey Sue,
>>Thanks. I didn't know it had that. I'm playing with it now.. is there
>>any way for this to automatically run in the background or do I always
>>have to start it up?
>>Marc F.
>>Sue Hoegemeier wrote:
>>You can use SQL Profiler which comes with SQL Server.
>>-Sue
>>On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
>><marc@.digitalalias.net> wrote:
>>
>>Hi,
>>I've recently had some data issues with my MS SQL Server 2000. From
>>time to time, we will lose some data from tables here and there. I
>>don't know what's causing it, but it's a pretty decent size database.
>>We won't loose a whole tables' worth, but some data.
>>About 50 tables, some of the tables, such as our: vendors, leads,
>>contacts, manufacturers can have up to 12,000 records, but that's about it.
>>About Jerry's suggestion.. Does Microsoft have any profilers for their
>>own products or do you always have to go third party? Thanks.
>>Marc F.
>>
>>Michael Cheng [MSFT] wrote:
>>
>>Hi Bertil,
>>It's really strange that two tables will become empty and I believe some
>>operations must be done. You may follow the Jerry's suggestion using
>>Profiler to trace the SQL actions.
>>
>>Sincerely yours,
>>Michael Cheng
>>Microsoft Online Partner Support
>>When responding to posts, please "Reply to Group" via your newsreader so
>>that others may learn and benefit from your issue.
>>=====================================================>>This posting is provided "AS IS" with no warranties, and confers no rights.
>>
>|||Marc,
SQL Magazine had some good articles on Profiler - you can
try searching for those at: www.sqlmag.com
This web cast may help:
Support WebCast: SQL Server 2000 Profiler: What's New and
How to Effectively Use It
http://support.microsoft.com/default.aspx?id=325263
And you can find Profiler tips at:
http://www.sql-server-performance.com/sql_server_profiler_tips.asp
-Sue
On Mon, 23 Jan 2006 09:49:02 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
>Hi Sue,
>Thanks for that info. I was wondering if there are any tutorials or
>docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
>in 21 Days", but it only mentions profiler. I was playing with it and a
>lot of the events and their displayed data doesn't make sense to me.
>I'm trying them one-by-one to see what kind of result they show after I
>do something related to our database. As you probably know.. it's
>becoming a long process. :D
>Marc F.
>Sue Hoegemeier wrote:
>> Hi Marc,
>> Sure...You can run the same thing that Profiler runs using
>> server side traces - produces a trc file that you can open
>> in profiler or you can view as well as import into a table
>> using fn_trace_gettable.
>> The quickest way to get started with it is to just use
>> Profiler to create the trace you want. Then in Profiler, go
>> to File and select Script Trace. This will create the server
>> side trace for you - all you would really need to change on
>> the script is the destination file for the trace. You can
>> execute traces in jobs if you want.
>> If you execute the entire script you generated with
>> Profiler, it will start up the trace. You can see in the
>> script how the tracing is done with the trace stored
>> procedures. The stored procedures for traces are named
>> sp_trace_xxx.
>> There are also a few fn_trace_xxx trace functions you can
>> use as well. To get information on all traces that are
>> running, use:
>> SELECT * FROM :: fn_trace_getinfo(default)
>> You get the trace id when you execute this which can help
>> you in using the other functions, stored procedures that use
>> the trace id for a parameter (such as sp_trace_setstatus).
>> Hope that helps get you going with it -
>> -Sue
>> On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
>> <marc@.digitalalias.net> wrote:
>>
>>Hey Sue,
>>Thanks. I didn't know it had that. I'm playing with it now.. is there
>>any way for this to automatically run in the background or do I always
>>have to start it up?
>>Marc F.
>>Sue Hoegemeier wrote:
>>You can use SQL Profiler which comes with SQL Server.
>>-Sue
>>On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
>><marc@.digitalalias.net> wrote:
>>
>>Hi,
>>I've recently had some data issues with my MS SQL Server 2000. From
>>time to time, we will lose some data from tables here and there. I
>>don't know what's causing it, but it's a pretty decent size database.
>>We won't loose a whole tables' worth, but some data.
>>About 50 tables, some of the tables, such as our: vendors, leads,
>>contacts, manufacturers can have up to 12,000 records, but that's about it.
>>About Jerry's suggestion.. Does Microsoft have any profilers for their
>>own products or do you always have to go third party? Thanks.
>>Marc F.
>>
>>Michael Cheng [MSFT] wrote:
>>
>>Hi Bertil,
>>It's really strange that two tables will become empty and I believe some
>>operations must be done. You may follow the Jerry's suggestion using
>>Profiler to trace the SQL actions.
>>
>>Sincerely yours,
>>Michael Cheng
>>Microsoft Online Partner Support
>>When responding to posts, please "Reply to Group" via your newsreader so
>>that others may learn and benefit from your issue.
>>=====================================================>>This posting is provided "AS IS" with no warranties, and confers no rights.
>>
>>

losing data from tables

Hi!
Is there any known problem with MSSQL 7.0 losing data from tables?
I have an application that is used by 150 seperate customer and each
customer have about 10 users. The customers have theirs database at a
server administrated by me. The application access its database through
ODBC. In maintenanse plan I have a short-time backup of the databases
three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
19:00.
Sometime a stange malfunction occur. Two tables are empty. The last 7
days backup from yeasterday is ok but when they started using the
application next morning the two tables are empty. I is always the same
two tables. This malfunction happens one to two times per year.
any proposal?
Best regards,
Bertil Moreflt
Bertil,
To know what is happening you'd need to be able to view the activity for the
database. This can be done by a trace (Profiler) and or by viewing the
activities contained in the transaction log (third-party log tool). Might
check out:
http://www.lumigent.com/products/le_sql.html
for the future. This type of tool may allow you to "undo" the deletes as
well.
HTH
Jerry
"Bertil Morefalt" <bertil@.community.nospam> wrote in message
news:OxDh9qbyFHA.1028@.TK2MSFTNGP12.phx.gbl...
> Hi!
> Is there any known problem with MSSQL 7.0 losing data from tables?
> I have an application that is used by 150 seperate customer and each
> customer have about 10 users. The customers have theirs database at a
> server administrated by me. The application access its database through
> ODBC. In maintenanse plan I have a short-time backup of the databases
> three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
> 19:00.
> Sometime a stange malfunction occur. Two tables are empty. The last 7 days
> backup from yeasterday is ok but when they started using the application
> next morning the two tables are empty. I is always the same two tables.
> This malfunction happens one to two times per year.
> any proposal?
> Best regards,
> Bertil Moreflt
|||Hi Bertil,
It's really strange that two tables will become empty and I believe some
operations must be done. You may follow the Jerry's suggestion using
Profiler to trace the SQL actions.
Sincerely yours,
Michael Cheng
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.
|||Hi,
I've recently had some data issues with my MS SQL Server 2000. From
time to time, we will lose some data from tables here and there. I
don't know what's causing it, but it's a pretty decent size database.
We won't loose a whole tables' worth, but some data.
About 50 tables, some of the tables, such as our: vendors, leads,
contacts, manufacturers can have up to 12,000 records, but that's about it.
About Jerry's suggestion.. Does Microsoft have any profilers for their
own products or do you always have to go third party? Thanks.
Marc F.
Michael Cheng [MSFT] wrote:
> Hi Bertil,
> It's really strange that two tables will become empty and I believe some
> operations must be done. You may follow the Jerry's suggestion using
> Profiler to trace the SQL actions.
>
> Sincerely yours,
> Michael Cheng
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> ================================================== ===
> This posting is provided "AS IS" with no warranties, and confers no rights.
>
|||You can use SQL Profiler which comes with SQL Server.
-Sue
On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hi,
>I've recently had some data issues with my MS SQL Server 2000. From
>time to time, we will lose some data from tables here and there. I
>don't know what's causing it, but it's a pretty decent size database.
>We won't loose a whole tables' worth, but some data.
>About 50 tables, some of the tables, such as our: vendors, leads,
>contacts, manufacturers can have up to 12,000 records, but that's about it.
>About Jerry's suggestion.. Does Microsoft have any profilers for their
>own products or do you always have to go third party? Thanks.
>Marc F.
>
>Michael Cheng [MSFT] wrote:
|||Hey Sue,
Thanks. I didn't know it had that. I'm playing with it now.. is there
any way for this to automatically run in the background or do I always
have to start it up?
Marc F.
Sue Hoegemeier wrote:
> You can use SQL Profiler which comes with SQL Server.
> -Sue
> On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>
|||Hi Marc,
Sure...You can run the same thing that Profiler runs using
server side traces - produces a trc file that you can open
in profiler or you can view as well as import into a table
using fn_trace_gettable.
The quickest way to get started with it is to just use
Profiler to create the trace you want. Then in Profiler, go
to File and select Script Trace. This will create the server
side trace for you - all you would really need to change on
the script is the destination file for the trace. You can
execute traces in jobs if you want.
If you execute the entire script you generated with
Profiler, it will start up the trace. You can see in the
script how the tracing is done with the trace stored
procedures. The stored procedures for traces are named
sp_trace_xxx.
There are also a few fn_trace_xxx trace functions you can
use as well. To get information on all traces that are
running, use:
SELECT * FROM :: fn_trace_getinfo(default)
You get the trace id when you execute this which can help
you in using the other functions, stored procedures that use
the trace id for a parameter (such as sp_trace_setstatus).
Hope that helps get you going with it -
-Sue
On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hey Sue,
>Thanks. I didn't know it had that. I'm playing with it now.. is there
>any way for this to automatically run in the background or do I always
>have to start it up?
>Marc F.
>Sue Hoegemeier wrote:
|||Hi Sue,
Thanks for that info. I was wondering if there are any tutorials or
docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
in 21 Days", but it only mentions profiler. I was playing with it and a
lot of the events and their displayed data doesn't make sense to me.
I'm trying them one-by-one to see what kind of result they show after I
do something related to our database. As you probably know.. it's
becoming a long process. :D
Marc F.
Sue Hoegemeier wrote:
> Hi Marc,
> Sure...You can run the same thing that Profiler runs using
> server side traces - produces a trc file that you can open
> in profiler or you can view as well as import into a table
> using fn_trace_gettable.
> The quickest way to get started with it is to just use
> Profiler to create the trace you want. Then in Profiler, go
> to File and select Script Trace. This will create the server
> side trace for you - all you would really need to change on
> the script is the destination file for the trace. You can
> execute traces in jobs if you want.
> If you execute the entire script you generated with
> Profiler, it will start up the trace. You can see in the
> script how the tracing is done with the trace stored
> procedures. The stored procedures for traces are named
> sp_trace_xxx.
> There are also a few fn_trace_xxx trace functions you can
> use as well. To get information on all traces that are
> running, use:
> SELECT * FROM :: fn_trace_getinfo(default)
> You get the trace id when you execute this which can help
> you in using the other functions, stored procedures that use
> the trace id for a parameter (such as sp_trace_setstatus).
> Hope that helps get you going with it -
> -Sue
> On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>
|||Marc,
SQL Magazine had some good articles on Profiler - you can
try searching for those at: www.sqlmag.com
This web cast may help:
Support WebCast: SQL Server 2000 Profiler: What's New and
How to Effectively Use It
http://support.microsoft.com/default.aspx?id=325263
And you can find Profiler tips at:
http://www.sql-server-performance.co...filer_tips.asp
-Sue
On Mon, 23 Jan 2006 09:49:02 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hi Sue,
>Thanks for that info. I was wondering if there are any tutorials or
>docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
>in 21 Days", but it only mentions profiler. I was playing with it and a
>lot of the events and their displayed data doesn't make sense to me.
>I'm trying them one-by-one to see what kind of result they show after I
>do something related to our database. As you probably know.. it's
>becoming a long process. :D
>Marc F.
>Sue Hoegemeier wrote:

losing data from tables

Hi!
Is there any known problem with MSSQL 7.0 losing data from tables?
I have an application that is used by 150 seperate customer and each
customer have about 10 users. The customers have theirs database at a
server administrated by me. The application access its database through
ODBC. In maintenanse plan I have a short-time backup of the databases
three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
19:00.
Sometime a stange malfunction occur. Two tables are empty. The last 7
days backup from yeasterday is ok but when they started using the
application next morning the two tables are empty. I is always the same
two tables. This malfunction happens one to two times per year.
any proposal?
Best regards,
Bertil MorefltBertil,
To know what is happening you'd need to be able to view the activity for the
database. This can be done by a trace (Profiler) and or by viewing the
activities contained in the transaction log (third-party log tool). Might
check out:
http://www.lumigent.com/products/le_sql.html
for the future. This type of tool may allow you to "undo" the deletes as
well.
HTH
Jerry
"Bertil Morefalt" <bertil@.community.nospam> wrote in message
news:OxDh9qbyFHA.1028@.TK2MSFTNGP12.phx.gbl...
> Hi!
> Is there any known problem with MSSQL 7.0 losing data from tables?
> I have an application that is used by 150 seperate customer and each
> customer have about 10 users. The customers have theirs database at a
> server administrated by me. The application access its database through
> ODBC. In maintenanse plan I have a short-time backup of the databases
> three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
> 19:00.
> Sometime a stange malfunction occur. Two tables are empty. The last 7 days
> backup from yeasterday is ok but when they started using the application
> next morning the two tables are empty. I is always the same two tables.
> This malfunction happens one to two times per year.
> any proposal?
> Best regards,
> Bertil Moreflt|||Hi Bertil,
It's really strange that two tables will become empty and I believe some
operations must be done. You may follow the Jerry's suggestion using
Profiler to trace the SQL actions.
Sincerely yours,
Michael Cheng
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.|||Hi,
I've recently had some data issues with my MS SQL Server 2000. From
time to time, we will lose some data from tables here and there. I
don't know what's causing it, but it's a pretty decent size database.
We won't loose a whole tables' worth, but some data.
About 50 tables, some of the tables, such as our: vendors, leads,
contacts, manufacturers can have up to 12,000 records, but that's about it.
About Jerry's suggestion.. Does Microsoft have any profilers for their
own products or do you always have to go third party? Thanks.
Marc F.
Michael Cheng [MSFT] wrote:
> Hi Bertil,
> It's really strange that two tables will become empty and I believe some
> operations must be done. You may follow the Jerry's suggestion using
> Profiler to trace the SQL actions.
>
> Sincerely yours,
> Michael Cheng
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> ========================================
=============
> This posting is provided "AS IS" with no warranties, and confers no rights
.
>|||You can use SQL Profiler which comes with SQL Server.
-Sue
On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hi,
>I've recently had some data issues with my MS SQL Server 2000. From
>time to time, we will lose some data from tables here and there. I
>don't know what's causing it, but it's a pretty decent size database.
>We won't loose a whole tables' worth, but some data.
>About 50 tables, some of the tables, such as our: vendors, leads,
>contacts, manufacturers can have up to 12,000 records, but that's about it.
>About Jerry's suggestion.. Does Microsoft have any profilers for their
>own products or do you always have to go third party? Thanks.
>Marc F.
>
>Michael Cheng [MSFT] wrote:|||Hey Sue,
Thanks. I didn't know it had that. I'm playing with it now.. is there
any way for this to automatically run in the background or do I always
have to start it up?
Marc F.
Sue Hoegemeier wrote:
> You can use SQL Profiler which comes with SQL Server.
> -Sue
> On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>|||Hi Marc,
Sure...You can run the same thing that Profiler runs using
server side traces - produces a trc file that you can open
in profiler or you can view as well as import into a table
using fn_trace_gettable.
The quickest way to get started with it is to just use
Profiler to create the trace you want. Then in Profiler, go
to File and select Script Trace. This will create the server
side trace for you - all you would really need to change on
the script is the destination file for the trace. You can
execute traces in jobs if you want.
If you execute the entire script you generated with
Profiler, it will start up the trace. You can see in the
script how the tracing is done with the trace stored
procedures. The stored procedures for traces are named
sp_trace_xxx.
There are also a few fn_trace_xxx trace functions you can
use as well. To get information on all traces that are
running, use:
SELECT * FROM :: fn_trace_getinfo(default)
You get the trace id when you execute this which can help
you in using the other functions, stored procedures that use
the trace id for a parameter (such as sp_trace_setstatus).
Hope that helps get you going with it -
-Sue
On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hey Sue,
>Thanks. I didn't know it had that. I'm playing with it now.. is there
>any way for this to automatically run in the background or do I always
>have to start it up?
>Marc F.
>Sue Hoegemeier wrote:|||Hi Sue,
Thanks for that info. I was wondering if there are any tutorials or
docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
in 21 Days", but it only mentions profiler. I was playing with it and a
lot of the events and their displayed data doesn't make sense to me.
I'm trying them one-by-one to see what kind of result they show after I
do something related to our database. As you probably know.. it's
becoming a long process. :D
Marc F.
Sue Hoegemeier wrote:
> Hi Marc,
> Sure...You can run the same thing that Profiler runs using
> server side traces - produces a trc file that you can open
> in profiler or you can view as well as import into a table
> using fn_trace_gettable.
> The quickest way to get started with it is to just use
> Profiler to create the trace you want. Then in Profiler, go
> to File and select Script Trace. This will create the server
> side trace for you - all you would really need to change on
> the script is the destination file for the trace. You can
> execute traces in jobs if you want.
> If you execute the entire script you generated with
> Profiler, it will start up the trace. You can see in the
> script how the tracing is done with the trace stored
> procedures. The stored procedures for traces are named
> sp_trace_xxx.
> There are also a few fn_trace_xxx trace functions you can
> use as well. To get information on all traces that are
> running, use:
> SELECT * FROM :: fn_trace_getinfo(default)
> You get the trace id when you execute this which can help
> you in using the other functions, stored procedures that use
> the trace id for a parameter (such as sp_trace_setstatus).
> Hope that helps get you going with it -
> -Sue
> On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>|||Marc,
SQL Magazine had some good articles on Profiler - you can
try searching for those at: www.sqlmag.com
This web cast may help:
Support WebCast: SQL Server 2000 Profiler: What's New and
How to Effectively Use It
http://support.microsoft.com/default.aspx?id=325263
And you can find Profiler tips at:
http://www.sql-server-performance.c...ofiler_tips.asp
-Sue
On Mon, 23 Jan 2006 09:49:02 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hi Sue,
>Thanks for that info. I was wondering if there are any tutorials or
>docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
>in 21 Days", but it only mentions profiler. I was playing with it and a
>lot of the events and their displayed data doesn't make sense to me.
>I'm trying them one-by-one to see what kind of result they show after I
>do something related to our database. As you probably know.. it's
>becoming a long process. :D
>Marc F.
>Sue Hoegemeier wrote:

Losing connections with SQL 2005

We have a VB6 application that I recently converted over from SQL Server 2000
to SQL Server 2005. At numerous customer locations I am now experiencing
occasional "lost connection" problems. This never happened with SQL 2000.
Our client app does not use pooled connections and I have not been able to
find a clean way to recover from the broken connection problem. If I check
the status of the Connection it claims that it is open, it is only if I try
to make a database request that I get the error back. Re-opening the
connection solves the problem.
Is anyone aware of SQL Server 2005 doing anything different with respect to
keeping database connections open?
Thanks,
Intermittent connectivity issues are generally difficult to
troubleshoot. That being said...what do you define as a
"lost connection". What are the errors? What errors do the
clients receive? What errors are in the logs? Have you run
profiler traces to try to get more information on what may
be going on?
Does the application keep connections open for some reason
after processing a database request?
-Sue
On Fri, 9 Feb 2007 13:36:01 -0800, Darrel Miller
<DarrelMiller@.discussions.microsoft.com> wrote:

>We have a VB6 application that I recently converted over from SQL Server 2000
>to SQL Server 2005. At numerous customer locations I am now experiencing
>occasional "lost connection" problems. This never happened with SQL 2000.
>Our client app does not use pooled connections and I have not been able to
>find a clean way to recover from the broken connection problem. If I check
>the status of the Connection it claims that it is open, it is only if I try
>to make a database request that I get the error back. Re-opening the
>connection solves the problem.
>Is anyone aware of SQL Server 2005 doing anything different with respect to
>keeping database connections open?
>Thanks,
|||Hi Sue,
The error number reported is either 3604 or 16389 from Microsoft OLE DB
provider for SQL Server.
The best way I can describe "lost connection" is that you can no longer do
Connection.Execute or pass the connection object to Recordset.Open.
The behaviour is exactly as if you unplugged a network cable and then
replugged it. In fact I have been doing this to try and develop error
handling code to recover from the situtation.

> Does the application keep connections open for some reason
> after processing a database request?
As I mentioned our application does not use connection pooling and therefore
we obviously keep the connection open after a database request. Each client
application maintains two connections for the lifetime of the application.
One connection is a read-only connection, the other is a read write. Our
target customer is less than 50 users, so keeping a 100 connection objects
open is not a problem for the server.
My problem is that SQL 2005 seems to have much more fragile database
connections than SQL 2000 did. I was just wondering if there were some SQL
settings that could increase the robustness.
Thanks,
Darrel
"Sue Hoegemeier" wrote:
> Intermittent connectivity issues are generally difficult to
> troubleshoot. That being said...what do you define as a
> "lost connection". What are the errors? What errors do the
> clients receive? What errors are in the logs? Have you run
> profiler traces to try to get more information on what may
> be going on?
> Does the application keep connections open for some reason
> after processing a database request?
> -Sue
> On Fri, 9 Feb 2007 13:36:01 -0800, Darrel Miller
> <DarrelMiller@.discussions.microsoft.com> wrote:
>
>
|||On Tue, 13 Feb 2007 09:04:01 -0800, Darrel Miller
<DarrelMiller@.discussions.microsoft.com> wrote:

>Hi Sue,
>The error number reported is either 3604 or 16389 from Microsoft OLE DB
>provider for SQL Server.
>The best way I can describe "lost connection" is that you can no longer do
>Connection.Execute or pass the connection object to Recordset.Open.
Then that's an application issue or address from the
application end. Those are things you see in the app not in
SQL Server.

>As I mentioned our application does not use connection pooling and therefore
>we obviously keep the connection open after a database request.
That doesn't necessarily mean you would or should keep two
connections open for the lifetime of the application but
that's a whole different thing.

>My problem is that SQL 2005 seems to have much more fragile database
>connections than SQL 2000 did. I was just wondering if there were some SQL
>settings that could increase the robustness.
Part of the problem in try to address it from a SQL Server
end is that you can't really quantify it. It's not real
clear what data access changes may have happened with the
application either - did you also switch over to ADO.Net,
change MDAC versions, drivers/providers? You need to look at
network, OS, potential issues as well.
-Sue
|||Darrel, we have exactly this issue, but with a different front end language.
We converted in Nov 06 and immediately our remote clients began having
serious issues just as you describe. We reduced that somewhat by
implementing an ado connection object that "pings" periodically to keep the
connection alive, but in recent weeks the problem has gotten worse again.
I ran into Bill Vaughn at a user group meeting late last year and picked his
brain (or should I say "opened the tap"). The bottom line seemed to be: the
most effective solution is to upgrade the app to dotnet; changing to sql
native client can help somewhat; "pinging" can also help. Upgrading our app
is a really big task, and we haven't done it. We did the ping thing and the
native client thing, but as I said, those things alone are not the solution.
We opened a ticket with MS and spend hours in emails and on the phone with
overseas MS support people on this issue. They really didn't have a clue.
By the way, look at Bill's posting on 3/2/07 in which he says unequivocably
that for a win app, connecting and staying connected is a preferred
practice. Obviously Sue Hoegemeier does not agree.
My view is that MS is either unaware of, in denial about, or unwilling to
acknowledge and correct this issue.
Jeremy
"Darrel Miller" <DarrelMiller@.discussions.microsoft.com> wrote in message
news:46643664-0C8E-4C94-8DE4-4704518EAE8A@.microsoft.com...
> We have a VB6 application that I recently converted over from SQL Server
> 2000
> to SQL Server 2005. At numerous customer locations I am now experiencing
> occasional "lost connection" problems. This never happened with SQL 2000.
> Our client app does not use pooled connections and I have not been able to
> find a clean way to recover from the broken connection problem. If I
> check
> the status of the Connection it claims that it is open, it is only if I
> try
> to make a database request that I get the error back. Re-opening the
> connection solves the problem.
> Is anyone aware of SQL Server 2005 doing anything different with respect
> to
> keeping database connections open?
> Thanks,
>
|||Bill, I do appreciate your detailed and thoughtful response. However, our
application did not experience dropped connections under sql 2k (or they
were so insignificant no one bothered me about it), whereas despite our best
efforts, I see many error reports every day with sql 2k5. Seems quite clear
that the problem is dramatically worse with the new version.
Jeremy
"William (Bill) Vaughn" <billvaRemoveThis@.betav.com> wrote in message
news:OaM5uelbHHA.4808@.TK2MSFTNGP04.phx.gbl...
> Ahem... the ability to stay connected once connected has been a perennial
> issue. I can remember back to the earliest versions of SQL Server when
> connections "aged" out for some unknown reason. No, I don't know the
> actual reason, but symptoms lead me to believe that there is an ack/nak
> protocol in the TDS that basically pings the application at the driver
> level to see if the server or client is still there. This makes sense as
> SQL Server automatically detects if the client falls into the sea or gets
> disconnected for some reason. This polling has to be done on the server as
> it scans the current connections to see if there is activity (work to be
> done) or if there is still a client (albeit inactive) attached. I suspect
> that if the client is busy (running Age of Empires or Windows OneCare) and
> can't respond in time or the event is simply lost (as can happen in
> Windows message-loop-based applications) the server thinks the client has
> dropped off. My solution to this is to keep polling the server (once every
> 30 seconds or so seems to be more than enough) to keep the connection
> alive. Does the Connection pooling mechanism do this on its own? BHOM, but
> I would not be surprised. I have not seen evidence of this in the Profiler
> logs though. I suspect it's done at the TDS provider level. Since we don't
> have access to those APIs, one can constantly close and reopen the
> connection (use the Connection pool) or poll (which is pretty easy and
> faster as it does not require reauthenication, resetting the connection
> and loss of the server state). To poll you can execute any SQL like "USE
> <initial catalog>" or "DECLARE i as INT".
> I'm not sure I said that keeping a connection open is a "preferred"
> method, just a perfectly viable method--assuming you know the costs.
> hth
> --
> ____________________________________
> William (Bill) Vaughn
> Author, Mentor, Consultant
> Microsoft MVP
> INETA Speaker
> www.betav.com/blog/billva
> www.betav.com
> Please reply only to the newsgroup so that others can benefit.
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
> __________________________________
> Visit www.hitchhikerguides.net to get more information on my latest book:
> Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
> and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
> ------
> "Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
> news:uAzf37ibHHA.2088@.TK2MSFTNGP04.phx.gbl...
>
|||Ok, there are a lot of "fixes" in SS2K5 that might have had this unintended
side-effect. Have you installed the latest SP2? I think there is a SP2a (or
b).
I would log a bug with MS. The (more or less) monitor this group (but pay
more attention to the MSDN forums).
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
Visit www.hitchhikerguides.net to get more information on my latest book:
Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
------
"Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
news:Oa3%2321nbHHA.2300@.TK2MSFTNGP06.phx.gbl...
> Bill, I do appreciate your detailed and thoughtful response. However, our
> application did not experience dropped connections under sql 2k (or they
> were so insignificant no one bothered me about it), whereas despite our
> best efforts, I see many error reports every day with sql 2k5. Seems
> quite clear that the problem is dramatically worse with the new version.
> Jeremy
>
> "William (Bill) Vaughn" <billvaRemoveThis@.betav.com> wrote in message
> news:OaM5uelbHHA.4808@.TK2MSFTNGP04.phx.gbl...
>
sql

Losing connections with SQL 2005

We have a VB6 application that I recently converted over from SQL Server 200
0
to SQL Server 2005. At numerous customer locations I am now experiencing
occasional "lost connection" problems. This never happened with SQL 2000.
Our client app does not use pooled connections and I have not been able to
find a clean way to recover from the broken connection problem. If I check
the status of the Connection it claims that it is open, it is only if I try
to make a database request that I get the error back. Re-opening the
connection solves the problem.
Is anyone aware of SQL Server 2005 doing anything different with respect to
keeping database connections open?
Thanks,Intermittent connectivity issues are generally difficult to
troubleshoot. That being said...what do you define as a
"lost connection". What are the errors? What errors do the
clients receive? What errors are in the logs? Have you run
profiler traces to try to get more information on what may
be going on?
Does the application keep connections open for some reason
after processing a database request?
-Sue
On Fri, 9 Feb 2007 13:36:01 -0800, Darrel Miller
<DarrelMiller@.discussions.microsoft.com> wrote:

>We have a VB6 application that I recently converted over from SQL Server 20
00
>to SQL Server 2005. At numerous customer locations I am now experiencing
>occasional "lost connection" problems. This never happened with SQL 2000.
>Our client app does not use pooled connections and I have not been able to
>find a clean way to recover from the broken connection problem. If I check
>the status of the Connection it claims that it is open, it is only if I try
>to make a database request that I get the error back. Re-opening the
>connection solves the problem.
>Is anyone aware of SQL Server 2005 doing anything different with respect to
>keeping database connections open?
>Thanks,|||Hi Sue,
The error number reported is either 3604 or 16389 from Microsoft OLE DB
provider for SQL Server.
The best way I can describe "lost connection" is that you can no longer do
Connection.Execute or pass the connection object to Recordset.Open.
The behaviour is exactly as if you unplugged a network cable and then
replugged it. In fact I have been doing this to try and develop error
handling code to recover from the situtation.

> Does the application keep connections open for some reason
> after processing a database request?
As I mentioned our application does not use connection pooling and therefore
we obviously keep the connection open after a database request. Each client
application maintains two connections for the lifetime of the application.
One connection is a read-only connection, the other is a read write. Our
target customer is less than 50 users, so keeping a 100 connection objects
open is not a problem for the server.
My problem is that SQL 2005 seems to have much more fragile database
connections than SQL 2000 did. I was just wondering if there were some SQL
settings that could increase the robustness.
Thanks,
Darrel
"Sue Hoegemeier" wrote:
> Intermittent connectivity issues are generally difficult to
> troubleshoot. That being said...what do you define as a
> "lost connection". What are the errors? What errors do the
> clients receive? What errors are in the logs? Have you run
> profiler traces to try to get more information on what may
> be going on?
> Does the application keep connections open for some reason
> after processing a database request?
> -Sue
> On Fri, 9 Feb 2007 13:36:01 -0800, Darrel Miller
> <DarrelMiller@.discussions.microsoft.com> wrote:
>
>|||On Tue, 13 Feb 2007 09:04:01 -0800, Darrel Miller
<DarrelMiller@.discussions.microsoft.com> wrote:

>Hi Sue,
>The error number reported is either 3604 or 16389 from Microsoft OLE DB
>provider for SQL Server.
>The best way I can describe "lost connection" is that you can no longer do
>Connection.Execute or pass the connection object to Recordset.Open.
Then that's an application issue or address from the
application end. Those are things you see in the app not in
SQL Server.

>As I mentioned our application does not use connection pooling and therefor
e
>we obviously keep the connection open after a database request.
That doesn't necessarily mean you would or should keep two
connections open for the lifetime of the application but
that's a whole different thing.

>My problem is that SQL 2005 seems to have much more fragile database
>connections than SQL 2000 did. I was just wondering if there were some SQL
>settings that could increase the robustness.
Part of the problem in try to address it from a SQL Server
end is that you can't really quantify it. It's not real
clear what data access changes may have happened with the
application either - did you also switch over to ADO.Net,
change MDAC versions, drivers/providers? You need to look at
network, OS, potential issues as well.
-Sue|||Darrel, we have exactly this issue, but with a different front end language.
We converted in Nov 06 and immediately our remote clients began having
serious issues just as you describe. We reduced that somewhat by
implementing an ado connection object that "pings" periodically to keep the
connection alive, but in recent weeks the problem has gotten worse again.
I ran into Bill Vaughn at a user group meeting late last year and picked his
brain (or should I say "opened the tap"). The bottom line seemed to be: the
most effective solution is to upgrade the app to dotnet; changing to sql
native client can help somewhat; "pinging" can also help. Upgrading our app
is a really big task, and we haven't done it. We did the ping thing and the
native client thing, but as I said, those things alone are not the solution.
We opened a ticket with MS and spend hours in emails and on the phone with
overseas MS support people on this issue. They really didn't have a clue.
By the way, look at Bill's posting on 3/2/07 in which he says unequivocably
that for a win app, connecting and staying connected is a preferred
practice. Obviously Sue Hoegemeier does not agree.
My view is that MS is either unaware of, in denial about, or unwilling to
acknowledge and correct this issue.
Jeremy
"Darrel Miller" <DarrelMiller@.discussions.microsoft.com> wrote in message
news:46643664-0C8E-4C94-8DE4-4704518EAE8A@.microsoft.com...
> We have a VB6 application that I recently converted over from SQL Server
> 2000
> to SQL Server 2005. At numerous customer locations I am now experiencing
> occasional "lost connection" problems. This never happened with SQL 2000.
> Our client app does not use pooled connections and I have not been able to
> find a clean way to recover from the broken connection problem. If I
> check
> the status of the Connection it claims that it is open, it is only if I
> try
> to make a database request that I get the error back. Re-opening the
> connection solves the problem.
> Is anyone aware of SQL Server 2005 doing anything different with respect
> to
> keeping database connections open?
> Thanks,
>|||Ahem... the ability to stay connected once connected has been a perennial
issue. I can remember back to the earliest versions of SQL Server when
connections "aged" out for some unknown reason. No, I don't know the actual
reason, but symptoms lead me to believe that there is an ack/nak protocol in
the TDS that basically pings the application at the driver level to see if
the server or client is still there. This makes sense as SQL Server
automatically detects if the client falls into the sea or gets disconnected
for some reason. This polling has to be done on the server as it scans the
current connections to see if there is activity (work to be done) or if
there is still a client (albeit inactive) attached. I suspect that if the
client is busy (running Age of Empires or Windows OneCare) and can't respond
in time or the event is simply lost (as can happen in Windows
message-loop-based applications) the server thinks the client has dropped
off. My solution to this is to keep polling the server (once every 30
seconds or so seems to be more than enough) to keep the connection alive.
Does the Connection pooling mechanism do this on its own? BHOM, but I would
not be surprised. I have not seen evidence of this in the Profiler logs
though. I suspect it's done at the TDS provider level. Since we don't have
access to those APIs, one can constantly close and reopen the connection
(use the Connection pool) or poll (which is pretty easy and faster as it
does not require reauthenication, resetting the connection and loss of the
server state). To poll you can execute any SQL like "USE <initial catalog>"
or "DECLARE i as INT".
I'm not sure I said that keeping a connection open is a "preferred" method,
just a perfectly viable method--assuming you know the costs.
hth
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
Visit www.hitchhikerguides.net to get more information on my latest book:
Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
----
---
"Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
news:uAzf37ibHHA.2088@.TK2MSFTNGP04.phx.gbl...
> Darrel, we have exactly this issue, but with a different front end
> language. We converted in Nov 06 and immediately our remote clients began
> having serious issues just as you describe. We reduced that somewhat by
> implementing an ado connection object that "pings" periodically to keep
> the connection alive, but in recent weeks the problem has gotten worse
> again.
> I ran into Bill Vaughn at a user group meeting late last year and picked
> his brain (or should I say "opened the tap"). The bottom line seemed to
> be: the most effective solution is to upgrade the app to dotnet; changing
> to sql native client can help somewhat; "pinging" can also help.
> Upgrading our app is a really big task, and we haven't done it. We did
> the ping thing and the native client thing, but as I said, those things
> alone are not the solution. We opened a ticket with MS and spend hours in
> emails and on the phone with overseas MS support people on this issue.
> They really didn't have a clue.
> By the way, look at Bill's posting on 3/2/07 in which he says
> unequivocably that for a win app, connecting and staying connected is a
> preferred practice. Obviously Sue Hoegemeier does not agree.
> My view is that MS is either unaware of, in denial about, or unwilling to
> acknowledge and correct this issue.
> Jeremy
>
> "Darrel Miller" <DarrelMiller@.discussions.microsoft.com> wrote in message
> news:46643664-0C8E-4C94-8DE4-4704518EAE8A@.microsoft.com...
>|||Bill, I do appreciate your detailed and thoughtful response. However, our
application did not experience dropped connections under sql 2k (or they
were so insignificant no one bothered me about it), whereas despite our best
efforts, I see many error reports every day with sql 2k5. Seems quite clear
that the problem is dramatically worse with the new version.
Jeremy
"William (Bill) Vaughn" <billvaRemoveThis@.betav.com> wrote in message
news:OaM5uelbHHA.4808@.TK2MSFTNGP04.phx.gbl...
> Ahem... the ability to stay connected once connected has been a perennial
> issue. I can remember back to the earliest versions of SQL Server when
> connections "aged" out for some unknown reason. No, I don't know the
> actual reason, but symptoms lead me to believe that there is an ack/nak
> protocol in the TDS that basically pings the application at the driver
> level to see if the server or client is still there. This makes sense as
> SQL Server automatically detects if the client falls into the sea or gets
> disconnected for some reason. This polling has to be done on the server as
> it scans the current connections to see if there is activity (work to be
> done) or if there is still a client (albeit inactive) attached. I suspect
> that if the client is busy (running Age of Empires or Windows OneCare) and
> can't respond in time or the event is simply lost (as can happen in
> Windows message-loop-based applications) the server thinks the client has
> dropped off. My solution to this is to keep polling the server (once every
> 30 seconds or so seems to be more than enough) to keep the connection
> alive. Does the Connection pooling mechanism do this on its own? BHOM, but
> I would not be surprised. I have not seen evidence of this in the Profiler
> logs though. I suspect it's done at the TDS provider level. Since we don't
> have access to those APIs, one can constantly close and reopen the
> connection (use the Connection pool) or poll (which is pretty easy and
> faster as it does not require reauthenication, resetting the connection
> and loss of the server state). To poll you can execute any SQL like "USE
> <initial catalog>" or "DECLARE i as INT".
> I'm not sure I said that keeping a connection open is a "preferred"
> method, just a perfectly viable method--assuming you know the costs.
> hth
> --
> ____________________________________
> William (Bill) Vaughn
> Author, Mentor, Consultant
> Microsoft MVP
> INETA Speaker
> www.betav.com/blog/billva
> www.betav.com
> Please reply only to the newsgroup so that others can benefit.
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
> __________________________________
> Visit www.hitchhikerguides.net to get more information on my latest book:
> Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
> and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
> ----
---
> "Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
> news:uAzf37ibHHA.2088@.TK2MSFTNGP04.phx.gbl...
>|||Ok, there are a lot of "fixes" in SS2K5 that might have had this unintended
side-effect. Have you installed the latest SP2? I think there is a SP2a (or
b).
I would log a bug with MS. The (more or less) monitor this group (but pay
more attention to the MSDN forums).
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
Visit www.hitchhikerguides.net to get more information on my latest book:
Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
----
---
"Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
news:Oa3%2321nbHHA.2300@.TK2MSFTNGP06.phx.gbl...
> Bill, I do appreciate your detailed and thoughtful response. However, our
> application did not experience dropped connections under sql 2k (or they
> were so insignificant no one bothered me about it), whereas despite our
> best efforts, I see many error reports every day with sql 2k5. Seems
> quite clear that the problem is dramatically worse with the new version.
> Jeremy
>
> "William (Bill) Vaughn" <billvaRemoveThis@.betav.com> wrote in message
> news:OaM5uelbHHA.4808@.TK2MSFTNGP04.phx.gbl...
>

Friday, March 23, 2012

Looping each record

Hi,

My report requirement of customer activity report.

For each customer, my report should show the summary by purchase catgory and then individual transaction details.

Customer Name: Tom Tim

Books: 100

Computers : 2

Others: 23

(then all 125 transactions should be listed).

Same summary and detail should be repeated for each customer in the same single report. (there are aound 200 customers)

How can I achieve this? (other than Sub report). Sub-report seems to be very slow in rendering?

Thanks,

Vasanth

Wouldn't a list work? The list would repeat for each customer.|||Within the list, I am not able to user another table to list the transaction because list allows only group (aggregate) functionalities.|||any ideas pls?

Loopig Each Record in dataset

Hi,
My report requirement of customer activity report.
For each customer, my report should show the summary by purchase
catgory and then individual transaction details.
Customer Name: Tom Tim
Books: 100
Computers : 2
Others: 23
(then all 125 transactions should be listed).
Same summary and detail should be repeated for each customer in the
same single report. (there are aound 200 customers)
How can I achieve this? (other than Sub report). Sub-report seems to be
very slow in rendering? Also If I use List control, it does not allow
table inside list with out aggregate.
So I couldn;t use list control
Thanks,
VasanthSubreport is how you have to solve this. Make sure that you have appropriate
indexing so the subreport query runs quickly.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
<bvasanth123@.rediffmail.com> wrote in message
news:1137791468.342737.278100@.g47g2000cwa.googlegroups.com...
> Hi,
> My report requirement of customer activity report.
> For each customer, my report should show the summary by purchase
> catgory and then individual transaction details.
> Customer Name: Tom Tim
> Books: 100
> Computers : 2
> Others: 23
> (then all 125 transactions should be listed).
> Same summary and detail should be repeated for each customer in the
> same single report. (there are aound 200 customers)
> How can I achieve this? (other than Sub report). Sub-report seems to be
> very slow in rendering? Also If I use List control, it does not allow
> table inside list with out aggregate.
> So I couldn;t use list control
> Thanks,
> Vasanth
>|||The performance is very poor when we try to render in PDF. We checked
SP (in sub-report) which takes 1-2 seconds for each customer. But
rendering that takes more than 30 mins depending on data.|||Your question was initially about subreports. Based on what you want to do,
subreports are the way to do it. There are differences with rendering
different formats. HTML and CSV are very fast, PDF and Excel are much slower
(order of magnitude slower). Your hypothesis that the problem is the
subreports most likely is not the issue. The issue with PDF is either
complexity or the size of the data (number of rows and size of rows). Also,
do you have any images? If so, what format is the image in. Try the report
without images and see if that helps. If you want to know how much the
optimal time is then render it to html and see how long that takes. That is
the best case.
Is this for a user viewing on the screen or is this destined for printing?
If it is the user then try using drill through. Drill through is quite fast.
Also, does it have to be pdf? As I said, html is much faster.
RS does all rendering in memory. So you can also try adding additional RAM
to see if that would help.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
<bvasanth123@.rediffmail.com> wrote in message
news:1137851207.701517.281770@.g14g2000cwa.googlegroups.com...
> The performance is very poor when we try to render in PDF. We checked
> SP (in sub-report) which takes 1-2 seconds for each customer. But
> rendering that takes more than 30 mins depending on data.
>|||Thanks Bruces
1) Total records: 20,000 + ( 100 customer * 200 recods average)
2) This is purely for audit record keeping purpose. Not online report
3) We do not have any images
4) HTML report is coming quickly around 10 mins. PDF seems to be the
killer.. Takes for ever. (HTML size is ? 50 MB which we do not prefer)
5) We tried another sample report that just display 20,000+. PDF comes
in 5-10 mins. But if the same 20K+ records have been run with
sub-reports, it goes down in performance.
5) RAM is 4 GB|||RAM is fine. If it takes 10 mins for html and 30 for PDF then you are doing
pretty good. Usually the difference is greater than that. You might be able
to do something to get the data down quicker (playing with indexes) but that
would only affect at most 10 mins of the total time. The extra 20 minutes is
PDF rendering issues. My suggestion is to just schedule it to run at night.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
<bvasanth123@.rediffmail.com> wrote in message
news:1138029125.861955.154170@.o13g2000cwo.googlegroups.com...
> Thanks Bruces
> 1) Total records: 20,000 + ( 100 customer * 200 recods average)
> 2) This is purely for audit record keeping purpose. Not online report
> 3) We do not have any images
> 4) HTML report is coming quickly around 10 mins. PDF seems to be the
> killer.. Takes for ever. (HTML size is ? 50 MB which we do not prefer)
> 5) We tried another sample report that just display 20,000+. PDF comes
> in 5-10 mins. But if the same 20K+ records have been run with
> sub-reports, it goes down in performance.
> 5) RAM is 4 GB
>

Wednesday, March 21, 2012

Loop insert

Hi,

I will do my best to explain my question:

I have a field called CUSTOMER which contains a number such as C00001,
C0002 etc.

However i now have a new field called XCUSTOMER which is the new number that they relate to.

CUSTOMER C00001 now relates to XCUSTOMER 493845.
CUSTOMER C00002 now relates to XCUSTOMER 494343.

Basically there are hundreds of these and i dont have the time to manually enter the data. I just want to say "where customer = C00001 then insert 49494 into XCUSTOMER and then loop through and insert them all.

My table is called CUSTOMERINFO.

If anyone could help it would be much apprieciated as it would save me so much time.

Thanks:)update customerinfo
set xcustomer = case customer
when 'c00001' then 493845
when 'c00001' then 494343
...
end

LOOKUPCUBE Problem!

Hi,

Dose LOOKUPCUBE function include "Customer Member" ?
I use Customer Member in LOOKUPCUBE syntax and system response #Err!
So, LOOKUPCUBE dosen't support Customer Member?
Thanks for any advice!

Angi

Hi Angi,

The LookupCube function returns either a numeric expression or a string expression. Assuming you have a Customer dimension with a hierarchy called Customers (with levels Country-State-City-Name), and you are issuing the query in cube Budget, but want to evaluate these expressions in another cube called Sales, here are two working examples:

with member x as 'lookupcube("Sales", "[Customers].[Country].&[Canada].name")'
select x on 0 from [Budget] -- returns the string Canada

with member x as 'lookupcube("Sales", "[Customers].[City].count")'
select x on 0 from [Budget] -- returns the number of cities in the City level

Hope this helps,

Artur

|||

Artur,

Thanks for help!
My expression as follow...

WITH
MEMBER [X].[XX02].[THISPERIOD] AS '[X].[XX02].[200612]'
MEMBER [IV].[IV02].[Execute] AS 'LOOKUPCUBE ("CUBEEF" , " (
[X].[XX08].&[XX0830010] , " + [X].[XX04].CURRENTMEMBER.UNIQUENAME + "," +
[X].[XX01].CURRENTMEMBER.UNIQUENAME + " , [X].[XX02].[THISPERIOD] )" ) '
....
....

So, the Customer Member is [X].[XX02].[THISPERIOD] and the
[IV].[IV02].[Execute] will response #Err.
Any idea?

Angi

|||

The second parameter in the LookupCube function call is invalid. To see the detailed error message, please double click on the Err# cell in SQL Management Studio and it will display the reason. Looks like you are trying to pass a set to the function and not a string. What are you trying to achieve with this query?

--Artur

Monday, March 19, 2012

Lookup Transform

Hi!
I am a newbie, grateful for some help. I have a Source Ole DB w sql-command selecting the customer.salary and customer.occupation, which I want to match with demo_id in Ole DB destination.
salary, occupation also in dim_demographic.
But in Lookup editor I find no column demo_id... how do I do this?The lookup must select against your table with demo_id in it... We need more information here... If you can't find demo_id in your lookup table, and you need it to be there, well then you probably have the wrong table defined in your lookup.|||

Not sure what your problem is. In general, Lookup transform can be basedon a table or on a query; if you don't see an specifc column in the Column is because that column is not on that table/query.

Could you provide more details on how you have set up your package.

BTW, Have you considered to just include the lookup table as a part of the OLE DB Source query and not use the lookup transform at all?

|||Ihave a select customer.salary, customer.occupation in source DB. These should be Lookup columns matched with the id from dim_demography. How?|||The source has nothing to do with the lookup.

In the lookup, you need to specify a lookup table or query. When that's done, then you map columns from the dataflow (your source) to columns in the lookup table. You'll have to have an ID column coming from the source query though.

I'm sorry, but your response didn't say anything more clear than your original post.|||How would you do if you had a demo_id in db Destination, which should be matched with the values salary and occupation in Dim_demographic (containing columns id, salary, occupation) or de db Source sql customer.salary, customer.occupation?|||Please provide your schema:

Table1:
Column1
Column2
....

Table2:
Column1
Column2
...

We'll go from there.|||OLE db Source column:
SELECT Sum, k.sum-p.price AS Profit, convert (char(8),date,112) as Date_id, salary, occupation, k.shop_id, k.customer_no, salesperson_id, p.articlenr,campain_id
FROM Purchase k join product p on k.articlenr=p.articlenr join customer ku on k.customernr=ku.customernr

Destination column:
demo_id, Profit, customer_id, product_id, shop_id, date_id, sum, salesperson_id, campain_id

Demo_id = id from Dim_demographic.
Dim_demographic columns are id, salary, occupation

How do I get the Dim_demographic_id to match demo_id? by comparing salary and occupation from Source column with Dim_demographic salary, occupation.. may be in an sql in Lookup?
What should be used in Lookup as connection table? or SQL Select demographic.id where demographic.salary= fact_purchase.salary...
|||

curiousss wrote:

OLE db Source column:
SELECT Sum, k.sum-p.price AS Profit, convert (char(8),date,112) as Date_id, salary, occupation, k.shop_id, k.customer_no, salesperson_id, p.articlenr,campain_id
FROM Purchase k join product p on k.articlenr=p.articlenr join customer ku on k.customernr=ku.customernr
Destination column:
demo_id, Profit, customer_id, product_id, shop_id, date_id, sum, salesperson_id, campain_id
Demo_id = id from Dim_demographic.
Dim_demographic columns are id, salary, occupation

How do I get the Dim_demographic_id to match demo_id? by comparing salary and occupation from Source column with Dim_demographic salary, occupation.. may be in an sql in Lookup?
What should be used in Lookup as connection table? or SQL Select demographic.id where demographic.salary= fact_purchase.salary...

Use a lookup transform with a query like:

Take the output of the OLE DB Source component to a Lookup transform. Inside of the lookup write a query like:

Select dim_demographic_id, salary, occupation
From Dim_demographic

Then in the columns tab draw a line to join Salary and occupation; in the bottom part, choose id from the dropdown list to bring the dim_demographic_id (from dim_demographic) to the pipe line. Now when you connect the pipeline to the destination component you should have that extra column; all you have to do is to create the mapping between thedestination column demo_id and the column in the pipeline.

BTW, it is better to provide a query inside the lookup tranform than select the tablename from the list...it save resources(memory) improvng performance.

|||Thank you!
Now lines bw Available lookup (salary, occupation) and matching Available Input columns. Available Lookup id marked, and on first line in Lookup column: id, Lookup operation: <add new column>

But OLE db destination Preview is empty. although Destination column: demo_id is matched with Input column: id

Execution error:
[Lookup [3882]] Warning: The Lookup transformation encountered duplicate reference key values when caching reference data. The Lookup transformation found duplicate key values when caching metadata in PreExecute. This error occurs in Full Cache mode only. Either remove the duplicate key values, or change the cache mode to PARTIAL or NO_CACHE.

[OLE DB Destination [3924]] Error: An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated
errors. Check each OLE DB status value, if available. No work was done.".

[OLE DB Destination [3924]] Error: There was an error with input column "salesperson_id" (3972) on input "OLE DB Destination Input" (3937). The column status returned was: "Conversion failed because the data value overflowed the specified type.".|||The OLE DB destination preview shows you what's in the destination table, not what's in the data flow. You have nothing in your table, hence nothing shows up in the preview.|||

curiousss wrote:

Thank you!
Now lines bw Available lookup (salary, occupation) and matching Available Input columns. Available Lookup id marked, and on first line in Lookup column: id, Lookup operation: <add new column>

But OLE db destination Preview is empty. although Destination column: demo_id is matched with Input column: id

Execution error:
[Lookup [3882]] Warning: The Lookup transformation encountered duplicate reference key values when caching reference data. The Lookup transformation found duplicate key values when caching metadata in PreExecute. This error occurs in Full Cache mode only. Either remove the duplicate key values, or change the cache mode to PARTIAL or NO_CACHE.

[OLE DB Destination [3924]] Error: An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated
errors. Check each OLE DB status value, if available. No work was done.".

[OLE DB Destination [3924]] Error: There was an error with input column "salesperson_id" (3972) on input "OLE DB Destination Input" (3937). The column status returned was: "Conversion failed because the data value overflowed the specified type.".

Ok Few things here:

The preview in the OLE DB Components shows data that is already in that table; if this is your initial load, the table is empty, then the preview will show nothing.

The first message you in the execution error is warning you about duplicates in the Dim_Demographics; meaning, there is more than one row for a combination of salary and occupation. This is treated as a warning as SSIS will use any of those values; so be careful on that. Provide a query and joins that retrieves unique values.

The other 2 messages are actually errors; check the data types of the source and destination for salesperson_is column; they have to be the same.

|||Hi!
Thank you.. its getting late...wouldn't this eliminate duplicates?
SELECT DISTINCT id, salary, occupation
From Dim_demographic|||by the way.. is there any logic/rule/order when changing data types (just shows error now)

Purchase Source both salesperson_id are DT_18

Purchase Destination salesperson_id Input DT_18
Purchase Destination salesperson_id external DT_14

...Dim_salesperson salesperson_id is DT-14|||

I don't think the Distinct will eliminate the duplicates as each row (I guess) has a unique Id value.

To convert data types there is a Data conversion tranform in the toolbox of the data flow.

Saturday, February 25, 2012

looking for non similar orders

System is "MFG" Custom jobs.
Tracking customer's last order and looking for changes they make in the job
detail table as compared to order before.
Table schema:
OrderID, line#, Item, QtyReq
I need to verify the last order and if it changed with the clients prior
order.
I have 6 line items per order, and the rows 1,2,3 are the critical check
point I need to verify.
If it returned int for # of rows diff that would be fine.
http://www.webhost4life.com/hosting.asp shows monthly fees for 5,10,20 USD
per month.Hi
Check out http://www.aspfaq.com/etiquett_e.asp?id=5006 on how to post DDL
and example data. It is also useful to post your current attempt and the
information you expect from the example data.
John
"Stephen Russell" wrote:

> System is "MFG" Custom jobs.
> Tracking customer's last order and looking for changes they make in the jo
b
> detail table as compared to order before.
> Table schema:
> OrderID, line#, Item, QtyReq
> I need to verify the last order and if it changed with the clients prior
> order.
> I have 6 line items per order, and the rows 1,2,3 are the critical check
> point I need to verify.
> If it returned int for # of rows diff that would be fine.
> http://www.webhost4life.com/hosting.asp shows monthly fees for 5,10,20 US
D
> per month.
>
>
>
>