Friday, March 30, 2012
Loss of Connection
check on it's connection to the database on a SQL Server 2000.
Sometimes it looses it's connection and then is unable to restablish
the connection for over an hour. During the time that it looses
contact with the SQL Server there is some pretty heavy activity on the
SQL Server 2000 box.
Is there some setting I've overlooked ... or is this some weakness on
the part of SQL Server ? I don't think the application is doing a
query or anything, I think it's just some heartbeat kind of routine.
rls
Seattle, WAare you attaching via name or IP address? Without a WINS, DNS or ADS server,
attaching via name may be unrealizable. Try using the IP address.
--
J
www.urbanvoyeur.com
"brlarue" <ron.strouss@.westfarm.com> wrote in message
news:42b547894434e770528406949d17c5b5@.news.teranews.com...
> We have an application running on a server that does a connection
> check on it's connection to the database on a SQL Server 2000.
> Sometimes it looses it's connection and then is unable to restablish
> the connection for over an hour. During the time that it looses
> contact with the SQL Server there is some pretty heavy activity on the
> SQL Server 2000 box.
> Is there some setting I've overlooked ... or is this some weakness on
> the part of SQL Server ? I don't think the application is doing a
> query or anything, I think it's just some heartbeat kind of routine.
> rls
> Seattle, WA|||We have a DNS. I'll take a look at the possibility of using the IP
address. Here is the message coming from the application that looses
it's connection.
GENTRAN Notification: ConvertedNotification3 Oct 05 2003 07:29:20
EventID=55867 1-1-50009:ODBC: MFC database exception in
Program/RETCODE: Edimgr/-1State:08S01,Native:0,Origin:[Microsoft][ODBC
SQL Server Driver]
Communication link failure
-
On Mon, 6 Oct 2003 06:26:59 -0400, "UrbanVoyeur" <nospam@.nospam.com>
wrote:
>are you attaching via name or IP address? Without a WINS, DNS or ADS server,
>attaching via name may be unrealizable. Try using the IP address.
Losing some data
r
was full. We only have 1 database in the server. We use Drive D:\ for the lo
g
file. I moved one file to C:\ drive to free some space. This file has nothin
g
to do with the database. Then, we restarted the server. After the server was
up again, I noticed weird things such as:
1. log file (ldf) went down from 9 GB to 2 MB
2. we lost some data in some tables.
My question is:
1. What caused those weird sthings?
2. After the server was up, users have been entering/processing new data. Is
it possible to restore data from the last backup before the server had
problem without deleting the new data?
We don't have a DBA here so we don't have any clue on what to do.
Thanks for replying.Hi
"lwidjaya" wrote:
> We had problems with our ERP application since the log file of the SQL ser
ver
> was full. We only have 1 database in the server. We use Drive D:\ for the
log
> file. I moved one file to C:\ drive to free some space. This file has noth
ing
> to do with the database. Then, we restarted the server. After the server w
as
> up again, I noticed weird things such as:
> 1. log file (ldf) went down from 9 GB to 2 MB
> 2. we lost some data in some tables.
> My question is:
> 1. What caused those weird sthings?
> 2. After the server was up, users have been entering/processing new data.
Is
> it possible to restore data from the last backup before the server had
> problem without deleting the new data?
> We don't have a DBA here so we don't have any clue on what to do.
> Thanks for replying.
When you restarted the server did you do anything else? You should not have
lost any committed data, but uncomitted data may be rolled back.
If you have entered new data it may be better to go back to the old backup
and re-enter it. If you have space then the last backup can be restored as a
database with a different name, and you could use a tool such as SQL Data
Compare or dbghost to see what has changed and then make your decission. You
could do this first on a different server if there is not enough room on the
live environment. These tools do have options to insert the differences from
one database to the other, but if you have triggers and other actions
performed you may need to disable them and be very careful how you add the
information.
The longer you let users add/change data then the harder it will be to go
back.
HTH
John|||Hi John,
thanks for your reply.
I checked the database and since we just had this database since November
2006, we've never done any transaction log backup on it. Is doing back up
using veritas the same as doing backup from EM? We do file backup everynight
using veritas.
I'm planning to run transaction log backup from EM after the scheduled
backup from veritas, will it do any harm to the database? Should we set up
transaction log back up every hour? Is it true that transaction log back up
will shrink log file?
Thanks in advance,
Lisa
"John Bell" wrote:
> When you restarted the server did you do anything else? You should not hav
e
> lost any committed data, but uncomitted data may be rolled back.
> If you have entered new data it may be better to go back to the old backup
> and re-enter it. If you have space then the last backup can be restored as
a
> database with a different name, and you could use a tool such as SQL Data
> Compare or dbghost to see what has changed and then make your decission. Y
ou
> could do this first on a different server if there is not enough room on t
he
> live environment. These tools do have options to insert the differences fr
om
> one database to the other, but if you have triggers and other actions
> performed you may need to disable them and be very careful how you add the
> information.
> The longer you let users add/change data then the harder it will be to go
> back.
> HTH
> John|||Hi Lisa
Veritas is probably using an agent to backup the database, it will
effectively doing a full backup. You could set up your own schedule to do
transaction log backups to disc. My preference is to backup to disc and then
to tape, that way you can always have the most recent backups at hand if you
wish to quickly restore the database. There is a period between doing the
disc backup and putting it onto tape, but if you use a raid disc array this
is reduced.
Transaction log backups themselves will not shrink the file. It does enable
the transaction log file to be re-used so it will limit the size of the file
under normal workloads, and the transaction log should only grow if you have
an abnormally large number of changes. Continually shrinking the data and lo
g
files is not a good idea as it can lead to disc fragmentation of the file
which will impact on performance. Your database should be in full recovery
mode.
John
"lwidjaya" wrote:
[vbcol=seagreen]
> Hi John,
> thanks for your reply.
> I checked the database and since we just had this database since November
> 2006, we've never done any transaction log backup on it. Is doing back up
> using veritas the same as doing backup from EM? We do file backup everynig
ht
> using veritas.
> I'm planning to run transaction log backup from EM after the scheduled
> backup from veritas, will it do any harm to the database? Should we set up
> transaction log back up every hour? Is it true that transaction log back u
p
> will shrink log file?
> Thanks in advance,
> Lisa
> "John Bell" wrote:|||Hi John,
We're running transaction log backup every hour now. As you said, the
transaction log backup didn't shrink the ldf file. Is it ok if we shrink the
database once a while?
Thanks,
Lisa
"John Bell" wrote:
[vbcol=seagreen]
> Hi Lisa
> Veritas is probably using an agent to backup the database, it will
> effectively doing a full backup. You could set up your own schedule to do
> transaction log backups to disc. My preference is to backup to disc and th
en
> to tape, that way you can always have the most recent backups at hand if y
ou
> wish to quickly restore the database. There is a period between doing the
> disc backup and putting it onto tape, but if you use a raid disc array thi
s
> is reduced.
> Transaction log backups themselves will not shrink the file. It does enabl
e
> the transaction log file to be re-used so it will limit the size of the fi
le
> under normal workloads, and the transaction log should only grow if you ha
ve
> an abnormally large number of changes. Continually shrinking the data and
log
> files is not a good idea as it can lead to disc fragmentation of the file
> which will impact on performance. Your database should be in full recovery
> mode.
> John
> "lwidjaya" wrote:
>|||Hi Lisa
It is not a good idea to shrink any of the database files as this can lead
to disc fragmentation and potential degredation of performance. See
http://www.karaszi.com/SQLServer/info_dont_shrink.asp, the only exception
may be if you have done something very abnormal, such as mass data
migration/upgrade.
If you make sure that if you have auto expansion on, then the value should
not be a percentage. Alternatively you could turn expansion off and expand
manually when it is a quiet period.
John
"lwidjaya" <lwidjaya@.discussions.microsoft.com> wrote in message
news:3DAF644A-9416-4F05-9986-8A1D929FA0DD@.microsoft.com...[vbcol=seagreen]
> Hi John,
> We're running transaction log backup every hour now. As you said, the
> transaction log backup didn't shrink the ldf file. Is it ok if we shrink
> the
> database once a while?
> Thanks,
> Lisa
> "John Bell" wrote:
>|||> We're running transaction log backup every hour now. As you said, the
> transaction log backup didn't shrink the ldf file. Is it ok if we shrink
> the
> database once a while?
Why? Do you once in a while lease that disk space to some other process?
http://www.karaszi.com/SQLServer/info_dont_shrink.asp|||No, we don't lease the space to other process. We had ldf file less than 1 G
B
before, then our consultant did 'tables reorganizing' and the ldf suddenly
grew to 13.5 GB. We only have 2 GB free space now for the log file. So, I'm
wondering if we can shrink it to get more space, just in case we need the
space in the future.
I have a question, in database property/Taskpad, it shows that the
transaction log space is 13.5 GB, used: 76 MB, and free: 13.4 GB. But how
come the ldf file size shows 13.5 GB? Does it mean the actual size of the ld
f
file is only 76 MB if we shrink it?
Thanks for the replies!
"Aaron Bertrand [SQL Server MVP]" wrote:
> Why? Do you once in a while lease that disk space to some other process?
> http://www.karaszi.com/SQLServer/info_dont_shrink.asp
>
>|||> No, we don't lease the space to other process. We had ldf file less than 1
> GB
> before, then our consultant did 'tables reorganizing' and the ldf suddenly
> grew to 13.5 GB.
Well, that's a special case, and it fits under John Bell's comment
(something very abnormal). I assume your consultant does not reorganize
tables daily?
> I have a question, in database property/Taskpad, it shows that the
> transaction log space is 13.5 GB, used: 76 MB, and free: 13.4 GB. But how
> come the ldf file size shows 13.5 GB? Does it mean the actual size of the
> ldf
> file is only 76 MB if we shrink it?
No, that is not necessarily true. There are multiple virtual log files, and
how the shrink will physically change the physical log file depends on where
any active transactions are stored in the log file.
A|||No, our consultant only did it one time because we made changes to our ERP
data.
So, I guess we can do the database shrink since it's an 'abnormal case'?
Thanks.
"Aaron Bertrand [SQL Server MVP]" wrote:
> Well, that's a special case, and it fits under John Bell's comment
> (something very abnormal). I assume your consultant does not reorganize
> tables daily?
>
> No, that is not necessarily true. There are multiple virtual log files, a
nd
> how the shrink will physically change the physical log file depends on whe
re
> any active transactions are stored in the log file.
> A
>
>
Losing some data
was full. We only have 1 database in the server. We use Drive D:\ for the log
file. I moved one file to C:\ drive to free some space. This file has nothing
to do with the database. Then, we restarted the server. After the server was
up again, I noticed weird things such as:
1. log file (ldf) went down from 9 GB to 2 MB
2. we lost some data in some tables.
My question is:
1. What caused those weird sthings?
2. After the server was up, users have been entering/processing new data. Is
it possible to restore data from the last backup before the server had
problem without deleting the new data?
We don't have a DBA here so we don't have any clue on what to do.
Thanks for replying.
Hi
"lwidjaya" wrote:
> We had problems with our ERP application since the log file of the SQL server
> was full. We only have 1 database in the server. We use Drive D:\ for the log
> file. I moved one file to C:\ drive to free some space. This file has nothing
> to do with the database. Then, we restarted the server. After the server was
> up again, I noticed weird things such as:
> 1. log file (ldf) went down from 9 GB to 2 MB
> 2. we lost some data in some tables.
> My question is:
> 1. What caused those weird sthings?
> 2. After the server was up, users have been entering/processing new data. Is
> it possible to restore data from the last backup before the server had
> problem without deleting the new data?
> We don't have a DBA here so we don't have any clue on what to do.
> Thanks for replying.
When you restarted the server did you do anything else? You should not have
lost any committed data, but uncomitted data may be rolled back.
If you have entered new data it may be better to go back to the old backup
and re-enter it. If you have space then the last backup can be restored as a
database with a different name, and you could use a tool such as SQL Data
Compare or dbghost to see what has changed and then make your decission. You
could do this first on a different server if there is not enough room on the
live environment. These tools do have options to insert the differences from
one database to the other, but if you have triggers and other actions
performed you may need to disable them and be very careful how you add the
information.
The longer you let users add/change data then the harder it will be to go
back.
HTH
John
|||Hi John,
thanks for your reply.
I checked the database and since we just had this database since November
2006, we've never done any transaction log backup on it. Is doing back up
using veritas the same as doing backup from EM? We do file backup everynight
using veritas.
I'm planning to run transaction log backup from EM after the scheduled
backup from veritas, will it do any harm to the database? Should we set up
transaction log back up every hour? Is it true that transaction log back up
will shrink log file?
Thanks in advance,
Lisa
"John Bell" wrote:
> When you restarted the server did you do anything else? You should not have
> lost any committed data, but uncomitted data may be rolled back.
> If you have entered new data it may be better to go back to the old backup
> and re-enter it. If you have space then the last backup can be restored as a
> database with a different name, and you could use a tool such as SQL Data
> Compare or dbghost to see what has changed and then make your decission. You
> could do this first on a different server if there is not enough room on the
> live environment. These tools do have options to insert the differences from
> one database to the other, but if you have triggers and other actions
> performed you may need to disable them and be very careful how you add the
> information.
> The longer you let users add/change data then the harder it will be to go
> back.
> HTH
> John
|||Hi Lisa
Veritas is probably using an agent to backup the database, it will
effectively doing a full backup. You could set up your own schedule to do
transaction log backups to disc. My preference is to backup to disc and then
to tape, that way you can always have the most recent backups at hand if you
wish to quickly restore the database. There is a period between doing the
disc backup and putting it onto tape, but if you use a raid disc array this
is reduced.
Transaction log backups themselves will not shrink the file. It does enable
the transaction log file to be re-used so it will limit the size of the file
under normal workloads, and the transaction log should only grow if you have
an abnormally large number of changes. Continually shrinking the data and log
files is not a good idea as it can lead to disc fragmentation of the file
which will impact on performance. Your database should be in full recovery
mode.
John
"lwidjaya" wrote:
[vbcol=seagreen]
> Hi John,
> thanks for your reply.
> I checked the database and since we just had this database since November
> 2006, we've never done any transaction log backup on it. Is doing back up
> using veritas the same as doing backup from EM? We do file backup everynight
> using veritas.
> I'm planning to run transaction log backup from EM after the scheduled
> backup from veritas, will it do any harm to the database? Should we set up
> transaction log back up every hour? Is it true that transaction log back up
> will shrink log file?
> Thanks in advance,
> Lisa
> "John Bell" wrote:
|||Hi John,
We're running transaction log backup every hour now. As you said, the
transaction log backup didn't shrink the ldf file. Is it ok if we shrink the
database once a while?
Thanks,
Lisa
"John Bell" wrote:
[vbcol=seagreen]
> Hi Lisa
> Veritas is probably using an agent to backup the database, it will
> effectively doing a full backup. You could set up your own schedule to do
> transaction log backups to disc. My preference is to backup to disc and then
> to tape, that way you can always have the most recent backups at hand if you
> wish to quickly restore the database. There is a period between doing the
> disc backup and putting it onto tape, but if you use a raid disc array this
> is reduced.
> Transaction log backups themselves will not shrink the file. It does enable
> the transaction log file to be re-used so it will limit the size of the file
> under normal workloads, and the transaction log should only grow if you have
> an abnormally large number of changes. Continually shrinking the data and log
> files is not a good idea as it can lead to disc fragmentation of the file
> which will impact on performance. Your database should be in full recovery
> mode.
> John
> "lwidjaya" wrote:
|||Hi Lisa
It is not a good idea to shrink any of the database files as this can lead
to disc fragmentation and potential degredation of performance. See
http://www.karaszi.com/SQLServer/info_dont_shrink.asp, the only exception
may be if you have done something very abnormal, such as mass data
migration/upgrade.
If you make sure that if you have auto expansion on, then the value should
not be a percentage. Alternatively you could turn expansion off and expand
manually when it is a quiet period.
John
"lwidjaya" <lwidjaya@.discussions.microsoft.com> wrote in message
news:3DAF644A-9416-4F05-9986-8A1D929FA0DD@.microsoft.com...[vbcol=seagreen]
> Hi John,
> We're running transaction log backup every hour now. As you said, the
> transaction log backup didn't shrink the ldf file. Is it ok if we shrink
> the
> database once a while?
> Thanks,
> Lisa
> "John Bell" wrote:
|||> We're running transaction log backup every hour now. As you said, the
> transaction log backup didn't shrink the ldf file. Is it ok if we shrink
> the
> database once a while?
Why? Do you once in a while lease that disk space to some other process?
http://www.karaszi.com/SQLServer/info_dont_shrink.asp
|||No, we don't lease the space to other process. We had ldf file less than 1 GB
before, then our consultant did 'tables reorganizing' and the ldf suddenly
grew to 13.5 GB. We only have 2 GB free space now for the log file. So, I'm
wondering if we can shrink it to get more space, just in case we need the
space in the future.
I have a question, in database property/Taskpad, it shows that the
transaction log space is 13.5 GB, used: 76 MB, and free: 13.4 GB. But how
come the ldf file size shows 13.5 GB? Does it mean the actual size of the ldf
file is only 76 MB if we shrink it?
Thanks for the replies!
"Aaron Bertrand [SQL Server MVP]" wrote:
> Why? Do you once in a while lease that disk space to some other process?
> http://www.karaszi.com/SQLServer/info_dont_shrink.asp
>
>
|||> No, we don't lease the space to other process. We had ldf file less than 1
> GB
> before, then our consultant did 'tables reorganizing' and the ldf suddenly
> grew to 13.5 GB.
Well, that's a special case, and it fits under John Bell's comment
(something very abnormal). I assume your consultant does not reorganize
tables daily?
> I have a question, in database property/Taskpad, it shows that the
> transaction log space is 13.5 GB, used: 76 MB, and free: 13.4 GB. But how
> come the ldf file size shows 13.5 GB? Does it mean the actual size of the
> ldf
> file is only 76 MB if we shrink it?
No, that is not necessarily true. There are multiple virtual log files, and
how the shrink will physically change the physical log file depends on where
any active transactions are stored in the log file.
A
|||No, our consultant only did it one time because we made changes to our ERP
data.
So, I guess we can do the database shrink since it's an 'abnormal case'?
Thanks.
"Aaron Bertrand [SQL Server MVP]" wrote:
> Well, that's a special case, and it fits under John Bell's comment
> (something very abnormal). I assume your consultant does not reorganize
> tables daily?
>
> No, that is not necessarily true. There are multiple virtual log files, and
> how the shrink will physically change the physical log file depends on where
> any active transactions are stored in the log file.
> A
>
>
Losing odbc connection on install
for Ceridian Prism--an application for HR departments. Now we are installing
a VB.Net application which uses MSDE (SQL 7). The problem is that for some
reason we are losing the original ODBC connection to 2000 when we install
MSDE. Now I realize that installing MSDE 2000 may help this issue, however
we really need to use SQL 7 for now. Does anyone have any ideas of what
might be causing the loss in the connection? Is the SQL 7 install
overwriting something that the ODBC needs for the 2000 connection? Is it
something with named instances? We are using the standard MSDE installation
from Microsoft.
Thanks.
My guess is that the MSDE 7 installation is installing an older version of
MDAC which is not ADO.Net compatible. It could also be an issue with named
instances as earlier version of MDAC (pre 2.5 I think) did not support named
instances.
Jim
"LisaConsult" <lisaconsult@.online.nospam> wrote in message
news:81EA76DC-A07B-4982-B9F2-CD31ACE1F0B0@.microsoft.com...
> We have a system which uses an ODBC connection to connect to SQL Server
> 2000
> for Ceridian Prism--an application for HR departments. Now we are
> installing
> a VB.Net application which uses MSDE (SQL 7). The problem is that for
> some
> reason we are losing the original ODBC connection to 2000 when we install
> MSDE. Now I realize that installing MSDE 2000 may help this issue,
> however
> we really need to use SQL 7 for now. Does anyone have any ideas of what
> might be causing the loss in the connection? Is the SQL 7 install
> overwriting something that the ODBC needs for the 2000 connection? Is it
> something with named instances? We are using the standard MSDE
> installation
> from Microsoft.
> Thanks.
|||Thanks for your response. Actually, we know that it is somehow SQL Server
related and not MDAC because once we uninstalled Server Manager and MSDE, the
connection worked fine again. As an aside, if they needed MDAC, we installed
2.6, but as I said, I don't believe this was the issue. Any other thoughts?
Thanks
"Jim Young" wrote:
> My guess is that the MSDE 7 installation is installing an older version of
> MDAC which is not ADO.Net compatible. It could also be an issue with named
> instances as earlier version of MDAC (pre 2.5 I think) did not support named
> instances.
> Jim
> "LisaConsult" <lisaconsult@.online.nospam> wrote in message
> news:81EA76DC-A07B-4982-B9F2-CD31ACE1F0B0@.microsoft.com...
>
>
|||Oops, my mistake, this app is actually still a VB6 app.
"LisaConsult" wrote:
> We have a system which uses an ODBC connection to connect to SQL Server 2000
> for Ceridian Prism--an application for HR departments. Now we are installing
> a VB.Net application which uses MSDE (SQL 7). The problem is that for some
> reason we are losing the original ODBC connection to 2000 when we install
> MSDE. Now I realize that installing MSDE 2000 may help this issue, however
> we really need to use SQL 7 for now. Does anyone have any ideas of what
> might be causing the loss in the connection? Is the SQL 7 install
> overwriting something that the ODBC needs for the 2000 connection? Is it
> something with named instances? We are using the standard MSDE installation
> from Microsoft.
> Thanks.
|||I still think that it is a problem with the data connection layer and not
SQL Server. Have you tried installing MDAC 2.8 after MSDE 7 is installed.
Jim
"LisaConsult" <lisaconsult@.online.nospam> wrote in message
news:2271E92F-AFE5-4C14-A8FA-2A05326227B6@.microsoft.com...[vbcol=seagreen]
> Thanks for your response. Actually, we know that it is somehow SQL Server
> related and not MDAC because once we uninstalled Server Manager and MSDE,
> the
> connection worked fine again. As an aside, if they needed MDAC, we
> installed
> 2.6, but as I said, I don't believe this was the issue. Any other
> thoughts?
> Thanks
> "Jim Young" wrote:
losing data from tables
Is there any known problem with MSSQL 7.0 losing data from tables?
I have an application that is used by 150 seperate customer and each
customer have about 10 users. The customers have theirs database at a
server administrated by me. The application access its database through
ODBC. In maintenanse plan I have a short-time backup of the databases
three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
19:00.
Sometime a stange malfunction occur. Two tables are empty. The last 7
days backup from yeasterday is ok but when they started using the
application next morning the two tables are empty. I is always the same
two tables. This malfunction happens one to two times per year.
any proposal?
Best regards,
Bertil MorefältBertil,
To know what is happening you'd need to be able to view the activity for the
database. This can be done by a trace (Profiler) and or by viewing the
activities contained in the transaction log (third-party log tool). Might
check out:
http://www.lumigent.com/products/le_sql.html
for the future. This type of tool may allow you to "undo" the deletes as
well.
HTH
Jerry
"Bertil Morefalt" <bertil@.community.nospam> wrote in message
news:OxDh9qbyFHA.1028@.TK2MSFTNGP12.phx.gbl...
> Hi!
> Is there any known problem with MSSQL 7.0 losing data from tables?
> I have an application that is used by 150 seperate customer and each
> customer have about 10 users. The customers have theirs database at a
> server administrated by me. The application access its database through
> ODBC. In maintenanse plan I have a short-time backup of the databases
> three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
> 19:00.
> Sometime a stange malfunction occur. Two tables are empty. The last 7 days
> backup from yeasterday is ok but when they started using the application
> next morning the two tables are empty. I is always the same two tables.
> This malfunction happens one to two times per year.
> any proposal?
> Best regards,
> Bertil Morefält|||Hi Bertil,
It's really strange that two tables will become empty and I believe some
operations must be done. You may follow the Jerry's suggestion using
Profiler to trace the SQL actions.
Sincerely yours,
Michael Cheng
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
=====================================================This posting is provided "AS IS" with no warranties, and confers no rights.|||Hi,
I've recently had some data issues with my MS SQL Server 2000. From
time to time, we will lose some data from tables here and there. I
don't know what's causing it, but it's a pretty decent size database.
We won't loose a whole tables' worth, but some data.
About 50 tables, some of the tables, such as our: vendors, leads,
contacts, manufacturers can have up to 12,000 records, but that's about it.
About Jerry's suggestion.. Does Microsoft have any profilers for their
own products or do you always have to go third party? Thanks.
Marc F.
Michael Cheng [MSFT] wrote:
> Hi Bertil,
> It's really strange that two tables will become empty and I believe some
> operations must be done. You may follow the Jerry's suggestion using
> Profiler to trace the SQL actions.
>
> Sincerely yours,
> Michael Cheng
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> =====================================================> This posting is provided "AS IS" with no warranties, and confers no rights.
>|||You can use SQL Profiler which comes with SQL Server.
-Sue
On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
>Hi,
>I've recently had some data issues with my MS SQL Server 2000. From
>time to time, we will lose some data from tables here and there. I
>don't know what's causing it, but it's a pretty decent size database.
>We won't loose a whole tables' worth, but some data.
>About 50 tables, some of the tables, such as our: vendors, leads,
>contacts, manufacturers can have up to 12,000 records, but that's about it.
>About Jerry's suggestion.. Does Microsoft have any profilers for their
>own products or do you always have to go third party? Thanks.
>Marc F.
>
>Michael Cheng [MSFT] wrote:
>> Hi Bertil,
>> It's really strange that two tables will become empty and I believe some
>> operations must be done. You may follow the Jerry's suggestion using
>> Profiler to trace the SQL actions.
>>
>> Sincerely yours,
>> Michael Cheng
>> Microsoft Online Partner Support
>> When responding to posts, please "Reply to Group" via your newsreader so
>> that others may learn and benefit from your issue.
>> =====================================================>> This posting is provided "AS IS" with no warranties, and confers no rights.|||Hey Sue,
Thanks. I didn't know it had that. I'm playing with it now.. is there
any way for this to automatically run in the background or do I always
have to start it up?
Marc F.
Sue Hoegemeier wrote:
> You can use SQL Profiler which comes with SQL Server.
> -Sue
> On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>>Hi,
>>I've recently had some data issues with my MS SQL Server 2000. From
>>time to time, we will lose some data from tables here and there. I
>>don't know what's causing it, but it's a pretty decent size database.
>>We won't loose a whole tables' worth, but some data.
>>About 50 tables, some of the tables, such as our: vendors, leads,
>>contacts, manufacturers can have up to 12,000 records, but that's about it.
>>About Jerry's suggestion.. Does Microsoft have any profilers for their
>>own products or do you always have to go third party? Thanks.
>>Marc F.
>>
>>Michael Cheng [MSFT] wrote:
>>Hi Bertil,
>>It's really strange that two tables will become empty and I believe some
>>operations must be done. You may follow the Jerry's suggestion using
>>Profiler to trace the SQL actions.
>>
>>Sincerely yours,
>>Michael Cheng
>>Microsoft Online Partner Support
>>When responding to posts, please "Reply to Group" via your newsreader so
>>that others may learn and benefit from your issue.
>>=====================================================>>This posting is provided "AS IS" with no warranties, and confers no rights.
>|||Hi Marc,
Sure...You can run the same thing that Profiler runs using
server side traces - produces a trc file that you can open
in profiler or you can view as well as import into a table
using fn_trace_gettable.
The quickest way to get started with it is to just use
Profiler to create the trace you want. Then in Profiler, go
to File and select Script Trace. This will create the server
side trace for you - all you would really need to change on
the script is the destination file for the trace. You can
execute traces in jobs if you want.
If you execute the entire script you generated with
Profiler, it will start up the trace. You can see in the
script how the tracing is done with the trace stored
procedures. The stored procedures for traces are named
sp_trace_xxx.
There are also a few fn_trace_xxx trace functions you can
use as well. To get information on all traces that are
running, use:
SELECT * FROM :: fn_trace_getinfo(default)
You get the trace id when you execute this which can help
you in using the other functions, stored procedures that use
the trace id for a parameter (such as sp_trace_setstatus).
Hope that helps get you going with it -
-Sue
On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
>Hey Sue,
>Thanks. I didn't know it had that. I'm playing with it now.. is there
>any way for this to automatically run in the background or do I always
>have to start it up?
>Marc F.
>Sue Hoegemeier wrote:
>> You can use SQL Profiler which comes with SQL Server.
>> -Sue
>> On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
>> <marc@.digitalalias.net> wrote:
>>
>>Hi,
>>I've recently had some data issues with my MS SQL Server 2000. From
>>time to time, we will lose some data from tables here and there. I
>>don't know what's causing it, but it's a pretty decent size database.
>>We won't loose a whole tables' worth, but some data.
>>About 50 tables, some of the tables, such as our: vendors, leads,
>>contacts, manufacturers can have up to 12,000 records, but that's about it.
>>About Jerry's suggestion.. Does Microsoft have any profilers for their
>>own products or do you always have to go third party? Thanks.
>>Marc F.
>>
>>Michael Cheng [MSFT] wrote:
>>Hi Bertil,
>>It's really strange that two tables will become empty and I believe some
>>operations must be done. You may follow the Jerry's suggestion using
>>Profiler to trace the SQL actions.
>>
>>Sincerely yours,
>>Michael Cheng
>>Microsoft Online Partner Support
>>When responding to posts, please "Reply to Group" via your newsreader so
>>that others may learn and benefit from your issue.
>>=====================================================>>This posting is provided "AS IS" with no warranties, and confers no rights.
>>|||Hi Sue,
Thanks for that info. I was wondering if there are any tutorials or
docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
in 21 Days", but it only mentions profiler. I was playing with it and a
lot of the events and their displayed data doesn't make sense to me.
I'm trying them one-by-one to see what kind of result they show after I
do something related to our database. As you probably know.. it's
becoming a long process. :D
Marc F.
Sue Hoegemeier wrote:
> Hi Marc,
> Sure...You can run the same thing that Profiler runs using
> server side traces - produces a trc file that you can open
> in profiler or you can view as well as import into a table
> using fn_trace_gettable.
> The quickest way to get started with it is to just use
> Profiler to create the trace you want. Then in Profiler, go
> to File and select Script Trace. This will create the server
> side trace for you - all you would really need to change on
> the script is the destination file for the trace. You can
> execute traces in jobs if you want.
> If you execute the entire script you generated with
> Profiler, it will start up the trace. You can see in the
> script how the tracing is done with the trace stored
> procedures. The stored procedures for traces are named
> sp_trace_xxx.
> There are also a few fn_trace_xxx trace functions you can
> use as well. To get information on all traces that are
> running, use:
> SELECT * FROM :: fn_trace_getinfo(default)
> You get the trace id when you execute this which can help
> you in using the other functions, stored procedures that use
> the trace id for a parameter (such as sp_trace_setstatus).
> Hope that helps get you going with it -
> -Sue
> On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>>Hey Sue,
>>Thanks. I didn't know it had that. I'm playing with it now.. is there
>>any way for this to automatically run in the background or do I always
>>have to start it up?
>>Marc F.
>>Sue Hoegemeier wrote:
>>You can use SQL Profiler which comes with SQL Server.
>>-Sue
>>On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
>><marc@.digitalalias.net> wrote:
>>
>>Hi,
>>I've recently had some data issues with my MS SQL Server 2000. From
>>time to time, we will lose some data from tables here and there. I
>>don't know what's causing it, but it's a pretty decent size database.
>>We won't loose a whole tables' worth, but some data.
>>About 50 tables, some of the tables, such as our: vendors, leads,
>>contacts, manufacturers can have up to 12,000 records, but that's about it.
>>About Jerry's suggestion.. Does Microsoft have any profilers for their
>>own products or do you always have to go third party? Thanks.
>>Marc F.
>>
>>Michael Cheng [MSFT] wrote:
>>
>>Hi Bertil,
>>It's really strange that two tables will become empty and I believe some
>>operations must be done. You may follow the Jerry's suggestion using
>>Profiler to trace the SQL actions.
>>
>>Sincerely yours,
>>Michael Cheng
>>Microsoft Online Partner Support
>>When responding to posts, please "Reply to Group" via your newsreader so
>>that others may learn and benefit from your issue.
>>=====================================================>>This posting is provided "AS IS" with no warranties, and confers no rights.
>>
>|||Marc,
SQL Magazine had some good articles on Profiler - you can
try searching for those at: www.sqlmag.com
This web cast may help:
Support WebCast: SQL Server 2000 Profiler: What's New and
How to Effectively Use It
http://support.microsoft.com/default.aspx?id=325263
And you can find Profiler tips at:
http://www.sql-server-performance.com/sql_server_profiler_tips.asp
-Sue
On Mon, 23 Jan 2006 09:49:02 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
>Hi Sue,
>Thanks for that info. I was wondering if there are any tutorials or
>docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
>in 21 Days", but it only mentions profiler. I was playing with it and a
>lot of the events and their displayed data doesn't make sense to me.
>I'm trying them one-by-one to see what kind of result they show after I
>do something related to our database. As you probably know.. it's
>becoming a long process. :D
>Marc F.
>Sue Hoegemeier wrote:
>> Hi Marc,
>> Sure...You can run the same thing that Profiler runs using
>> server side traces - produces a trc file that you can open
>> in profiler or you can view as well as import into a table
>> using fn_trace_gettable.
>> The quickest way to get started with it is to just use
>> Profiler to create the trace you want. Then in Profiler, go
>> to File and select Script Trace. This will create the server
>> side trace for you - all you would really need to change on
>> the script is the destination file for the trace. You can
>> execute traces in jobs if you want.
>> If you execute the entire script you generated with
>> Profiler, it will start up the trace. You can see in the
>> script how the tracing is done with the trace stored
>> procedures. The stored procedures for traces are named
>> sp_trace_xxx.
>> There are also a few fn_trace_xxx trace functions you can
>> use as well. To get information on all traces that are
>> running, use:
>> SELECT * FROM :: fn_trace_getinfo(default)
>> You get the trace id when you execute this which can help
>> you in using the other functions, stored procedures that use
>> the trace id for a parameter (such as sp_trace_setstatus).
>> Hope that helps get you going with it -
>> -Sue
>> On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
>> <marc@.digitalalias.net> wrote:
>>
>>Hey Sue,
>>Thanks. I didn't know it had that. I'm playing with it now.. is there
>>any way for this to automatically run in the background or do I always
>>have to start it up?
>>Marc F.
>>Sue Hoegemeier wrote:
>>You can use SQL Profiler which comes with SQL Server.
>>-Sue
>>On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
>><marc@.digitalalias.net> wrote:
>>
>>Hi,
>>I've recently had some data issues with my MS SQL Server 2000. From
>>time to time, we will lose some data from tables here and there. I
>>don't know what's causing it, but it's a pretty decent size database.
>>We won't loose a whole tables' worth, but some data.
>>About 50 tables, some of the tables, such as our: vendors, leads,
>>contacts, manufacturers can have up to 12,000 records, but that's about it.
>>About Jerry's suggestion.. Does Microsoft have any profilers for their
>>own products or do you always have to go third party? Thanks.
>>Marc F.
>>
>>Michael Cheng [MSFT] wrote:
>>
>>Hi Bertil,
>>It's really strange that two tables will become empty and I believe some
>>operations must be done. You may follow the Jerry's suggestion using
>>Profiler to trace the SQL actions.
>>
>>Sincerely yours,
>>Michael Cheng
>>Microsoft Online Partner Support
>>When responding to posts, please "Reply to Group" via your newsreader so
>>that others may learn and benefit from your issue.
>>=====================================================>>This posting is provided "AS IS" with no warranties, and confers no rights.
>>
>>
losing data from tables
Is there any known problem with MSSQL 7.0 losing data from tables?
I have an application that is used by 150 seperate customer and each
customer have about 10 users. The customers have theirs database at a
server administrated by me. The application access its database through
ODBC. In maintenanse plan I have a short-time backup of the databases
three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
19:00.
Sometime a stange malfunction occur. Two tables are empty. The last 7
days backup from yeasterday is ok but when they started using the
application next morning the two tables are empty. I is always the same
two tables. This malfunction happens one to two times per year.
any proposal?
Best regards,
Bertil Moreflt
Bertil,
To know what is happening you'd need to be able to view the activity for the
database. This can be done by a trace (Profiler) and or by viewing the
activities contained in the transaction log (third-party log tool). Might
check out:
http://www.lumigent.com/products/le_sql.html
for the future. This type of tool may allow you to "undo" the deletes as
well.
HTH
Jerry
"Bertil Morefalt" <bertil@.community.nospam> wrote in message
news:OxDh9qbyFHA.1028@.TK2MSFTNGP12.phx.gbl...
> Hi!
> Is there any known problem with MSSQL 7.0 losing data from tables?
> I have an application that is used by 150 seperate customer and each
> customer have about 10 users. The customers have theirs database at a
> server administrated by me. The application access its database through
> ODBC. In maintenanse plan I have a short-time backup of the databases
> three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
> 19:00.
> Sometime a stange malfunction occur. Two tables are empty. The last 7 days
> backup from yeasterday is ok but when they started using the application
> next morning the two tables are empty. I is always the same two tables.
> This malfunction happens one to two times per year.
> any proposal?
> Best regards,
> Bertil Moreflt
|||Hi Bertil,
It's really strange that two tables will become empty and I believe some
operations must be done. You may follow the Jerry's suggestion using
Profiler to trace the SQL actions.
Sincerely yours,
Michael Cheng
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
================================================== ===
This posting is provided "AS IS" with no warranties, and confers no rights.
|||Hi,
I've recently had some data issues with my MS SQL Server 2000. From
time to time, we will lose some data from tables here and there. I
don't know what's causing it, but it's a pretty decent size database.
We won't loose a whole tables' worth, but some data.
About 50 tables, some of the tables, such as our: vendors, leads,
contacts, manufacturers can have up to 12,000 records, but that's about it.
About Jerry's suggestion.. Does Microsoft have any profilers for their
own products or do you always have to go third party? Thanks.
Marc F.
Michael Cheng [MSFT] wrote:
> Hi Bertil,
> It's really strange that two tables will become empty and I believe some
> operations must be done. You may follow the Jerry's suggestion using
> Profiler to trace the SQL actions.
>
> Sincerely yours,
> Michael Cheng
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> ================================================== ===
> This posting is provided "AS IS" with no warranties, and confers no rights.
>
|||You can use SQL Profiler which comes with SQL Server.
-Sue
On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hi,
>I've recently had some data issues with my MS SQL Server 2000. From
>time to time, we will lose some data from tables here and there. I
>don't know what's causing it, but it's a pretty decent size database.
>We won't loose a whole tables' worth, but some data.
>About 50 tables, some of the tables, such as our: vendors, leads,
>contacts, manufacturers can have up to 12,000 records, but that's about it.
>About Jerry's suggestion.. Does Microsoft have any profilers for their
>own products or do you always have to go third party? Thanks.
>Marc F.
>
>Michael Cheng [MSFT] wrote:
|||Hey Sue,
Thanks. I didn't know it had that. I'm playing with it now.. is there
any way for this to automatically run in the background or do I always
have to start it up?
Marc F.
Sue Hoegemeier wrote:
> You can use SQL Profiler which comes with SQL Server.
> -Sue
> On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>
|||Hi Marc,
Sure...You can run the same thing that Profiler runs using
server side traces - produces a trc file that you can open
in profiler or you can view as well as import into a table
using fn_trace_gettable.
The quickest way to get started with it is to just use
Profiler to create the trace you want. Then in Profiler, go
to File and select Script Trace. This will create the server
side trace for you - all you would really need to change on
the script is the destination file for the trace. You can
execute traces in jobs if you want.
If you execute the entire script you generated with
Profiler, it will start up the trace. You can see in the
script how the tracing is done with the trace stored
procedures. The stored procedures for traces are named
sp_trace_xxx.
There are also a few fn_trace_xxx trace functions you can
use as well. To get information on all traces that are
running, use:
SELECT * FROM :: fn_trace_getinfo(default)
You get the trace id when you execute this which can help
you in using the other functions, stored procedures that use
the trace id for a parameter (such as sp_trace_setstatus).
Hope that helps get you going with it -
-Sue
On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hey Sue,
>Thanks. I didn't know it had that. I'm playing with it now.. is there
>any way for this to automatically run in the background or do I always
>have to start it up?
>Marc F.
>Sue Hoegemeier wrote:
|||Hi Sue,
Thanks for that info. I was wondering if there are any tutorials or
docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
in 21 Days", but it only mentions profiler. I was playing with it and a
lot of the events and their displayed data doesn't make sense to me.
I'm trying them one-by-one to see what kind of result they show after I
do something related to our database. As you probably know.. it's
becoming a long process. :D
Marc F.
Sue Hoegemeier wrote:
> Hi Marc,
> Sure...You can run the same thing that Profiler runs using
> server side traces - produces a trc file that you can open
> in profiler or you can view as well as import into a table
> using fn_trace_gettable.
> The quickest way to get started with it is to just use
> Profiler to create the trace you want. Then in Profiler, go
> to File and select Script Trace. This will create the server
> side trace for you - all you would really need to change on
> the script is the destination file for the trace. You can
> execute traces in jobs if you want.
> If you execute the entire script you generated with
> Profiler, it will start up the trace. You can see in the
> script how the tracing is done with the trace stored
> procedures. The stored procedures for traces are named
> sp_trace_xxx.
> There are also a few fn_trace_xxx trace functions you can
> use as well. To get information on all traces that are
> running, use:
> SELECT * FROM :: fn_trace_getinfo(default)
> You get the trace id when you execute this which can help
> you in using the other functions, stored procedures that use
> the trace id for a parameter (such as sp_trace_setstatus).
> Hope that helps get you going with it -
> -Sue
> On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>
|||Marc,
SQL Magazine had some good articles on Profiler - you can
try searching for those at: www.sqlmag.com
This web cast may help:
Support WebCast: SQL Server 2000 Profiler: What's New and
How to Effectively Use It
http://support.microsoft.com/default.aspx?id=325263
And you can find Profiler tips at:
http://www.sql-server-performance.co...filer_tips.asp
-Sue
On Mon, 23 Jan 2006 09:49:02 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hi Sue,
>Thanks for that info. I was wondering if there are any tutorials or
>docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
>in 21 Days", but it only mentions profiler. I was playing with it and a
>lot of the events and their displayed data doesn't make sense to me.
>I'm trying them one-by-one to see what kind of result they show after I
>do something related to our database. As you probably know.. it's
>becoming a long process. :D
>Marc F.
>Sue Hoegemeier wrote:
losing data from tables
Is there any known problem with MSSQL 7.0 losing data from tables?
I have an application that is used by 150 seperate customer and each
customer have about 10 users. The customers have theirs database at a
server administrated by me. The application access its database through
ODBC. In maintenanse plan I have a short-time backup of the databases
three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
19:00.
Sometime a stange malfunction occur. Two tables are empty. The last 7
days backup from yeasterday is ok but when they started using the
application next morning the two tables are empty. I is always the same
two tables. This malfunction happens one to two times per year.
any proposal?
Best regards,
Bertil MorefltBertil,
To know what is happening you'd need to be able to view the activity for the
database. This can be done by a trace (Profiler) and or by viewing the
activities contained in the transaction log (third-party log tool). Might
check out:
http://www.lumigent.com/products/le_sql.html
for the future. This type of tool may allow you to "undo" the deletes as
well.
HTH
Jerry
"Bertil Morefalt" <bertil@.community.nospam> wrote in message
news:OxDh9qbyFHA.1028@.TK2MSFTNGP12.phx.gbl...
> Hi!
> Is there any known problem with MSSQL 7.0 losing data from tables?
> I have an application that is used by 150 seperate customer and each
> customer have about 10 users. The customers have theirs database at a
> server administrated by me. The application access its database through
> ODBC. In maintenanse plan I have a short-time backup of the databases
> three times a day (8:30, 11:30, 14:30) and a 7 days backup every day at
> 19:00.
> Sometime a stange malfunction occur. Two tables are empty. The last 7 days
> backup from yeasterday is ok but when they started using the application
> next morning the two tables are empty. I is always the same two tables.
> This malfunction happens one to two times per year.
> any proposal?
> Best regards,
> Bertil Moreflt|||Hi Bertil,
It's really strange that two tables will become empty and I believe some
operations must be done. You may follow the Jerry's suggestion using
Profiler to trace the SQL actions.
Sincerely yours,
Michael Cheng
Microsoft Online Partner Support
When responding to posts, please "Reply to Group" via your newsreader so
that others may learn and benefit from your issue.
========================================
=============
This posting is provided "AS IS" with no warranties, and confers no rights.|||Hi,
I've recently had some data issues with my MS SQL Server 2000. From
time to time, we will lose some data from tables here and there. I
don't know what's causing it, but it's a pretty decent size database.
We won't loose a whole tables' worth, but some data.
About 50 tables, some of the tables, such as our: vendors, leads,
contacts, manufacturers can have up to 12,000 records, but that's about it.
About Jerry's suggestion.. Does Microsoft have any profilers for their
own products or do you always have to go third party? Thanks.
Marc F.
Michael Cheng [MSFT] wrote:
> Hi Bertil,
> It's really strange that two tables will become empty and I believe some
> operations must be done. You may follow the Jerry's suggestion using
> Profiler to trace the SQL actions.
>
> Sincerely yours,
> Michael Cheng
> Microsoft Online Partner Support
> When responding to posts, please "Reply to Group" via your newsreader so
> that others may learn and benefit from your issue.
> ========================================
=============
> This posting is provided "AS IS" with no warranties, and confers no rights
.
>|||You can use SQL Profiler which comes with SQL Server.
-Sue
On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hi,
>I've recently had some data issues with my MS SQL Server 2000. From
>time to time, we will lose some data from tables here and there. I
>don't know what's causing it, but it's a pretty decent size database.
>We won't loose a whole tables' worth, but some data.
>About 50 tables, some of the tables, such as our: vendors, leads,
>contacts, manufacturers can have up to 12,000 records, but that's about it.
>About Jerry's suggestion.. Does Microsoft have any profilers for their
>own products or do you always have to go third party? Thanks.
>Marc F.
>
>Michael Cheng [MSFT] wrote:|||Hey Sue,
Thanks. I didn't know it had that. I'm playing with it now.. is there
any way for this to automatically run in the background or do I always
have to start it up?
Marc F.
Sue Hoegemeier wrote:
> You can use SQL Profiler which comes with SQL Server.
> -Sue
> On Fri, 20 Jan 2006 15:31:59 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>|||Hi Marc,
Sure...You can run the same thing that Profiler runs using
server side traces - produces a trc file that you can open
in profiler or you can view as well as import into a table
using fn_trace_gettable.
The quickest way to get started with it is to just use
Profiler to create the trace you want. Then in Profiler, go
to File and select Script Trace. This will create the server
side trace for you - all you would really need to change on
the script is the destination file for the trace. You can
execute traces in jobs if you want.
If you execute the entire script you generated with
Profiler, it will start up the trace. You can see in the
script how the tracing is done with the trace stored
procedures. The stored procedures for traces are named
sp_trace_xxx.
There are also a few fn_trace_xxx trace functions you can
use as well. To get information on all traces that are
running, use:
SELECT * FROM :: fn_trace_getinfo(default)
You get the trace id when you execute this which can help
you in using the other functions, stored procedures that use
the trace id for a parameter (such as sp_trace_setstatus).
Hope that helps get you going with it -
-Sue
On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hey Sue,
>Thanks. I didn't know it had that. I'm playing with it now.. is there
>any way for this to automatically run in the background or do I always
>have to start it up?
>Marc F.
>Sue Hoegemeier wrote:|||Hi Sue,
Thanks for that info. I was wondering if there are any tutorials or
docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
in 21 Days", but it only mentions profiler. I was playing with it and a
lot of the events and their displayed data doesn't make sense to me.
I'm trying them one-by-one to see what kind of result they show after I
do something related to our database. As you probably know.. it's
becoming a long process. :D
Marc F.
Sue Hoegemeier wrote:
> Hi Marc,
> Sure...You can run the same thing that Profiler runs using
> server side traces - produces a trc file that you can open
> in profiler or you can view as well as import into a table
> using fn_trace_gettable.
> The quickest way to get started with it is to just use
> Profiler to create the trace you want. Then in Profiler, go
> to File and select Script Trace. This will create the server
> side trace for you - all you would really need to change on
> the script is the destination file for the trace. You can
> execute traces in jobs if you want.
> If you execute the entire script you generated with
> Profiler, it will start up the trace. You can see in the
> script how the tracing is done with the trace stored
> procedures. The stored procedures for traces are named
> sp_trace_xxx.
> There are also a few fn_trace_xxx trace functions you can
> use as well. To get information on all traces that are
> running, use:
> SELECT * FROM :: fn_trace_getinfo(default)
> You get the trace id when you execute this which can help
> you in using the other functions, stored procedures that use
> the trace id for a parameter (such as sp_trace_setstatus).
> Hope that helps get you going with it -
> -Sue
> On Fri, 20 Jan 2006 16:57:31 -0500, Marc Ferguson
> <marc@.digitalalias.net> wrote:
>
>|||Marc,
SQL Magazine had some good articles on Profiler - you can
try searching for those at: www.sqlmag.com
This web cast may help:
Support WebCast: SQL Server 2000 Profiler: What's New and
How to Effectively Use It
http://support.microsoft.com/default.aspx?id=325263
And you can find Profiler tips at:
http://www.sql-server-performance.c...ofiler_tips.asp
-Sue
On Mon, 23 Jan 2006 09:49:02 -0500, Marc Ferguson
<marc@.digitalalias.net> wrote:
[vbcol=seagreen]
>Hi Sue,
>Thanks for that info. I was wondering if there are any tutorials or
>docs on Profiler? I have "Sams Teach Yourself Microsoft SQL Server 2000
>in 21 Days", but it only mentions profiler. I was playing with it and a
>lot of the events and their displayed data doesn't make sense to me.
>I'm trying them one-by-one to see what kind of result they show after I
>do something related to our database. As you probably know.. it's
>becoming a long process. :D
>Marc F.
>Sue Hoegemeier wrote:
Losing connections with SQL 2005
to SQL Server 2005. At numerous customer locations I am now experiencing
occasional "lost connection" problems. This never happened with SQL 2000.
Our client app does not use pooled connections and I have not been able to
find a clean way to recover from the broken connection problem. If I check
the status of the Connection it claims that it is open, it is only if I try
to make a database request that I get the error back. Re-opening the
connection solves the problem.
Is anyone aware of SQL Server 2005 doing anything different with respect to
keeping database connections open?
Thanks,
Intermittent connectivity issues are generally difficult to
troubleshoot. That being said...what do you define as a
"lost connection". What are the errors? What errors do the
clients receive? What errors are in the logs? Have you run
profiler traces to try to get more information on what may
be going on?
Does the application keep connections open for some reason
after processing a database request?
-Sue
On Fri, 9 Feb 2007 13:36:01 -0800, Darrel Miller
<DarrelMiller@.discussions.microsoft.com> wrote:
>We have a VB6 application that I recently converted over from SQL Server 2000
>to SQL Server 2005. At numerous customer locations I am now experiencing
>occasional "lost connection" problems. This never happened with SQL 2000.
>Our client app does not use pooled connections and I have not been able to
>find a clean way to recover from the broken connection problem. If I check
>the status of the Connection it claims that it is open, it is only if I try
>to make a database request that I get the error back. Re-opening the
>connection solves the problem.
>Is anyone aware of SQL Server 2005 doing anything different with respect to
>keeping database connections open?
>Thanks,
|||Hi Sue,
The error number reported is either 3604 or 16389 from Microsoft OLE DB
provider for SQL Server.
The best way I can describe "lost connection" is that you can no longer do
Connection.Execute or pass the connection object to Recordset.Open.
The behaviour is exactly as if you unplugged a network cable and then
replugged it. In fact I have been doing this to try and develop error
handling code to recover from the situtation.
> Does the application keep connections open for some reason
> after processing a database request?
As I mentioned our application does not use connection pooling and therefore
we obviously keep the connection open after a database request. Each client
application maintains two connections for the lifetime of the application.
One connection is a read-only connection, the other is a read write. Our
target customer is less than 50 users, so keeping a 100 connection objects
open is not a problem for the server.
My problem is that SQL 2005 seems to have much more fragile database
connections than SQL 2000 did. I was just wondering if there were some SQL
settings that could increase the robustness.
Thanks,
Darrel
"Sue Hoegemeier" wrote:
> Intermittent connectivity issues are generally difficult to
> troubleshoot. That being said...what do you define as a
> "lost connection". What are the errors? What errors do the
> clients receive? What errors are in the logs? Have you run
> profiler traces to try to get more information on what may
> be going on?
> Does the application keep connections open for some reason
> after processing a database request?
> -Sue
> On Fri, 9 Feb 2007 13:36:01 -0800, Darrel Miller
> <DarrelMiller@.discussions.microsoft.com> wrote:
>
>
|||On Tue, 13 Feb 2007 09:04:01 -0800, Darrel Miller
<DarrelMiller@.discussions.microsoft.com> wrote:
>Hi Sue,
>The error number reported is either 3604 or 16389 from Microsoft OLE DB
>provider for SQL Server.
>The best way I can describe "lost connection" is that you can no longer do
>Connection.Execute or pass the connection object to Recordset.Open.
Then that's an application issue or address from the
application end. Those are things you see in the app not in
SQL Server.
>As I mentioned our application does not use connection pooling and therefore
>we obviously keep the connection open after a database request.
That doesn't necessarily mean you would or should keep two
connections open for the lifetime of the application but
that's a whole different thing.
>My problem is that SQL 2005 seems to have much more fragile database
>connections than SQL 2000 did. I was just wondering if there were some SQL
>settings that could increase the robustness.
Part of the problem in try to address it from a SQL Server
end is that you can't really quantify it. It's not real
clear what data access changes may have happened with the
application either - did you also switch over to ADO.Net,
change MDAC versions, drivers/providers? You need to look at
network, OS, potential issues as well.
-Sue
|||Darrel, we have exactly this issue, but with a different front end language.
We converted in Nov 06 and immediately our remote clients began having
serious issues just as you describe. We reduced that somewhat by
implementing an ado connection object that "pings" periodically to keep the
connection alive, but in recent weeks the problem has gotten worse again.
I ran into Bill Vaughn at a user group meeting late last year and picked his
brain (or should I say "opened the tap"). The bottom line seemed to be: the
most effective solution is to upgrade the app to dotnet; changing to sql
native client can help somewhat; "pinging" can also help. Upgrading our app
is a really big task, and we haven't done it. We did the ping thing and the
native client thing, but as I said, those things alone are not the solution.
We opened a ticket with MS and spend hours in emails and on the phone with
overseas MS support people on this issue. They really didn't have a clue.
By the way, look at Bill's posting on 3/2/07 in which he says unequivocably
that for a win app, connecting and staying connected is a preferred
practice. Obviously Sue Hoegemeier does not agree.
My view is that MS is either unaware of, in denial about, or unwilling to
acknowledge and correct this issue.
Jeremy
"Darrel Miller" <DarrelMiller@.discussions.microsoft.com> wrote in message
news:46643664-0C8E-4C94-8DE4-4704518EAE8A@.microsoft.com...
> We have a VB6 application that I recently converted over from SQL Server
> 2000
> to SQL Server 2005. At numerous customer locations I am now experiencing
> occasional "lost connection" problems. This never happened with SQL 2000.
> Our client app does not use pooled connections and I have not been able to
> find a clean way to recover from the broken connection problem. If I
> check
> the status of the Connection it claims that it is open, it is only if I
> try
> to make a database request that I get the error back. Re-opening the
> connection solves the problem.
> Is anyone aware of SQL Server 2005 doing anything different with respect
> to
> keeping database connections open?
> Thanks,
>
|||Bill, I do appreciate your detailed and thoughtful response. However, our
application did not experience dropped connections under sql 2k (or they
were so insignificant no one bothered me about it), whereas despite our best
efforts, I see many error reports every day with sql 2k5. Seems quite clear
that the problem is dramatically worse with the new version.
Jeremy
"William (Bill) Vaughn" <billvaRemoveThis@.betav.com> wrote in message
news:OaM5uelbHHA.4808@.TK2MSFTNGP04.phx.gbl...
> Ahem... the ability to stay connected once connected has been a perennial
> issue. I can remember back to the earliest versions of SQL Server when
> connections "aged" out for some unknown reason. No, I don't know the
> actual reason, but symptoms lead me to believe that there is an ack/nak
> protocol in the TDS that basically pings the application at the driver
> level to see if the server or client is still there. This makes sense as
> SQL Server automatically detects if the client falls into the sea or gets
> disconnected for some reason. This polling has to be done on the server as
> it scans the current connections to see if there is activity (work to be
> done) or if there is still a client (albeit inactive) attached. I suspect
> that if the client is busy (running Age of Empires or Windows OneCare) and
> can't respond in time or the event is simply lost (as can happen in
> Windows message-loop-based applications) the server thinks the client has
> dropped off. My solution to this is to keep polling the server (once every
> 30 seconds or so seems to be more than enough) to keep the connection
> alive. Does the Connection pooling mechanism do this on its own? BHOM, but
> I would not be surprised. I have not seen evidence of this in the Profiler
> logs though. I suspect it's done at the TDS provider level. Since we don't
> have access to those APIs, one can constantly close and reopen the
> connection (use the Connection pool) or poll (which is pretty easy and
> faster as it does not require reauthenication, resetting the connection
> and loss of the server state). To poll you can execute any SQL like "USE
> <initial catalog>" or "DECLARE i as INT".
> I'm not sure I said that keeping a connection open is a "preferred"
> method, just a perfectly viable method--assuming you know the costs.
> hth
> --
> ____________________________________
> William (Bill) Vaughn
> Author, Mentor, Consultant
> Microsoft MVP
> INETA Speaker
> www.betav.com/blog/billva
> www.betav.com
> Please reply only to the newsgroup so that others can benefit.
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
> __________________________________
> Visit www.hitchhikerguides.net to get more information on my latest book:
> Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
> and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
> ------
> "Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
> news:uAzf37ibHHA.2088@.TK2MSFTNGP04.phx.gbl...
>
|||Ok, there are a lot of "fixes" in SS2K5 that might have had this unintended
side-effect. Have you installed the latest SP2? I think there is a SP2a (or
b).
I would log a bug with MS. The (more or less) monitor this group (but pay
more attention to the MSDN forums).
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
Visit www.hitchhikerguides.net to get more information on my latest book:
Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
------
"Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
news:Oa3%2321nbHHA.2300@.TK2MSFTNGP06.phx.gbl...
> Bill, I do appreciate your detailed and thoughtful response. However, our
> application did not experience dropped connections under sql 2k (or they
> were so insignificant no one bothered me about it), whereas despite our
> best efforts, I see many error reports every day with sql 2k5. Seems
> quite clear that the problem is dramatically worse with the new version.
> Jeremy
>
> "William (Bill) Vaughn" <billvaRemoveThis@.betav.com> wrote in message
> news:OaM5uelbHHA.4808@.TK2MSFTNGP04.phx.gbl...
>
sql
Losing connections with SQL 2005
0
to SQL Server 2005. At numerous customer locations I am now experiencing
occasional "lost connection" problems. This never happened with SQL 2000.
Our client app does not use pooled connections and I have not been able to
find a clean way to recover from the broken connection problem. If I check
the status of the Connection it claims that it is open, it is only if I try
to make a database request that I get the error back. Re-opening the
connection solves the problem.
Is anyone aware of SQL Server 2005 doing anything different with respect to
keeping database connections open?
Thanks,Intermittent connectivity issues are generally difficult to
troubleshoot. That being said...what do you define as a
"lost connection". What are the errors? What errors do the
clients receive? What errors are in the logs? Have you run
profiler traces to try to get more information on what may
be going on?
Does the application keep connections open for some reason
after processing a database request?
-Sue
On Fri, 9 Feb 2007 13:36:01 -0800, Darrel Miller
<DarrelMiller@.discussions.microsoft.com> wrote:
>We have a VB6 application that I recently converted over from SQL Server 20
00
>to SQL Server 2005. At numerous customer locations I am now experiencing
>occasional "lost connection" problems. This never happened with SQL 2000.
>Our client app does not use pooled connections and I have not been able to
>find a clean way to recover from the broken connection problem. If I check
>the status of the Connection it claims that it is open, it is only if I try
>to make a database request that I get the error back. Re-opening the
>connection solves the problem.
>Is anyone aware of SQL Server 2005 doing anything different with respect to
>keeping database connections open?
>Thanks,|||Hi Sue,
The error number reported is either 3604 or 16389 from Microsoft OLE DB
provider for SQL Server.
The best way I can describe "lost connection" is that you can no longer do
Connection.Execute or pass the connection object to Recordset.Open.
The behaviour is exactly as if you unplugged a network cable and then
replugged it. In fact I have been doing this to try and develop error
handling code to recover from the situtation.
> Does the application keep connections open for some reason
> after processing a database request?
As I mentioned our application does not use connection pooling and therefore
we obviously keep the connection open after a database request. Each client
application maintains two connections for the lifetime of the application.
One connection is a read-only connection, the other is a read write. Our
target customer is less than 50 users, so keeping a 100 connection objects
open is not a problem for the server.
My problem is that SQL 2005 seems to have much more fragile database
connections than SQL 2000 did. I was just wondering if there were some SQL
settings that could increase the robustness.
Thanks,
Darrel
"Sue Hoegemeier" wrote:
> Intermittent connectivity issues are generally difficult to
> troubleshoot. That being said...what do you define as a
> "lost connection". What are the errors? What errors do the
> clients receive? What errors are in the logs? Have you run
> profiler traces to try to get more information on what may
> be going on?
> Does the application keep connections open for some reason
> after processing a database request?
> -Sue
> On Fri, 9 Feb 2007 13:36:01 -0800, Darrel Miller
> <DarrelMiller@.discussions.microsoft.com> wrote:
>
>|||On Tue, 13 Feb 2007 09:04:01 -0800, Darrel Miller
<DarrelMiller@.discussions.microsoft.com> wrote:
>Hi Sue,
>The error number reported is either 3604 or 16389 from Microsoft OLE DB
>provider for SQL Server.
>The best way I can describe "lost connection" is that you can no longer do
>Connection.Execute or pass the connection object to Recordset.Open.
Then that's an application issue or address from the
application end. Those are things you see in the app not in
SQL Server.
>As I mentioned our application does not use connection pooling and therefor
e
>we obviously keep the connection open after a database request.
That doesn't necessarily mean you would or should keep two
connections open for the lifetime of the application but
that's a whole different thing.
>My problem is that SQL 2005 seems to have much more fragile database
>connections than SQL 2000 did. I was just wondering if there were some SQL
>settings that could increase the robustness.
Part of the problem in try to address it from a SQL Server
end is that you can't really quantify it. It's not real
clear what data access changes may have happened with the
application either - did you also switch over to ADO.Net,
change MDAC versions, drivers/providers? You need to look at
network, OS, potential issues as well.
-Sue|||Darrel, we have exactly this issue, but with a different front end language.
We converted in Nov 06 and immediately our remote clients began having
serious issues just as you describe. We reduced that somewhat by
implementing an ado connection object that "pings" periodically to keep the
connection alive, but in recent weeks the problem has gotten worse again.
I ran into Bill Vaughn at a user group meeting late last year and picked his
brain (or should I say "opened the tap"). The bottom line seemed to be: the
most effective solution is to upgrade the app to dotnet; changing to sql
native client can help somewhat; "pinging" can also help. Upgrading our app
is a really big task, and we haven't done it. We did the ping thing and the
native client thing, but as I said, those things alone are not the solution.
We opened a ticket with MS and spend hours in emails and on the phone with
overseas MS support people on this issue. They really didn't have a clue.
By the way, look at Bill's posting on 3/2/07 in which he says unequivocably
that for a win app, connecting and staying connected is a preferred
practice. Obviously Sue Hoegemeier does not agree.
My view is that MS is either unaware of, in denial about, or unwilling to
acknowledge and correct this issue.
Jeremy
"Darrel Miller" <DarrelMiller@.discussions.microsoft.com> wrote in message
news:46643664-0C8E-4C94-8DE4-4704518EAE8A@.microsoft.com...
> We have a VB6 application that I recently converted over from SQL Server
> 2000
> to SQL Server 2005. At numerous customer locations I am now experiencing
> occasional "lost connection" problems. This never happened with SQL 2000.
> Our client app does not use pooled connections and I have not been able to
> find a clean way to recover from the broken connection problem. If I
> check
> the status of the Connection it claims that it is open, it is only if I
> try
> to make a database request that I get the error back. Re-opening the
> connection solves the problem.
> Is anyone aware of SQL Server 2005 doing anything different with respect
> to
> keeping database connections open?
> Thanks,
>|||Ahem... the ability to stay connected once connected has been a perennial
issue. I can remember back to the earliest versions of SQL Server when
connections "aged" out for some unknown reason. No, I don't know the actual
reason, but symptoms lead me to believe that there is an ack/nak protocol in
the TDS that basically pings the application at the driver level to see if
the server or client is still there. This makes sense as SQL Server
automatically detects if the client falls into the sea or gets disconnected
for some reason. This polling has to be done on the server as it scans the
current connections to see if there is activity (work to be done) or if
there is still a client (albeit inactive) attached. I suspect that if the
client is busy (running Age of Empires or Windows OneCare) and can't respond
in time or the event is simply lost (as can happen in Windows
message-loop-based applications) the server thinks the client has dropped
off. My solution to this is to keep polling the server (once every 30
seconds or so seems to be more than enough) to keep the connection alive.
Does the Connection pooling mechanism do this on its own? BHOM, but I would
not be surprised. I have not seen evidence of this in the Profiler logs
though. I suspect it's done at the TDS provider level. Since we don't have
access to those APIs, one can constantly close and reopen the connection
(use the Connection pool) or poll (which is pretty easy and faster as it
does not require reauthenication, resetting the connection and loss of the
server state). To poll you can execute any SQL like "USE <initial catalog>"
or "DECLARE i as INT".
I'm not sure I said that keeping a connection open is a "preferred" method,
just a perfectly viable method--assuming you know the costs.
hth
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
Visit www.hitchhikerguides.net to get more information on my latest book:
Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
----
---
"Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
news:uAzf37ibHHA.2088@.TK2MSFTNGP04.phx.gbl...
> Darrel, we have exactly this issue, but with a different front end
> language. We converted in Nov 06 and immediately our remote clients began
> having serious issues just as you describe. We reduced that somewhat by
> implementing an ado connection object that "pings" periodically to keep
> the connection alive, but in recent weeks the problem has gotten worse
> again.
> I ran into Bill Vaughn at a user group meeting late last year and picked
> his brain (or should I say "opened the tap"). The bottom line seemed to
> be: the most effective solution is to upgrade the app to dotnet; changing
> to sql native client can help somewhat; "pinging" can also help.
> Upgrading our app is a really big task, and we haven't done it. We did
> the ping thing and the native client thing, but as I said, those things
> alone are not the solution. We opened a ticket with MS and spend hours in
> emails and on the phone with overseas MS support people on this issue.
> They really didn't have a clue.
> By the way, look at Bill's posting on 3/2/07 in which he says
> unequivocably that for a win app, connecting and staying connected is a
> preferred practice. Obviously Sue Hoegemeier does not agree.
> My view is that MS is either unaware of, in denial about, or unwilling to
> acknowledge and correct this issue.
> Jeremy
>
> "Darrel Miller" <DarrelMiller@.discussions.microsoft.com> wrote in message
> news:46643664-0C8E-4C94-8DE4-4704518EAE8A@.microsoft.com...
>|||Bill, I do appreciate your detailed and thoughtful response. However, our
application did not experience dropped connections under sql 2k (or they
were so insignificant no one bothered me about it), whereas despite our best
efforts, I see many error reports every day with sql 2k5. Seems quite clear
that the problem is dramatically worse with the new version.
Jeremy
"William (Bill) Vaughn" <billvaRemoveThis@.betav.com> wrote in message
news:OaM5uelbHHA.4808@.TK2MSFTNGP04.phx.gbl...
> Ahem... the ability to stay connected once connected has been a perennial
> issue. I can remember back to the earliest versions of SQL Server when
> connections "aged" out for some unknown reason. No, I don't know the
> actual reason, but symptoms lead me to believe that there is an ack/nak
> protocol in the TDS that basically pings the application at the driver
> level to see if the server or client is still there. This makes sense as
> SQL Server automatically detects if the client falls into the sea or gets
> disconnected for some reason. This polling has to be done on the server as
> it scans the current connections to see if there is activity (work to be
> done) or if there is still a client (albeit inactive) attached. I suspect
> that if the client is busy (running Age of Empires or Windows OneCare) and
> can't respond in time or the event is simply lost (as can happen in
> Windows message-loop-based applications) the server thinks the client has
> dropped off. My solution to this is to keep polling the server (once every
> 30 seconds or so seems to be more than enough) to keep the connection
> alive. Does the Connection pooling mechanism do this on its own? BHOM, but
> I would not be surprised. I have not seen evidence of this in the Profiler
> logs though. I suspect it's done at the TDS provider level. Since we don't
> have access to those APIs, one can constantly close and reopen the
> connection (use the Connection pool) or poll (which is pretty easy and
> faster as it does not require reauthenication, resetting the connection
> and loss of the server state). To poll you can execute any SQL like "USE
> <initial catalog>" or "DECLARE i as INT".
> I'm not sure I said that keeping a connection open is a "preferred"
> method, just a perfectly viable method--assuming you know the costs.
> hth
> --
> ____________________________________
> William (Bill) Vaughn
> Author, Mentor, Consultant
> Microsoft MVP
> INETA Speaker
> www.betav.com/blog/billva
> www.betav.com
> Please reply only to the newsgroup so that others can benefit.
> This posting is provided "AS IS" with no warranties, and confers no
> rights.
> __________________________________
> Visit www.hitchhikerguides.net to get more information on my latest book:
> Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
> and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
> ----
---
> "Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
> news:uAzf37ibHHA.2088@.TK2MSFTNGP04.phx.gbl...
>|||Ok, there are a lot of "fixes" in SS2K5 that might have had this unintended
side-effect. Have you installed the latest SP2? I think there is a SP2a (or
b).
I would log a bug with MS. The (more or less) monitor this group (but pay
more attention to the MSDN forums).
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
INETA Speaker
www.betav.com/blog/billva
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
Visit www.hitchhikerguides.net to get more information on my latest book:
Hitchhiker's Guide to Visual Studio and SQL Server (7th Edition)
and Hitchhiker's Guide to SQL Server 2005 Compact Edition (EBook)
----
---
"Jeremy" <jeremy-nospam@.ninprodata.com> wrote in message
news:Oa3%2321nbHHA.2300@.TK2MSFTNGP06.phx.gbl...
> Bill, I do appreciate your detailed and thoughtful response. However, our
> application did not experience dropped connections under sql 2k (or they
> were so insignificant no one bothered me about it), whereas despite our
> best efforts, I see many error reports every day with sql 2k5. Seems
> quite clear that the problem is dramatically worse with the new version.
> Jeremy
>
> "William (Bill) Vaughn" <billvaRemoveThis@.betav.com> wrote in message
> news:OaM5uelbHHA.4808@.TK2MSFTNGP04.phx.gbl...
>
Wednesday, March 28, 2012
Loosing transactions
My application is loosing some transactions fired from a trigger, this lost is because CPU usage is very high abut I cant find any log that indicates the lost transactions.
Regards.
This probably is not the best place for this post. Try:
microsoft.public.sqlserver.server
Rand
This posting is provided "as is" with no warranties and confers no rights.
Loosing SQL connection while running ASP
It selects from one table (15.000 records), and updates into another table.
But when we execute the ASP script, it reports the following error, after
completing between 70%-80%. This varies each time it runs
"Microsoft OLE DB Provider for SQL Server error '80004005'
[DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access
denied. "
The application is running in ths following server environment
Server 1
Fujitsu-Siemens RX300 Server (Dual 2,8 Xeon, 1 Gb Ram, 72 Gb SCSI Raid 1)
Windows 2003 Web Server (IIS)
Server 2
Fujitsu-Siemens RX300 Server (Dual 2,8 Xeon, 1 Gb Ram, 72 Gb SCSI Raid 1)
Windows 2003 Standart server
Microsoft SQL Server 2000Jesper Carstensen (jesper@.swush.com) writes:
> We are running an ASP application. That is fairly simple.
> It selects from one table (15.000 records), and updates into another
> table. But when we execute the ASP script, it reports the following
> error, after completing between 70%-80%. This varies each time it runs
>
> "Microsoft OLE DB Provider for SQL Server error '80004005'
> [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access
> denied. "
So how exactly do you retrieve the rows and submit the updates? For
retrieval I would recommend client-side cursors.
Is the table you update on the same server as the one you read from?
In such case, the best may be to have all the updating login in a stored
procedure, unless the logic is too complex for SQL.
--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se
Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||We are seeing that EXACT same error popup randomly on our web server
running Windows 2003 Server. Here's our setup:
Web Server 1 - Windows 2000 Server - Latest SP, MDAC, All Updates, etc.
Web Server 2 - Windows 2003 Server - All updates
SQL Server - Windows 2003 Server, SQL Server 2003 SP3a, All Updates
Now, we used to have both our web servers be 2000 server. We've
recently updated the 2nd one to be 2003 server. Ever since we have
been getting that same error (below) at random times. We seem to think
that it may fail under heavy load.
"Microsoft OLE DB Provider for SQL Server error '80004005'
[DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or
access denied. "
We've done just about everything we've seen suggested here in the
groups and on Microsoft's site without any success. We can't nail down
what is causing this problem. If ANYBODY can help point us in the
right direction we'd greatly apprecaite it. What's odd is that I've
seen many posts reporting this same problem without any solutions
given. It seems that maybe there's some kind of issue with 2003 that
hasn't been documented or exposed yet.|||This would be my checklist.. (1 and 2 are probably ruled out given that the
2000 server works with no problem)
one clue you may have to help you is the amount of time the web server takes
to throw that error when you access the page. Is it trying to connect and
then giving up after a few seconds, or is it failing immediately without
attempting to handshake with the server.
1) Check the DB server loading, increase connection timeout if its heavy for
extended periods.
2) continuously ping the DB server from the web server(s) to see if there
are connection outages/packet loss.
3) Try a different(older) version of MDAC on the web server, fiddle with the
client settings.
4) Try varying the connection type (SQL logon Windows logon Named Pipes
TCP/IP)
5) Run IIS under a different isolation level, fiddle with the application
settings
7) trap the error in the connection code and attempt a re-connect (3 strikes
and your out).
6) try something else ...
8) get the 2000 server back out
hope you find a solution to the problem, let us know if you do. :)
Mr Tea
http://mr-tea.blogspot.com
"Don Kitchen" <donkitchen@.gmail.com> wrote in message
news:1106327170.237818.86870@.c13g2000cwb.googlegro ups.com...
> We are seeing that EXACT same error popup randomly on our web server
> running Windows 2003 Server. Here's our setup:
> Web Server 1 - Windows 2000 Server - Latest SP, MDAC, All Updates, etc.
> Web Server 2 - Windows 2003 Server - All updates
> SQL Server - Windows 2003 Server, SQL Server 2003 SP3a, All Updates
> Now, we used to have both our web servers be 2000 server. We've
> recently updated the 2nd one to be 2003 server. Ever since we have
> been getting that same error (below) at random times. We seem to think
> that it may fail under heavy load.
> "Microsoft OLE DB Provider for SQL Server error '80004005'
> [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or
> access denied. "
> We've done just about everything we've seen suggested here in the
> groups and on Microsoft's site without any success. We can't nail down
> what is causing this problem. If ANYBODY can help point us in the
> right direction we'd greatly apprecaite it. What's odd is that I've
> seen many posts reporting this same problem without any solutions
> given. It seems that maybe there's some kind of issue with 2003 that
> hasn't been documented or exposed yet.|||"Don Kitchen" <donkitchen@.gmail.com> wrote in message
news:1106327170.237818.86870@.c13g2000cwb.googlegro ups.com...
> We are seeing that EXACT same error popup randomly on our web server
> running Windows 2003 Server. Here's our setup:
> Web Server 1 - Windows 2000 Server - Latest SP, MDAC, All Updates,
etc.
> Web Server 2 - Windows 2003 Server - All updates
> SQL Server - Windows 2003 Server, SQL Server 2003 SP3a, All Updates
> Now, we used to have both our web servers be 2000 server. We've
> recently updated the 2nd one to be 2003 server. Ever since we have
> been getting that same error (below) at random times. We seem to
think
> that it may fail under heavy load.
> "Microsoft OLE DB Provider for SQL Server error '80004005'
> [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or
> access denied. "
> We've done just about everything we've seen suggested here in the
> groups and on Microsoft's site without any success. We can't nail
down
> what is causing this problem. If ANYBODY can help point us in the
> right direction we'd greatly apprecaite it. What's odd is that I've
> seen many posts reporting this same problem without any solutions
> given. It seems that maybe there's some kind of issue with 2003
that
> hasn't been documented or exposed yet.
I've seen the same problem and I think it's specifically a memory
problem. It may be a more general "resource" issue correlated with
memory but it is hard to pin down.|||I'm happy to report that we have found the fix to our problems.
Without going into too much detail here, I can say that 2 registry keys
that we created in Windows 2003 fixed the problem.
See these two threads for more info about it:
http://groups-beta.google.com/group...57109e74649c7c8
http://groups-beta.google.com/group...075084961f5cdb7