After Upgrading To SQL 7.0 Database Writes/saves Are Slow...
Apr 18, 2002
I upgraded from 6.5 to 7.0 SP3. Now when I save (write) an invoice it takes about 10-12 seconds, at 6.5 it was 1-3 seconds. SQL Server and my Materials App are the only thing running on this box. This is the only area that has gotten slower everything else works great. I have 3 users saving invoices and about 15 people total using the system at one time. It's a compaq DL580 loaded with memory, database is 2,195MB in size. Same 6.5 client to access system as before. Should I rebuild/reindex the database? Is there something from the old 6.5 version I need to remove?? Thanks in advance!!!
View 1 Replies
ADVERTISEMENT
Feb 2, 2004
Hi,
i am experiencing SQl write performance problems on a very shiny server. Got data files on a Raid 1+0, log files on a separate drive, all SCSI, Win2003 server, 6G RAM, 2 Xeon processors. I've created a small benchmarking program and run it on my desktop pc and this 'big' server. Here are the results:
Desktop: SQL server inserts: 78 Seconds, Direct writes to the harddisk(Just write a string to the file 10000 times): 13 seconds
SQLServer: SQL server inserts: 422 Seconds, Direct writes to the harddisk: 16 seconds
So, for some reason, my 'shiny' machine is 6 times slower on writes than my desktop. When i tried comparing the select performance, my shiny server is 10 times faster than my desktop.
Initially i had Raid5 on my server and it had poorer direct write performance but now, direct writes seem to be ok, so, i recon this is a problem related to SQL server.
What can i do to improve the insert performance?
Thanks in advance
View 8 Replies
View Related
Aug 1, 2006
Is it possible to find the reads/writes to a sql server table ?
View 2 Replies
View Related
Jun 24, 2015
How can i pass my .net application's Userid to the trigger? I have a audit trail trigger on myTable. I dont know how to pass the userid (not the sql server user) to the trigger when a user delete a record from the application(.NET Application).
The trigger saves the modifications on the table including the userid of one who does the changes.
View 4 Replies
View Related
Oct 23, 2007
can we upgrade a sqlce forgotten password database to sql server for mobile database on our desktop
Thanks & Regards
Mukesh Gupta
View 1 Replies
View Related
Jan 6, 2005
Hey,
Does anyone know how I would go about upgrading a database version to 'C.0.6.43'? I am getting an error and it is telling me to upgrade my database to this version.
Thanks
View 2 Replies
View Related
Jan 9, 2008
This program gets the values of A and B passed in. They are for table columns DXID and CODE. The textbox GET1 is initialized to B when the page is loaded. When I type another value in GET1 and try to save it, the original initialized value gets saved and not the new value I just typed in. A literal value, like "222" saves but the new GET1.TEXT doesn't.
View 1 Replies
View Related
May 7, 2008
If I am upgrading the SQL Server 2005 32 bit to 64 bit (only DB Server), Do Applications that communicate with Database need to be upgraded to 64 bit as well???
Thanks,
View 1 Replies
View Related
Mar 5, 2008
hi experts,
i have a postcode database that i need to update. the database cnotains of 6tables, the file i ahev has all the information at once, so i have to organize it and insert records into the appropriate tables.
this is the first time i'm doign this so i would like to know what the best way to do? do i need to create a stored procedure or a script, or may be something special and efficient that i do not know yet.
any advise will be very appreciated
thanks in advance
View 1 Replies
View Related
Apr 27, 2007
I have a WinXP Pro sp2 system that I have installed SQL Express in one case and in another caes I have installed SQL Express sp1. When trying to upgrade either to the next service pack, I get a "Setup Failed" on the "SQL Server Database Services". When I look at the log file, the following is the error:
...
MSI (s) (F8!28) [14:43:23:980]: PROPERTY CHANGE: Adding EditionTypechecksum.CC1A8C58_27D1_4D38_BF1B_C0A5CBB90616 property. Its value is 'Express Edition with Advanced Services'.
<Func Name='GetSkuIt'>
GetServiceUserGroup failed for SQLEXPRESS, 5
Error Code: 0x80070534 (1332)
Windows Error Text: No mapping between account names and security IDs was done.
Source File Name: sqlcasqlcax.cpp
Compiler Timestamp: Fri Feb 9 22:35:05 2007
Function Name: SetInstanceProperty
Source Line Number: 1223
Error Code: 1332
MSI (s) (F8!28) [14:43:27:235]: Product: Microsoft SQL Server 2005 -- Error 29528. The setup has encountered an unexpected error while Setting Internal Properties. The error is: Fatal error during installation.
Error 29528. The setup has encountered an unexpected error while Setting Internal Properties. The error is: Fatal error during installation.
<EndFunc Name='LaunchFunction' Return='1332' GetLastError='203'>...
I was hoping someone has run into this before and can direct me to the solution.
Thank you.
KSLarson
View 3 Replies
View Related
Mar 8, 2007
hi,
I've modified some of my database tables and in the existing version(ver1.3) of my database and i named the new version as ver1.5.
now when i've detached ver1.3 database and attaching ver 1.5 to server it is taking the .ldf file from the folder ver1.3.(I'm maintaning the each version in a saperate folder)
I want to create an independent .ldf file for ver 1.5 also. i dont want to depend on old version for .ldf file.
Can any one help me regarding this issue....
dpindra@yahoo.com
View 1 Replies
View Related
Jan 21, 2004
Oke here is the problem. I hope sombody can help me with it.
After a lot of discussion, I finaly got the people that control the MS SQL server with my appertment as far to allow me to upgrade my access database to the MS SQL server.
After this I've created an access project. Devellopping in access I concluded I made a mistake in the rowsource property of one of the tablefields. The people that control the MS SQL server don'n allow me to make changes to the database structure from within access. So I've to write a SQL statement to change this.
example:
fields table1: id_document, id_document_type, document_name
fields tabel2: id_document_type, document_type_description.
the field id_document_type from table 1 is defined as a combo/list box tha contains the values from the field document_type_description from table 2.
I hope somebody can tell me if it is possible to write an SQL statement for this problem and how it shoud look like ?
thank you
View 1 Replies
View Related
Apr 17, 2007
I have one 32-bit SQL 2000 server that is our enterprise-wide reporting server (we'll call it RS) and another 32-bit SQL 2000 server that serves as a stored proc data source for certain reports on that server (we'll call it DS). I am about to go through an upgrade/migration of DS to 64-bit SQL 2005 and was wondering if:
it is possible to just change the data source location on RS and point the old reports against the new DS server?
there are any necessary steps to take within RS to make the data source (DS reference on RS) compatible with the destination (report on RS)?
it is possible/easy to bulk-migrate all the RDL's for a particular data source from RS to DS?
Does anyone know any of the answers to these questions?
Thanks in advance!
View 1 Replies
View Related
May 29, 2007
My question is two fold:
We have a database 65 GB in size and has grown over 12 years.
1) How can I upgrade to 2005 without downtime?
2) Our upgrades on SQL 2000 now can take upwards of 10 hours to add just a column and rebuild index tables?
Any way we can speed this up without detaching the database and going offline?
thanks,
Larry Sitka
View 6 Replies
View Related
Aug 3, 2015
I just upgraded from Windows 7 to Windows 10, and now I can't connect to SQL Server 2008 R2 nor 2014 Express. The logs on both say it can't open master.mdf because it was originally formatted with a sector size 4096 but now it's 3072. But no sector sizes were changed. The only thing that changed was going from Windows 7 (64-bit) to Windows 10 (64-bit). How can this be resolved?
View 10 Replies
View Related
Feb 11, 2015
Copy mssqlsystemresource.mdf of a recently upgraded server and paste to an old server have same effect of upgrading via .exe installation?
My idea is to save time and administrative efforts in upgrades (Service Packs and/or Cumulative Updates) using this method.
According to BOL:
The Resource database makes upgrading to a new version of SQL Server an easier and faster procedure. In earlier versions of SQL Server, upgrading required dropping and creating system objects. Because the Resource database file contains all system objects, an upgrade is now accomplished simply by copying the single Resource database file to the local server.
View 3 Replies
View Related
Sep 26, 2005
I have an application that is insertting thousands of records houlry. The server's hard drives are staying maxxed out. My boss says there is an index problem. I say it is a drive subsystem issue.
Any help would be appreciated to understand this performance problem.
View 2 Replies
View Related
Oct 30, 2006
How can You find the reads and writes per second of your hard drives in sql. I am reading my SQL book and it says that your average disk should have 125 or less i/o's. And it gave the forumal but as mentioned I don't know how to find the reads and writes.
View 4 Replies
View Related
Mar 25, 2008
Hi!
First of all I want to tell you that I'm not a dba or tuning expert but I've ran a trace on a database with perfomance problems and I've found a strange thing.
The user creates orders for their service people in the organisation. I can see in the trace that inserts are done but they don't produce any writes rightaway. However after 10-15 minutes all the writes are done, what could make the actual write be delayed so much. The application is developed using .net.
/Magnus
Jesus saves. But Gretzky slaps in the rebound.
View 8 Replies
View Related
Jul 21, 2000
Is there a way to get a total count of all SELECT, UPDATE, DELETE and INSERT statements to a SQL Server 6.5 database during a 12 hour period? I'm thinking maybe someone knows of a software that reads the log or monitors the server... I've been looking at the performance monitor and, although it has good information, it doesn't capture DML's.
FYI - it's for capacity planning.
TIA,
Mike
View 1 Replies
View Related
Mar 5, 2008
GUys,
Is there any way track tables which have most no of reads and writes from a database of 400 tables.
Thanks
View 9 Replies
View Related
Feb 18, 2004
Hi everyone! I'm new to this forum and I suspect I'll be using this forum frequently. Good stuff.
Allow this question may appear to be Web-related, I think the problem is with what I'm doing with the database. Please read.
I'm trying to implement a page tracking solution using ASP and SQL 2000. It basically writes a new record to a table every time a user visits a page on the site. It appeared to work fine at first, then I've increasingly been getting time out errors on my pages -- all pointing to the include file that fires the database write.
Here's the code that's referenced on every page:
Set Conn = Server.CreateObject("ADODB.Connection")
Conn.Open "dsn=x;uid=y;pwd=z;"
Set objRecordset1= Server.CreateObject("ADODB.Recordset")
objRecordset1.Open "SELECT * FROM table",Conn,1,2
objRecordset1.AddNew
objRecordset1.Fi elds("PAGE") = Left(request.servervariables("SCRIPT_NAME"),100)
objReco rdset1.Fields("QUERY_STRING") = Left(request.servervariables("QUERY_STRING"),100)
objRec ordset1.Fields("DATE") = Date()
objRecordset1.Fields("TIME") = Time()
objRecordset1.Fields("PLATFORM") = Left(request.servervariables("HTTP_USER_AGENT"),100)
obj Recordset1.Fields("REFERRER") = Left(request.servervariables("HTTP_REFERER"),100)
objRec ordset1.Fields("USER_IP") = Left(request.servervariables("REMOTE_ADDR"),20)
If Request.Cookies("TEST")("ID")<>"" Then
objRecordset1.Fields("VISITOR_ID") = Request.Cookies("TEST")("ID")
End If
objRecordset1.Update
Conn.Close
Set Conn=Nothing
%>
After taking out the reference to the above code everything speeds back up. So, I know the performance hit and time out issues have to do with the code above.
Is it the simultaneous write to the table, the constant opening and closing of the recordset, the cursor type, the lock type – or combination of things?
HELP!! Thanks!
David
View 3 Replies
View Related
Apr 17, 2008
Problem Statement........
Lets say user A accesses a record and is making an update to a column... next user B accesses the same record and makes an update to the same column and saves the data... how can user A check to see if an update has been made to prevent overwriting the data..
Is there a query statement that user A can write to check for this?
I understand locking can be used to prevent this but is there an alternative to locking.
View 5 Replies
View Related
Nov 25, 2007
Ok, here is my situation.....
When someone navigates to a user's profile page on my site, I present them with a slideshow of the user's photos using the AJAX slideshow extender. I obtain the querystring value in the URL (to determine which user's page I'm on) and feed that into a webservice via a context value where an array of photos is created for the slideshow. Now, in order to create the array's size, I do a COUNT of all of that specific user's photos. Then, I run another SQL statement to obtain the path of those photos in the file system. However, during the time of that first SQL query's execution (the COUNT statement) to the time of the second SQL query (getting the paths of the photos), the owner of that profile may upload or delete a photo from his profile. I understand this would be a very rare occurrence since SQL statements 1 and 2 will be executed within milliseconds of each other, but it is still possible I suppose. When this happens, when I try to populate the array, either the array will be too small or too large. I'm using SqlDataReader for this as it seems to be less memory and resource intensive than datasets, but I could be wrong since I'm a relative beginner and newbie. This is what I have in my vb file for the webservice.....Public Function GetSlides(ByVal contextKey As String) As AjaxControlToolkit.Slide() Dim dbConnection As New SqlConnection("string for the data source, etc.") Try dbConnection.Open() Dim memberId = CInt(contextKey) Dim photoCountLookupCmd As New SqlCommand _ ("SELECT COUNT(*) FROM Photo WHERE memberId = " & memberId, dbConnection) Dim thisReader As SqlDataReader = photoCountLookupCmd.ExecuteReader() Dim photoCount As Integer While (thisReader.Read()) photoCount = thisReader.GetInt32(0) End While thisReader.Close() Dim MySlides(photoCount - 1) As AjaxControlToolkit.Slide Dim photoLookupCmd As New SqlCommand _ ("SELECT fullPath FROM Photo WHERE memberId = " & memberId, dbConnection) thisReader = photoLookupCmd.ExecuteReader()
Dim i As Integer For i = 0 To 2 thisReader.Read() Dim photoUrl As String = thisReader.GetString(0) MySlides(i) = New AjaxControlToolkit.Slide(photoUrl, "", "") Next i thisReader.Close() Return MySlides Catch ex As SqlException Finally dbConnection.Close()
End Try
End FunctionI'm trying to use the most efficient method to interact with the database since I don't have unlimited hardware and there may be moderate traffic on the site. Is SqlDataReader the way to go or do I use something else? If I do use SqlDataReader, can someone show me how I can run those 2 SQL statements in best practice? Would I have to somehow lock writing to that table when I start the first SQL statement, then release the lock after I execute the second SQL statement? What's the best practice in this kind of scenario.
Thanks in advance.
View 3 Replies
View Related
Nov 5, 2015
How can I measure the disk reads and writes to see if I need to add aditional disks to the server?
View 2 Replies
View Related
Jul 23, 2005
Hello, we are investigating the use of SQL Server as a backend to ourscientific imaging application. We have found that when we write alarge image (60 Megabytes) the performance is quite a bit slower thanwriting 60 single megabyte images. The tests were performed runningSQL Server 2000 on Windows 2003 Enterprise on a single machine toeliminate the network's contribution. Perhaps there is a configurationoption that will allow us to tune SQL Server to better handle largewrites?TIA
View 1 Replies
View Related
Oct 25, 2006
Hi everyone,
Every time that my application throws an .DTSX file I don't know who or what is writing on eventviewer.application if failed or successful.
Execute method implements a customized class which implements IDTEVENTS but I promise that in any place of my code I'm writing that information.
app.execute(nothing,... MYEVENTS)
Public Class MYEVENTS
Implements IDTEVENTS
..
..
..
..
All the methods are declared although empty but OnQueryCancel which is customized.
How to disable this behaviour?
I'm concerned for that because of we could launch (when it's gonna in live) 300 or 400 packages on-daily basis!!!
Thanks in advance and regards,
View 3 Replies
View Related
Jul 6, 2015
We are in the process of moving existing clustered SQL server databases to AWS. There is one major database that has intensive reads and writes transactions. I'm wondering what is the best design to optimize the performance for both R/W since we have constant issues historically with the current environment when massive updates are happening. Reads shall have higher priority over writes.
View 2 Replies
View Related
Jul 17, 2015
I have inherited a database that is over-indexed, i.e. there are sometimes 10-20 indexes on a table. The performance is at times not great due to blocking from long running queries. I want to clean up the indexes as a starting point.
Through a query I found some time ago on the SQLCat blog I have discovered a large number of indexes in the database that have a huge disparity between reads and writes. The range of difference is sometimes almost 2 million more writes than reads. Should I just drop the indexes that have say, more than 100,000 more writes than reads and then see what the Missing Index DMVs tell me after a few days of running without those indexes?
In some cases there are a few hundred thousand reads but maybe a million writes on the index. Thus, there are a fair number of reads happening, just not in comparison to the number of writes. In some cases there are almost no reads and a million or more writes. I am obviously dropping those indexes. I just am not sure what to do about the indexes that do have a fair number of reads.
View 9 Replies
View Related
Oct 18, 2007
I am looking into various options to improve latency of our application (we figured the latency is mainly because data persistence - writes and reads from DB). I am looking into In-Memory databases also. But, before making that decision (of using in memory databases), I would like to see if there is a way to configure SQL Server 2005 to get as close performance as in-memory databases?
My question:
1. Is there a way that I can configure SQL Server 2005 to use a CACHE that gets loaded as needed basis, so that future database reads/writes will happen to the cache as opposed to disk (db writes)?
2. Is SQL Server 2005 recoverable in such configurations?
3. Are there any ideas/resources where I can get more details? (Such as sample configurations with bench mark numbers, rpevious experiences..etc)
Thanks
Murthy
View 1 Replies
View Related
Jul 4, 2006
I need to write back to a legacy system in the form of flat file --the first row would be a header and the remaining rows would be the actuals rows of data--each field would have a column delimiter of , and a row delimter of CRLF.
The source is a SQL Server 2005 table.
Im looking for a good example of a script task in the dataflow section that writes to a file.
Can anyone show me the code how to do this or point me to a link.
thanks in advance
Dave
View 10 Replies
View Related
Aug 22, 2007
I am using an error handler that was provided to me from another source. However, I notice that there's something in the code that writes the error message twice. I tried to discover what it was, but could not seem to pinpoint it. Here's an example of what my email messages look like:
Is activity file current?
The Script returned a failure result.
The extracts in D:myFolder are not current! Data NOT loaded.
Is activity file current?
The Script returned a failure result.
The extracts in D:myFolder are not current! Data NOT loaded.
Obviously, I just want my email to read:
Is activity file current?
The Script returned a failure result.
The extracts in D:myFolder are not current! Data NOT loaded.
Somewhere, errorMessages is being written to more than once. Need help finding the error.
Thanks!
Here is the code from my Event Handlers:
OnError event:
Public Sub Main()
Dim messages As Collections.ArrayList
Try
messages = CType(Dts.Variables("errorMessages").Value, Collections.ArrayList)
Catch ex As Exception
messages = New Collections.ArrayList()
End Try
messages.Add(Dts.Variables("SourceName").Value.ToString())
messages.Add(Dts.Variables("ErrorDescription").Value.ToString())
messages.Add(Dts.Variables("scriptError").Value.ToString())
Dts.Variables("errorMessages").Value = messages
Dts.TaskResult = Dts.Results.Success
End Sub
On PostExecute:
Public Sub Main()
Dim errorDesc As String
Dim messages As Collections.ArrayList
Try
messages = CType(Dts.Variables("errorMessages").Value, Collections.ArrayList)
Catch ex As Exception
Return
End Try
For Each errorDesc In messages
Dts.Variables("emailText").Value = Dts.Variables("emailText").Value.ToString + errorDesc + vbCrLf
Next
Dts.TaskResult = Dts.Results.Success
End Sub
View 4 Replies
View Related
Aug 23, 2000
I have a database that's 2.5GB but only has about 17MB of actual data. I've setup a standby server that I load my dumps into. The load takes about 10 miuntes. The dump takes about a minute and a half (which also seems slow to me for that small amount of data). I don't expect that it should take that long to load 8800 pages into a database. The standby server is the same hardware as the production server (sinlge 500MHz Xeon, 2GB RAM, RAID 5). The server has only a single RAID 5 array to store all the OS, and all the SQL data however, I still don't thinkit should take thta long to load. Let me know what you think.
--Buddy
View 1 Replies
View Related