OLEDBcommand Is Too Slow
Jun 15, 2007Hello,
I'm using an OLEDB Command in a DataFlow which performs a parametric query to update thousands of rowsets but it is very slow.
Is there an alternative ?
Hello,
I'm using an OLEDB Command in a DataFlow which performs a parametric query to update thousands of rowsets but it is very slow.
Is there an alternative ?
Hi,
I have application connected to MS Access DB using OleDB. When creating commands (Insert/Update/Select) I use OleDbParamater class to insert data into command. Examples :
Select ::
OleDbCommand select_cmd = new OleDbCommand("SELECT * FROM " + ObjectTable.TableName + " WHERE " +
ObjectTable.idObject + "=@" + ObjectTable.idObject + " AND " +
ObjectTable.idObjectUnder + "=@" + ObjectTable.idObjectUnder);
Update ::
OleDbCommand update_cmd = new OleDbCommand("Update " + ObjectTable.TableName + " SET " +
ObjectTable.idParent + "=@" + ObjectTable.idParent + " , " +
ObjectTable.idParentUnder + "=@" + ObjectTable.idParentUnder + " , " +
ObjectTable.License + "=@" + ObjectTable.License + " , " +
ObjectTable.Type + "=@" + ObjectTable.Type + " ," +
ObjectTable.Language + "=@" + ObjectTable.Language + " , " +
ObjectTable.Name + "=@" + ObjectTable.Name + " , " +
ObjectTable.Checksum + "=@" + ObjectTable.Checksum + " , " +
ObjectTable.VText + "=@" + ObjectTable.VText + " , " +
ObjectTable.VInt + "=@" + ObjectTable.VInt + " WHERE " +
ObjectTable.idObject + "=@" + ObjectTable.idObject + " AND " +
ObjectTable.idObjectUnder + "=@" + ObjectTable.idObjectUnder);
Parametes:: (Adding in separate method -> AddParameters(OleDbCommand command); )
command.Parameters.Add("@" + ObjectTable.idObject, OleDbType.BigInt).Value = this.IDUpper;
command.Parameters.Add("@" + ObjectTable.idObjectUnder, OleDbType.BigInt).Value = this.IDUnder;
command.Parameters.Add("@" + ObjectTable.Name, OleDbType.VarChar).Value = this.Name;
command.Parameters.Add("@" + ObjectTable.idParent, OleDbType.BigInt).Value = GetUpper(this.IDParent);
command.Parameters.Add("@" + ObjectTable.idParentUnder, OleDbType.BigInt).Value = GetUnder(this.IDParent);
command.Parameters.Add("@" + ObjectTable.License, OleDbType.BigInt).Value = this.License;
command.Parameters.Add("@" + ObjectTable.Language, OleDbType.BigInt).Value = this.Language;
command.Parameters.Add("@" + ObjectTable.Type, OleDbType.BigInt).Value = (int)this.Type;
command.Parameters.Add("@" + ObjectTable.VText, OleDbType.VarChar).Value = String.IsNullOrEmpty(this.VText) ? null : this.VText;
command.Parameters.Add("@" + ObjectTable.VInt, OleDbType.BigInt).Value = this.VInt;
command.Parameters.Add("@" + ObjectTable.Checksum, OleDbType.BigInt).Value = this.Checksum;
Question: Does the order of adding parameters to command matter? Because allways when the order of parameters added is diffrent from order in command text, I get weird Exceptions . I thought that the name matters, not the order, but it seems that system doesn't care about the parameter's name, it just picks next parameter in command.Parameters when putting values. How is it?
Is there way to rename parameters Param_0, Param_1 in OLEDBCommand transformation? I am trying to create table driven packages using BIML. I am using OLEDBCommand Transformation to update rows. But since, I will not be sure of how many parameters and order of the parameters, I was planning to rename the parameter programmatically, so that accordingly I can build the update statement and add filter condition.
View 1 Replies View Related What does Invalid object name "MyTable" mean?
I checked the table and column names. They are both correct.
Will the error mean something else, e.g. wrong data type, or no data?
TIA,Jeffrey
Dim strConn As String = ConfigurationManager.ConnectionStrings("MyConnectString").ConnectionStringDim oConn As New OleDbConnection(strConn)Dim oDBCommand As New OleDbCommand("MyStoredProceduret", oConn)oDBCommand.CommandType = CommandType.StoredProcedureoDBCommand.Connection.Open()Dim rtnValue As String = oDBCommand.ExecuteScalar() This is the error source file
This is the SPCREATE PROCEDURE MyStoredProcedure ASSELECT SettingsReqSchdTimeout FROM SettingsGO
I am writing a data access web page, but I find that the excution speed is too slow.
My data base is just a data table which have five columns: id, code, quantity, price and Date. The data base has about 45000 rows. When I use OSQL or Query function, speed is just fine.
Here is the main code which I think cause the speed slow:
string conn = ConfigurationSettings.AppSettings["connectionstring"];
SqlDataAdapter adapter_2 = new SqlDataAdapter("select * from table",conn);
DataSet ds = new DataSet();
adapter_2.Fill (ds,"table");
DataTable YahooOrders = ds.Tables["YahooOrders"];
DataRow[] product = new DataRow[20000];
.......
foreach (string s in split) // actually the split here has only one string in it
{
product = table.Select ("code like '"+s+"%' and Date >='"+minDate+"' and Date <='"+table.Select("Date = Max(Date)")[0][1].ToString()+"'");
foreach(DataRow myRow in product)
{
int count = Convert.ToInt32(myRow[2]);
itemQuantity = count + itemQuantity;
revenue = Convert.ToDouble(myRow[3]) * count + revenue;
// get product code, ignore repeated code
int myIndex=code.BinarySearch( myRow[1] );
if ( myIndex < 0 )
code.Add(myRow[1]);
}
orderQuantity = product.Length + orderQuantity;
}
The first foreach actually excutes just one time, so it won' t cause any speed problem.
The second foreach' s job is to sum each column of specified rows which is product here.
So, any ideas about this?
Thanks!
All,
Actually from the application the developers are using count(column) to know the no. of rows resulted by a statement which joins many tables but its taking lot of time.
Is there an easy way to get the count of records(result set) of the output.
I cant use sysindexes b'z i need the count of the output genereted by the SQl Statement which joins many tables and retrieves many rows.
Thanks,
Sajai
Hi fellas (and girllas),
Got a problem (duh!). My MSSQL Server lags. Now, mind, it doesn't lag all the time. And it seems to be independent of the # of users trying to access the server. And it random clears itself up. And the problem doesn't present itself in SQL MGR, just on the web app we're running on it.
Setup:
SQL Server 2k running on 2k3 w/ IIS & backup exec.
All SQL data files are on a raid5 SCSI U160.
App:
Intranet App developed by us for us. ASP.NET & VB.NET.
Symptoms:
When queried server takes a LONG time to respond. So long infact it has become counter productive. When taking a look at the server, the CPU usage hovers between 50-75% and spikes up to 90% every now and then just for kicks. The memory usage is 2.35gb out of 4gb. To fix this we have to kill and restart all the SQL services.
Any thoughts on what to look at? There're indexes on the required FKs and the heavily queried columns. We're at a loss here.
Thanks for any helpful help!
=Me!
Database has its few HUGE tables, but I experienced queries against few other very small tables to be incredible slow.
Anybody has the same or similar problem?
A vendor's application is performing slow. Vendor tested it in QA and it's slow. End-users run it in PRD and it's slow. The application calls SP1, and SP1 calls SP2. Inside SP1 has a cursor. I believe as the db gets larger. The application is going to be even slower. What can I suggest to the vendor in order to fix it? Tell them to re-write the application code? Eliminate cursor?
Thanks
Hello,
I have 4000 record in my table employee. it takes 13 sec to get data. It this normal ? What is wrong ?
Thanks
Code Snippet
CREATE FUNCTION [dbo].[VrniStrukturo] (@id_sod int)
RETURNS TABLE
AS
RETURN
(
WITH tree(id, parent_id, naziv, nivo) AS
(
-- Base case
SELECT
id,
parent_id,
naziv,
1 as nivo
FROM employee
WHERE id = @id_sod
UNION ALL
-- Recursive step
SELECT
e.id,
e.parent_id,
e.naziv,
eh.nivo + 1 AS nivo
FROM employee e
INNER JOIN tree eh ON
e.parent_id = eh.id
)
--SELECT *
SELECT id
FROM tree
--ORDER BY nivo, priimek, ime
);
There re certian times when I want to execute a sql request (select for example) then It gets too long before I get an answer. (that happens only some times exceptionnaly). What does that mean, is it that somebody is using heavily the DB or may be using Entreprise manager or what exactly
and how can I know who is responsible for taking all SQL server resources at that specefic time. What command or what tool can I use pls for this purpose.
Thanks for your help.
Hello,Were using the data transfermation service to copy in an Ingres II 2.5database to an SQL Server 2000 database. Small databases don't present anissue, but when pulling one across that's about 20GB its been taking between12 and 24 hours. Both systems are relitively quick boxes and neither of themare tapped out on processor, disk I/O or network resources.I do have the "Boost SQL Server priority on Windows" checked under it'sproperties and all the processors are checked to be used.Does anyone know if there's a way to tweak SQL Server or Ingres to handlethis a little quicker? Or even an idea where the bottleneck could be may behelpful.Thanks,John.
View 1 Replies View RelatedI've got a performance question about a clr tvf that I have created. When I query the function it takes about 30 seconds for it to execute as apposed to < 6 seconds when I execute the same code in a console app (the 6 seconds includes outputting the returned data to the console, without writing the output to console it executes in about 1 second). Both the function and the app are iterating (>40,000) and returning ( >10,000) the same number of rows. I've noticed the following when viewing the executions in the PerfMon:
* the sqlclr tfv kicks the % Processor Time up to 30 for 30 seconds, the console app has % Processor Time at 9 for about 2 seconds
* .NET CLR Memory - Allocated Bytes/sec spikes anywhere from 1 to 3 times during the sqlclr query at about 44MB/sec. It barely registers if at all when the console app runs.
* In either case, % Time in GC is at zero.
I'm assuming that there are some configurations I'm ignorant of that can help me tune the execution. I can't imagine that it takes SqlServer that long just to iterate through the records.
I have recently decided to make the change from Microsoft access to SQL Server believing that it's a bigger faster beast with better parameterized queries and triggers and all that. BUT.I have some client data that I imported from their original paradox files.The invoice lineitem file contains over 1 milliion records.When I open this table in access and click show last record, the record is displayed in about 1 or 2 seconds.I used the upsize to SQL Server tool in Access to shift my data into SQL Server.When I use the Express Mangagement tool to open the same table and say show me the last record, it takes 17 minutes.I admit that most numeric data types have been translated to floats, so that's probably not good.But I cant alter them from floats to numeric or decimals using the table design tool.Do the conversion anomalies make up the whole reason why SQL Server seems so incredibly SLOW! ?????????
View 2 Replies View RelatedI am running the following BCP to extract a table with 156641604 rows.
bcp TestDB..data out test3.bcp -T -b1000000 -a32000
When running this i notice that the disk read bytessec counter in performance monitor on the drive that has the database devices is only reading 30mbsec. I am writing the bcp file to a different drive. Both drives are far more capable of achieving much higher IO. Is this a limitation with BCP or are there futher switches available that would speed this process up. Also the drives are both local so the bottle neck is not network. Any ideas?
Has anyone else noticed delays with SQL Express? I'm not really talking about delays on the queries but just delays in general response. For example: everything is running great, then for about 2 minutes I get connection timeouts etc can't even open stuff in the management studio without getting timeouts ... then as strangely as it started everything goes back to normal and requests are served again.
The server has nothing on except 1 website, its Win 2003 Server. 512MB Ram on a PIV. The memory usage is low and during the "lockups" the machine isn't showing any processor usage and SQL mem usage is around 40Megs.
I am not using User Instances either. Nothing in the event logs. What is odd, is its happening on 3 of my machines ..... all with different sites, the only thing in common between them is SQL.
thanks,
-c
This sounds like a pretty easy one. I have a SQL 2000 database with 2-3.4GHZ CPUs and 1GB of RAM. I have one database on it. I go in Query Analyzer on another machine and run a simple query like 'SELECT * FROM USERS' which should return 15,000 rows.
IT takes 30 (thirty) seconds to finish this query. OMG
Where do I start to decipher why on Earth this takes more than .01 seconds?
Thanks.
Hi all...
I need urgent help, about someting:
i've developed and deploy an aspnet web site (data works with sqlserver), but after a few minutes working with some users, the permormance slows and stop the site.
please help me what should i do......
Dear All,Finally I completed my project. Thanks all you helped me to do it.Now I have the biggest problem. In my application the Data grid is filled with Data from SQL server table which has large number of records. When I run my queries in SS Management Studio it runs very fast. To fill data to datagrid it takes lots of time. How can I reduce this time. How can I increase the performance of my Application.Thanks,Janaka
View 4 Replies View RelatedHello,
We have Sql Server installed on the Windows 2000 machine.
There are 2 databases that the employees access on it.
When the machine is just started, there are no problems but after some time the connections get really really slow and then eventually become impossible to connect. We did not have this problem before until we had a computer crash and then Sql server was installed on a new machine. The problems started here. We had 64mb of ram and we put in another 64mb thinking this would solve it but it did not.
On the task manager the cpu percentage stays constant at 100% and the virtual memory increases and increases very slowly until it has reached it's maximum (which is 600mb).
Any information would be greatly appreciated.
Thanks very much,
Kostas
Hi,
I have a query which has suddenly started responding slow.
CAn anyone tell me what could be the possibilities?
I tried update stats(I am on sql 70-though it's done auto but i did it manually again)
I used union all in place of union but had no big effect.any othe thought?
Thanks!
Hi,
Some of my queries are running too slow.It's taking as long as 30secs .Earlier the same query was taking less than 5 secs.
I understand the db has grown BUT I do not know to look at this query where should i start from and what should I look into.
It is on production server.
the db size is 15GB and unallocated is 9GB.
log space used is 4%.
TIA.
After I installed SP3 on a production SQL box, backups are a lot slower. Anybody have any suggestions?
Thanks in advance
Joe
I am configuring a new Server running SQL Server 7.0 sp2.
I have a job that runs a large process. From the SQL Agent job, the processing time is over 8 hours. If I run the same process from a query window, it takes an hour and a half. On my 6.5 database, the job takes 2 - 3 hours from the scheduled task. What is going on with SQL Agent? I have not been able to find any information on memory and SQL Agent.
Any help would be greatly appreciated!
Thanks
Trina Blazek
I have a dts job set up to transfer 550,000 records from a dbf file into a sql server. If I just let it run, there is a 9-10 minute delay, then it starts. If I try to schedule a job, it fails completely. I looked up ways to get it to execute quicker, mainly going to the advanced tab of the transform arrow and making the inserts 1000 at a time, the table locked, turning constraints off. Any advice on how to speed it up or why the job is failing?
View 7 Replies View RelatedI need to transfer a database from one server to another, I'm using the DTS utility because the servers have different sort orders. our database size is about 5GB which include about 2500 tables. Using DTS is taking many hours to transfer all objects and data. is there a better/faster way to do this?
Any help would be appreciated, Thank you
Hi,
Does any one have an idea why the backup to disk it takes 6 Hrs and to Tape it take 1 Hr.? Also what will be the diference
bettewn creating the a backup device and just creating the file on the fly when I setup the backup?
Thanks Felix.
Hi,
I have a query that takes minutes to execute, even through there are about 300,000 records are being processed. I would appreciate any help with optimizing that query.
I have two tables: User and Usage. Table user has two fields: User_Id and Date_Created and a non-clustered index on User_Id. Table usage has two fields also: User_Id and Date_Used and non-clustered index on both fields. The User table is populated when the user registers. The Usage table is populated every time the user opens a document.
Here is what I need to do: get the number of users from the Usage table who opened a document at least once after they have registered during the last 30 days for each day in the time frame, where the time frame varies.
For example, if the time frame is 8/01/00 - 8/31/00, I need to get the following data:
date returns
---- -------
8/01/00 10 (10 users returned to the document between 7/2/00 and 8/1/00)
8/02/00 15 (15 users returned between 7/3/00 and 8/02/00)
.
.
.
8/31/00 20 (20 users returned between 8/1/00 and 8/31/00)
Here is my query:
SELECT [date],
(SELECT count(distinct user_id)
FROM usage u JOIN [user] ON u.[user_id] = [user].[user_id]
WHERE u.[date] BETWEEN usage.[date]-30 AND usage.[date]
AND u.[date]>[user].date_created
GROUP BY usage.[date])returns
FROM usage
WHERE [date] BETWEEN @date1 AND @date2
This query works fine, but too slow. We use MS SQL server 7.0.
Thank you,
Yana
I am running an IIS, windows NT and Sql 7.0
My asp pages load very slow whenever the database is accessed. All the database scripts are in stored procedures. The pages load very fast when they are run on personal web server. I will really appreciate any suggestion to help fix the problem
Thanks
Godwin
IMS
Howdy. I have a table in my DB that has about 2 million records. The search times are taking 15 - 30 seconds depending on the number of records I am returning. Is this normal? The machine is NT 4 sp6a Dual PIII 866's with 1 GB of RAM on RAID5 SCSI disk. This seems like a long time to me. What kind of performance should I expect? Any kind of tuning steps I can take?
Thanks
Shane
I have a query which responds immediately when run however if I add an order by clause it takes 40 seconds. Below is the query with the order by clause
SELECT distinct Licenseplate, platetypecode.platetypecode, platetypecode.platetypecodeid
FROM Ticket INNER JOIN PlateTypeCode
ON PlateTypeCode.PlateTypeCodeID = Ticket.PlateTypeCodeID
ORDER BY licenseplate
The Ticket table contains approx. 11,000 records. I have created a nonclustered index for the licenseplate field, a 7 char varchar field.
Any suggestions for speeding up the query?
UPDATE TABLE SET FIELD1=NULL WHERE FIELD2=something - is running veryvery slow. Table has 200 000 recors and on FIELD2 is index.
If number of records to be updated is about 40 000, it takes about 3-4 hours.
I tried this:
- create a new MSSQL DB.
- migration of TABLE from original databes to new test DB
- So, in new DB was only one table, with 200 000 records. Index was only on FIELD2.
- I ran that update and it takes about 3-4 hours
I tried that on Interbase and update takes under ONE SECOND!!!!!!!
Thanks for some notices
Hi,
I am using MSDE and Analysis Services (lastest packs) and the same installation on the same machine has been working great for the last 18 months or so untill yesterday. Whenever I try to open a DTS (in order to edit it) the machine just goes into a coma.... I have tried to re-start many times but of no use.
Can someone kindly guide me what should I look for in order to solve this.
I will be very grateful for your help.