I use SQLOLEDB Provider
It is not stored porcedure but program code
create new global temporary table with
CREATE TABLE ##tmp123 (...
create and open a recorsset to populate it as direct table
at the open stage I get the following error: Invalid object name '##tmp123'
hi have written SSIS script and i am using script component to Row count below my code what i have written. and i am getting error below i have mention...after code see the error using System;
} the error Script component has encountered an exception in user code Object is not an ADODB.RecordSet or an ADODB.Record. Parameter name: adodb at System.Data.OleDb.OleDbDataAdapter.FillFromADODB(Object data, Object adodb, String srcTable, Boolean multipleResults) at System.Data.OleDb.OleDbDataAdapter.Fill(DataTable dataTable, Object ADODBRecordSet) at ScriptMain.CreateNewOutputRows() at UserComponent.PrimeOutput(Int32 Outputs, Int32[] OutputIDs, PipelineBuffer[] Buffers) at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
I am trying to access a table that I know exists and has data. But, when I create a recordset and check for RecordCount, I get a result -1 (no records). When I access the same table (using the same program), it reports (and I can view in a dbgrid) 752580 records exist.
Here's some of the code:
The table is originally copied from another database; I use the following code to be sure the previous connection is closed before proceeding.
If Not adoRS Is Nothing Then If adoRS.State = adStateOpen Then adoRS.Close Set adoRS = Nothing End If If Not DbConn Is Nothing Then If DbConn.State = adStateOpen Then DbConn.Close Set DbConn = Nothing End If
Then a new connection (it works) is opened to access the database with the copied table:
I've been trying different things to "READ" the recordset from the "Message Queue". I can read it but with some weird characters. I've tried
ActiveXMessageFormatter
BinaryMessageFormatter
XMLMessageFormatter
So, far I have no luck.
MessageQueue msgQ3 = new MessageQueue("SERVERNM\" + msg.Label, false);
Message msg3 = new Message();
msg3.Formatter = new ActiveXMessageFormatter(); msg3 = msgQ3.Receive(new TimeSpan(0, 0, 30));
byte[] b = new byte[msg3.BodyStream.Length]; msg3.BodyStream.Read(b, 0, (int)msg3.BodyStream.Length); System.Text.ASCIIEncoding enc = new System.Text.ASCIIEncoding();
returnVal = enc.GetString(b);
RESULTS: I try to convert it to String and then deserialize later but I'm getting some junk. <?xml version="1.0" encoding="utf-8" ?>
I used adodb connection and recordset in script task. but i have an error saying adodb is not defined. how do i add it to reference? or, is adodb can run in script task or only ado.net?
Firstly I consider myself quite an experienced SQL Server user, andamnow using SQL Server 2005 Express for the main backend of mysoftware.My problem is thus: The boss needs to run reports; I have designedthese reports as SQL procedures, to be executed through an ASPapplication. Basic, and even medium sized (10,000+ records) reportingrun at an acceptable speed, but for anything larger, IIS timeouts andquery timeouts often cause problems.I subsequently came up with the idea that I could reduce processingtimes by up to two-thirds by writing information from eachcalculationstage to a number of tables as the reporting procedure runs..ie. stage 1, write to table xxx1,stage 2 reads table xxx1 and writes to table xxx2,stage 3 reads table xxx2 and writes to table xxx3,etc, etc, etcprocedure read final table, and outputs information.This works wonderfully, EXCEPT that two people can't run the samereport at the same time, because as one procedure creates and writesto table xxx2, the other procedure tries to drop the table, or read atable that has already been dropped....Does anyone have any suggestions about how to get around thisproblem?I have thought about generating the table names dynamically using'sp_execute', but the statement I need to run is far too long(apparently there is a maximum length you can pass to it), and evenbreaking it down into sub-procedures is soooooooooooooooo timeconsuming and inefficient having to format statements as strings(replacing quotes and so on)How can I use multiple tables, or indeed process HUGE procedures,withdynamic table names, or temporary tables?All answers/suggestions/questions gratefully received.Thanks
I have an application which runs successfully on a couple of my customer's machines but fails on a third. It seems to fail when opening the database:
Unable to cast COM object of type 'ADODB.CommandClass' to interface type 'ADODB._Command'. This operation failed because the QueryInterface call on the COM component for the interface with IID '{B08400BD-F9D1-4D02-B856-71D5DBA123E9}' failed due to the following error: No such interface supported (Exception from HRESULT: 0x80004002 (E_NOINTERFACE)). Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=false; Initial Catalog=lensdb;Data Source = SQL
Before I got this error I was getting another problem (sorry didn't make a copy of that error's text) that made me think that adodb.dll simply wasn't loaded/registered. I got rid of that error by copying my adodb.dll onto the third machine and running gacutil /i. There is now an entry in winntassemblies for adodb.
Just in case you think it could be an obvious registry problem: when I started getting the current error I thought that maybe the registry needed updating and I merged the following lines into onto the target machine (from my dev machine):
I have posted various questions on the microsoft ng trying to identify the cause of this error.
ADODB.Command error '800a0cb3' Object or provider is not capable of performing requested operation.
I have MDAC 2.8 installed locally on my machine.
Via IIS I created a virtual directory via IIS that points to my ASP files on c:
Via the SQL Server IIS for XML configuration utility I created a virtual directory with a different names that points to the same directory as that created via IIS.
The SQL database is on a different machine and I connect via the OLEDB DSNless connection string.
I used a ADODB.Stream to transform the XML against the XSL but I couldnt get it to work. To simplify things and work towards a solution I inserted the code into my ASP from the MS KB article Q272266 (see below). I amended the ms code to change the connection code and call a stored procedure that exists on the database. The ms code gives the same error as my original code.
I tried changing the CursorLocation to server and client but the results were the same.
I put a SQL trace on the DB to determine if the stored procedure gets ran, it does not.
If I run the following in the URL it works:
If I run http://localhost/p2/?sql=SELECT+*+FROM+tblStatus+FOR+XML+AUTO&root=root&xsl=tblStatusDesc.xsl it works. If I run the xml template it works: http://localhost/p2/xml/test.xml
The two lines above run. My IIS server uses a virtual directory called dev, so when I run the ASP I type http://localhost/DEV/secure/aframes.asp the IIS virtual directory creted by sql server is called p2 but has the same source code directory.
Here is the MS code amended as described above that does not work.
sQuery = "<ROOT xmlns:sql='urn:schemas-microsoft-com:xml-sql'><sql:query>Select StatusDesc from tblStatus for XML Auto</sql:query></ROOT>" '*************************************************
Dim txtResults ' String for results dim CmdStream ' as ADODB.Stream
Set adoConn = CreateObject("ADODB.Connection") Set adoStreamQuery = CreateObject("ADODB.Stream")
adoConn.ConnectionString = sConn adoConn.Open
Set adoCmd = CreateObject("ADODB.Command") set adoCmd.ActiveConnection = adoConn
adoConn.CursorLocation = adUseClient
Set adoCmd.ActiveConnection = adoConn
adoStreamQuery.Open ' Open the command stream so it may be written to adoStreamQuery.WriteText sQuery, adWriteChar ' Set the input command stream's text with the query string adoStreamQuery.Position = 0 ' Reset the position in the stream, otherwise it will be at EOS
Set adoCmd.CommandStream = adoStreamQuery ' Set the command object's command to the input stream set above adoCmd.Dialect = "{5D531CB2-E6Ed-11D2-B252-00C04F681B71}" ' Set the dialect for the command stream to be a SQL query. Set outStrm = CreateObject("ADODB.Stream") ' Create the output stream outStrm.Open adoCmd.Properties("Output Stream") = response ' Set command's output stream to the output stream just opened adoCmd.Execute , , adExecuteStream ' Execute the command, thus filling up the output stream.
I've been programming with SQL 7 for about a year and my company has finally decided to go SQL 2k5.
I've come accross a really irritating error when writing to the DB via ADO in ASP pages. I have a column in a table that is auto-incremental.
In SQL server 7 you just make an ADODB.Command object and enter the SQL query 'insert into table (columns) values ('val...') now for SQL 7 I can completely leave out the auto-incremental column (called 'ErrorNo') and simply specify the other columns and values in my insert query. e.g. where my table is called master_error:
ErrorNo int identity (1, 1) not null ,ReportedBy char(10) ,ExpectedFixDate datetime
With ErrorNo being an auto-incremental identity, my query would be
INSERT INTO master_error (ReportedBy, ExpectedFixDate) VALUES ('Ben','01 Sep 2007')
this works perfectly with ADODB.Command when writing to SQL Server 7 from IIS 5.0
however when I execute the exact same command on the exact same table using ADODB.Command writing to SQL Server 2005 from IIS 6.0 I get an 'error 500 internal server error'
I thought perhaps SQL 2005 might have different syntax so I typed the query directly into SQL Server 2005's version of query analyser and guess what... it worked fine.
I can't tell where the error lies. I find it hard to beleive that the error is in the code of my ASP page as it works perfectly against a sql 7 db.
Can you create a temporary table using an ad-hoc query?
What I am trying to do is a type of filter search (the user has this and this or this) were i will not know how may items the user is going to select until they submit....that is why i can't use a stored procedure(i think)....any help on how to do this?
I'm having trouble with a query and ASP. The query itself is easy. I need a temporary table to be filled with the contents of a select and then i need to select out of the temporary table and return the results to the ASP.
INSERT into @results (ItemID, ItemDescription, ItemType, ItemRequestedBy, ItemStatus, ItemQuantity, ItemCostPer, ItemOrderNumber, DateAdded, DateLastEdited) SELECT * from cr_EquipmentData
SELECT * from @results
When I run this in Query Analyser, I get exactly the results I need... But when I try and run the ASP script, I get an error about the recordset not being open. (It's not the ASP checking 'cos when I run a simple select statement it works)
I appreciate there is no use for the query as it stands, but eventually I want to be able to perform not destructive statements on the @results table before returning the data to ASP. This is more 'testing' than anything at the mo'
I am writing a stored procedure that outputs it's information to a temporary table while it assembles the information. Afterwards, it returns the contents of the temporary table as the results of the stored procedure.
I found that if you create the table, inside the SP, as an ordinary table, the information builds to that table considerably faster than if you use a true temporary table.
I found that if I create a user function that returns a table as it's return value, it is also as slow as if I used a true temporary table.
The database can amass over 2 million records in one table in just a few days. If I have the procedure query against this table, and output to an ordinary table it creates, and summarize the information it is adding to the table, then it takes an average of around 4 minutes to return the results from the query. If I change the output table to a temporary table (#temp), it between 12 and 15 minutes. Nothing else in the procedure changed. Just the kind of table. If I take the logic and move it to a function which returns those results in a RETURN table, it also takes over 14 minutes.
Why would it take so much longer outputing to a temporary table rather than a normal table? Is it because temporary tables are stored in a different database (tempdb)? Why would returning query results from a function be just as slow?
Hi, I am trying to join two tables usig two temporary tables.I want the Output as table2/table1 = Output
Select temp2.Value/temp1.Value as val from ( ( select count(*)as Value from [Master] m inner join [Spares]s on s.SId=m.SID where m.Valid_From between '2007-06-01' and '2007-06-30' )temp1 Union ( select isnull(sum(convert(numeric(10,0),s.Unit_Price)),'0') as Value from [Order] h inner join [Spares] s on s.Number = s.Number where h.Valid_Date between '2007-06-01' and '2007-06-30' ))temp2 as t
Our java application is running on oracle and now we would like to port it to sql server 2000. In oracle we have a global temporary tables that has an option on commit to delete rows. Now I am looking for the same option in sql server but as far as I can see sql server's local temporary tables are visible per connection. Since the connection are shared in the pool it seems I can not rely on local temporary tables since the connection does not get closed at the end of the user's session. I need something that is visible per transaction not per connection.
Is it a better approach just to create a regular table and basically have a trigger to delete all data on commit?
If a stored procedure invokes another stored procedure that creates atemporary table why can't the calling procedure see the temporary table?CREATE PROCEDURE dbo.GetTempASCREATE TABLE #Test([id] int not null identity,[name] as char(4))INSERT INTO #Test ([name]) VALUES ('Test')CREATE PROCEDURE dbo.TestASEXEC dbo.GetTempSELECT * FROM #Test -- Invalid object name '#Test'.Thanks,TP
Hi, just a quick question regarding performance and speed.Q. Run a complicated query three times or create a TEMPORARY table andaccess it as needed?I have a page where it will be accessed 10,000+ a day.In that page I have an SQL query where it involves 3 different tables(approximate table sizes T1) 200,000 T2) 900,000 T3) 20 records)I'll be running that query 3 times in that page...One to retrieve the content and display it on the page.Second to count the number of records (using COUNT(*) function)And third to retrieve the content regions. (using DISTINCT function)What would be the best way of doing...Running the SQL query 3 timesOrCreate a temporary table and access it as many time as I needRegards,
I would like to know from other members whether they are successful in using temporary tables within a stored procedure and use that stored procedure in a report.
My scenario is like this:
I have a stored procedure A which fetches the data from different tables based on the orderno passed as input parameter.
I have built another stored procedure B with a temporary table created and inserting the rows in to this temporary table based on vendor related specifc orders by calling the above procedure A.
I have used this stored procedure in the dataset created and getting this error:
Could not generate a list of fields for the query. Check the query syntax, or click the Refresh Fields on the query toolbar.
Does anybody encountered this and have a resolution.
The table definition for @temp is the same as TBL_EVENT_DEFINiTION. The problem is I need to add a field to the @temp table and define the values using an update statement. If I include the additional field in the @temp table declaration then I get an error saying something to the effect of the number of fields is different. Is there a way of altering the temporary table to add this field?
I'm creating a temporary table in a procedure to help me with some calculations. I would like the name of the temporary table to be 'dynamic', I mean, to have the hour it was created, something like this: create table #myTempTable15May11:10:00, so if someone access the procedure while it's running, he won't have problems in creating his 'own' temporary table (create table #myTempTable15May11:10:02). Is there anyway I can do this?
I tried to make a DTS to transform data in a text file, I used a Store Procedure that use a temp table (#Resultados) but the DTS give me an error.
I read that in this case I canīt use local temp tables but I can use global temp tables, then I changed in my Store, #Resultados by ##Resultados, bu the result was the same.
SELECT LEFT(RP.Grupo,3) + LEFT(RP.Equipo,12) + LEFT(RP.SubGrupo,2) + LEFT(D.Fecha,8) + D.Valor as Dato FROM ReportesPlantilla RP LEFT JOIN ##Resultados D ON RP.Planta = D.Planta AND RP.Etapa = D.Etapa AND RP.GrupoEquipo = D.GrupoEquipo AND RP.Equipo = D.Equipo AND RP.Concepto = D.Concepto AND D.Fecha BETWEEN @FechaIni AND @FechaFin
Can some one tell me how to display the fields of temporary table in the report? In the query a global temp table is created and in the end i am displaying the fields of that table. How do i do that?
I need to use a temporary table within a stored procedure (which is a transaction), should I use local temporary table or global temporary table?
If I use a temporary table in my transaction, do I destroy the temporary table at the end of the transaction, or just leave it, and let the sytem clean it up? If I destroy the temporary table at the of the transaction, what happens if another user session is accessing the temporary at the very moment?
I am searching for information on paging large datasets, and have found some that involve creating temporary tables in the database. Before I head off and implement something, I have a number of issues I'd like to bounce around here.
1. An example I found on MSDN involves creating a temporary table, copying relevant columns to the row in the temp table. Why do this, rather add the source tables primary keys into the temp table, and do a join? Example; browsing Products Catalog which is categorised into hierarchies. The MSDN version would have a temp table created with a incrementing field which is used for the paging, and then a number of fields are also copied from the products table to the temp table - my question is why not simply copy the product primary key into the temp table, and then join?
2. In real life, do people allow each user to create their own temporary tables? If I have 1000 concurrent users, all wishing to perform a page-based browse, I would be creating 1000 new temporary tables. Do people consider default temp tables, that is, creating a default temporary table for browsing each category in the products table, for example?
3. Do you have any advice/tips for this type of problem?