I have an Execute SQL Task that selects one column value from one row, so General > ResultSet = Single row. Result Set > Result Name = 0 (the first selected value) and Variable Name = User::objectTypeNbr. The task runs successfully, but after the it runs the value of User::objectTypeNbr is not changed.
User::objectTypeNbr > Data Type = Int32. When I declared the variable Value could not be empty so I set it to 0 arbitraily, assuming it would be overwritten when assigned a new value by the Execute SQL Task, but it remains 0 after the task runs. What am I missing here?
Hi, I am relatively new to SSIS. Please advice me on how to change the value of a variable during runtime. i.e. user should be able to key in the value.
SELECT * FROM .dbo.tblUser Where vcLogonName = @pNewLogonName
to
[dbo].[spLogonName @pNewLogonName varchar(60) AS
DECLARE @Local_pNewLogonName varchar(60)
SET @Local_pNewLogonName = @pNewLogonName
SELECT * FROM .dbo.tblUser Where vcLogonName = @Local_pNewLogonName
and started getting this error on the web page.
ADODB.Recordset error '800a0cb3'
Current Recordset does not support updating. This may be a limitation of the provider, or of the selected locktype.
Does anyone know why this is happening? Nothing on the site has changed. If I change the sp back the errors go away. I'm trying to use local variables in all SP to avoid the slowness that can happen when using the parameter varibles.
I want to set the value for a user defined variable with the script task with the intention of using that value as a condition in both of two Precedence Constraints for the purpose of determine which of two different direction the package will go. The problem is I don't know how to reference the User Defined variable in the Script of a Script task, nor how to alter its value.
Can I design data flows with an XML Source pointing to file system xml files, and but run them against variables? I can see the properties at design time (Accessmode=0,1,2; XMLDataVar="User::XMLData), but the XMLSource object doesn't appear to have expressions enabled to change this at runtime, nor do these properties seem to be exposed to configurations.
The scenario:
I have a package that reads through (potentially thousands) of XML files, and having identified the embedded message "type", transforms them via XSLT to an easier XML format (XML Datasource can't cope well with multiple namespaces etc). Then send this the new file off to a type specific dataflow task to send contents to database.
For performance (and other) reasons I'd rather XSLT to a single shared variable and use this variable as the XML source in the data flows (Rather than writing out to the filesystem [via xslt] and immediately reading it back in each time [via datasource]).
But designing the data flow task using a variable as xml source is frustrating for a bunch of reasons - not the least being:
(a) The variable needs to be populated with sample xml data at design time. But this could only ever match one dataflow at a time (fine at run time, but painful at designtime - leading to validation errors popping up all around the place.)
Yes I could have a seperate variable for each xmlsource - but that just makes things more complicated up front and also leaves me with 20+ large, type specific, xml variables floating around at runtime of which only one is ever being used per import - which just doesn't feel right to me.
(b) populating the variable with sample xml at design time is painful due to it being a string - and not accepting hard returns etc that are in text files, design time changes etc)
Using a file XML source at design time is infinitely easier, when I get a design time change - I can just amend the XSLT and run the appropriate task to produce a sample XML file to design against.
I'm creating a package that needs to go to an FTP site (FTP Task), download a file, unzip it and then process a series of table loads for the 12 text files that will be unzipped. My problem is that the zip file is a date (yyyymmdd.zip) which is normally the previous day of execution EXCEPT on Mondays when it would be the previous Friday's date. My thought is that IF (magic question) I could determine the day of the week in the SSIS package, I know that Tuesday-Friday is just a formatting exercise of getdate()-1 and Monday would be getdate()-3 but I can't seem to find a way (function?) that will allow me to determine the day of the week?
Is there any way to change variable scope while using package templates?
I have created a package template that has several variables, a "typical" control flow and data flow. My goal was to try and use this as a starting point to create other packages within the same project and edit as required in the new package. I couldn't find any way (yet) to change scope of variables...these still show as belonging to the scope of package used to create the template.
Hi, I'm trying to set the value of the variable @prvYearMonth thru this sp. In the query analyzer I execute the following code to the see the results of my 'CabsSchedule_GetPrevYearMonth' SP, but the only see "The Command(s) completed successfully in the result. What am I missing??
Thanks in advance
CREATE PROCEDURE CabsSchedule_GetPrevYearMonth ( @prvYearMonth int OUTPUT )
AS BEGIN SET @prvYearMonth = (SELECT MAX(YearMonth) FROM CabsSchedule) END GO
SELECT @tmpCount returns nothing. The RIGHT(....) function does not render any results. I am expecting '0006'.
I read that the data type must be compatible with varchar. The @cLastBarcode was declare as char(25). I have even tried casting the @cLastBarcode char string to type varchar.
I did a trace on a production DB for many hours, and got more than 7 million of "RPC:Completed" and "SQL:BatchCompleted" trace records. Then I grouped them and obtained only 545 different events (just EXECs and SELECTs), and save them into a new workload file.
To test the workload file, I run DTA just for 30 minutes over a restored database on a test server, and got the following: Date 28-12-2007 Time 18:29:31 Server SQL2K5 Database(s) to tune [DBProd] Workload file C:Tempfiltered.trc Maximum tuning time 31 Minutes Time taken for tuning 31 Minutes Expected percentage improvement 20.52 Maximum space for recommendation (MB) 12874 Space used currently (MB) 7534 Space used by recommendation (MB) 8116 Number of events in workload 545 Number of events tuned 80 Number of statements tuned 145 Percent SELECT statements in the tuned set 77 Percent INSERT statements in the tuned set 13 Percent UPDATE statements in the tuned set 8 Number of indexes recommended to be created 15 Number of statistics recommended to be created 50 Please note that only 80 of the 545 events were tuned and 20% of improvement is expected if 15 indexes and 50 statistics are created.
Then, I run the same analysis for an unlimited amount of time... After the whole weekend, DTA was still running and I had to stop it. The result was: Date 31-12-2007 Time 10:03:09 Server SQL2K5 Database(s) to tune [DBProd] Workload file C:Tempfiltered.trc Maximum tuning time Unlimited Time taken for tuning 2 Days 13 Hours 44 Minutes Expected percentage improvement 0.00 Maximum space for recommendation (MB) 12874 Space used currently (MB) 7534 Space used by recommendation (MB) 7534 Number of events in workload 545 Number of events tuned 545 Number of statements tuned 1064 Percent SELECT statements in the tuned set 71 Percent INSERT statements in the tuned set 21 Percent DELETE statements in the tuned set 1 Percent UPDATE statements in the tuned set 5 This time DTA processed all the events, but no improvement is expected! Neither indexes/statistics creation recomendation.
It does not seem that Tuning Advisor crashed... Usage reports are fine and make sense to me.
What's happening here? It looks like DTA applied the recomendations and iterated, but no new objects where found in DB.
I guess that recomendations from the first try with only 80 events were invalidated by the remaining from the long run.
My first foray into the SQL CLR world is a simple function to return the size of a specified file. I created the function in VS2005, where it works as expected. Running the function in SSMS, however, returns a value of zero, regardless of the file it is pointed at.
Here's the class member code:
Public Shared Function GetFileSize(ByVal strTargetFolder As String, ByVal strTargetFile As String) As Long
This always returns zero with no error displayed. Running Profiler was little help and there's not much in the Event Log. The function returns correct values in VS2005. The assembly is created with UNSAFE because using EXTERNAL_ACCESS resulted in a security error that prevented the assembly from being created, let alone running. Security is, I suspect, at the root of this issue as well, but I'm not sure what or where to look to verify this.
So I€™m at a dead-end looking for the reason behind the following behavior. Just to make sure no one misses it, the 'behavior' is the difference in the number of reads between using sp_executesql and not.
The following statements are executed against a SQL 2000 database that contains >1,000,000 records in the act_item table. They are run using Query Analyzer and the Duration and Reads come from SQL Profiler
SQL 2: DECLARE @Priority int DECLARE @Activity_Code char(36)
SET @Priority = 0 SET @Activity_Code = '46DF335F-68F7-493F-B55E-5F9BC6CEBC69' update act_item set Priority = @Priority where activity_code = @activity_code
Reads: ~160 Duration: 0 ms
Random information:
Activity_code is an indexed field on the table, although it is not the primary key. There are a total of four indexes on the table, none of which include the priority as one of the fields. There are two triggers on the table, neither of which is executed for this SQL statement (there is an IF UPDATE(fieldname) surrounding the code in the trigger) There are no foreign relationships I checked (using perfmon) to see if a compilation/recompilation was happening. No it's not. Any suggestions as to avenues that could be examined would be appreciated.
Hi All, I am kindly seeking for help. I have a table(MyTable) which is defined as (date datetime, ID char (10), and R, P,M,D&Y are all float) and the layout is as following: Date ID R P M D... Y 1/1/90 A 1 2 3 4... 5 1/2/90 A 2 3 4 5... 1 ... 2/11/05 A 3 4 5 6... 2 1/1/90 B 1 2 3 4... 5 1/2/90 B 2 3 4 5... 1 ... 2/11/05 B 3 4 5 6... 2 ... The expected query results look like: ( this results from Date, ID and R fields) Date A B 1/1/90 1 1 1/2/90 2 2 ... 2/11/05 3 3
The SQL I wrote: select date, ID, A=sum(case when ID=A then R else 0 end), B=sum(case when id=B then R else 0 end) from MyTable Group by date
I would also like to get another set of results with the same format but from date,ID and P fields: Date A B 1/1/90 2 2 1/2/90 3 3 ... 2/11/05 4 4
select date, ID, A=sum(case when ID=A then P else 0 end), B=sum(case when id=B then P else 0 end) from MyTable Group by date
The problem with that is if I have thousands of ID in MyTable I have to "hard code" thousands times and the same problem with the fields/columns. Is there any easier way to do this? I also would like to insert the results into a table/view which will be refreshed whenever MyTable gets updated.
Any suggestion/comments are highly appreciated! shiparsons
I use the following sproc to populate a table that is used as the base recordset for a report.
For some reason, when the sproc is run from a scheduled job, it doesn't repopulate the table. It does, however, truncate the table. If I run it manually from query analyzer, it works fine.
I've checked all the permissions on all the object touched by the sproc, and everything looks right there. Is there another problem I should be looking for?
SET QUOTED_IDENTIFIER OFF GO SET ANSI_NULLS OFF GO
setuser N'mcorron' GO
CREATE PROCEDURE mcorron.CreateDiscOrders AS /* Creates table for Orders with disc items Actuate report */ SET NOCOUNT ON SET ANSI_WARNINGS OFF
TRUNCATE TABLE dbo.rptDiscOrders
INSERT INTO dbo.rptDiscOrders SELECT * FROM (SELECT ORD.product as prod_XREF, ORD.ORDER_NUMB, ORD.CustName, ord.units as ordunits, INV.Product, INV.Units FROM (SELECT TOP 100 PERCENT f.PARENT_SITE, f.SITE, dbo.vwCustBillTo.CustName, o.ORDER_NUMB, p.Prod_Xref, o.PRODUCT, o.ORDER_TONS * 2000 / m.part_wt AS UNITS FROM dbo.Lawn_Orders o INNER JOIN dbo.PRODUCT_XREF p ON o.PRODUCT = p.Product INNER JOIN dbo.FACILITY_MASTER f ON o.WHSE = f.SITE INNER JOIN dbo.Lawn_PartMstr m ON o.PRODUCT = m.part_code INNER JOIN dbo.vwCustBillTo ON o.BILLTO = dbo.vwCustBillTo.BillToNum WHERE (o.SHIP_DATE < DATEADD(d, 30, GETDATE())) and prod_xref not like 'dead%') ORD INNER JOIN (SELECT f.PARENT_SITE, x.Prod_Xref, i. Product, SUM(i.Qty) AS Units FROM dbo.Lawn_Inventory i INNER JOIN dbo.FACILITY_MASTER f ON i.Whse = f.SITE INNER JOIN dbo.PRODUCT_XREF x ON i. Product = x. Product WHERE (f.WHSE_TYPE = 'ship') GROUP BY f.PARENT_SITE, x.Prod_Xref, i. Product) INV ON ORD.PARENT_SITE = INV.PARENT_SITE AND ORD.Prod_Xref = INV.Prod_Xref) ordinv WHERE (Prod_Xref <> Product) GO setuser GO
Do you see anything wrong with this? The first select works and finds rows the second one does not. I have opened the Key since the first query does find rows.
select *
from [dbo].[dmTable]
WHERE cast(decryptByKey(field) as varchar(50)) = 'Value'
select *
from [dbo].[dmTable]
where field = EncryptByKey(Key_GUID('CLTCadminKey'),'Value')
I have a stored procedure that is Averaging a Difference in dates in seconds. All of the sudden it started throwing an Arithmetic overflow error. After running the query below on the same data, I can see that it is because the DateDiff in my procedure, which is calculating the difference in seconds, is returning a value greater than 68 years. Looking at the dates in the result table, I don't see how it is coming up with the values in the Years Difference column.
Code SnippetSELECT createdate, completeddate, DATEDIFF(y, createdate, completeddate) as 'years difference' FROM tasks WHERE (TaskStatusID = 3) and (createdate < completeddate) and (DATEDIFF(y, createdate, completeddate)>=68) ORDER BY completeddate
I am trying to convert an active x script in a script task. Below is a snippet of code. The underlined AsOfDate has a blue squiggly line under it and if I hover over it, it says "Declaration Expected."
Public Class ScriptMain
Dim AsOfDate As String
AsOfDate = Dts.Variables("MyDate").Value ...
Can someone please tell me what I'm missing? I thought maybe I'm missing an import statement, but I have:
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
I have used similar syntax in script components and it works fine.
In SQL 2005 SP1 - In my transactional replication RMO C# script, I want my snapshot job schedule to run daily at 2:58 AM.
Instead it runs hourly in the 58th minute. Sample code below shows I use the value 025800. That should be interpretted as AM. The frequencytype is daily. The frequency interval is 1. There is no subday frequency. Yet the job runs hourly and disregards the specified hour.
Is there something missing in this code? Is this a SQL Server bug?
// Set the required properties for the trans publication snapshot job. TransPublication tpublication = new TransPublication(); tpublication.ConnectionContext = conn; tpublication.Name = publicationName; tpublication.DatabaseName = publicationDbName; tpublication.SnapshotSchedule.FrequencyType = ScheduleFrequencyType.Daily; tpublication.SnapshotSchedule.FrequencyInterval = Convert.ToInt32(0x0001); tpublication.SnapshotSchedule.ActiveStartDate = 20051101; string newString = "025800"; tpublication.SnapshotSchedule.ActiveStartTime = Convert.ToInt32(newString); tpublication.Create();
What do I now have: A directory with Access Databases; around 20 databases, all dinamicly created; Each Database has on average 300 tables inside, all equally structured, all created by software; Each table has two DateTime fields, 4 double fields and 4 long int fields; Each table has around 10000 records, average.
The Directory is shared in a Windows 2003 Enterprise server. Around 20 users access the databases simultaneously, adding, retrieving and deleting data, over 100MBits LAN.
Here's the catch: As fast as possible, the program needs to retrieve 1 single record matching a single date from a given table in a given database. All databases work together. It needs to gets litterally thousands of individual records in order to work properly. Per user. That means thousands of requests, but not much data in each request. That's its core job. A small percentage of request write the record back , that is, update it. Maybe 2% of requests.
If I were to reproduce this situation in a SQL server 2005, what would be the expected time for lets say 50000 requests ? Or should I stick to Access ?
Hi - I'm using VWD, VB, and created a dataset/tableadapter to insert a record into a SQL Express database. The database has a couple of columns, but specifically a Datetime column. Using the default insert created, I have the following code: Dim da as New partyDetailsTableAdapters.partyDetailsTableAdapterProfile.partyid = da.Insert(Profile.UserName, tbName.Text, DateTime.Now) The compiler throws an error though, saying 'Expression expected' - and it squiggles an underline under the closing bracket after DateTime.Now - I have no problem if I'm trying to update a record using: Dim da as New partyDetailsTableAdapters.partyDetailsTableAdapterDim pd as partyDetails.partyDetailsDataTablepd = da.GetPartyDetailsByID(Profile.partyid)da.Update(Profile.UserName, tbName.text, DateTime.Now, Profile.partyid, Profile.partyid) Have I an error in my Insert section? Thanks for any help, Mark
Hi, I am trying to import an excel spreadsheet to the sql server database, I have 7 spreadsheets. in that 6 of them work fine, but when i try to import the 7th i am getting and error called External Table is not in the expected format System.Data.OleDb.OleDbException: External table is not in the expected format
I was hoping that the Trim function inside the update command, cmdUpdate.Parameters.Add("@doc_num", Trim(txtDocNum.Text)) , would deal with any leading and trailing spaces, but it does not seem to be doing anything at all. The value from the textbox still arrives in the database table with leading spaces!!
I am trying to extract the non matching records and the matching ones in two tables from two servers, one a linked server in one go. For example if table A has records, Rec1, Rec2, Rec3 and Rec6 AND Table B has Rec1, Rec2, Rec3 and Rec7 I need to get in the result set Rec1, Rec2, Rec3, Rec6 and Rec7.
The real query I ran is as follows. I want to know the list of all tables in GlobalDB database in sg_qt1 and sg_q5 servers. NOTE : sg_q5 is a Linked server to sg_qt1.
Select Substring(a.name,1,30), Substring(user_name(a.uid),1,19) from sysobjects a full outer JOIN sg_q5.globaldb.dbo.sysobjects b ON a.name = b.name where a.xtype = 'u' and b.xtype = 'u' and a.name not in ('dtproperties','rowcounts','globalDBrowcounts')
If I run it from sg_qt1, the result I get contain all tables from sg_qt1 but not the non-matching ones from sg_q5.
I am obviously doing something wrong, but what is it?
Thanks. If possible please reply to, r.wimalaratne@iaea.org
select INVOICE.TarrifHeadNumber, SUM(INVOICEITEMS.ItemQuantity) From invoiceitems, invoice Where invoice.invoicenumber = invoiceitems.invoicenumber and month(InvoiceDate)='11' and year(InvoiceDate)= '2012' group by INVOICE.TarrifHeadNumber
Now if i use below query and add invoicetypecode field
select INVOICE.TarrifHeadNumber,CETSH.GoodsDescription, SUM(INVOICEITEMS.ItemQuantity),INVOICE.invoicetype code From invoiceitems, invoice , cetsh Where invoice.invoicenumber = invoiceitems.invoicenumber and month(InvoiceDate)='11' and year(InvoiceDate)= '2012' and cast(CETSH.CETSHNumber as varchar) = INVOICE.TarrifHeadNumber group by INVOICE.TarrifHeadNumber,CETSH.GoodsDescription,in voicetypecode
Hello,if you create this table:create table hello (int a, int bconstraint pk_hello primary key clustered ( a, b ))and then insert the following recordsa,b1,11,21,32,12,22,33,13,23,3and then doselect a,b from hellothe output seems to be:a,b1,12,13,11,22,23,21,32,33,3which is wrong and (i think) is reflecting the actual index orderand physical order on diskit should be:a,b1,11,21,32,12,22,33,13,23,3i have tested this on a table with 500,000 recordsand sure enough if you declare the clustered primary key fields inreverse order:constraint pk_hello primary key clustered ( b, a )two things happen:- the select with no order by returns the records in the expected order- queries relying on that order run MUCH FASTERhas anyone else seen / noticed this?
I have the problems with UserSort functionality. I have attached the report to show what I mean. textbox3 located in the table group has the following user sort property value:
I have a package that has a container containing multiple DF Tasks.
The container is set to be Transacted, such that should any of the DF tasks fail the data inserted in any of the previous tasks rolls back.
This works as expected.
However, this container is part of a larger package and so I wanted to have a checkpoint on it, so that should any of the tasks within it fail, the package could be restarted from this container.
However, I would expect the functionality to be that on failure, the checkpoint would cause the whole container to be started again (because the container is transacted all DF task info would be rolled back) so we would expect it to start at task 1 again.
This is not the functionality I see. The package restarts from the failed task within the container every time.
According to the book Prof SSIS, it should start again from the first task and as explained this makes sense on a Transacted container as you would want this to happen.
A previous forum message encountered the same issue it appears:
See SSIS Checkpoints 04 Dec 2006.
This is an extract from it:
"I only experimented a little but my experience was that when I have a transacted container with multiple tasks that are checkpointed, SSIS would try to restart from the task that failed rather than from the first task in the container. The transaction was being rolled back correctly though.
In short, I felt that check points were not aware of transactions.
So, I ended up with this setting and it works for me:
Container is checkpointed and trasacted. Tasks within the container are not checkpointed. 'FailParentOnFailure' property set to True on the tasks.
That way, if a task failed, it would fail the container and a checkpoint would be created at that level. Transaction would be rolled back as usual."
While this makes sense to me it is not the same properties that the SSIS book has that work.
Additionally, this didn't work for me either !!
I have tried every combination of FailPackageOnProperty and FailParentOnProperty that makes sense but every time the package restarts from the failed container within the task.
The transaction is rolled back correctly every time, but it seems the checkpoint that is created is not used correctly when dealing with transactions within containers.
I have a ForEach Loop that has 3 script tasks in it.
I have them set up so that they execute in order, such as:
script1 ---> script2 ---> script3
script1 creates a file
script2 creates a file
script3 compares the files using a diff command
Problem is, when I execute the container, it shows that script3 finishes BEFORE script2, which of course gives an error b/c the file from script2 doesn't exist yet.
The error is "The system cannot find the file specified".
I'm reading a lot of data from a database using SqlDataReader. Now I'd like to report the progress to the user, but for this I need to know in advance how many items SqlDataReader will return.
Is there any way to get this 'number of expected results' from the SqlDataReader?
I have a simple column chart with point labels. The values are displaying at a -90 angle. I want the values to display in the center of the column, but when I choose the center position, it is displaying at the top; half of the value is in the bar and the other half above. Why?