I am reading a table one record at a time. Within this record a field can contain multiple values.
The delimiter is a ‘^’. The data comes from a Pick legacy system
The data looks like this :-
chargeable_item_cd quantity text_1
I77C1^I77C2 1^1 PLATES /SCREWS^1.3MM SET
I want to extract the multiple values from this field and insert a record for each set of values.
I can unravel the data easy enough. The problem I have is how to loop within a DTS Activex
Script to store each of the values extracted from the field before moving onto the next record.
Is it possible or am I better of using a SQL task an taking a fraction of the time. (I am resisting
this as my boss doesn’t like SQL code)
I am doing this within a data driven query.
Thanks in advance.
Cheers
Chris
The code I have so far looks like this(but doesn’t work). It gives a "No query specification returnedby transform status".
'************************************************* **********
' Visual Basic Transformation Script
'************************************************* **********
' Copy each source column to the destination column
Function Main()
DTSDestination("DHB_Key") = DTSLookups("DHB Lookup").Execute(DTSGlobalVariables("DHB_Code").Value)
DTSDestination("Health_Encounter_Theatre_Key") = DTSSource("rule_violtd_cd")
DTSDestination("Health_Encounter_Key") = DTSLookups("Hlth Encntr Lookup").Execute(DTSDestination("DHB_Key"), _
DTSDestination("Patient_Key"), _
DTSDestination("Patient_Care_Episode_Key"), _
"TH")
' This piece of code does it for multi field values for charagble items to individual values that can be stored in the database
' Copy each source column to the destination column
DIM string1, string2, string3, sitem, sqty, sdescript, quantity, descript
string1 = DTSSource("chargeable_item_cd")
string2 = DTSSource("quantity")
If IsNull(sqty) Then
quantity = 0
Else
quantity = sqty
End If
If IsNull(sdescript) Then
descript = "X"
Else
descript = sdescript
End If
If NOT IsNull(DTSDestination("Chargeable_Items_Key")) Then
If DTSLookups("Item Exists Lookup").Execute(DTSDestination("DHB_Key"),_
DTSDestination("Health_Encounter_Theatre_Key"),_
DTSDestination("Patient_Key"),_
DTSDestination("Patient_Care_Episode_Key"),_
DTSDestination("Health_Encounter_Key"),_
DTSDestination("Chargeable_Items_Key"),_
quantity,_
descript) = 0 Then
Main = DTSTransformstat_InsertQuery
Else
Main = DTSTransformStat_SkipRow
End If
Main = DTSTransformStat_OK
End If
I'm a newbie so I'll explain what I'm trying to achieve the best I can ...
I'd like to essentially loop through a SQL table to display the correct results. The workflow is the user query's the database and returns records (by property ID). In the return there are duplicate records being returned - in this case, two property owners returned with the same property ID.
How would I loop through the SQL statement in the application (code) to identify when the property id's are the same and display only one owner for that property?
Using a code snippet borrowed from a co-worker, I have put together a query that, among other things, pulls a list value out of an xml clob field and displays it in the query results. My query as it stands right now is below, followed by a snippet from the xml clob that I am pulling from.
select * from (Select Wtr_Service_Tag, Wtr_Tran_Origin, Wtr_Send_Date, Wtr_Receive_Date, to_char(substr(wtr_req_xml,instr(substr(wtr_req_xml,1,8000),'SID')+8,12)) Asset_Tag from ws_transactions Where Wtr_Service_Tag In ('20458749610')
[Code] ....
This query is only able to pull the first value in the list.
I have two questions...
[1]How can I edit this query to pull all of the list items when there are more than 1? I have another field, in a separate table, that I can pull from to get that number.
[2]This one may be more complex. As currently written, the query pulls a fixed number of characters from the xml clob and either returns not enough data, or too much because the values I need to pull could be of varying lengths. I have no way to query what those lengths might be.
I am attempting to use the foreach loop structure in an SSIS package toloop through however many Excel files are placed in a directory andthen perform an import operation into a SQL table on each of thesefiles sequentially. The closest model for this that I was able to findin the MS tutorial used a flat file source rather than Excel. Thatinvolved adding a new expression to the Connection Manager that set theconnection string to the current filename, as provided by the foreachcomponent. That works just fine, but when I attempt to apply the samemethod to an Excel source, rather than a flat file source, I cannot getit to work. I see the following error associated with the Excel sourceon the Data Flow page: "Validation error. Data Flow Task: Excel Source[1]: The AcquireConnection method call to the connection manager "ExcelConnection Manager 1" failed with error code 0xC020200." I think thatit's just a matter of getting the right expression, and I thought thatperhaps I should be constructing an expression for ExcelFilePath ratherthan the Connection String, but I have fiddled with it for hours andhaven't come up with something that will be accepted. Has anybody outthere been able to do this, or can perhaps refer me to somedocumentation that contains an example of what I am trying to do?Thanks for any help you can give.
SET @SQL = 'Select * FROM IdentipassNew.dbo.CBORD_Interface_Final' SET @BCPBody = 'bcp "' + @SQL + '" queryout "d:smartcardcbordudfcbordbody.txt" -T -fc:cpbody.fmt'
Problem is, there is over 85,000 records in that set and that is too big for the text file, so I was wondering if it would be possible to select like 30,000 records output those to a text file, then select the next 30,000 and create another file, then finally get the remaing records and put that in another text file. Can someone point me in the right direction as to how to accomplish this?
Currently looping through the set of flat files like CHK0604, CHK0611, CHK0618, and CHK0625 from the source folder C:SOURCE
OBJECTIVE within the flat file if any records/rows cause error i have to move the bad data into separate folder C:ERROR
STEPS TAKEN
1) In FOREACH LOOP component i specified the variable User:: sourceFilePath for my source file CHK0604 etc. location C:SOURCE. The loop walkthrough each file in C:SOURCE and if no error then moves the flat file into another folder C:ARCHIVED. This task is perfectly working.
2) Within the dataflow I am diverting the the bad rows from "conditional component" into "Flat File Destination" Component.
3) "Flat File Destination" Connection manager i set the expressions as @[User:: sourceFilePath] +"_Error.TXT".
ISSUE
Because of point (3) the error file is created in the SOURCE flat file location C:SOURCE.
QUESTION
1) My error file name should be CHK0604_Error, CHK0611_Error, CHK0618_Error, CHK0625_Error created in another folder C:ERROR.
2) How to move the bad data into another directory while looping through a set of FLAT FILES ?
3) If i have to create another variable like @[User:: ErrorFilePath] where to create ? How to use the source file title as the title of error file.?
I'm trying to create a report that will be data-driven and produce multiple reports based on a parameter. For example, 5 reports go out today based on invoice numbers from yesterday. Each invoice can have multiple trasnsaction lines and each line contains the invoice number. What I have so far is only taking the first invoice number (let's say it has 10 transactions) and sending me the same report 10 times and stopping.
I get nothing for the remaining 4 invoices/reports. Here is what I have. To me this should enter the distinct invoice no's from yesterday into #temp, while a invoice no exiists begin query with one invoice selected from #temp then delete that invoice no and select another one and repeat till no more invoice no's. But it's only going through the one invoice no.
select distinct InvoiceNo into #temp from table where Invoicedate between getdate()-1 and getdate() declare @InvoiceNo VARCHAR(25)
This is how I calculate the ratio of failures in an order:
31 Days Table 1 query sum(CASE WHEN (datediff(dd,serDATE,'2015-01-21')) >= 31 THEN 31 WHEN (datediff(dd,serDATE,'2015-01-21')) < 0 THEN 0 ELSE (datediff(dd,serDATE,'2015-01-21'))END) as 31days1 .
How do i loop and pass dates dynamically in the Datediff?
31 Failures Table 2 query SUM(Case when sometable.FAILUREDATE BETWEEN dateadd(DAY,-31,CONVERT(DATETIME, '2015-01-21 23:59:00.0', 102)) AND CONVERT(DATETIME, '2015-01-21 23:59:00.0', 102)Then 1 Else 0 END) As Failures31,31 Day Cal(Formula) combining both Table 1 and Table 2 ((365*(Convert(decimal (8,1),T2.Failures31)/T1.31day))) [31dayCal]This works fine when done for a specific order.
I want a similar kind of calculation done for day wise and month wise.
2. what approach should I be using to achieve day wise and month wise calculation?
I do also have a table called Calender with the list of dates that i can use.
Im having a issue. Im not sure how I am going to carry out but I have two tables in SQL server 2005 TABLES Category SubCategory (PK)CategoryName (PK) SubCategoryNameCategoryID SubCategoryIDDate Date (Just shows the date inserted) (FK)CategoryID On the front page, I need to have it querys out the CategoryName from Categorys but also querys out all....Well not all but atleast 5 subcategorys that relate to that categoryName. Once its down it moves to the next category and does the same and so on. Does anyone know the trick ?
I want to loop through a recordset and do inserts into another table based on each record.
The way I have been doing it is copy my key data into a temp table, Loop through temp finding the max ID Doing what I need to do, deleting the max, then finding the new max and looping until no records exist.
I know there has to be a better way. The table I am working with is millions of records. Thanks in advance, Chris Reeder
I need to loop through a set of tables and move the data through a data pump from one server to another. This set of tables is dynamic so I have greated a global recordset and the looping is working fine.
During the looping process I need to change the transformations for each table so the source, destination, and transformation of the datapump are correct for the next table in the loop. I am using a VBS to handle this right now but cannot get the transformation to change. I essentially want to auto-remap using a vbs script. Is this possible?
Hello clever people I have a table that holds duplicates that I want to change into a table that has no duplicates. The current table is this name compound_id integer name varchar(150) name_type integer
This table stores chemical names. There is no primary key in the table so there are multiple compound_id's. I think the original idea was to have four name-types 1 = chemical name 2 = a description of the chemical 3 = a synonym of the chemical 4 = a formula of the chemical
I have created a new table called compound_name with this structure
id int primary key (auto identity) compound_id int used as a foreign key compound_name varchar(150) compound_desc varchar(250) compound_synonym varchar(150) compound_formula varchar(50) compound_trade_nme varchar(50)
I have also started to populate the new table by running this code insert into compound_name(compound_id,compound_name) SELECT DISTINCT compound_id, name FROM dbo.name WHERE (name_type = 1)
Now I need to somehow loop through the name table getting distinct compound_id's, and perform a case when name_type = 2 (which is synonym name_type) Then inside the loop update compound_name.compound_synonym for each compound_id which matches name_type 2 Then case 3 do the same for name_type 3 which is the name_type for descripton Then case 4 do the same for name_type 4 which is the formula
Hi there, I am new to SQL and am having trouble looping a script. I have the following script that needs to be refreshed a large number of times, or needs to be looped indefinitely until stopped:
select df.tablespace_name "Tablespace", block_size "Block Size", (df.totalspace - fs.freespace) "Used MB", fs.freespace "Free MB", df.totalspace "Total MB", round(100 * (fs.freespace / df.totalspace)) "Pct. Free" from dba_tablespaces ts, (select tablespace_name, round(sum(bytes) / 1048576) TotalSpace from dba_data_files group by tablespace_name) df, (select tablespace_name, round(sum(bytes) / 1048576) FreeSpace from dba_free_space group by tablespace_name) fs where ts.tablespace_name = fs.tablespace_name and df.tablespace_name = fs.tablespace_name(+) ;
I know this question may have a very easy solution, but I have no idea how to solve it.
Bi Ar Ar Bi Ar Ch Bi Ar Ma Bi Au Ar Bi Au Ch Bi Au Ma As Ar Ar As Ar Ch As Ar Ma As Au Ar As Au Ch As Au Ma As Au Ma
I have 3 columns S, D, C. i have text values in it. I need to write a query such that it will check each row for distinct value.For ex, all the rows are distinct except the last one. so i need to see all the duplicate entries. can anyone help me?
Hi All, I would like to know the best way to approach the following requirement: I have an ASP.net 2 web site which gets its data from SQL 2005. I am trying to run a series of 'rules' which are SQL where statements stored in a table, against rows stored in another table. I open the 'Rules' table looping through all records. I copy each rule to a string and put it on the end of the SQL statement so that the rule will only be appended if it passes the rule... this may be a little confusing. The rules process will fire when the details have been submitted to the database. Table containg rules would contain something like: ID, RuleSQL 1, (ClientAge >18) 2, (ClientIncome>10000) 3 Etc... This a very simplified version of the table but gives the general idea. I currently use ASP.NET 2 and sqlconnections/datareaders to do this. I would like to know if there is a way of doing the same thing server side using Transact SQL because that would (I believe) speed up the time taken to perform all the tests as i wouldn't need to rely on ASP to open all recordsets and append the data. If the ASP route would be the standard way of doing it and is not likely to have a detremental effect on performance then i am fine to stick with it because i know it works. any comments or suggestions would be welcomed. Thanks, Ian
I have an array (12,2) of values plus a profile variable that I want to pass as parameters while writing to a database. I've been told that I've set up the parameters wrong, and they cannot be changed every time I loop using the method I'm using. But I have no idea how to use any other method. Please... I'm down to the wire in terms of deadline here. I have until midnight to get it uploaded and running online. [CODE]Sub WriteClasses(ByVal CreditsArray) Dim i As Integer Dim EnrollDb As SqlConnection Dim cmdEnroll As SqlCommand EnrollDb = New SqlConnection("Server=LONNASQLEXPRESS;Integrated Security=True;database=LGordonTouroReg") cmdEnroll = New SqlCommand("INSERT INTO Enrollment (SectionID, Semester, Year, ClassID, StudentID) VALUES (@SectionID, 'Fall', '2007', @ClassID, @StudentID)", EnrollDb) EnrollDb.Open() For i = 0 To 12 cmdEnroll.Parameters.AddWithValue("@SectionID", CreditsArray(i, 2)) cmdEnroll.Parameters.AddWithValue("@ClassID", CreditsArray(i, 0)) cmdEnroll.Parameters.AddWithValue("@studentID", Profile.StudentID) If Not CreditsArray(i, 0) = "" Then cmdEnroll.ExecuteNonQuery() Response.Write(CreditsArray(i, 0) & " has been added to your schedule.<br/>") End If Next i EnrollDb.Close() End Sub[/CODE]
lets say i have a stored procedure (for insert command) which i am calling in my code to execute. The user provided data is being stored in a array. My class takes the stored procedure name and also takes parameters name and types. Is there any way to loop through the parameters, (various columns in the table which is of diffrent data type ie varchar, int, etc). How to implement it?
Hoping for a little help... I'm attemting to call a stored proc, pass parameters, and display the data 1 record at a time. I need to be able to show the data in a series of lables or text boxes. So the user will see one record, pushed into the lables, click a button and go to the next record...so on and so forth.
I think I have the code to get the data correct, it's the displaying data in lables and looping through the recordset the has me clueless.
Private Sub Page_Load(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles MyBase.Load 'Put user code to initialize the page here If Not Page.IsPostBack Then ' IF This is the first page load Dim UserID As String = Request.QueryString("UserID") ' parameter for stored procedure Dim RoleID As String = Request.QueryString("RoleID")
Dim DS As DataSet Dim MyConnection As SqlConnection Dim MyCommand As SqlDataAdapter
MyConnection = New SqlConnection(System.Configuration.ConfigurationSettings.AppSettings("connectionString")) MyCommand = New SqlDataAdapter("getdirective", MyConnection) MyCommand.SelectCommand.CommandType = CommandType.StoredProcedure MyCommand.SelectCommand.Parameters.Add(New SqlParameter("@roleID", SqlDbType.NVarChar)).Value = RoleID
Try DS = New DataSet MyCommand.Fill(DS)
'Display data in a series of lables or highly formated datagrid
Catch ex As Exception Response.Write("<font color=red>Error: " & ex.Message & "</font>")
End Try
Else 'IF the page is being reloaded
End If
End Sub
Private Sub cmdAck_Click(...) Handles cmdAck.Click 'This need to loop through the records
I'm pretty new to T-SQL and have an *easy* problem, for you experts, that I can't get seem to get solved. I'd like to loop through a list of items in TABLE "Items". I then want to use that list to loop through and SUM SALES and QTY for each item from a TABLE called "Shipments". As I loop through each item, I want to UPDATE the "Items" table with the Summary data. So, logically I'd do something like this:
SELECT item_no FROM Items
BEGIN
SELECT SUM(sales) AS Total_Sales, SUM(qty) AS Total_Qty WHERE item_no=@item_no
UPDATE Items SET Sales=@Total_Sales, Qty=@Total_Qty WHERE item_no=@item_no
END
I've tried somewhat successfully to use cursors to create my loop query, but I cannot seem to get the SELECT and UPDATE correct in the loop itself. Can anyone steer me in the right direction (or better yet, provide a solution)?
<!--- Update the DISTANCE field on STORE table ---> <cfquery name="UpdateZips" datasource="#application.data#" username="#application.username#" password="#application.password#"> exec Stores_UpdateZipSeachInfo '#Dist#', '#zip2.zipcode#' </cfquery> </cfloop>
I am not sure if what I wish to do is possible, but I shall ask anyway;
My project examines a database log of all the pages of an online teaching tool. Once the user has completed all the pages they are to be issued a certificate. Users may complete the teaching tool in any order, and the pages are always stored whenever they are acccessed, regardless of certification. I have created a number of views that extract the data into a list of all the possible completion dates; i.e. where all the pages have been completed within any 12 month period. I need to write a query/view that uses the view to extract the first possible user completion date followed by every completion 12 months after that, then after that etc. to present day. Can I do this? Am I making sense ?
A no is acceptable in this case; I know I can do this with multiple queries from withing an application. I'd just rather not.
i have a select query that returns multiple rows (within a cursor). How do i loop through the rows to process it (in a stored proc)? I donot want to use nested cursors. a code sample is requested.
I've got one table with two columns. Column Name Data Type 1) Id Integer Identity 2) RemDate DateTime
I've to write one SP/JOB in that there will be an integer input parameter @numofday.
Say value of @numofday is 5 then.... in SP/Job I need to insert 31 - 5 = 26 records to above-mentioned table where date starting from 1st of current month.
This logic can be achieve through looping but if anyone can suggest some better way to achieve this functionality without use of looping.
i have created a job that i have scheduled to run every 10 min everything is configured well since i have tested preety everything their is to be tested and found that it was my last step wich as a fetch in it so i imagine that this fetch is making it loop over and over again. the job goes trought all the steps and starts back at the first step and keep going like that till i disable it here is my fetch statement and if you have any clue any help would be widely apreciated.
PS: i suspected it to be that fetch statement causing the havoc ;)
DECLARE TransactionNb_cursor CURSOR FOR SELECT TransactionNb, EqId FROM DetCom WHERE UpdCode = 'C'
OPEN TransactionNb_cursor
FETCH NEXT FROM TransactionNb_cursor INTO @TransactionNb, @EqId
WHILE @@FETCH_STATUS <> -1 BEGIN -- Vérifier s'il existe une transaction avec le UpdCode = 'C' dans EntCom IF (SELECT UpdCode FROM EntCom WHERE TransactionNb = @TransactionNb and EqId = @EqId) = 'C' BEGIN CONTINUE END ELSE BEGIN RAISERROR (50006, 10, 0, @TransactionNb, @EqId) END
FETCH NEXT FROM TransactionNb_cursor INTO @TransactionNb, @EqId END
CLOSE TransactionNb_cursor DEALLOCATE TransactionNb_cursor
1A ACTIVE 1BINACTIVE 1CINACTIVE 2BINACTIVE 2CINACTIVE 2DINACTIVE 3A ACTIVE 3B ACTIVE 3C ACTIVE 4B ACTIVE 5DINACTIVE ---------------------------------------------------------------- Following is the desired View that I need for the above table. Any ID which has atleast one ACTIVE branch will have ACTIVE status and any company which have all of its branches INACTIVE will have INACTIVE status. Thanks for your help
I want to apply the same query for 250 tables (see below). It would be very painful to code 250 individual queries. Is there a way to loop through all 250 tables using "sysobjects WHERE type = 'U'", and applying the below code for each table (users table would remain constant)?
select co.* from company table_type where not exists(select * from users u where (u.users_id = type.rn_create_user) or (u.users_id = type.rn_edit_user) )
wondering if anybody could help i wondered if there is a way i could loop through a number of tables and append them to a new database at the moment, i'm writing a SQL statment for each table and when there's about 50 this becomes slightly tedious......
this is my SQL statment in a SP,
INSERT INTO DB2.dbo.datatable1 SELECT * FROM DB1.dbo.datatable1;
INSERT INTO DB2.dbo.datatable2 SELECT * FROM DB1.dbo.datatable2;
I need to create a cursor that will loop through my customer database to return matching rows of data based on my select statement criteria. I have written most of it based on what I remember from my limited SQL exposure at a previous job afew years ago, but I can't remember how to make the @cust_id varaible increment by 1 and loop to the end of the customer table.
Can anyone steer me in the right direction here please?
DECLARE @cust_id INT SET @cust_id = 371 DECLARE my_cursor CURSOR FOR SELECT CUSTOMER_ID, FULL_NAME, ADDRESS_LINE1, SUBURB, STATE, POSTCODE FROM CUSTOMER_LANGUAGE_DETAILS WHERE POSTCODE IN (SELECT POSTCODE FROM CUSTOMER_LANGUAGE_DETAILS WHERE CUSTOMER_ID = @cust_id AND INACTIVE = 0 ) AND CUSTOMER_ID <> @cust_id
SELECT CUSTOMER_ID, FULL_NAME, ADDRESS_LINE1, SUBURB, STATE, POSTCODE FROM CUSTOMER_LANGUAGE_DETAILS WHERE CUSTOMER_ID = @cust_id AND INACTIVE = 0 OPEN my_cursor --SET @cust_id = @cust_id + 1 FETCH NEXT FROM my_cursor WHILE @@FETCH_STATUS = 0 BEGIN FETCH NEXT FROM my_cursor END CLOSE my_cursor DEALLOCATE my_cursor
Hopefully someone can assist me with what appears to be a simple issue - and hopefully I am just looking in the wrong location - because I am almost going nuts trying to work this out!!
I am wanting to pickup a file from a given folder, do some transformations on the data within the file, and then dump each row within the original file as a seperate txt file into a new folder.
I have managed to get it working - well all except having each row as a seperate txt file. Currently all the rows are outputed into the same txt file. argh
As it stands I have a For Each Loop Container, within this i have a Data Flow Container; which in itself, just has the source, some derived columns, and then an output.
How can I get this to pull each row from the source and put it as a seperate txt file. If someone can just nudge me in the right direction it would be much appreciated.