Creating Stored Procs That Need To Continusiouly Append To A New Table (this Is To Scrub Data That Is Imported Into DB).
Jan 9, 2005
I have 1 table with a huge amount of data that I recive from someone else in a flat file format. I want to be able to filter through that data and scrub it and find out the good data and bad data from it.
I'm scrubbing the data using different stored procs that i've created and through a web interface that the user can pick which records they wish to create.
If I were to create a new table for clean records, what is the syntax to keep Appending to that table through the data that i'm obtainig via the stored procs that i've created.
Any thoughts or suggestions are greatly appriciated in advance
I have written a piece of code in a stored procedure that builds a string called "filter$" based on fielde in a table. How do I use this string as my where clause?
We have a product that we were experiencing high recompile rates do totemp tables, we have re-written them to use table variables to reducerecompile rate. The problem I am having is that we have somecustomers still on SQL 7 where the table variables were not valid. Iwould like to have one installation script that would create thestored procedure with table variables on SQL Server 2000 and withouton SQL Server 7, I have been unsuccessful do to the limitation thatCREATE PROCEDURE statement cannot be in a script with any otherstatements.Was wondering if anyone else has run into this and found a way aroundit.
I am attempting to create a stored procedure that will launch at report runtime to summarize data in a table into a table that will reflect period data using an array type field. I know how to execute one line but I am not sure how to run the script so that it not only summarizes the data below but also creates and drops the table.
I'm a newbie DBA and i'm trying to create a package that would extract data from MySQL and inserts them to a SQL 2005 Server. I'm quite new to this SSIS and would like to ask help from you to help me go through with this.
Is there a way to namespace groups of stored procs to reduce confusion when using them?
For C# you can have ProjectName.ProjectSection.Classname when naming a class. I am just wondering if there is a way to do the same with SQL stored procs. I know Oracle has packages and the name of the package provides a namespace for all of the stored procs inside it.
I have a situation where I have table with over a billion records and needs to be scrubbed. Table does have a field with date time timestamp. I have been deleting rows from the table using the script below which basically provides me delete statements by date for records older than 90 days.
But now on each day row count is over 30 million rows and it takes forever to delete by date and transaction log becomes humongous.
So I would like to scrub it in 5 minute intervals instead of daily for records older than 90 days. Even in 5 minute intervals the record count tends to be around a million. This will keep the delete slice small enough to not a gigantic transaction log.
declare @startdate Datetime declare @enddate Datetime set @startdate = getdate()-480 set @enddate = getdate()-90
WHILE (@startdate < @enddate) BEGIN print 'delete from vending where DetectedDate < ''' + CONVERT(varchar(10), @startdate, 101) +'''' set @startdate = @startdate+1 END
I am hoping to modify the script above to produce a script with statements like this for a window between last 90 and 120 days:
delete from vending where DetectedDate <'6/15/2015 8:25:00 PM' go delete from vending where DetectedDate <'6/15/2015 8:30:00 PM' go delete from vending where DetectedDate <'6/15/2015 8:35:00 PM' go
I am moving an app from Access 2003 to C#/SQL Server 2005. The Access app has one front end exe and two back end databases. One back end db is for UK users, one for US users. The tables and structures are identical, but the data is different. UK users link to the UK back end, US users to the US backend. (The queries are in the front end)
In SQL Server, the view and stored procs will be in one database (KCom), the US data tables in another database (KUS), and the UK data tables in another (KUK). All databases are on the same server.
My question is how to let the views and stored procs in KCom know whether to pull the data from KUK or KUS.
In the front end app, I use ADO.Net to deal with the SQL Server databases.
I have not been able to find a model for this, but it must be somewhat common.
I willl have a lost of nested views and stored procs, but as a simple example, say I have a view in KCom which is just "Select * From Invoice Where InvDate > '01/01/2007'. The ADO request would come from the front end app which would know whether it wanted the data from KUK or KUS. How would I get the view to get the data from one as opposed to the other (KUK vs KUS) ?
Is there some other strategy for this situation, say use of connection strings? If anyone has an idea, or knows of an explanation somewhere on the net I'd greatly appreciate it.
I need to run stored procs based on a list in a code table. In other words, it reads the name of a stored proc from a table and runs it and does this for all the rows in the table. I think the ForEach loop container will do what I need and there is an ADO enumerator option but the documentation does not tell you how to use this. From what I can tell, you need to get a dataset into an SSIS variable first and then you plug the variable name into the ForEach ADO enumerator. Is that correct? If so, can someone tell me how to get a dataset into a variable?
I'm looking at writing some customized insert, update and delete stored procs for a replication target. For various reasons I would like to write a "one size fits all" custom stored proc for each of these tasks.
It looks like I can get the data values passed as parameters just fine.
I was wondering if there's a way to also pass the source schema and table name as parameters, or to determine these on the fly in my all purpose stored procs. Some replication products refer to these types of values as "tokens" that can be included in the replication data stream sent to the target.
I can adjust the source database replication publications, and article definitions, but I cannot modify the actual source database tables to include these as values in data columns. It is possible a view that contains these elements as strings might fly, but I was hoping to avoid cluttering the source database.
I have a sql server 2008 backend with an Access 2007 frontend database. Each time I export a query I get the following error:
Code: Microsoft Access was unable to append all the data to the table.
The contents of fields in 0 record(s) were deleted, and 1 record(s) were lost due to key violations.
*If data was deleted, the data you pasted or imported doesn't match the field data types or the FieldSize property in the destination table. *If records were lost, either the records you pasted contain primary key values that already exist in the destination table, or they violate referential integrity rules for a relationship defined between tables. Do you want to proceed anyway?
I don't know what if anything is actually missing because of the amount of data is more thant 6000 records. It seems everything exported but I would have to comb through the data to be sure.
Hi, I have a stored procedure that looks like this:...WHILE @@FETCH_STATUS = 0BEGINDECLARE @MyCount int ; EXEC spLoanQuestionnaireCriteria @AuditSelectedLoanID, @Criteria, @MyCount OUTPUTEND ...SELECT * FROM #Categories... Executing this stored procedure will return me 1 table for each time the EXEC statement is called that only has on column (MyCount)I really don't need this data to be returned, it is only used for some internal calculations in the stored procedureThe stored procedure should only return the results from SELECT * FROM #Categories in 1 table.Is there a Keyword I can use to exclude the EXEC results being returned to the dataset? Thanks in advance,Andre
If you drop / rename a table and then recreate the table with the SAME NAME, what impact does it have on stored procedures that use these tables? From a system perspective, do you have to rebuild / recompile ALL the stored procedures that use this table?
I had a discussion with someone that said that this is a good idea, since the IDs of the tables change in sysobjects and from a SQL SERVER query plan perspective, this needs to be done...
Question 2
If you Truncate a Table as part of a BEGIN TRANSACTION, what happens if an error occurs? Will it Rollback? The theory is that it won't because Truncate doesn't utilize the logs where as Delete From uses the SQL Logs?
I was wondering if there was a different approach I should take in appending data to a table...
My destination table has about 94+ million records in it, and I have been taking two approaches to getting new files into this table:
1) I do a data pump task in a DTS to import the file to a trans (temp) table, which is truncated every time, and then do an INSERT INTO statement from the temp table to my destination table.
The import to the trans table only takes a few minutes (about 1 - 2 million records per file, but have short record lenghts,) but when I do the INSERT INTO statement, it takes upwards of 6 hrs to append.
2) I have tried doing a bulk insert task, going directly to the destination table (which defeats the purpose of my trans table to check out the data prior, but I feel the data is clean at this point.)
I am running the bulk insert right now, and it's been running for over 3 hours...so I'm going to assume this will take just as long as the INSERT INTO statement does like I did before.
My destination table does not have any indexes in it at all, and I don't need to do any transformations to the data when bringing it into SQL since the data is clean. Also, I have a default value constraint on one of my fields on the destination table.
Plus there are other ppl and applications hitting the server which could impact the overall processing, but nothing out of the ordinary is going on the server today. I know there are only so many ways to get a file into a table...but maybe someone knows a different way I should try this.
Hello. I was using the new sys.dm_db_index_operational_stats function which is nice for seeing counts of insert/update/delete actions per table index, bla bla bla... Anyways, question, can I do the same thing with Profiler? meaning, can I trace stored procs and sopmehow see the proc exec WITH each table it does actions against? Not talking about filtering on table names in the text, talking I just want to run an application, which uses all stored procs, and see every table used by that execution of the proc, and also the number of rows inserted,updated,deleted.... If so, which Profiler events/columns must I flick on to gather that? Thanks, Bruce
I have an existing table I need to add data to. The data is in a text file, and the existing table already has data in it (I don't want to delete this I want to add to it).
I used Microsoft's import utility but this created a seperate table with generic fieldnames (column01, column02, ect). Is there a step in this wizard I missed?
public Set DoSomthing(Set toBeProcessed, Set measuresToWorkWith)The set measurseToWorkWith is passed as {[Measures].[Measure1], [Measures].[Measure2] ...}
with the measures being real or query-scoped calculated members.
To get the value of the measure for each tuple in the set toBeProcessed, I create an Expression for each tuple (measure) in the set measuresToWorkWith then for each tuple in toBeProcessed call expression.Calculate(tuple) which returns a MDXValue.
My problem is that in order to make the code generic I need to get the real (.NET) data type of the MDXValue. The class only has explicit conversion methods ToInt16() etc which implies that the data type is known at design time.
However, if one of the measures is a query-scoped calculation then it could return a .NET double, int, bool or string.
If the measure is real then I can look up its metadata. However, it appears that if it is a formula (scoped member) then are all bets are off?
CREATE PROCEDURE PROC1 AS BEGIN SELECT A.INTCUSTOMERID,A.CHREMAIL,B.INTPREFERENCEID,C.CHR PREFERENCEDESC FROM CUSTOMER A INNER JOIN CUSTOMERPREFERENCE B ON A.INTCUSTOMERID = B.INTCUSTOMERID INNER JOIN TMPREFERENCE C ON B.INTPREFERENCEID = C.INTPREFERENCEID WHERE B.INTPREFERENCEID IN (6,7,2,3,12,10) ORDER BY B.INTCUSTOMERID
END
IF I AM USING THIS PROC AS I/P TO ANOTHER PROC THEN IT GIVES NO PROBLEM AS I CAN USE ?
BUT IF , I USE ANOTHER PROC SIMILAR TO THE FIRST ONE WHICH GIVES SLIGHTLY DIFFERENT RESULTS AND GIVES TWO SETS OF RESULTS,THEN WE HAVE A PROBLEM,HO TO SOLVE THIS :-
CREATE PROCEDURE MY_PROC AS BEGIN SELECT A.INTCUSTOMERID,A.CHREMAIL,B.INTPREFERENCEID,C.CHR PREFERENCEDESC FROM CUSTOMER A INNER JOIN CUSTOMERPREFERENCE B ON A.INTCUSTOMERID = B.INTCUSTOMERID INNER JOIN TMPREFERENCE C ON B.INTPREFERENCEID = C.INTPREFERENCEID WHERE B.INTPREFERENCEID IN (23,12,10) ORDER BY B.INTCUSTOMERID
END
SELECT A.INTCUSTOMERID,MAX(case when A.intpreferenceid = 23 then '1' else '0' end) + MAX(case when A.intpreferenceid = 12 then '1' else '0' end) + MAX(case when A.intpreferenceid = 10 then '1' else '0' end) AS PREFER FROM CUSTOMER GROUP BY A.INTCUSTOMERID ORDER BY A.INTCUSTOMERID END
WHICH NOW GIVES ME TWO SETS OF DATA , BOTH REQUIRED THEN HOW TO USE ONE SET OF RESULTS AS I/P TO ANOTHER PROC AND OTHER SET OF RESULTS AS I/P TO YET ANOTHER PROC .
I imported a SQL Table into SQL DataBase, But I can not update this table even with SQL Server management Studio When I change any data on mentioned table above, Red exclamation sign appears left of the record . How can I correct this problem? Thanks.
I am developing a DB with others in my group. When I import tables created on other servers to my server, the primary key and other properties do not import with the tables. Can anyone explain why this is happening? Is there a setting I have over looked?
I am in the process of importing data that is in a text format to thesql server, I want to add couple of fields and insert general data inthe fields added, this data is going to be similar to all the recordsimported. Your help in this regard will be greatly appreciated.
I am not sure how to implement the following, but I believe it entails using DTS, and hopefully it is fine that I post it here b/c ultimately I will need this backend data for my frontend .aspx pages:
On a weekly basis, I need to IMPORT some data located on a remote Oracle DB into SQL Server 2k. Since there is so much data to transfer, I would only like to transfer the data that is new to the table since the last IMPORT, i.e. a week ago and leave behin the OLD data.
Is DTS the correct way to go or do I have more control via DTS with STORED PROCEDURES? Does anyone have any good references for me?
On a similar note, once this Oracle data is IMPORTED into a certain table, I would like to EXPORT some of these NEWLY acquired rows matching certain criteria into another table for auditing purposes. For this scenario, should I implement a TRIGGER UPDATE event here on the first table?
i have few fields that contain foreign characters with diacritic marks which are not getting imported correctly.
below is the import format:
- File type: ASCII - Row delimiter: carriage return and line feed {CR/LF} - Column delimiter: Tab - Text qualifier: None
Please advice.
Here is the errors i'm getting:
- Executing (Error)
Messages
Error 0xc02020a1: Data Flow Task: Data conversion failed. The data conversion for column "Country_str_local_long_name" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". (SQL Server Import and Export Wizard)
Error 0xc020902a: Data Flow Task: The "output column "Country_str_local_long_name" (37)" failed because truncation occurred, and the truncation row disposition on "output column "Country_str_local_long_name" (37)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component. (SQL Server Import and Export Wizard)
Error 0xc0202092: Data Flow Task: An error occurred while processing file "L:Country.txt" on data row 6. (SQL Server Import and Export Wizard)
Error 0xc0047038: Data Flow Task: The PrimeOutput method on component "Source - Country_txt" (1) returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. (SQL Server Import and Export Wizard)
Error 0xc0047021: Data Flow Task: Thread "SourceThread0" has exited with error code 0xC0047038. (SQL Server Import and Export Wizard)
Error 0xc0047039: Data Flow Task: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown. (SQL Server Import and Export Wizard)
Error 0xc0047021: Data Flow Task: Thread "WorkThread0" has exited with error code 0xC0047039. (SQL Server Import and Export Wizard)
How do I search for and print all stored procedure names in a particular database? I can use the following query to search and print out all table names in a database. I just need to figure out how to modify the code below to search for stored procedure names. Can anyone help me out? SELECT TABLE_SCHEMA + '.' + TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'
I have imported some tables from msaccess but as the datetime field from access is different from SQL Server I had to change all columns type from datetime to varchar...
Now in SQL Server i'm trying to convert this data into datetime, and I'm using cast(field as datetime) and sql give me this message..."The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value."
how can I select these rows that are giving me errors ??? OR is there a way to convert this data ??
I don't know if in this data it will give the error... but as the table is too big...I can't post everything here...
------------------------------- EDIT
I got the problem...
I have data like 4/11/2006 9:23:19 AM and others like 11/21/2005 6:02:13 PM
So the first one 4/11/2006 9:23:19 AM it converts without problem, but wrongly...cause the month is "4" and not "11" the other one 11/21/2005 6:02:13 PM it tries to convert the month as 21 and not as 11....
I'm importing floor machine data from SQL7 into SQL2005 using SSIS. I import the SQL7 data into a SQL2005 master table and then attempt to match the import data with the data in a current table for either update or insert of new machines. The SQL2005 master table was imported from a SQL2000 database. When I run the first Lookup import, it does not recognize the PK matches between the SQL7 import and SQL2005 master and imports all the SQL7 as new machines. The first Lookup branches to a second lookup that checks for changes in the SQL2005 master. When I run the package a second time, the second Lookup treats all the records as updates when it gets to the second Lookup, but should treat these as found. Any suggestions as to why this process is not working properly would be appreciated. Is there a way I can embed a picture of the process from SSIS in this post? Thanks
I am using vwde2008 and created db and table. i want to create a stored procedure that will insert values. I am bit lost on how write this. this is my table info table = BusinessInfo Columns = BusinessID, BusinessName, BusinessAddress how can i create a stored procedure ?
I am trying to creating a table inside a stored procedure using SQL that works fine in Query Analyzer. However, when I check the syntax I get the following message:
SELECT B.COMPONENT, TOT_OPTIONS_EXCHANGED = SUM(A.UNITS) FROM TBLEXERCISEOPTIONS A, TBLCOMPONENT B WHERE B.COMPONENTID = A.COMPONENTID GROUP BY B.COMPONENT
GO
Any help getting this to run correctly would be appreciated.