WE have a job that loads data from an Oralce DB into our SQL Server 2000 DB twice a day. The schedule has just changed so that now there is a possibility of having my west coast users impacted when it runs at 5 PM PST and my east coast users impacted when it runs at 7 AM EST. As a workaround, I have developed a DTS package that loads the data into temp tables instead of the real tables. IE. Oracle -> XTable_temp instead of Oracle -> XTable. The load sometimes takes about an hour to an hour and a half to load, so this solution works great, but I want to then lock the table, delete it and rename the temp table to table X. The pseudo code would be:
Begin Transaction
Lock Table XTable
Drop XTable
Alter Table XTable_temp rename to XTable
Release Lock XTable
End Transaction
Create XTable_temp
I see two issues with this solution. 1) I think if I can lock XTable that the lock would be released when the table is dropped and the XTable_temp was being renamed. 2) I can't find a command to rename a table.
If on the source I have a new column, the script generated by SqlPackage.exe recreates the table on the background with moving the data into a temp storage. If the table is big, such approach can cause issues.
Example of the script is below: in the source project I added columns [MyColumn_LINE_1] and [MyColumn_LINE_5].
Is there any way I can make it generating an alter statement instead?
BEGIN TRANSACTION; SET TRANSACTION ISOLATION LEVEL SERIALIZABLE; SET XACT_ABORT ON; CREATE TABLE [dbo].[tmp_ms_xx_MyTable] ( [MyColumn_TYPE_CODE] CHAR (3) NOT NULL,
[Code] ....
The same script is generated regardless the table having data or not, having a clustered or nonclustered PK.
I want to insert the data from temp table to other table. Only condition is, it needs to sorted based on tool number and tool date. For example if we have ten records for tool number 1000, it should be order by tool number and then based on tool_dt. Both tables doesn't have any primary keys. Please find below my code. I removed all the unnecessary columns for simple understanding. INSERT INTO tool_summary (tool_nbr, tool_dt) select tool_nbr, tool_dt from #tool order by tool_nbr, tool_dt...But this query is not working as expected. Data is getting shuffled.
Hi everyone, I'm fairly new to sql and right now I am struggling with a script. I am trying to extract data from a normal table into a temporary table, update it in the temporary table, then put it back into the normal table. I'll display my code, let me know what you think, any suggestions are appreciated. Thanks a lot.
I am trying to update a table in one database with data from a temporary table which i created in the tempdb.
I want to update field1 in the table with the tempfield1 from the #temp_table
The code looks something like this:
Use master UPDATE [dbname].dbo.table SET [dbname].dbo.table.field1 = [tempdb].dbo.#temp_table.tempfield1 WHERE ( [dbname].dbo.table.field2= [tempdb].dbo.#temp_table.tempfield2 AND [dbname].dbo.table.field3= [tempdb].dbo.#temp_table.tempfield3 AND [dbname].dbo.table.field4= [tempdb].dbo.#temp_table.tempfield4)
I get the following error: Msg 4104, Level 16, State 1, Line 2 The multi-part identifier "tempdb.dbo.#temp_table.tempfield2" could not be bound. Msg 4104, Level 16, State 1, Line 2 The multi-part identifier "tempdb.dbo.#temp_table.tempfield3" could not be bound. Msg 4104, Level 16, State 1, Line 2 The multi-part identifier "tempdb.dbo.#temp_table.tempfield4" could not be bound.
Hi, I created a package to load a fact table which loads more than 7 million records. When loading the table it took nearly 15 minutes. Then indexes were created for the table at the DB level after which the time to load same number of records has increased. How to resolve this time delay?
Hi, all experts here, Do we always have to use SCD component for the loading of data into data warehouse to handle changes of rows? I am looking forward to hearing from you and thank you very much in advance for your help. With best regards,
I have a Excel cross-tab (multilevel column) report that need to be loaded into a database table. Currently, I am using a Excel macro that convert the columns into rows before loading into the database table. I was thinking whether there is a better way of doing this perhaps in SSIS or using XML.
I am building a dynamic query stored procedure. I am first filling a temp table with data:
Declare @Counter int
drop table #tempmerge create table #tempmerge(IDIndex int IDENTITY, CitationNum char(9),Exp1 int)
insert into #tempmerge Select E_Cit_For_Merge, Count(*) as Exp1 from dbo.E_Citation_XML_Data group by E_Cit_For_Merge having Count(*)>1 select * from #tempmerge
Not sure if you can help on this but Ive got a stored procedure in sql server and it creates a temp table. I then call another stored procedure from this one. When it returns to the 1st stored procedure I want the temp table to keep the information entered into the table, but the data is lost.
Is there a flag that can be turned on and off do this?
how I can load the CSV file data into the sql server table. I know there are ways like bulk insert and other to load the csv file data into the table. But in my case the table doesn't exist and has to be created at the run time. With simple insert in temp table we do like select * into #temp from tablename and that creates the temp table. So. I need something like that which create the temp table and load the data into it. because the CSV file would have different number of columns and names so I can not create the table structure in advance. I have to create the table at run time.
When you load the data into a new partition table, can it to done online without any downtime? because I have few tables that are around 250 gigs and more.
I have a table which is referenced in multiple stored procedures .Among them only one procedure loads data (INSERT statement) to the table. In all other procedures table is referenced in FROM clause.
My question is how do i find the stored procedure that is actually loading data into that table.
can anyone help me to solve this problem i have created a ssis package to load the data from excel file to the table, but we are getting the data in different language ie in french,english and in china after loading the data when we view the data it is showing as junk characters for chinese data but we are able to see other language data ie french and english. so please tell me how to solve that reply to my mail id(sandeep_shetty@mindtree.com)
Hello. I have a SP. I am passing two variables to it that are comma seperated lists (ids) so that I can use an IN clause. But since I cannot use these CSV in an IN clause directly I wrote a temp table that first parses the CS list and creates a temp table with a single column of IDS.Then in a second temp table I am selecting the data using enrollment_id IN (Select enrollment_id From #temptable)Then the SP selects data from the second temp table.This works exactly the way it is supposed to work. But when I call the SP from ASP.NET it doesn't return any data. I just cant seem to return data by selecting from a temp table. And my ASP.NET code is correct because I commented out everything and just returned the parameters I was passing and that worked. So does anyone know how to select data from a temp table in a SP when executed from ASP.NET?I even tried Table Variables and same thing. It works great in SQL but when called from ASP.NET it doesnt return any data. I am guessing it's probably losing its scope. Any quick help would be appreciated. Thank you.Here is a snippet of the SP (even this fails):CREATE PROCEDURE [dbo].[returnEnrollments](@organization_id int, @test_item_list ntext, @enrollment_list ntext)As--select @test_item_list as tilist, @enrollment_list as elistDeclare @report_start_date datetime;Declare @report_end_date datetime;Declare @test_item_id_list varchar(8000);Declare @enrollment_id_list varchar(8000);DECLARE @TestItemID varchar(10), @Pos intDECLARE @EnrollmentID varchar(10)--Set @IDList = @test_item_id_list;--Set @EnrollmentIDList = @enrollment_id_list;Set @test_item_id_list = Cast(@test_item_list As varchar(8000))Set @enrollment_id_list = Cast(@enrollment_list As varchar(8000))Select @report_start_date = report_start_date, @report_end_date = report_end_date From organization Where organization_id = @organization_id;-- Create a temp table to store all the enrollment idsIF OBJECT_ID('tempdb..#EnrollTemp') IS NOT NULLDROP TABLE #EnrollTempCREATE TABLE #EnrollTemp (enrollment_id int NOT NULL)SET @enrollment_id_list = LTRIM(RTRIM(@enrollment_id_list))+ ','SET @Pos = CHARINDEX(',', @enrollment_id_list, 1)IF REPLACE(@enrollment_id_list, ',', '') <> ''BEGIN WHILE @Pos > 0 BEGIN SET @EnrollmentID = LTRIM(RTRIM(LEFT(@enrollment_id_list, @Pos - 1))) IF @EnrollmentID <> '' And @EnrollmentID > 0 BEGIN INSERT INTO #EnrollTemp (enrollment_id) VALUES (CAST(@EnrollmentID AS int)) --Use Appropriate conversion END SET @enrollment_id_list = RIGHT(@enrollment_id_list, LEN(@enrollment_id_list) - @Pos) SET @Pos = CHARINDEX(',', @enrollment_id_list, 1) ENDENDSelect top 20 * From #EnrollTemp Edit: Just used a DataReader instead of a datatable and it works. Dont know why it would fail with a datatable.
I have a series of data flow tasks that I want to output to a temp table. I've set the data source for RetainSameConnection and the Data Flows are DelayValidation. The OLE DB data source inside the Data Flow works fine, but the data destinations don't offer a # or ## as a target. I've tried every destination that sounds logical, without success.
i am displaying some information in a datagrid as part of shopping cart.i want to store the informations which i displayed and some informations which the user enters in a temporary table, so that i can perform manipulations based on that. how can i do that? how to load those informations into a temporary table during runtime can anyone explain me in detail? please thanks in advance
I am looking for a way to convert the following format into a sql table. The format it is Bib Tex.
Essentially a new row in the table would be for each entry, denoted by an @ logo and each column is denoted by an =, as you can see from the example data no one contains all the possible columns and some fields can be over two lines long.
To load this I was considering loading it into a table as each line being a row. Adding a row number, then a column counting the @ signs in order and essentially grouping each record, then for each group running through and looking for the column keywords 'author' , 'title' etc then splitting the data out into those constituent parts using substring and charindex.
@Book{hicks2001, author = "von Hicks, III, Michael", title = "Design of a Carbon Fiber Composite Grid Structure for the GLAST Spacecraft Using a Novel Manufacturing Technique", publisher = "Stanford Press", year = 2001,
I have Table A . we already have 80 columns . we have to add 65 more columns.
we are populating this table from oracle .and we need to populate those 65 columns again from the same table.
Is it a better idea to add those new 65 columns to the same table or new table.
If we go for the same table then loading time will be double, If I go for new table and If i am able to run both the packages which loads table data from same oracle server to difffrent sql tables then we should be good. But if we run in to temp space issues on oracle server . Then i have to load the two tables separately which consumes the same time as earlier one.
I was thinking if there is a way in SSIS where I can pull data from same oracle table in to two diff sql tables at same time?