SQL Server 2012 :: Fast Data Loading With Partition Switching Strategy

Jul 28, 2015

I’m looking for clearity on partition switching. The idea is to use many BULK INSERT statements into table dbo.X_n in parallel and when BULK INSERT for table dbo.X_n is completed, switch dbo.X_n into dbo.bigdaddy. I think this is the fastest way to upload a couple hundred GB of data.

In learning about partition switching (in part) from The Data Loading Performance Guide under Partition SWITCH, I hear the instructions to say copy the main table exactly to become a target. But in that same step (#1), I read that we need to change the default file group of the target (dbo.X_n) from the default file group. Then it says I need to match indexes and lists the filegroup as something we need to match with the main table.

As an overview of the partition switching strategy, I think the whole point of BULK INSERT with partitioning is to have seperate files (in same group) to enable concurrent uploading where each table has its own file. Once the upload is completed to a table (dbo.X_n) then we do the partition switch into the main table (dbo.bigdaddy). The data we just uploaded doesn’t actually move, just the metadata for it.

“Don’t have the same filegroup on your target as the main table. You must have the same filegroup on your target as the main table.”

View 1 Replies


ADVERTISEMENT

Transact SQL :: Fast Data Loading With Partition Switching Strategy

Jul 28, 2015

I’m looking for clearity on partition switching. The idea is to use many BULK INSERT statements into table dbo.X_n in parallel and when BULK INSERT for table dbo.X_n is completed, switch dbo.X_n into dbo.bigdaddy. I think this is the fastest way to upload a couple hundred GB of data.

In learning about partition switching (in part) from The Data Loading Performance Guide under Partition SWITCH, I hear the instructions to say copy the main table exactly to become a target. But in that same step (#1), I read that we need to change the default file group of the target (dbo.X_n) from the default file group. Then it says I need to match indexes and lists the filegroup as something we need to match with the main table.

As an overview of the partition switching strategy, I think the whole point of BULK INSERT with partitioning is to have seperate files (in same group) to enable concurrent uploading where each table has its own file. Once the upload is completed to a table (dbo.X_n) then we do the partition switch into the main table (dbo.bigdaddy). The data we just uploaded doesn’t actually move, just the metadata for it.

When I read the instructions linked above, I hear “Don’t have the same filegroup on your target as the main table. You must have the same filegroup on your target as the main table.”

Where am I disconnected?

View 5 Replies View Related

SQL Server 2008 :: Loading Data Into New Partition Table Online

Mar 1, 2015

When you load the data into a new partition table, can it to done online without any downtime? because I have few tables that are around 250 gigs and more.

View 5 Replies View Related

What Is My Best Strategy For Loading Data.

Apr 22, 2007

I have been developing a genealogy application using a SQL Server 2000 database and ASP .NET 2.0.  In this application a process, Ged.Parse, converts data from the GEDCOM standard format (a heirachical file format that looks as if it was designed for 80-column cards) into my SQL Server database.
As we started to load reasonable quantities of data into the system we found that the on-line response became abysmal.  This problem was fixed by defining a number of secondary indexes (response times dropped to under a second, from previously exceeding 2 minutes and often timing out).  Unfortunately however the processing time of Ged.Parse then tripled, and it may now take up to an hour to process a GEDCOM. I believe that this is a byproduct of defining several indexes that are not needed by Ged.Parse itself, but which are of course maintained as Ged.Parse inserts new records into the database.  
I am wondering what my best strategy is, apart from putting Ged.Parse into a background task and just letting it trickle away.  (I will probably do this anyway). What I'd like to be able to do is to have Ged.Parse load records without creating the secondary indexes, and then create the indexes for the newly-added records as a penultimate step just before it makes them available for general use.  Of course there is no way that you can do this:  records in a table are either indexed or they are not.
Proposed change:  recode Ged.Parse to load data into temporary tables, say NewPeople, NewFacts, etc., with these tables having only the indexes required by Ged.Parse. Then, as the last process in Ged.Parse run a SQL procedure with code like: -            Insert into People Select * From NewPeople            Delete from NewPeople            etc
This is a reasonable amount of programming, so before I make this change could somebody tell me:  will this be significantly faster overall, or is this likely to make little or no improvement compared to the present process in which Ged.Parse loads data directly into People, Facts, etc?   Two facts that may influence the answer.  First, all record relationships are through GUIDs, so records in NewPeople, NewFacts, etc would already have their final key values.  Second: although Ged.Parse needs to form relationships between records, these relationships are only within the new records (created from the same GEDCOM), and Ged.Parse does not need to relate any of these new records to earlier records.
Thank you,
Robert Barnes.
 

View 2 Replies View Related

Fast Loading Relational Data

Apr 8, 2006

I am searching for a way to fast load relation data. I know how to load data fast but how can i store relation data fast.

For example :

Table1 ( tabel1Id int identity , name varchar(255) )

Table2 ( tabel2Id int identity , table2Id , name varchar(255))



When i insert 50 records into Table1 i can't get the 50 identity fields back, to insert the related data into Table2.



I think one of the solutions could be returning a selection of
Table1 joined with syslockinfo, but i have no idea how to do it.



Does anyone have an idea?

View 3 Replies View Related

Fast Loading Of Data To Table

Jul 10, 2007

Hi Every one,

How can I load or copy say millions of rows to a table in the database faster?

Thanks,
Mejo George

View 6 Replies View Related

SQL Server 2012 :: Loading And Reformatting Text File Data Into Table

May 29, 2015

I am looking for a way to convert the following format into a sql table. The format it is Bib Tex.

Essentially a new row in the table would be for each entry, denoted by an @ logo and each column is denoted by an =, as you can see from the example data no one contains all the possible columns and some fields can be over two lines long.

To load this I was considering loading it into a table as each line being a row. Adding a row number, then a column counting the @ signs in order and essentially grouping each record, then for each group running through and looking for the column keywords 'author' , 'title' etc then splitting the data out into those constituent parts using substring and charindex.

@Book{hicks2001,
author = "von Hicks, III, Michael",
title = "Design of a Carbon Fiber Composite Grid Structure for the GLAST
Spacecraft Using a Novel Manufacturing Technique",
publisher = "Stanford Press",
year = 2001,

[Code] ....

View 9 Replies View Related

SQL Server 2012 :: Strategy To Delete / Move Millions Of Rows In A Database?

Apr 16, 2015

I am using SQL Server 2012 SE.I am trying to delete rows from a couple of tables (GetPersonValue has 250 million rows and I am trying to delete 50Million rows and GetPerson has 35 Million rows and I am trying to delete 20 million rows). These tables are in TX replication.The plan is to delete data older than 400 days old.

I tried to move data to new tables from the last 400 days and it took me like 11 hours. If I delete data in chunks of 500000 then its taking a long time to rebuild indexes(delete plus rebuild indexes 13 hours). Since I am using standard edition partition wont work.

find ddl below:

GO
CREATE TABLE [dbo].[GetPerson](
[GetPersonId] [uniqueidentifier] NOT NULL,
[LinedActivityPersonId] [uniqueidentifier] NOT NULL,
[CTName] [nvarchar](100) NULL,
[SNum] [nvarchar](50) NULL,
[PHPrimary] [nvarchar](50) NULL,

[code]....

View 1 Replies View Related

SQL Server 2012 :: Row Number Over Partition

Dec 5, 2013

I am having problems getting my last revision number out when i am trying to use inner joins along with the row_number over partition

I am using 2 tables, tbl_acyear_lookup & tbl_targets

tbl_acyear_lookup columns = (pk)- academic_year_id, academic_year
looks like this:
1, 2010/2011
2, 2011/2012
3, 2012/2013

tbl_targets columns = targetID, Academic_Year_ID,Course_Mode,UK_ENROL, INT_ENROL, Notes, Revision_Number

I have one stored proc that uses the Row_number over partition that looks like this:

Select TargetID, Academic_Year_id, Course_Mode, UK_Enrol, Int_Enrol, Notes, Revision_Number from
(SELECT ROW_NUMBER() OVER (partition by [Academic_Year_id] order by [Revision_Number] DESC) as [RevNum],TargetID, Academic_Year_id, Course_Mode, Target_Year, UK_Enrol, Int_Enrol, Notes, Revision_Number
FROM tbl_targets where course_mode=@course_mode) RV where (RV.RevNum=1)

Now the next store proc needs to use the above but i need to add the Academic_year from the tbl_acyear_lookup table also add filter the target_year ='year 1'

View 2 Replies View Related

SQL 2012 :: Switching From One Database To Another Within The Same Script?

Jan 28, 2015

I would like to know how I can switch from one database to another within the same script. I have a script that reads the header information from a SQL BAK file and loads the information into a Test database. Once the information is in the temp table (Test database) I run the following script to get the database name.

This part works fine.

INSERT INTO @HeaderInfo EXEC('RESTORE HEADERONLY
FROM DISK = N''I:TESTdatabase.bak''
WITH NOUNLOAD')
DECLARE @databasename varchar(128);
SET @databasename = (SELECT DatabaseName FROM @HeaderInfo);

The problem is when I try to run the following script nothing happens. The new database is never selected and the script is still on the test database.

EXEC ('USE '+ @databasename)

The goal is switch to the new database (USE NewDatabase) so that the other part of my script (DBCC CHECKDB) can run. This script checks the integrity of the database and saves the results to a temp table.

What am I doing wrong?

View 2 Replies View Related

SQL 2012 :: Disabling Column Store Index And Try Loading Data

Oct 17, 2015

We have a typical issue with Column Store Index, we have a procedure which does 2 activities - Switch & Reverse Switch

Switch:
1. Fetch the Partitions needed to be switched
2. Switch the data from Main Table to Switch table
2. Disable the Column store on Switch table

SSIS Package:
3. Load data to Switch (Insert / Update)

Reverse Switch:
4. Enable the Switch
5. Switch back the data from Switch table to Main table

Issue: Some time the Column store is not getting disabled, and the package fails complaining try disabling the Column store index and try loading data.

If we re-run the procedure, the column store gets disabled.

View 1 Replies View Related

SQL Server 2012 :: Why Partition Function Works For Datetime2 But Not For Datetime1

Dec 20, 2013

DECLARE @DatePartitionFunction nvarchar(max) = N'CREATE PARTITION FUNCTION DatePartitionFunction (datetime) AS RANGE RIGHT FOR VALUES (';
DECLARE @i datetime = '2007-09-01 00:00:00.000';
WHILE @i < '2008-10-01 00:00:00.000'
BEGIN
SET @DatePartitionFunction += '''' + CAST(@i as nvarchar(10)) + '''' + N', ';

[Code] ....

Msg 7705, Level 16, State 2, Line 1
Could not implicitly convert range values type specified at ordinal 1 to partition function parameter type.

However if I change to datetime2 it works

DECLARE @DatePartitionFunction nvarchar(max) = N'CREATE PARTITION FUNCTION DatePartitionFunction (datetime2) AS RANGE RIGHT FOR VALUES (';
DECLARE @i datetime2 = '2007-09-01 00:00:00.000';
WHILE @i < '2008-10-01 00:00:00.000'
BEGIN
SET @DatePartitionFunction += '''' + CAST(@i as nvarchar(10)) + '''' + N', ';

[Code] ...

Is the data type of the column used for partitioning. All data types are valid for use as partitioning columns, except text, ntext, image, xml, timestamp, varchar(max), nvarchar(max), varbinary(max), alias data types, or CLR user-defined data types.

In this case why isn't datetime works?

version is as follow:

Microsoft SQL Server 2012 (SP1) - 11.0.3128.0 (X64)
Dec 28 2012 20:23:12
Copyright (c) Microsoft Corporation
Enterprise Evaluation Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1)

from [URL] .....

Table and index partitioning is supported in this edition

so I don't know why it fails!

View 1 Replies View Related

SQL Server 2012 :: Partition Existing Table And Archive One Of The Partitions

Jun 22, 2015

I have some table that need to be partitioned and archive one of the partitions.

I did this in Oracle several years ago but not in SQL Server.

I'm looking for a basic example on how to do this.

I know the basic steps but the examples that I found on the Web were not quite what I'm looking for.

[url][/
Partition an existing SQL Server Table
url]

View 9 Replies View Related

What Does Strategy Exist To Deploy SSIS Package And My Own Data Flow Components Into A Enterparise Server?

Mar 29, 2007



I created a SSIS package and several data flow componenets for this package.



What does strategy exist to deploy SSIS package and data flow components into a enterparise server?



Thanks in advance.

View 2 Replies View Related

SQL Server 2012 :: Error Loading Metadata - No Cubes Found

Aug 22, 2014

While starting querying with sql server mdx query i m getting error loading metadata:no cubes found.

View 6 Replies View Related

SQL Server 2012 :: Report Never Stops Loading / Error - Out Of Memory

Oct 3, 2014

I have a report that uses three tables from a database. As long as I only use two of the tables, it runs fine. I need the data from the third table for a line chart. So of course there is a great deal of data in the third table. I have a where clause for start date and end date. Is there a way I could only search the third table after I know what I need from it? Or

View 6 Replies View Related

SQL Server 2012 :: Loading Image From File System Into Query Using Openrowset

May 14, 2014

I have a directory with images and a table in my DB with the path of each file. The main application allow me to create reports where I can display an image, so I was thinking to use a query like:

SELECT [ID]
,[PT_CODE]
,[FILE_PATH],
CASE WHEN [FILE_PATH] IS NOT NULL THEN
(SELECT * FROM OPENROWSET(BULK [FILE_PATH], SINGLE_BLOB) TT)
END
AS IMAGE_LOADED
FROM [DB].[dbo].[TABLE_MR_FILES]But I keep getting the error:

Incorrect syntax near 'FILE_PATH'.I have try multiple combinations without luck to make the OPENROWSET read the path stored in the column [FILE_PATH]. What am I missing?

Note: I am using MSSQL 2012. I don't want to import the images into the DB just load them in the fly as needed by the report runned from the application. I have full access to the DB so if a store procedure is the solution I can go with it.

View 4 Replies View Related

SQL 2012 :: Fast Changing Dimension

May 18, 2015

I have a table that needs to be incorporated into the data warehouse.The table has the following schema.

CREATE TABLE [dbo].[Consignment](
[Id] [int] IDENTITY(1,1),
[BooingID] INT
[BookingDate] [datetime] NULL,
[CarrierServiceName] [nvarchar](255) NULL,
[CarrierServiceCode] [nvarchar](255) NULL,

[code]...

This Table has the same granularity as the fact table as it’s one row per booking.However due to the nature of the data I would not want to incorporate this into the fact table.The Originating and Destination addresses are populated for each booking and are required for reporting.

Question:Should this be moved into a fast changing Dimension table.? or would there be a better way to incorporate this data.

View 1 Replies View Related

SQL 2012 :: MDF File Size Is Growing Fast?

Jul 28, 2014

I have a Problem like the Following ..

On 24th my Mdf size was 10GB,when i checked now the Mdf size was increased suddenly to 30GB.

solution to decrease the Size and as well as where can i check the reasons behind that..

View 2 Replies View Related

SQL 2012 :: Local Backup Strategy On Primary With Transaction Log Shipping Enabled To Secondary DR Site

Oct 2, 2014

I have a scenario where a customer is going to be using Log Shipping to the DR site; however, we need to maintain the normal backup strategy on the current system. (i.e. Nightly Full, Every 6 Hour Differential and Hourly Transaction Log backup)I know how to setup Transaction Log Shipping and Fail-over to DR and backup but now the local backup strategy is going to be an issue. I use the [URL] .... maintenance solution currently.

Is it even possible to do regular backups locally keeping data integrity for your backup strategy with Transaction Log Shipping enabled?

View 2 Replies View Related

SQL Server 2008 :: Verify Partition Data Load Syntax

Feb 28, 2015

What is the syntax to verify that the partition data is loaded into the correct partition.

View 0 Replies View Related

SQL Server 2008 :: Verify Partition Data Load Syntax?

Mar 2, 2015

What is the syntax to verify Partition data load.

View 1 Replies View Related

SQL Server Admin 2014 :: How A New Partition Function Apply For Current Data

Apr 15, 2015

I have a heavy database , More than 100 GB only for six month .every Query on it takes me along time and I dont have enough space to add more indexes.by a way I decided to do partitioning. I create a partition function , on date filed and all Data records per month was appointed to a separate file.And is partitioning only for Future data entry?

View 9 Replies View Related

Data Missing When Loading Data Into Sql Server 2005

Jan 17, 2008



Hi, Experts

The project is a C/S data analysis system built with .Net 2.0 in windows environment, OS: Microsoft Windows 2003 R2 standard Edition Service Pack2, Database used in this project is: Sql server 2005. As a data analysis system, we need to load large amount of data from file to database, we do it by create a dts package and then do data loading by execute "m_Package.Execute(null, variables, m_PackageEvents, null, null)".

The problem is, we fount that DTS miss some data randomly sometimes, we can't find the rule till now. for example we've data as follows in data file, all data field splited by '|'
11234|26341|2007-09-03 00:00|0|0|0.0|0|0.011470833793282509|1|0.045497223734855652|0|0|1|0|3|2929|13130|43|0|2|0|0|40|1|0|0|0|0|0|1||0|0|3|0|0|0|0|0||0|3|0|0|43|43|0|41270|0|3|0|0|10|3|0|0|0|0|0||0|1912|0|0|3|0|0|0|0|0|0|0|3|0|0|5|0|40|0|9|0|0|0|0|0|0|0|0|29|1|1|24|24.0|16|16.0|0|0|0.0|0|0|24|23.980693817138672|0|0.0|0|0.0|0|0.0|0|0.0|11|2.0764460563659668|43|2|0|0|30|11|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|3|3|0|0|0|0|0|0|0|0|0|6|0|0|0|0|0|6|0|0|45|1|0|0|0|2|42|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|0|0|0|2|0|0|0|0|0|0|51|47|85|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|||||||||||||0|0|0|0|0|97.117401123046875|0|0|83|57|||0.011738888919353485|0|1|0.065955556929111481|0|4|||0.00026658669230528176|1|0.00014440112863667309|1|68|53|12|2|1|2.0562667846679688|10|94|2|0|0|30|11|47|4|13902|7024|6878|18|85|4.9072666168212891|5|0.0|0|0.0|0|0.0|0|0.0|0|358|6448|6324|0|0|0|0||0||462|967|0|41|39|2|0|0|0|1|0|0|0|0|0|0|0|0|3|0|0|3|0|0|0|0|0|0|0|0|0|3|0|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|46|0|1|0|1|37|0|0|46|0|1|0|1|37|0|0|0|0|0|0|0|0|0.0|0|0|6|4|2|0|0|2|1|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|1|0.012290795333683491|0|44|44.0|0|0.0|0|0|0|30|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|0|2|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|1|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|1|0|0|0|0|0|0|0|0|0|0|0|0|27|0|0|2112|411|411|45|437|2|0|2|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|4|0|4|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|6|6|0|3|2|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|5|5.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|600|600|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|6|0|0|0|0|0|0|6|0|9|1|2|2|3|0|1|0|0|0|0|0|0|0|0|0|0|0|13|3|2|5|1|1|1|0|0|0|102|0|1|1|0|0|0|3|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0||||||||||0|0|0|0|0|0|0||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|46.0|46|0.0|0|0.0|0|0.011469460092484951|1|0.0|0|0.0|0|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|100.0|100.0|0|1|0|1|0|0|0.02481505274772644|1|0.014951236546039581|1|0|0|0|0|0|0|0|0|0|0|0|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|4695.5556640625|42260|7126.66650390625|62040|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||||||0|0||||||||||

11213|26341|2007-09-03 00:00|0|0|0.0|0|0.068584725260734558|2|0.14375555515289307|27|0|2|1|11|3966|13162|97|0|13|0|0|83|3|2|3|0|0|0|26||0|0|11|0|0|0|1|0||0|1|0|3|97|98|0|246795|0|11|1|0|3|14|0|0|0|0|0||0|1912|0|0|12|0|0|0|0|0|0|0|12|0|0|17|0|83|2|2|2|0|0|0|0|0|0|0|73|3|1|24|24.0|16|16.0|0|0|0.0|3|0|24|23.905364990234375|2|0.0040000001899898052|0|0.0|0|0.0|0|0.0|11|2.0772171020507812|97|7|0|0|80|10|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|12|12|0|0|0|0|0|0|0|0|0|41|0|0|0|0|0|41|0|0|99|25|0|0|0|0|74|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|0|0|0|0|0|0|0|0|0|0|177|158|332|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|3|||||||||||||0|0|0|0|0|0.0|0|0|321|198|||0.041950233280658722|0|2|0.1999288946390152|0|5|||0.00088306842371821404|1|0.00051651173271238804|1|529|113|4|8|2|2.0671226978302002|10|274|7|0|0|80|10|66|17|13934|7024|6910|31|332|4.7173333168029785|5|0.000033333333703922108|1|0.000033333333703922108|1|0.000033333333703922108|1|0.0|0|358|6448|6356|0|0|0|0||0||1609|3423|0|83|78|5|0|0|26|0|0|0|0|0|0|0|0|0|2|0|1|1|0|0|0|0|3|0|0|0|0|2|0|2|0|0|0|0|0|0|0|0|0|0|2|0|0|0|0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|65|0|1|0|1|72|0|0|65|0|1|0|1|72|0|0|0|0|0|0|0|0|0.0|0|0|12|7|0|2|3|16|5|5|6|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|2|0.04131799191236496|0|48|48.0|5|5.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|1|0|0|0|0|0|0|9|0|5|1|0|0|0|1|0|0|0|1|1|0|0|0|0|0|0|0|0|0|0|4|2|1|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|3|0|0|0|0|3|0|0|0|0|0|0|0|0|121|0|1410|6046|558|1400|192|2467|10|0|5|1|0|0|0|2|0|0|0|1|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|15|0|10|0|0|0|0|3|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|21|9|12|10|3|1|0|1|4|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|163|4|144|91|92|2|92|0|0|0|0|0|101|92|0|0|0|0|101|0|0|0|0|600|596|1|0|0|3|0|0|0|0|0|0|0|0|0|0|0|9|0|0|1|0|0|0|8|0|34|3|4|14|7|2|3|0|1|0|0|0|0|0|0|0|0|0|41|6|4|23|5|2|1|0|0|0|289|0|7|7|0|0|0|11|11|0|0|4|4|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0||0|0|4|0|0|0|0|0|4||||||||||3|0|0|0|0|0|3||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|55.814689636230469|47.931411743164062|48|0.0|0|0.0|0|0.068584725260734558|2|0.0|0|0.0|0|14|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0.0|0|0|0|0|1|0|0|1|0|0|0|0|0|||0|100.0|100.0|0|26|26|0|0|0|0.088263392448425293|2|0.056161552667617798|2|0|0|0|0|0|0|0|0|0|5|22|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|16308.888671875|146780|23162.22265625|184840|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||||||0|0||||||||||

11220|26341|2007-09-03 00:00|0|0|0.0|0|0.309184730052948|2|0.17757916450500488|0|0|7|4|18|3925|13682|172|0|19|0|0|164|10|5|4|0|0|0|2||0|0|20|0|0|1|4|0||0|5|0|4|172|172|0|1165085|0|20|4|0|20|8|0|0|0|0|0||0|1912|0|0|24|0|0|1|0|0|0|0|23|0|0|30|0|164|4|6|8|0|0|0|0|0|0|0|121|10|15|24|24.0|16|16.0|0|0|0.0|4|0|24|23.646148681640625|1|0.0040013887919485569|0|0.0|0|0.0|0|0.0|11|2.0849723815917969|172|5|0|0|123|44|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|26|24|0|0|0|2|0|0|0|0|0|192|1|0|0|0|0|191|0|0|190|12|0|0|0|0|178|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|3|0|0|0|0|0|1|0|0|0|0|1008|953|2758|0|5|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|4|||||||||||||0|0|0|0|0|84.418106079101562|0|0|2626|1420|||0.29022222757339478|0|5|1.5045944452285767|0|5|||0.0058597764000296593|2|0.0046600494533777237|2|1340|1114|80|119|27|2.2584490776062012|10|1180|5|0|0|123|44|953|55|14462|7024|7438|52|2758|3.0037333965301514|5|0.021266667172312737|1|0.00036666667438112199|1|0.0|0|0.0|0|362|6440|6880|0|0|0|0||0||13711|27667|0|159|149|10|0|0|1|1|0|0|0|0|0|0|0|0|7|0|0|7|0|0|0|0|4|0|0|0|0|7|0|7|0|0|0|0|0|0|0|0|0|2|3|0|0|0|0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|842|0|111|0|102|1702|0|1|842|0|111|0|0|1703|0|0|0|0|0|0|0|0|0.0|0|0|47|26|11|3|7|37|1|20|16|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|4|0.24921548366546631|0|44|44.0|0|0.0|0|0|0|1003|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|10|0|8|2|0|0|0|0|0|0|0|0|0|0|81|1|60|10|10|0|0|0|0|0|0|0|0|0|1|1|0|0|0|0|0|0|0|25|16|4|2|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|53|27|17|4|5|0|0|0|0|0|0|0|0|0|421|0|8685|67179|2138|12104|80|26285|104|1|73|16|14|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|87|1|77|7|2|0|0|0|0|0|0|0|0|0|1|0|0|1|0|0|0|0|0|0|0|0|0|0|16|0|9|5|2|0|0|0|0|0|0|0|0|0|155|155|0|105|51|32|9|13|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|5|5.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|102|0|0|0|0|0|600|445|4|20|32|63|16|15|4|1|0|0|0|0|0|0|0|37|0|0|5|0|0|0|32|0|230|7|10|99|68|22|21|0|3|0|0|0|0|0|0|0|0|0|286|18|10|182|53|17|6|0|0|0|2528|0|10|10|0|0|0|22|22|0|0|25|25|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0||0|0|30|0|0|0|0|0|30||||||||||23|0|0|0|0|0|23||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|0.0|45.998283386230469|46|0.0|0|0.0|0|0.30917638540267944|2|0.0|0|0.0|0|8|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0.0|0|0|0|0|1|1|0|0|0|0|0|0|0|||0|100.0|100.0|0|2|1|0|0|1|0.73375397920608521|5|0.41600942611694336|6|0|0|0|0|0|0|0|0|0|0|0|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|98115.5546875|865520|176626.671875|1159360|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||||||0|0||||||||||

We found that some of the data field become 'null' after the load action finished, if we load the same data again, problem disappeared, we can't 100% reproduce this issue each time, we don't know why, Anybody here can help us to solve this issue or give us some clue?


View 3 Replies View Related

SQL 2012 :: Partition On Table For A Particular Value And Ranges

Mar 23, 2015

How can I make partitions on a table for a particular value and ranges together?

For example, for customer id 12345 i need a separate partition, then for 56789 i need a separate partition, and if i have range of values like 1000 to 1020 then a separate partition for this.

For certain ids i need unique partition, and for certain ids i need Ranges.

is it possible in SQL 2012?

View 6 Replies View Related

SQL 2012 :: How To Add New Filegroup To Existing Partition Scheme

Jul 10, 2014

How to add some more ranges to existing partition schema and function?

Already My table partitioned on date ranges,

6 partitions , each partition contains 6 moths data, so total data is 3 years.

i.e. 1 partition data- from jan2012 to Jun2012
2 partition data- from july2012 to dec2012
3 partition data- from jan2013 to Jun2013
4 partition data- from july2013 to dec2013
5 partition data- from jan2014 to Jun2014
6 partition data- from july2014 to dec2014
After Jan2015 data will go to Primary file group(Default)

Now customer wants to add two more file groups with these partitions ranges, i.e. jan2015 to jun15 and Jul15 to dec15.

File group and ndf file adding is OK, But

how to alter partition scheme and partition function with these additional ranges to existing partition function and scheme?

partitioned table size is 200 GB.

View 1 Replies View Related

Loading Data From Xml To Sql Server 2005

Aug 13, 2007

Please find the code below (which I am using).
1)
CREATE TABLE rssFeeds( feedXML XML)
SELECT * FROM RSSFEEDS
DECLARE @xmlDoc XML
SET @xmlDoc = ( SELECT * FROM OPENROWSET ( BULK 'C:Test.xml', SINGLE_CLOB ) AS xmlData)

2)
INSERT INTO rssFeeds (feedXML) VALUES (@xmlDoc)

I am able to get the contents of xml file into sql server in only one field of Table. Now I am asked to make one table with the schema same as xml file's. How to proceed? Pls let me know some URL for this.

Regards,
Ashish Johri

View 1 Replies View Related

Loading Data Into SQL Server From A SAS Dataset

Jul 20, 2005

Hi,SQL Server 2000 SP3Has anyone ever successfully loaded data into SQL Server from a SASdataset. I have tried using DTS and SAS OLE DB drivers but get thefollowing errorError Description:A provider specific error occurred (%1:%2)Context:Error calling GetRowSet to get DBSCHEMA_TABLES schema info. Yourprovider does not support all the schema rowsets required by DTS.It does seem to me to be a problem with the OLE DB providers but ifanyone has seen this issue with DTS previously , ler me know......Any help is appreciated....Thanks in advanceReg

View 1 Replies View Related

EDI.TXT Data Loading In Sql Server Using SSIS

May 27, 2008



I have requirement to load the EDI.TXT format data in SQl server using the SSIS.The edi file data looks like that

@pRecType="A",@pA010="XC",@pA020="270",@pA110="CDC LOCAL"
@pRecType="C",@pC010="M",@pC015="H",@pC050="20080408",@pC060="B",@pC070="E40245",@pC080="P",@pC110="95000",@pC112="000000000",@pC120="P",@pC121="N",@pC122="",@pC124="100000002274",@pC125="166759",@pC210="Y",@pC301="000061131006",@pC320="20080211",@pC321="20080211",@pC511="",@pC512="",@pC900=" ABY"
@pRecType="E",@pE100="1",@pE101="5789",@pE110="",@pE111="",@pE120="",@pE121="",@pE130="",@pE131="",@pE140="",@pE141="",@pE150="",@pE151="",@pE160="",@pE161="",@pE170="",@pE171="",@pE180="",@pE181="",@pE190="",@pE191=""
@pRecType="H",@pH110="8",@pH120="3",@pH130="1",@pH210="",@pH220="20080211",@pH230="20080211",@pH235="01",@pH240="20080211",@pH250="20080211",@pH310="",@pH320="",@pH710="",@pH711="",@pH712="",@pH510="",@pH511="",@pH512="",@pH520="",@pH521="",@pH522="",@pH530="",@pH531="",@pH532="",@pH540="",@pH541="",@pH542="",@pH550="",@pH551="",@pH552="",@pH560="",@pH561="",@pH562="",@pH570="",@pH571="",@pH572="",@pH580="",@pH581="",@pH582="",@pH590="",@pH591="",@pH592="",@pH600="",@pH601="",@pH602="",@pH610="",@pH620="",@pH630="",@pH640="",@pH650="",@pH660="",@pH670="",@pH680="",@pH690="",@pH713=""
@pRecType="M",@pM250="170"
@pRecType="P",@pP010="SE",@pP110="E",@pP130="",@pP150="721153832",@pP210="",@pP310="",@pP340="",@pP350="",@pP360="",@pP370=""
@pRecType="S",@pS010="",@pS020="200278911",@pS110="BABIN",@pS111="JENNIFER",@pS112="L",@pS120="",@pS121="",@pS125="",@pS126="",@pS127="",@pS211="BABIN",@pS212="JENNIFER",@pS213="L",@pS221="F",@pS222="19850919",@pS231="M",@pS310="1605 SOUTH SHIRLEY",@pS311="",@pS315="GONZALES",@pS316="LA",@pS317="70737",@pS410="N"
@pRecType="V",@pV130="AA",@pV140="000000100",@pV150=""
@pRecType="X",@pX010="20080211",@pX020="20080211",@pX131="45378SG",@pX133="5789",@pX134="",@pX141="000095000",@pX142="000000000",@pX147="001",@pX210="0490"


can you help me how I solve this problem.I will appreaite for that id you help me this.

Thanks

HK

View 4 Replies View Related

Strategy For Data Storage/searching

Dec 16, 2007

Hello there,

Don't know if this is the right forum to be asking this, but I'll give it a try...

I'm relativelly a beginner in SQL Server and T-SQL in general. The problem I'm trying to solve is the following:

The big picture is that I have data coming from different data sources which I need to store on a database for later reference. Each data source might have a different set of measurements. For example, data source 1 might log Pressure and Humidity while data source 2 logs Pressure and Temperature. Once the data is present on the DB, the users can go ahead and retrieve data for a given [datasource/measurement/time interval] to generate reports or charts.

My implementation so far consists of two tables: series_info and series_data. series_info holds general information for a given series of measurements for a given data source (Pressure for data source 1, Pressure for data source 2, Humidity for data source 1 and Temperature for data source 2, in our example). Each series has a bigint index as primary key.

The table series_data contains all data relative to the series from series_info. Each piece of data has a bigint as a primary key, an associate time (which is always crescent) and a foreign key to the series it represents (in series_info).

Alright, everything is cool so far. However, whenever a user wants to retrieve data for given [data source/measurement/time interval], this takes very long, since all data is interposed in series_data and for every search it's necessary to find where the desired data actually lies.

One obvious solution for this would be to dynamically create a new table to hold the data for each series, but that would just make my database disorganized, since there would be thousands and thousands of tables.

Another thing that comes to my mind is to create a table with information of where lies the data for a given [data source / measurement] for given dates. So when the user requested data for a given [data source/measurement] between, say, january and february, we would first look at this intermediate table and find out that the data lies between indexes 1000 and 2000 on the series_data table, so the next SELECT command to series_data would already contain a restriction like WHERE index>=1000 and index<=2000. This should probably improve the speed of retrieval.

What do you guys (or girls) think? Maybe there's simply a classical solution for such a case.


Thanks in advance!

View 6 Replies View Related

Data-archiving And Purging Strategy

Mar 31, 2008



Regarding SQL Server data, I am looking to implement the beset Data-Archive and Purge policy. Normal, we do SQL Backups and keep the history for some period , for example, 8 weeks, so we can go back and restore any data point in time upto 8 month in past. and we also do Tape backups.

Question is Where can I get nice article or documentation on this to best design such policy where I make sure that I am covered for point in time recovery of database (which is sql backups) and point in time recovery in far past, say, 3 years ago using tape backups, and I need to make sure that I don't repeat the same efforsts.

Any advices or suggestion on this topic.

Thanks,

View 5 Replies View Related

Integration Services :: Table Partition Using SSIS 2012

Nov 12, 2015

How can we create table partition and how can we use in ssis 2012.

View 6 Replies View Related

Loading Data From .mdb File Into Sql Server 2005

Aug 15, 2007

I am so new to SQL Server 2005 and just studying.  Saying that...
We use SQL Server 2005 Express edition.  Some one sent me a file (info.mdb) and asked me to load the data in this file in to a table called Products and also asked me to load in another table (ProdCat) where id = 'X05'.
So being not knowing anyting regarding data loading etc, how should I do this and proceed?  The .mdb means its a Access database file?  If that is the case, I dont have Access in my machine and what should I do?

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved