Data Warehousing :: Will Creating Partitions On Table Increase Insert Speed

Oct 8, 2015

I have table having around 100 million rows.Everyday we have an ETL process in which table will be trucnated and relaoded. Will creating a partition on the table increase the inserting speed?

View 4 Replies


ADVERTISEMENT

Increase The Speed Of Insert Statment

Mar 2, 2008

Hi all

i'm using sqlserver 2005

this statment take 1:30 min to execute

****insert into temotable (select key from table1)

if i used a select statment alone it takes 4 sec

but with insert statment it take 1:30min

by the way i put indexes on the table1

plz how i can increase the speed of insert statment.

thanks in advance

View 4 Replies View Related

Data Warehousing :: Creating A Table With Column Store Index?

Sep 26, 2015

I am trying to create a sample table in the Azure SQL  Data warehouse but its giving me a syntax error Incorrect syntax near the keyword 'CLUSTERED'.

CREATE TABLE [dbo].[FactInternetSales]
( [ProductKey] int NOT NULL
, [OrderDateKey] int NOT NULL
, [CustomerKey] int NOT NULL
, [PromotionKey] int NOT NULL

[Code] ....

what's the correct syntax

View 2 Replies View Related

Data Warehousing :: How To Increase Partial Cache Size On Lookup Stuck

Apr 20, 2015

After converting from SSIS 2008 to SSIS 2012, I am facing major performance slowdown while loading fact data.When we used 2008 - one file used to take around 2 hours average and now after converting to 2012 - it took 17 hours to load one file. This is the current scenario: We load data into Staging and Select everything from Staging (28 million rows) and use a lookups for each dimension. I believe it is taking very long time due to one Dimension table which has (89 million rows). 

With the lookup, we currently are using partial cache because full cache caused system out of memory.Lookup Transformation Editor - on this lookup - how to increase the size on partial Cache size 64-bit? I am being stuck at 4096 MB and can not increase it. In 2008, I had 200,000 MB partial cache size.

View 2 Replies View Related

Creating Indexes On Large Table To Increase Performance

Mar 5, 2008

Dear all,
I'm using SQL Server 2005 Standard Edetion.
I have the following stored procedure that is executed against two tables (RecrodedCalls) and (RecordedCallsTags)
The table RecordedCalls has more than 10000000 Records and RecordedCallsTags is about 7500000 Records
Now the lines marked in baby blue are dynamic (Dynamic where statement) that varies every time this stored procedure is executed, may it contains 7 columns in condetion statement or may it contains 10 columns, or 2 coulmns.....etc
Now I want to create non-clustered indexes on the columns used in the where statement, THE DTA suggests different indexing whenever the where statement changes.
So what is the right way to created indexes, to create one index on all the columns once, or to create separate indexes on each columns, sometimes the DTA suggests 5 columns together at one if I€™m using 5 conditions, I can€™t accumulate all the possible indexes hence the where statement always vary from situation to situation, below the SP:


CREATE TABLE #tempLookups (ID int identity(0,1),Code NVARCHAR(100),NameE NVARCHAR(500),NameA NVARCHAR(500))

CREATE TABLE #tempTable (ID int identity(0,1),TypesCount INT,CallsType NVARCHAR(50))



INSERT INTO #tempLookups SELECT Code, NameE, NameA FROM lookups WHERE [Type] = 'CALLTYPES' ORDER BY Ordering ASC

INSERT INTO #tempTable SELECT COUNT(DISTINCT(RecordedCalls.ID)) As TypesCount,RecordedCalls.CallType as CallsType

FROM RecordedCalls LEFT OUTER JOIN RecordedCallsTags ON RecordedCalls.ID = RecordedCallsTags.CallID

WHERE RecordedCalls.ID <= '9369907'

AND (RecordedCalls.CallDate BETWEEN cast ('01 Jan 1910 00:00:00:000' as datetime ) AND cast ( '01 Jan 2210 00:00:00:000' as datetime ))

AND (RecordedCalls.Duration BETWEEN 0 AND 1000000)

AND RecordedCalls.ChannelID NOT IN('62061','62062','62063','62064','64110','64111','64112','64113','64114','69860','69861','69862','69863','69866','69867','69868')

AND RecordedCalls.ServerID NOT IN('2')

AND RecordedCalls.AgentID NOT IN('1000010000')

AND (RecordedCallsTags.TagID is null OR RecordedCallsTags.TagID NOT IN('100','200'))

AND RecordedCalls.IsDeleted='false'

GROUP BY RecordedCalls.CallType

SELECT IsNull(#tempTable.TypesCount, 0) AS TypesCount, CASE('English')

WHEN 'Arabic' THEN #tempLookups.NameA

ELSE #tempLookups.NameE

END AS CallsType FROM

#tempTable RIGHT OUTER JOIN #tempLookups ON #tempTable.CallsType = #tempLookups.Code

DROP TABLE #tempLookups

DROP TABLE #tempTable


Thanks all,
Tayseer

Any suggestions how to create efficient indexes??!!

View 2 Replies View Related

Database Optimization (Increase Speed)

Nov 21, 2005

Well good morning/afternoon to everyone. It's been a while sinse I've posted here and it seems that the site is a lot faster now. Good to see. :) Anyways, I'm working a current problem here at work with our database being quite slow. I've done some research already and will continue to do so but i wanted to get some of your opinions. Right now, I've run the 'DBCC SHOWCONTIG' command and it is telling the following in the first 3 system tables: DBCC SHOWCONTIG scanning 'sysobjects' table... Table: 'sysobjects' (1); index ID: 1, database ID: 6 TABLE level scan performed. - Pages Scanned................................: 34 - Extents Scanned..............................: 12 - Extent Switches..............................: 33 - Avg. Pages per Extent........................: 2.8 - Scan Density [Best Count:Actual Count].......: 14.71% [5:34] - Logical Scan Fragmentation ..................: 41.18% - Extent Scan Fragmentation ...................: 83.33% - Avg. Bytes Free per Page.....................: 2303.6 - Avg. Page Density (full).....................: 71.54% DBCC SHOWCONTIG scanning 'sysindexes' table... Table: 'sysindexes' (2); index ID: 1, database ID: 6 TABLE level scan performed. - Pages Scanned................................: 72 - Extents Scanned..............................: 16 - Extent Switches..............................: 59 - Avg. Pages per Extent........................: 4.5 - Scan Density [Best Count:Actual Count].......: 15.00% [9:60] - Logical Scan Fragmentation ..................: 50.00% - Extent Scan Fragmentation ...................: 81.25% - Avg. Bytes Free per Page.....................: 4184.9 - Avg. Page Density (full).....................: 48.30% DBCC SHOWCONTIG scanning 'syscolumns' table... Table: 'syscolumns' (3); index ID: 1, database ID: 6 TABLE level scan performed. - Pages Scanned................................: 323 - Extents Scanned..............................: 50 - Extent Switches..............................: 299 - Avg. Pages per Extent........................: 6.5 - Scan Density [Best Count:Actual Count].......: 13.67% [41:300] - Logical Scan Fragmentation ..................: 48.61% - Extent Scan Fragmentation ...................: 96.00% - Avg. Bytes Free per Page.....................: 4527.0 - Avg. Page Density (full).....................: 44.07% DBCC SHOWCONTIG scanning 'systypes' table... Table: 'systypes' (4); index ID: 1, database ID: 6 TABLE level scan performed. - Pages Scanned................................: 1 - Extents Scanned..............................: 1 - Extent Switches..............................: 0 - Avg. Pages per Extent........................: 1.0 - Scan Density [Best Count:Actual Count].......: 100.00% [1:1] - Logical Scan Fragmentation ..................: 100.00% - Extent Scan Fragmentation ...................: 0.00% - Avg. Bytes Free per Page.....................: 6712.0 - Avg. Page Density (full).....................: 17.07% According to the DBCC SHOWCONTIG command documentation, there should be no fragmentation at all. Some questions: 1. would system performance be severly negatively reduced with the above fragmentation (logical and extent)? 2. can the 'DBCC INDEXDEFRAG(dbname, tablename, indexname)' command be issued against those system tables without consequences? 3. is there some other command that can defrag the entire database without having to specify which tables? Also, I have also used the index tuning wizard after a profile trace but that failed with some unknown error. Thats it for now, please let me know if you have some info I could use to help speed up my database.

View 8 Replies View Related

Data Warehousing :: Cannot Insert Value NULL Into Column When Executing Stored Procedure

Sep 22, 2015

I received this stored procedure that I modified to run on my system.  Specifically, I only changed the database and filter text and left the rest alone.  When I execute the stored procedure, I get the error:

Msg 515, Level 16, State 2, Procedure ObjectNotesInsert, Line 18
Cannot insert the value NULL into column 'RefRowPointer', table 'pSCI_App.dbo.ObjectNotes'; column does not allow nulls. INSERT fails. The statement has been terminated.

Here is the actual stored procedure I am running. I should add that I can execute each step and get results and if I hard code the resulting values into the procedure, the execution works. 

USE [pSCI_App]
GO
/****** Object: StoredProcedure [dbo].[_JAMTestSp] Script Date: 09/21/2015 11:32:09 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON

[Code] ....

View 30 Replies View Related

How Do I Prevent An Insert Into Statement To Increase Tempdb Data Files So Much

May 6, 2008

I'm running this procedure which insert into table_name(id, name.....) select id, name.... from table_name. For some reason the tempdb data file grow up to 200GB. The tempdb is set to expand unrestricted by 10%. How can I prevent that from hapening? Thanks.

View 5 Replies View Related

Data Warehousing :: How To Use Temporary Table In SSIS

Jul 23, 2015

How to use temporary table in SSIS ?

View 2 Replies View Related

Data Warehousing :: Use New Table With Reduced Structure?

Aug 24, 2015

I have a large fact table spread across tens of partitions (appx. 1TB each). I found that the business does not need much of the columns in the table. So, as an optimization action, I decided to get rid of these un-needed columns.What is the efficient way to achieve this? Can I simply drop these columns from the table, or use a new table with the reduced structure?

View 2 Replies View Related

Data Warehousing :: Querying In Fact Table

May 2, 2015

I have a Fact Table with a ID column as Primary key and clustered index is created. And also I have 4 dimensions FK's of data type INTEGER. And finally, I have one aggregation measure in the Fact Table.

Now, my situation is How can I improve the speed of querying the fact table by creating any of the below indexes?

1. XML
2. Spatial
3. Clustered
4. Non-Clustered

View 2 Replies View Related

Master Data Services :: Table Partitions In MDS?

Apr 13, 2015

Perhaps this task is not for MDS.... But another tool for rapid development & startUp - we don't have. And nevertheless....

We created table managers_plan in MDS :

year
month
id_manager (domain attr)
POS (domain attr)
plan_sum_USD
plan_unit
----------------------------
Entities:
Managers ~ 800 records
POS  ~ 100 000 records

managers_plan Total records for
1 year = 100K x 12 = 1 200 000

managers_plan  - table partitions  
- Will bemade ?

View 3 Replies View Related

Data Warehousing :: Adding A New Column From Table To View

Nov 2, 2015

I have a SQL VIEW with col1, col2, col3. I need to add a new column to the view col4 coming from a TABLE in SQL Server.

View 4 Replies View Related

Data Warehousing :: Hiding Text Box Based On Table Property

Oct 16, 2015

I am putting together an invoice for my company. I have a text box describing each section of the invoice, followed by a table to list out the charges. I am using multiple tables based on what type of charge the client is receiving. 

I would like to hide each section if there are no items purchased of that type. I can do this with the table using the expression "=CountRows() < 1", but I do not know how to refer to that table (call it Tablix1 for the sake of discussion) for the text box. I've tried using a ReportItems function as my basis, without success. 

View 2 Replies View Related

Data Warehousing :: Populating Fact Tables With Surrogate Key From Dimension Table?

Sep 11, 2015

How do I correctly populate a fact table with the surrogate key from the dimension table?

View 4 Replies View Related

How To Speed Up Table Data Transfer Thru Bcp In?

Jan 30, 2008

for bcp in,
1. use fixed length format file or delimitered file?
2. table w/o index including primary key?
3. sort the text file before bcp in (will it speed up indexes creation after data uploading?)


which pt will or will not improve the overall bcp in processing?

thx...

View 4 Replies View Related

Creating A Stored Procedure That Will Summarize Data In A Table Into A Table Reflecting Period Data Using An Array Type Field

Sep 20, 2007

I am attempting to create a stored procedure that will launch at report runtime to summarize data in a table into a table that will reflect period data using an array type field. I know how to execute one line but I am not sure how to run the script so that it not only summarizes the data below but also creates and drops the table.

Any help would be greatly appreciated.

Current Table

Project | Task | Category | Fiscal Year | Fiscal Month | Total Hours
---------------------------------------------------------------------------------------------------------
Proj 1 | Task 1 | Cat 1 | 2007 | 01 | 40
Proj 1 | Task 1 | Cat 2 | 2007 | 02 | 20
Proj 1 | Task 1 | Cat 3 | 2007 | 03 | 35
Proj 1 | Task 1 | Cat 1 | 2008 | 01 | 40
Proj 1 | Task 1 | Cat 2 | 2008 | 02 | 40
Proj 1 | Task 1 | Cat 3 | 2008 | 03 | 40

Proposed Table

Project | Task | Category | Fiscal Month 01 | Fiscal Month 02 | Fiscal Month 03 | Fiscal Year
---------------------------------------------------------------------------------------------------------------------------------------------------
Proj 1 | Task 1 | Cat 1 | 40 | 0 | 0 | 2007
Proj 1 | Task 1 | Cat 2 | 0 | 20 | 0 | 2007Proj 1 | Task 1 | Cat 3 | 0 | 0 | 35 | 2007
Proj 1 | Task 1 | Cat 1 | 40 | 0 | 0 | 2008

Proj 1 | Task 1 | Cat 2 | 0 | 40 | 0 | 2008
Proj 1 | Task 1 | Cat 3 | 0 | 0 | 40 | 2008

Thanks,
Mike Misera

View 6 Replies View Related

SQL Server 2008 :: Insert Data Into Table Variable But Need To Insert 1 Or 2 Rows Depending On Data

Feb 26, 2015

I am writing a query to return some production data. Basically i need to insert either 1 or 2 rows into a Table variable based on a decision as to does the production part make 1 or 2 items ( The Raw data does not allow for this it comes from a look up in my database)

I can retrieve all the source data i need easily but when i come to insert it into the table variable i need to insert 1 record if its a single part or 2 records if its a twin part. I know could use a cursor but im sure there has to be an easier way !

Below is the code i have at the moment

declare @startdate as datetime
declare @enddate as datetime
declare @Line as Integer
DECLARE @count INT

set @startdate = '2015-01-01'
set @enddate = '2015-01-31'

[Code] .....

View 1 Replies View Related

Creating A Trigger To Insert Values From One Table To Another

Apr 15, 2008

hello everyone ,
i have a table named "Employee" with EmpID as PK.
 i want to insert EmpID one by one in another table named "AssignedComplaints"
so if i have all the EmpID inserted in "AssignedComplaints" table then on next insert operation , the first EmpID will be inserted and then second on so on.
like i gave u a sample
i have three EmpIDs in "Employee" table  named M1,M2,M3
first M1 is inserted secondly M2 and lastly M3.
now i have all EmpID in "AssignedCompalints" table.
now if i do want to insert again then the whole process will repeat , first M1 will be inserted secondly M2 and so on.
i need the query
i have created a trigger and will use this query in that trigger.
thanks

View 11 Replies View Related

Creating A Trigger To Automate Insert Into Another Table

Jan 14, 2008



Hi,

i am facing a problem with creating a Trigger to insert into another table.

i have 4 tables namely:
PurchaseOrder
PurchaseOrderItem
DeliveryOrder
DeliveryOrderItem

i want the trigger to create a new row in the DeliveryOrder when i creates a PurchaseOrder.

I tried the following:


CREATE TRIGGER trgInsertPO

ON PurchaseOrder

FOR INSERT

AS

INSERT INTO DeliveryOrder (DeliveryOrderNo,DeliveryOrderDate, SupplierID, DeliveryOrderStatus)

VALUES (PurchaseOrderNo,PurchaseOrderDate,SupplierID,'d')

GO


but it cant work. Help required.

Thanks.

View 6 Replies View Related

Creating A Stored Procedure To Insert Data?

Dec 1, 2005

Hello all,
I am having a lot of trouble with stored procedures. Could anyone help me out.
I have a table which contains a number of meetings. What I want to do is search this table, get out all the meetings for today and put them in a seperate table meetings today.
I can select the values, and I can insert the values.
But how do I store the values so that i can pass the results of the select to the insert?
Im also having a lot of trouble with storing date values.
ANy help would be greatly appreciated.
Regards,
Padraic Hickey 

View 2 Replies View Related

Data Warehousing :: Query That Extracts Email Data From A Column

Jun 8, 2015

I have  a column in which Email data is available like 

clicuanan@aspenms.com(M)
jteply@mac.com(M)

How to extract in the below format

clicuanan@aspenms.com
jteply@mac.com
tjones@jpmc.com

View 4 Replies View Related

Help In Creating Insert Statements For Retreiving Data From Database

Apr 14, 2008



Hi all,


Could someone tell me how to get the data from all tables of the database in the form of insert script? We are moving our databse from SQL Server 2000 to SQL Server 2005. The scripts for the Database, Tables, Views , Procedures, Functions have been obtained and it is only the data that is remaining. Some are small tables with 5 to 6 columns but there are some with 50 odd columns. A friend of mine told me about a procedure that returns a dataset with INSERT statements by passing a table name as a parameter. Such procedure would be of great help.

Thank you

View 5 Replies View Related

Data Warehousing

Aug 15, 2000

I am preparing for the exam anybody know of any braindump sites that have data
warehousing braindumps?

View 2 Replies View Related

DATA WAREHOUSING

Jun 17, 2000

Hello,

I am new to SQL Server. I want to know about DATA WAREHOUSING AND OLAP. Please help me.

View 3 Replies View Related

Data Warehousing

Mar 30, 2001

Can anyone suggest a good resource that outlines data warehouse solutions using SQL 7.0 or higher?

View 1 Replies View Related

Data Warehousing Dev. Help

Apr 5, 2007

Hi,
I need to implement/set up the Data warehouse/Data mart in one of the department in my company by using SQL server 2005. Do any body knows the steps what I need to follow?

It will be more appreciate that, if any body gives some of the links which will help me to do the implementation/development of the same.

I do have the basic idea however I may face some of the difficulties when I start such as, does the SQL server reporting service allow the end user to customize the report based on their needs etc.?, so any of them having experience in this field please reply me.


Thanks
Ajith Nair

View 1 Replies View Related

Data Warehousing :: How To Represent Metadata In A Data Warehouse

Sep 24, 2015

I am working on to create a data warehouse. I have made a database which will be the data warehouse and will consist of dimension and fact tables. I know that other than dimension and fact table a data warehouse should also consist of a meta data, now my question is what should be the structure of metadata and all the information it should have?

View 2 Replies View Related

Optimizing Insert Speed

Aug 9, 1999

I want to know how I can speed up inserting rows. Will stored procedures help at all? Any ideas are wanted. Thanks.

View 1 Replies View Related

How Can I Speed Up My Multiple INSERT's?

Feb 7, 2005

I have a data gathering application written in MSVC++ 6 that uses ADO to insert large amounts of data into a table. Currently I have a stored procedure that inserts a single row at a time and I call it everytime I have more data to insert. However this can often fully load SQL Server - I often have 10's or 100's of inserts a second for short periods and load goes up to 100%...

Does anyone know a way of making the inserts more effiicient without resorting to dynamic SQL?

For example is there a way of batching up these inserts such as passing 10 at a time to the sp and inserting them all at once with an "insert into <table> select ..."?

Or would modifying my C++ and wrapping a block of inserts to the single insert sp in a transaction help?

A collegue suggested writing the data to a temporary text file then using bulk insert at regular intervals but that would then involve writing a file management system as well and seems to be a bit of a hack!

Any help much appreciated.

View 14 Replies View Related

Table Partitions & RAID 5.

Aug 28, 2007

Hi experts,

We have a huge table with around 250 million records and have implemented SQL server 2005's new table partitioning feature. Now the data seems to be evenly spread across 20 different filegroups ( each 5 GB approx ) for the same table that was occupying 100 GB itself in the PRIMARY filegroup earlier.

Still the query response times have not come down drastically but we could see a good improvement in the execution plans now.

WE ARE USING RAID 5 IN OUR PRODUCTION ENVIRONMENT. ANY IDEA / THOUGHT ON HOW TO PLACE THE PARTITIONED FILEGROUPS AND THE LOG FILES IN THE RAID 5 (BTW , I'm very new to RAID concepts , any detailed instruction would be helpful ).

Any help would be greatly appreciated.

Thanks,

Hariarul

View 8 Replies View Related

Better Table Management (partitions?)

Oct 31, 2006

Hi,

For my work I am now learning Sql server 2005 and I have been given a database that has been set up by someone else to work with. It is my job to get the database ready for use in reports.

My problem is that the current database has one huge table with almost 8GB of data. The table contains data from 2004 to present (and growing) from 14 different countries. The reports we use are mostly per country, but we also want to compare the 14 countries to eachother for say, whole 2006.  At the moment the table is stored in one single file instead of using partitions.

I believe partitions can give a good performance boost when running the queries. But how do I do this? Currently the country codes are just plain text, can they be used for partitions?

Any advice would be welcome,

Thanks!

View 5 Replies View Related

Data Warehousing Resources

Sep 20, 2002

Hi:
I am planning to give the microsoft 70-019 data warehousing exam. I am not able to find any good resources to prepare for the exam. Please help!

Thanks in advance

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved