Webinar On High-perf Extracts From Oracle, DB2, Teradata, Etc.

Jul 12, 2006

This webcast from our partner, ETI, may be of interest to readers on the forum - we see questions quite often about high-performance interaction with other databases.

Donald

View 4 Replies


ADVERTISEMENT

High Value For The PagesSec Perf Counter

Feb 26, 2008

Hello All,

I have a 64-bit SQL Server running on 64-bit OS having 12GB of RAM. The Server only hosts our Application database (its a 30 GB database). There are no major process running on it. Despite of this, the Avg Pagessec shows a count of 1500 and
the Avg Page Faultsec shows 22000. Is there any specific reason as to why this is happening? The %idle time of the disk is 87%. Also there is no major load on the server. Do let me know if you need any other input.

Thanks in Advance,
Mitesh

View 2 Replies View Related

Data Extracts From Another SQL Database

May 22, 2008

We have an engineering software which was not designed for SQL use. Basically what they have are a bunch of tables, and in those table values are comma delimted lists. They then created views off these crappy tables and created a "key" in the view.. I'm not sure you already guessed that this sucks for writing reports and things of that nature, so, I decided to create my own database and create extracts.
I know how to get new data into the tables which I'll outline one below, but the part I'm not sure of how to do is how to see if a value is changed. They don't have timestamps tracking a update otherwise I'd just check against that and I'd rather prefer so just update data that needs to be versus doing a complete wipe of the data. Some of the data (such as sales orders) the employees requested to be extracted live however I was able to negotiate with them down to every 15 minutes.
Here's what my table looks like for customers..  CustomerId int NOT NULL,
CustomerNumber nvarchar(16),
CustomerName nvarchar(100),
BAddress1 nvarchar(100),
BAddress2 nvarchar(100),
BCity nvarchar(50),
BState nvarchar(10),
BZip nvarchar(15),
SAddress1 nvarchar(100),
SAddress2 nvarchar(100),
SCity nvarchar(50),
SState nvarchar(10),
SZip nvarchar(15),
Phone nvarchar(30),
Fax nvarchar(30)

And this is how i'm inserting new data.INSERT INTO Intranet.dbo.FS_Customer
SELECT
c.CustomerKey,
c.CustomerID,
c.CustomerName,
c.BillToAddress1,
c.BillToAddress2,
c.BillToCity,
c.BillToState,
c.BillToZip,
c.CustomerAddress1,
c.CustomerAddress2,
c.CustomerCity,
c.CustomerState,
c.CustomerZip,
c.CustomerContactPhone,
c.CustomerContactFax
FROM
FSDBKL.dbo.FS_Customer AS c
WHERE c.CustomerKey NOT IN (SELECT cu.CustomerId FROM intranet.dbo.FS_Customer AS cu)
 
FSDBKL.dbo.FS_Customer is the old table, and intranet.dbo.FS_Customer is the new one I created.
If I have to I can update each record but right now the customers table alone has 9,000 records and I think doing a update on each record will be time consuming.

View 4 Replies View Related

Reporting Services And Huge Data Extracts Causes IIS To Use A Lot Of Memory

Dec 30, 2007

Hi.

I am working on a serial tracking application using Sql Server 2005 and .Net. One of the requirments is to have an ad-hoc file export utility in which users can drag-n-drop fields from a set of tables and export the results to CSV. It all sounds ok and Sql Server Reporting Services' Report Builder seem to be just the right tool for it, but there is one problem :
The report size is big, about 7K - 8K pages and 4 - 5 columns wide; while rendering the report, IIS memory usage shoots up to about 2GB and remains at about 2GB.
Any idea if something can be done to mitigate this problem? Note that I dont need the HTML rendering at all. All I need is to have the CSV at the end of the day, while users are able to chose columns in an ad-hoc manner.

View 7 Replies View Related

High Availability To High Protection Without Reconfiguring Mirroring

Apr 23, 2007

Hi,

Is there a way to configure mirroring to go from High Availability to High Protection without having to reconfigure Database Mirroring? Using the interface in Management Studio, I can change the configuration option to High Performance, but not High Protection despite both of them being Synchronous.

If not, what are the recommended steps to configure the mirror once it already has been configured? Is just like initially setting up the mirror or would there be any shortcuts I could take? If I stop the mirroring and remove the witness, will the High Protection option be available?

Thanks,
J.

View 3 Replies View Related

High Safety Changed To High Performance After Fail Over ?

Mar 6, 2008

Hi There

I realise this is a stupid quesiton but i cannot really find any confirmation of this in BOL.


If you are running High Safety with automatic failover, when failover occurs does this automatically change to High Performance mode. SInce for failover to occur something has happen with the primary , it will be impossible to commit transactions on the new primary and mirror asyncronously since 1 of them is no longer available.


So am i correct in assuming that automatic failover also automatically changes the mode to High Performacne for that session?

Thanx

View 4 Replies View Related

Perf Mon

Mar 15, 1999

On an installation of sql 6.5, when I go to performance monitor, there are no sql counters. What gives? How can this be fixed,and do I have to reinstall,(wipeout data) to do it?

View 2 Replies View Related

SQL Perf Mon

Nov 9, 1998

The User Defined Counters Object has disappeared from me Performance Monitor.
Any ideas why or how to get it back? Thanks, Mike

View 1 Replies View Related

Perf Monitor

Feb 24, 2004

Is there anyway that you can automate performance monitor from the cmd line. I.e feed it some parameters like server and counters and receive a log file at the end.

Cheers

View 5 Replies View Related

SQL Network Perf Issues

Mar 19, 2003

I have a SQL2000 db running a vendor custom app on a Win2000 sp3 server. The front-end is run on Citrix to the client. Can't change the app or the somewhat poor infrastructure of client. users starting to complain about slowness ( we can shadow them and this seems the case). On the SQL box all the standard perf counters look good (buffer cache at 100%, queue length always less than 1, memory and disk look good. The server currently has 2 nics doing load balancing. The network counter bytes/sec averages over 100,000. The network counter output queue length shows an average of 4,294,967,251 (That seems high but in reading on another sql perf site this counter doesn't always work). Any other ideas where I might look. Do these numbers look high?
Thanks

View 1 Replies View Related

Perf Report Query

Sep 22, 2005

CREATE TABLE [Perf] ([TransId] INTEGER,[FileNo] VARCHAR(80),[TimeInSeconds] INTEGER,[FileSizeMB] INTEGER,[FileName] VARCHAR(255),[StartDate] datetime)Ok!! Here's the Problem. I am working on a perf stats report. FileType is First 9 chars of the field FileName. I need to compare a similar filename from this month to last month or before ordered by file size. Maybe this is a very simple query but currrently my mind refuses to work. Seeking F1.

View 3 Replies View Related

Data Warehousing :: Query That Extracts Email Data From A Column

Jun 8, 2015

I have  a column in which Email data is available like 

clicuanan@aspenms.com(M)
jteply@mac.com(M)

How to extract in the below format

clicuanan@aspenms.com
jteply@mac.com
tjones@jpmc.com

View 4 Replies View Related

MSDE And Missing Perf Counters

May 15, 2002

The SQL performance couters do not get installed with MSDE. Does anyone know how to install them??

View 2 Replies View Related

Sql 2000 Perf Monitoring Books

Mar 23, 2004

Any recommendations from DBA's on good books for performance monitoring for sql 2000

View 2 Replies View Related

Perf Tuning Question - Gurus Only

Feb 19, 2004

Howdy

I have a server that has the following average readings :

No. CPUs = 1
% CPU = 2
SystemCPU Queue Length = 2 to 4
SQL Server:Buffer ManagerBuffer Cache = 99.85%
RAM in the box = 1 GB
MemoryPages/sec = 1 to 5
SQL memory in use ( using Task Manager ) = 250 MB
Max worker threads = 255
Average number of connection = 60

So...........all indicators are that the CPU is idling, there is way enough RAM but we still have a ( in theory ) a congested CPU as the queue length is over 2 consistantly. Thing is, I need to work out if the CPU isnt working hard as the queue is long, or whether we can put extra databases/load on the box.

As the max worker threads are greater than number of connections ( 60 vs 255 ) we could reduce these as the number of users doesnt seem to alter much. BUT.......would this make much difference as if the 255-60=195 worker threads arent doing anything much, they shouldnt put any load on the server, right?

Any thoughts much appreciated.

Cheers,

SG.

View 13 Replies View Related

Perf On Read Uncommitted Isolation Level

Jul 5, 2005

Are there really any benefit on using Read Uncommitted Isolation Level or having a NOLOCK hints for retrieve queries when the default Isolation level just Read Committed (not using COM+).  I'm confused why the Community Server uses this technique perhaps for perf issues but I couldn't see any reason why...

View 1 Replies View Related

SSIS Perf Tuning - Tables Of 15M+ Rows

Feb 28, 2008

The challenge: I have to extract and convert data between 2 SQL server systems - only 4 tables on the source systems, 8 tables on the target system. Source tables have between 5,000 rows and 16,000,000 rows. For most of the tables (for example Customer, which goes into 4 target tables), there will be 1 row in target tables for each row in the mapped source system table - so my 13.5M customer rows will end up as around 40M rows across the 4 target tables. So far, so good. But - this is a 24x7 online retail web-site, and to get the data across as a clean process, we require the smallest possible duration.

I have progressed on the customer migration, and am testing on a test environment (2xdual core HT processors, 4 GB ram) which was 2.15 million rows. Live environment is likely to be a 4xdual core with 8-16 GB ram.

I am trying to optimize the extract data flow, and have read the SSISperfTuning doc. I am now trying to put that into practice.
I have a row size of approx 340 bytes, so based on that, and my test environment of 2.15 million rows, I work out at around 700 MB ram required to buffer the data. That is a factor of 7 times greater than the max buffer space for a data flow of 100 MB, which it seems, means I should divide the base MaxBufferRows (10000) by 7 to go down to 1400 rows?

I see a LOT of the following messages in my progress, when running with default settings:
[DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 30 buffers were considered and 30 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.

The design of the data flow at the moment is:


..........................................|--target table 1
SOURCE SP ---- MULTICAST---|--target table 2
..........................................|--target table 3
..........................................|--target table 4

any thoughts on Buffer tweaking, corrections to my assumption and other hints/techniques?


*##* *##* *##* *##*
Chaos, Disorder and Panic ... my work is done here!

View 7 Replies View Related

2005 Perf Much Worse Than 2000... Suggestions Please..

Nov 20, 2006

I have this SP that takes several varchar columns and concatinates them all together then inserts them into a text field. I do this with a cursor which was the quickest way to get it done when it was setup...

However when I moved the process to a 2005 server (on the same physical server) the process drastically slowed down. On 2000 the process took about 7 min to handle all 350k+ rows with the processors hanging around 20-40%... On 2005 it took over 30 min (not sure how long it would take cause I killed the process) and the processors stay above 98%...

I have rewritten the process to use a while loop instead of the cursor (I wanted to do this anyways) and it had no effect. At this rate (about 1 row a second) it will take forever and this process runs everyday.

Any ideas??

Here is the procedure...

declare @srch_field varchar(8000)

declare @row int, @productid varchar(25)

DECLARE @title varchar(150), @actors_keyname varchar(1200), @directors_name varchar(400)

Declare @genres varchar(700), @theme varchar(1500), @type varchar(1500), @studio_desc varchar(100)

DECLARE @media_format varchar(50), @artist_name varchar(100), @dev_name varchar(100)

DECLARE @flags varchar(256), @starring varchar(256), @esrb varchar(100), @esrb_desc varchar(500)

DECLARE @ptrval varbinary(16), @text varchar(max)

declare @productlist table(product_id varchar(25), IDNUM int identity)

insert into @productlist (product_id)

select product_id

from music_load..globalsearch

select @row = @@rowcount

while @row > 0

begin

select @productid = product_id

from @productlist

where idnum = @row

SELECT @title = rtrim(title) ,

@actors_keyname = actors_keyname ,

@directors_name = directors_name,

@genres = genres ,

@theme = theme ,

@type = type ,

@studio_desc = studio_desc,

@media_format = media_format ,

@artist_name = artist_name,

@dev_name = dev_name,

@flags = flags ,

@starring =starring ,

@esrb = esrb ,

@esrb_desc = esrb_desc

FROM globalsearch

where product_id = @productid

Set @srch_field = isnull(@title,'')

if @actors_keyname is not null and @actors_keyname <> 'unknown'

Set @srch_field = @srch_field + ' ~ ' + rtrim(@actors_keyname)

if @directors_name is not null and @directors_name <> 'unknown'

Set @srch_field = @srch_field + ' ~ ' + rtrim(@directors_name)

if @genres is not null

Set @srch_field = @srch_field + ' ~ ' + (ltrim(rtrim(replace(@genres, 0,''))))

if @theme is not null

Set @srch_field = @srch_field + ' ~ ' + (ltrim(rtrim(replace(@theme, 0,''))))

if @type is not null

Set @srch_field = @srch_field + ' ~ ' + (ltrim(rtrim(replace(@type, 0,''))))

if @studio_desc is not null

Set @srch_field = @srch_field + ' ~ ' + rtrim(@studio_desc)

if @media_format is not null

Set @srch_field = @srch_field + ' ~ ' + rtrim(@media_format)

if @artist_name is not null

Set @srch_field = @srch_field + ' ~ ' + rtrim(@artist_name)

if @dev_name is not null

Set @srch_field = @srch_field + ' ~ ' + rtrim(@dev_name)

if @flags is not null

Set @srch_field = @srch_field + ' ~ ' + rtrim(@flags)

if @starring is not null

Set @srch_field = @srch_field + ' ~ ' + rtrim(@starring)

if @esrb is not null

Set @srch_field = @srch_field + ' ~ ' + rtrim(@esrb)

if @esrb_desc is not null

Set @srch_field = @srch_field + ' ~ ' + rtrim(@esrb_desc)

update globalsearch

set srch_field = @srch_field

where product_id = @productid

SELECT @ptrval = TEXTPTR(srch_field),

@text = credits

FROM globalsearch

where product_id = @productid

UPDATETEXT globalsearch.srch_field @ptrval NULL NULL @text

SELECT @ptrval = TEXTPTR(srch_field),

@text = track

FROM globalsearch

where product_id = @productid

UPDATETEXT globalsearch.srch_field @ptrval NULL NULL @text

set @row = @row - 1

end



View 5 Replies View Related

Loading Data Warehouse (Perf. Tuning)

Feb 25, 2008



Hi all,

I'm loading my data warehouse using several SCDs. Some of these SCDs need to occur in sequence, while others can be run at the same time. I'm wondering what the best option for me is in terms of performance. Here is what I was considering:

1) Create a single package. Create two sequence containers --- one that will contain SCD loads that occur in sequence; the other sequence container contains SCD loads that occur in parallel.

OR

2) Create a set of packages for each SCD load. Then create a "Master" package that will use "Execute Package Task" components to call these packages.

The othe reason I want to bring up these difference ways to design an DW Load is because the second option is a "cleaner" approach, or a more organizational approach, to the load. The first option can get quite messy and large if you have several SCDs and several sequence containers. However, I'm looking for the fastest performance. Any thoughts?

View 7 Replies View Related

2005 Developer - Perf Monitor Counter Check Failed Message

Nov 11, 2006

Trying to install 2005 Dev edition in xp pro, sp2.

Performance Monitor Counter Check Failed Error.
Had to stop the install twice at the point it verified what components were to be included as
the docs were not going to be installed. Resolved the issue.

Now it won't pass the system config test because the registry is not the way it wants it.
I look up the messages and the only solution is to hack the registry and risk my system.

This is nuts.

Isn't there some way to restart cleanly without hacking registry keys ?

Help.

View 3 Replies View Related

OLE DB Provider For Teradata

Jun 19, 2008

Hi,

I am trying to query Teradata tables from SQL Server 2005 using SSIS that will populate a table in SS 2005. I have been able to test the connection successfully. I choose the Query option b.c the tables in Teradata are over 100 million records and I don't want to copy those tables locally.

I am able to enter my query and parse it. I can even preview the query and see the results. A very simple query:

select distinct acc_yr
from d4574fdp.clm_dim

But when I try and execute the query, I get the following error message:

quote:
TITLE: SQL Server Import and Export Wizard
------------------------------

Could not connect source component.

Error 0xc0202009: Source - Query [1]: An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "OLE DB Provider for Teradata" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".


------------------------------
ADDITIONAL INFORMATION:

Exception from HRESULT: 0xC0202009 (Microsoft.SqlServer.DTSPipelineWrap)

------------------------------
BUTTONS:

OK
------------------------------


I have the latest OLE DB Provider for Teradata installed. Is there some security setting that needs to be set? Any help would be greatly appreciated.

Thank You.

View 2 Replies View Related

SSIS And Teradata

Nov 30, 2006

Does anyone know if/how SSIS works with Teradata?

View 1 Replies View Related

SSRS And Teradata

Oct 13, 2006

Is anyone out there using SSRS with a Teradata data source?

R

View 22 Replies View Related

Teradata Datapump Issue

Nov 8, 2007

Wether I'm trying to access a datapump task in an imported package in SSIS, or in 2000 enterprise manager

they crash once created. I can not alter or redo mappings or anything else.
After enterprise manager churns for 5-10 min it crashes.

SSIS partially crashes: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. After this I can no longer click on anything and have to end task on Visual studio.

I can do a disconnected edit on the package in enterprise manager and alter the statement, but as far as I know you can't really change much as far as mappings back in there.
This also means that my 2000 packages have been limited to 1 teradata pump per package.

This happened on my laptop, 2 coworker's desktops, and now that i've migrated to a new high end developer box it's happening on it also.

Any feedback/ideas?

View 1 Replies View Related

Teradata Data Source

Jan 14, 2008

I'm trying to convert old DTS packages over to SSIS and also create new SSIS packages that use teradata as a warehouse source for many different departments.

So far i've tried what I can find here on the forums:

using special syntax and wrapping the sql command in an alias/variable name

Methods i've tried:
using an ODBC connection string that works just fine in VB for pulling data. - test connection succeeds but when I select data reader source I get:

Error at Data Flow Task [DataReader Source[185]]: Cannot acquire a managed connection from the run-time connection manager

Using an OLE DB connection where I get the standard error many have gotten and I tried the listed use special syntax and wrapping the sql command in an alias/variable name from :
MSDN
Error at Data Flow Task [Source - Testing]: An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record is available. Source: "OLE DB Provider for Teradata" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Currently i'm testing/trying this on a very simple view that produces 2 columns and 173 rows of string data.
I have not tried the .net provider and am reluctant to accept that as the only solution if it works as it would involve permissions issues and red tape here at the bank and is a less user-friendly solution for my dev team.

Any ideas?

Driver version on my machine and likely the other 5 that would need this to work:
3.03.00.01 12/17/2003

View 1 Replies View Related

SSIS - Teradata Connectivity

Nov 21, 2006

Hi,

How do I connect from SSIS to TeraData database. Is there any OLEDB providers available for evaluation.

If any of the experiences can be shared on this, it would be greatly helpful.

Thanks,

S Suresh

View 1 Replies View Related

IA64 Teradata OLE-DB Or ODBC

Sep 5, 2006

I have a 32 bit development/playbox SQL 2005 SSIS that can see and run 32 bit OLE DB drivers from Teradata. Granted I have the same issues as others where a table or a view would work and not a query or a variable query. The alternative is to use Linked Server and OPENQUERY.

But our real development box is a 64 bit IA 64 using 64 bit SQL 2005. Non of the Teradata.com drivers would work/show up in the dropdown box of the Connection Manager since they are 32 bit. One option is to use the 64 bit Teradata ODBC drivers and then use the .Net ODBC connection instead. Linked server doesn't work either because the SQL 2005 is 64 bit and it is looking for 64 bit drivers.

Has anyone have any experience with 64 bit SQL 2005 and Teradata?

Looked at other 3rd party vendors such as ETI but their documentation excludes IA 64 (Itanium).

Planning to try again using the 64 bit ODBC and .Net Adapter but I'm interested to hear if someone (MSFT & NCR) would have a real solution for 64 bit.

Thanks

Anatole

View 4 Replies View Related

Passing Native SQL To Teradata

Aug 7, 2007

I want to add a step to an SSIS package to passthrough native Teradata sql statements so that they can be executed on Teradata. Does anyone know if I can use an "Execute SQL Task" to accomplish this? It should be very similar to passing native Oracle SQL to an Oracle server.

View 6 Replies View Related

Connectivity To Teradata Through Linked Server

Feb 22, 2005

I've been informed that our data feed systems are going to be shut down and that our data will be residing in Teradata V2R5 databases. I was wondering if anyone knows if you can get connectivity to a Teradata server via Linked Server in the Security folder of SQL Server Enterprise Manager? If no then; Is it possible through a DTS package? Thanks for the info...

View 2 Replies View Related

SQLServer To Teradata Data Load Via DTS

Feb 24, 2004

I am extracting data from an SQLServer database to load into Teradata using DTS. The performance is abysmal. The same data in a text file loads quickly via multiload. I can move the data to other DBMSs via DTS quickly as well. Is there some way for me to improve the elapsed time/performance while using DTS? If not, what is the best way to move data from SQLServer into Teradata?

View 1 Replies View Related

Teradata Database - WHERE Clause In EXCEL

Jul 29, 2013

Still having a problem with my SQL WHERE clause…

I have three variables on an Excel form that connects to a Teradata database. The first variable is a date format (DateworkedF and DateworkedT) the other two are text fields. (StatusX and ErrorTypeX)

I want to be able to search on any or all of these fields. (If the field is blank return all values)

Query = "SEL SRN_ACCT_NUM, QUEUE_NAME, ERROR_TYPE, SUB_ERROR_TYPE, DATE_WORKED, MONTH_WORKED, DATE_APPLICATION_RECEIVED, ASSOC_WORKED, ACCT_ID, STATUS, COMMENTS, REVIEWED_IND, REVIEWED_AGENT, LOAD_DT " & _
"FROM UD402.JD_MCP_MASTER WHERE " & _
"(DATE_WORKED >= #" & DateworkedF & "# Or #" & DateworkedF & "# IS NULL)" & _
"AND (DATE_WORKED <= #" & DateworkedT & "# Or #" & DateworkedT & "# IS NULL)" & _
"AND (STATUS = '" & StatusX & "' OR '" & StatusX & "' IS NULL)" & _
"AND (ERROR_TYPE = '" & ErrorTypeX & "' or '" & ErrorTypeX & "' IS NULL);"

View 5 Replies View Related

Linked Server 2005 SQL Sent To Teradata

Sep 17, 2007

I am trying to set up the Linked 2005 Server to Teradata v2r5, and have got most of the way there. My problem is that using both methods (OPENQUERY and SELECT with 4-part naming string) seems to submit only the SELECT * part to Teradata with the rest of the SQL waiting until the data is all pulled back to SQL Server. This is a problem when working with big tables! It puts a huge strain on the network, and also runs me out of temp space in Teradata. Is there a way to get ALL the SQL passed to Teradata? I am looking to create a Report Model for Report Builder or Pro-Clarity. I have a dummy SQL Server 2005 database set up with views of the Teradata tables through the Linked Server. Any help would be greatly appreciated!

View 1 Replies View Related

Teradata Data Extension Question

Jun 16, 2006

I recently installed the Teradata .NET data provider and I am trying to use named parameters and multi-value parameters in my SQL.  I am finding information on this topic hard to come by.
 
In order to get parameters to work do I need to write my own Teradata data extension from scratch, expand an existing data extension to work with Teradata, or can I simple edit my .config files to point to a generic extension wrapper already existing?
 
Thanks!

View 3 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved