DTS/Async Stored Procedure/Import Huge Data

Jul 20, 2005

I have a table which contains approx 3,00,000 records. I need to
import this data into another table by executing a stored procedure.
This stored procedure accepts the values from the table as params. My
current solution is reading the table in cursor and executing the
stored procedure. This takes tooooooo long. approx 5-6 hrs. I need to
make it better.

Can anyone help ?

Samir

View 2 Replies


ADVERTISEMENT

How To Optimize Data Import With Huge Volumes And Joins Across Data Sources Not All SQL Server Based?

Jun 7, 2006

I need to periodically import a (HUGE) table of data from an external data source (not SQL Server) into SQL Server, with the following scenarios:
Some of the records in the external data source may not exist in SQL.Some of the records in the external data source may have a different value at different imports, but this records are identified univocally by the same primary key in the external datasource and in SQL Server.Some of the records in the external data source may be the same in SQL.

Due to the massive volume of the import, I would like to import only the records which are different from what I have in SQL Server (cases 1 and 2 above). In fact case 2 is the most critical.

I thought of making a query with a left outer join between the data in the external data source table (SOURCE) and the data in the SQL Server table (DESTIN). The join is done on the respective primary keys (composed keys of up to 10 columns) and one of the WHERE conditions will be that the value in SOURCE is different from the value in DESTIN.

The result of this query would be exactly what I need to import.
How to do this in SSIS??? I couldn't figure out how to join tables in different data sources yet.

In fact I cannot write a stored procedure to do that, since one of the sources is in a datasources not SQL Server.
I have seen the Lookup transformation in this article http://www.sqlis.com/default.aspx?311 but this is not exacltly what I want to do.
Another possibility is to use the merge join, but due to the sorting I believe its performances would be terrible!

Thanks in advance for your suggestions!

View 9 Replies View Related

Stored Procedure Update HUGE PROBLEM

Nov 8, 2005

Hi,I need help on this one. For the past two days, whenever I make amodification to a stored procedure using enterprise manager (Apply),the stored procedure stops working.If I copy it under a new name, it works but as soon as I make amodification, it stops working. I am going crazy on this one.The error : wrong column name. He doesn't recognize the column name ona very straighforward line : SELECT @SQL1 = 'SELECT @Total1 = Count(*)FROM dbo.Tbl_Report WHERE Utilisateur = "'+@utilisateur+'"For example 'sa' is not a column (it skips Utilisateur as the columnname).Very strange. Never had this problem in the past. thank you very much.

View 4 Replies View Related

Stored Procedure Vs SQL Huge Difference In Execution Time

Jul 23, 2005

I have a Stored Procedure (SP) that creates the data required for areport that I show on a web page. The SP does all the work and justreturns back a results set that I dump in an ASP.NET DataGrid. The SPtakes a product area and a start and end date as parameters.Here are the basics of the SP.1.Create temp table to store report results, all columns are createdthat will be needed at this point.2.Select products and general product data into the temp table.3.Create a cursor that loops through all the products in the temptable, running a more complex query with each individual product.4.The results of that query are updated on the temp table based on thecurrent product of the cursor.5.A complex "totals" query is run and the results from that areinserted into the temp table as the last 3 rows.In all we are talking about 120 rows in the temp table with 8 columnsthat are mostly numbers.I originally wrote this report SP about a month ago and it worked fine,ran in about 10 - 20 seconds based on server traffic and amount ofdata in the temp table. For the example I'm running there are the120 products.Just yesterday the (SP started timing out and when I ran the SPmanually from Query Analyzer (QA) (exec SP_NAME ... ) with the sameparameters as it was getting in the code it took 6 minutes to complete.I was floored. I immediately copied the SQL out of the SP and pastedinto another QA window, changed the variables to be hard coded valuesand ran it. It completed in 10 seconds.I'm really confused now. I ran a Profiler on the 2 when I ran themagain. The SQL code in QA executed again in ~10 seconds with 65,000reads. When the SP finished some 6 minutes later it had completed witthe right results but it needed 150,000,000 reads to do its job.How can the exact same SQL code produce such different results (time,disk reads) based on whether its in a SP or just run from QA but stillgive me the exact same output. The reports both look correct and havethe same numbers of rows.I asked my Sys Admin if he had done anything to anything and he saidno.I've been reading about recompiles and temp table indexes and allkinds of other stuff that could possibly be affecting it but havegotten nowhere.Any ideas are appreciated.

View 5 Replies View Related

Import Of Huge XML File

May 22, 2013

I have an xml file of 44 Gb (Not Meg, its really GB) . Delivered by the Danish custom authorities.

My problem is simple - How to import such a beast? I have seen a limit of 2.1 Gb everywhere.

View 9 Replies View Related

IMPORT New Data Since Last IMPORT - DTS/Stored Procs?

Jan 7, 2004

Hello:

I am not sure how to implement the following, but I believe it entails using DTS, and hopefully it is fine that I post it here b/c ultimately I will need this backend data for my frontend .aspx pages:

On a weekly basis, I need to IMPORT some data located on a remote Oracle DB into SQL Server 2k. Since there is so much data to transfer, I would only like to transfer the data that is new to the table since the last IMPORT, i.e. a week ago and leave behin the OLD data.

Is DTS the correct way to go or do I have more control via DTS with STORED PROCEDURES? Does anyone have any good references for me?

On a similar note, once this Oracle data is IMPORTED into a certain table, I would like to EXPORT some of these NEWLY acquired rows matching certain criteria into another table for auditing purposes. For this scenario, should I implement a TRIGGER UPDATE event here on the first table?

Any advice will be greatly appreciated!

View 3 Replies View Related

Import From Excel In Stored Procedure

May 2, 2007

I'm trying to automate a data entry process. We get annual or semi-anual fee schedules from our clients. Sometimes it's an excel file, sometimes it's csv or other text, sometimes it's from the web. We clean up these files a bit and then import them to a new table in a sql server database. Then someone writes a custom insert to take specific columns from that temp table, do some transformations, and put them in the correct place in the normal database. Then the imported table is deleted or archived. We'll still have to do the clean up phase, but I should be able to automate most everything after the import.

What I want to know is, can I pass a filename to a stored procedure, along with perhaps a few other parameters, and have the procedure import data from that file? If so, how would I go about it?

View 2 Replies View Related

Import/Export CSV File Via Stored Procedure

Jun 12, 2008

Reading through the forums, I found some great imformation for importing/exporting an excel spreadsheet via a stored procedure, however, the amount of data I have won't fit in excel. Can someone help me Import a CSV file via a stored procedure? and Export a CSV file via a stored procedure? I don't know if XML is the answer or if there is another way. For many reasons, I don't want to use DTS or Bulk. We are currently using SQL Server 2000.

Thank you,
Stacy



Thanks,
Stacy

View 1 Replies View Related

T-SQL (SS2K8) :: One Stored Procedure Return Data (select Statement) Into Another Stored Procedure

Nov 14, 2014

I am new to work on Sql server,

I have One Stored procedure Sp_Process1, it's returns no of columns dynamically.

Now the Question is i wanted to get the "Sp_Process1" procedure return data into Temporary table in another procedure or some thing.

View 1 Replies View Related

Stored Procedure - Import CSV File Into Table In Excel

Mar 3, 2014

I have the following code to import .csv file into my table in excel. It's being inserted into a table. dbo.ImportedPromoPricing. Table and the .csv file have 3 fields price, code and selling price.

Once import is completed I want to use the data in my dbo.ImportedPromoPricing to update another table dbo.MasterPricing. Records need to be compared and updated or appended if needed. in case of update only price will be updated. this is the beginning of my code

USE [Reporting]
GO
/****** Object: StoredProcedure [dbo].[ImportPromoPricing] Script Date: 03/03/2014 14:04:01 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON

[Code] .....

View 2 Replies View Related

Stored Procedure To Import Excel Sheet Content To

Oct 17, 2007

Hi,
I have created a stored procedure which imports the contents of the excel sheet into the sql server. I am using the syntax as
"declare cur_Inoperative cursor for Select * from OPENROWSET ('Microsoft.Jet.OleDB.4.0', 'EXCEL 8.0; Database=D:Nupur_GRAPV2.0Inoperative.xls',Sheet1$) order by 1".
as of now I am hard coding the database path inside the cursor, but i need to pass this path as stored procedure input parameter. Passing the path is working but i am not able to use that passed parameter inside the cursor. can anyone help me in replacing the actual path with the input path variable.

View 2 Replies View Related

How Can I Assign A Stored Procedure As Cursor's Data Source In AStored Procedure?

Oct 8, 2007

How can I create a Cursor into a Stored Procedure, with another Stored Procedure as data source?

Something like this:

CREATE PROCEDURE TestHardDisk
AS
BEGIN

DECLARE CURSOR HardDisk_Cursor
FOR Exec xp_FixedDrives
-- The cursor needs a SELECT Statement and no accepts an Stored Procedure as Data Source

OPEN CURSOR HardDisk_Cursor


FETCH NEXT FROM HardDisk_Cursor
INTO @Drive, @Space

WHILE @@FETCH_STATUS = 0
BEGIN

...
END
END

View 6 Replies View Related

Data Access :: Importing Huge Data From One Database To Another Daily

Jul 7, 2015

We have a daily process, which copies millions of rows of data from one DB to another over Linked Server. Just checking on the best practise, are there more efficient ways than the Linked server to copy millions of rows of data from one DB to another? I checked bulk insert but that transfers the data from the file to DB not DB to DB. 

View 6 Replies View Related

Import Data From Text File Into A Temp Table In Stored Proc

Oct 1, 2001

Hey,
can one of you please show me how to import data from a text file into a temp table in a stored proc.
thanks
Zoey

View 1 Replies View Related

Getting Data From A Storeed Procedure In A Stored Procedure

Jul 23, 2005

What I am looking to do is use a complicated stored procedure to getdata for me while in another stored procedure.Its like a view, but a view you can't pass parameters to.In essence I would like a sproc that would be like thisCreate Procedure NewSprocASSelect * from MAIN_SPROC 'a','b',.....WHERE .........Or Delcare Table @TEMP@Temp = MAIN_SPROC 'a','b',.....Any ideas how I could return rows of data from a sproc into anothersproc and then run a WHERE clause on that data?ThanksChris Auer

View 4 Replies View Related

Huge Deletes In A Huge Table

Apr 3, 2000

SQL 7 SP1 NT4 SP5

I have a TRANSACTION table with 150 million rows.

I have a USER table.

Each user has about 600 records in the TRANSACTION table.

The TRANSACTION cluster index is on USERID + RECID . The second index is on USERID + Fieldx + Fieldy.

The TRANSACTION table gets about 1.4 million inserts in a normal day and about 40,000 updates.

I want to go through the USER table and delete all users who have not visited me in a while.

I want to do this without substantially hindering performance in a production environment. I can perform this over a week period or two if needed.

The best way I thought of doing this was to grab x amount of users in a cursor and loop through deleting their corresponding TRANSACTION records.

Does anyone have any ideas on a better way. What is going to happen to my indices during this time ?

Thanks !!!

View 3 Replies View Related

Huge Data Insertion

Jun 12, 2006

Hi,

i have 4 tables, each consist of app. 10000000 rows.They have same columns (fTime[datetime] and bid[money]).What i wanna do is to collect all of datas into one of the tables, in ascending order by fTime.

PS i wanna do it as fast as possible as well

View 1 Replies View Related

Problem With Huge Amount Of Data

Nov 20, 2006

Hi,I've an application, lets call it simply "A", which creates in a Microsoft Sql Database two huge tables.Lets call them "table1" and "table2"It safes really much data into this tables.After application "A" has finished another application is executed which deletes this two tables.Then application "A" is started again and it will create this two tables again, but the amount of data becomes bigger.It can only proceed if the tables were deleted completely before and the database is empty. This is the procedure which I repeat very often, but everytime the amount of data becomes bigger (table1 and table2 becomes bigger).A couple if times it works fine, but once it seems data becomes too big and application "A" fails. Mostlikely because the data wasnt removed correctly / completely.   This is my code of deleting the two tables, maybee there is something I have to change:</p><p>&nbsp;try&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; {&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; SqlConnectionStringBuilder builder =&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; new SqlConnectionStringBuilder("Server=mycomputerdbname;Integrated Security=SSPI;" +&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "Initial Catalog=testing");&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; builder["Server"] = "(local)dbname"; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; builder["Connect Timeout"] = 10;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; builder["Trusted_Connection"] = true;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; builder["Initial Catalog"] = ((ComponentConfiguration)this.componentConfig).Persistency.DatabaseName;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; SqlConnection sqlconnection = new SqlConnection();&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; sqlconnection.ConnectionString = builder.ConnectionString;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; sqlconnection.Open();&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; SqlCommand cmd1 = new SqlCommand("DROP TABLE table1");&nbsp; // TO Do delete all tables&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; SqlCommand cmd2 = new SqlCommand("DROP TABLE table2");&nbsp;&nbsp; // TO Do delete all tables&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; cmd1.Connection = sqlconnection;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; cmd2.Connection = sqlconnection;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; cmd1.ExecuteNonQuery();&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Thread.Sleep(7000);&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; cmd2.ExecuteNonQuery();&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Thread.Sleep(7000);&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; sqlconnection.Close();&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; Thread.Sleep(3000);&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; }&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; catch { }</p><p>&nbsp;</p><p> Thanks for help! mulata 

View 6 Replies View Related

Loading Huge Volume Data

May 31, 2007

Hi Good morning to all,



My day started with loading huge volume of data and my data flow task failed to do so.



My data flow has a flat file connected to a OLEDB target. This is a one to one mapping. My source file contains 50 lac records and it is of 500 MB in size.



I'm processing the data with all the default buffer settings. I have 4 CPUs in my server.



the system process DTSDebug.exe is utilizing more than 2GB page size. My average CPU usage being 70% when one of those CPU s is hitting 100% utilization.



I'm very new to SSIS. So, please provide me some info how do i set my buffers and do we have any PDF for performance and tuning in SSIS ?



Do we have any bulk load transformation in SSIS to load into DB2UDB ?



If so how do i get it installed?





Thanks in advance,

Suresh N

View 2 Replies View Related

Tranferring Huge Data To Various Tables

Mar 13, 2007

Actually in my transformation i am transferring huge amount of data.

i have been using oledb command finally to dump my incoming data to respective tables.

For Example :

if you have two tables

table 1,table 2

in my incoming data i have a lookup and check for two unique columns with that of the unique columns in the table 1.if the record does not exsist i try inserting a record into table 2 and get the unique filed of the record and store that in particular column of table 1.

the data is very large an is this the better why or any suggesstions do let me know..



View 5 Replies View Related

Performance Issue With Huge Data

Mar 20, 2007

Hello All,

I am using SSIS to transfer data between two SQL Servers (2000). There is no transformation involved as the source and destination table structure is same. Even then the package execution takes lot of time.

The data in the tables is of the order of 66000000 the we were required to kill the package execution after it took more than 24 hours. The CPU usage was more than 13000s and disk I/O was well above 330000000. I am new to the tit-bits of SIS. Can anyone please tell me the reason as to why the package has gone so resource hungry.



Thanks in advance,

Atul

View 3 Replies View Related

Cannot Export To Excel With Huge Data

Aug 19, 2007

Hi All!

I have an Issue.

I am calling a rdl file through the Url and i am passing the Format=Excel in the Url.

Eg. http://harinarayana/ReportServer/......&Format=Excel&...........

If the data is around more than some 20000 records, its not able to export and

a Error like "The Service is not available " is being displayed.

Does anyone have any solution for scenarios like this? It would be of great help to me.

Regards

Hari

View 1 Replies View Related

How Do I Call A Stored Procedure To Insert Data In SQL Server In SSIS Data Flow Task

Jan 29, 2008



I need to call a stored procedure to insert data into a table in SQL Server from SSIS data flow task.
I am currently trying to use OLe Db Destination, but I am not sure how to map inputs to OLE DB Destination to my stored procedure insert.
Thanks

View 6 Replies View Related

Urgent! Please Help! Deleting Data From A Huge Table

Mar 16, 2004

I have a huge table with 4 primary keys on it. I need to delete the data from this table ( approx. 5.6 millions records to be deleted). It takes a hell lot of time to delete it by normal query.
Can someone please suggest me a better way?
Any help will be appreciated.

View 14 Replies View Related

Select Data From Huge Fact Tables

Oct 12, 2006

Hi,

I have a situation where I have 4 tables:

1. 2 Dimensional tables(Parent), DIM1 with 50000 rows, and DIM2 with 1000 rows

2. Fact 1 with 50 columns, 25 Million rows and with FK to DIM1 and DIM2

3. Fact 2 with 40 columns, and 25 Million rows and with FK to DIM1 & DIM2 tables.

Actually the fact 1 and fact 2 have same related data but since our Analysis cube person wanted the fact table not to have more than 50 columns we divided the tables into 2, but they have the same compound key.

Above said, I have a situation where I have to select all the columns, in both fact tables, and do a group by. I wrote the query and ran "Analyze Query in the Database Engine Tuning Advisor" for it. It gave bunch of recomendations about the statistics and indexes which I created. When I executed the query the result came up in matter of seconds, which was good.

In the query I had a condition having MarketName='Bridgeview' and DateID = 344 (FK of today-1).

When I wanted the data for last 30 days I changed to DateID in ( > FK of today -32 and < FK of today), the query responded and worked fine.

But when I changed the query to get MarketName='Aurora' (other than I used when I ran Tuning Advisor), the result returned is empty set. When I removed the MarketName condition, it is supposed to return all markets' data, but it returns only Bridgeview data.

I know the data is in the table for all markets, since reports are rendered from these fact tables for all of these markets(also ran queries to check the fact table data).

I am unable to point out the reason why the query behaves like this. It responds to the date change, but not to the MarketName change.

I really appreciate if anyone can help me point out the problem.

Thanks,

Venkat

View 3 Replies View Related

Huge Volume Of Data Loading Issue

Aug 21, 2007

Hi all,

I've faced a problem with the below error when I load 1.5m data into oracle database.


The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers.

Please help. Thanks

View 19 Replies View Related

ETL Delta Pulling Huge Data.. Right Approach ?

Dec 3, 2006

Hi all,

In an approach of building an ETL tool, we are into a situation wherein, a table has to be loaded on an incremental basis. The first run all the records apporx 100 lacs has to be loaded. From the next run, only the records that got updated since the last run of the package or newly added are to be pulled from the source Database. One idea we had was to have two OLE DB Source components, in one get those records that got updated or was added newly, since we have upddate cols in the DB getting them is fairly simple, in the next OLEDB source load all the records form the Destination, pass it onto a Merge Join then have a Conditional Split down the piple line, and handle the updates cum insert.

Now the question is, how slow the show is gonna be ? Will there be a case that the Source DB returns records pretty fast and Merge Join fails in anticipation of all the records from the destination ?

What might be the ideal way to go about my scenario.. Please advice...

Thanks in advance.

View 13 Replies View Related

Max Async Io

Jan 25, 2001

In SQL Server 7.0 there was option that allows you to set 'MAX async io', but it no longer available in SQL 2000.

Does anyone know why this option is not available or did Microsoft replace this function with something else?

Thank You,
John

View 1 Replies View Related

Max Async Io

Jan 25, 2001

In SQL Server 7.0 there was option that allows you to set 'MAX async io', but it no longer available in SQL 2000.

Does anyone know why this option is not available or did Microsoft replace this function with something else?

Thank You,
John

View 1 Replies View Related

Implementing Multiple Databases Due To Huge Data Size?

Aug 30, 2005

Has anyone implemented split data for an application between two databases because the data size is extremely large? If so could you please point me to relevant information.In this split data scenario, a table will automatically carry over to another database whenever the size limit for the current database is reached. The challenge is here for the DAL (data access layer) to automatically look into the appropriate database when the next row of data is in another database. OR Perhaps there is another solution to this terasize data problem..Any help on  this would be greatly appreciated.

View 8 Replies View Related

Puting Huge Chunk Of Data Into Database? Workable?

Jun 14, 2006

Hi.
I am trying to put a hugh chunk of text into my database for example information to a particular product which has more than 2000 characters. I had saw this datatype "nvarchar(MAX)" in SQL Server 2005 and was wondering if i can use this to store my text.
Thanks

View 1 Replies View Related

Move Current Huge Data File To New GPT Drive?

Oct 31, 2015

I have a database data file almost at 2tb maxing out a windows drive. Only 16gb left. Should I just add another data file on another Windows drive for growth? Or just move current huge data file to a new GPT drive? Or do both adding another data file and moving existing to its own new GPT drive?

Primary objective is to make do for now.

View 1 Replies View Related

Reporting Services And Huge Data Extracts Causes IIS To Use A Lot Of Memory

Dec 30, 2007

Hi.

I am working on a serial tracking application using Sql Server 2005 and .Net. One of the requirments is to have an ad-hoc file export utility in which users can drag-n-drop fields from a set of tables and export the results to CSV. It all sounds ok and Sql Server Reporting Services' Report Builder seem to be just the right tool for it, but there is one problem :
The report size is big, about 7K - 8K pages and 4 - 5 columns wide; while rendering the report, IIS memory usage shoots up to about 2GB and remains at about 2GB.
Any idea if something can be done to mitigate this problem? Note that I dont need the HTML rendering at all. All I need is to have the CSV at the end of the day, while users are able to chose columns in an ad-hoc manner.

View 7 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved