Solution Design For High Volume Of Data

Apr 18, 2006

Hi,

I have been asked to design a solution for a client of mine who basically requires the daily analysis and reconciliation of the differences between 2 extremely large text files.

The files are not in an identical format but are both in some form of delimited format (one is CSV, the other is a little more complex). For the sake of this question, let's assume that I can effectively import each file into an MS SQL table.

Each file will have in excess of 100,000 rows each day (new data for each day).

Whilst I know that MS SQL does easily have the capacity to store the data, is there a recommended way to tackle the potential problems (I imagine that performance is important... they will be running the report every day)

Or is building the solution as simple as importing the data into 2 tables, and then querying the differences and outputting as a report using Crystal?

Any suggestions appreciated.

Thanks

Rael

View 10 Replies


ADVERTISEMENT

Handling High Volume Data

Apr 18, 2007

Hi Guys,

I am facing problems with concurrent access in SQL Server 2000,The scenario is that the DB contains one huge de-normalized table containing 40 million records.

The application frequently queries this table to populate other derived tables,the sql queries take a long time to return results.

So if one query is in execution the other user's query goes into a
wait mode.Please suggest how I can better this.

Or do I need to upgrade to 2005.

Regards,
Prashant

View 2 Replies View Related

High Volume I/O!!!

Feb 28, 2008

I have a summary table with a 9 field composite primary key. Every 10 minutes, my system generates 2 files of 500,000 to 750,000 rows to be summarized into this table. I first Bulk insert those into a temp table, and then trigger an inner-join update query to do the updates, followed by a left-outer join to do the inserts. As the day goes on, millions of rows in my summary table, this process is too slow. Any ideas about causes/solutions???

RLiss

View 2 Replies View Related

High Volume DB Performance Problems

Mar 19, 2008

Hello ,i am a master student and i am making a seminar about high volume DB performance problems ,example : if i have a table with length of 1000000 record and this length is growing exponentially by the time,what the problems may i face in insertion ,deletion , search,in such table?? and what the problems in processing such DB in general

View 1 Replies View Related

MSDE For Large Volume And High No. Of Users?

Sep 21, 2005

Hi,We need to use a free database for a project because of tight budget.Is MSDE ok for handling large volume of data and 70 - 80 users?My understanding is that MSDE is optimized for 5 concurrent users.Is MySQL better than MSDE?Thanks,Ben

View 1 Replies View Related

High Volume Database Query Optimization

Dec 10, 2007

Hello every body

i am doing a research about high volume database treatment (maybe a database with tera bytes volume) , so is there any optimization or specialization for queries deal with such database? !!

View 5 Replies View Related

Performance Tuning On High Volume Server

Jan 4, 2007

I am looking to improve the performance of my sql server databases.

I currently have a dual location system, the database server setup is basically a quad xeon with 4gb at my office and a double xeon with 4gb at a remote webhosting location. There are separate application/web/intranet servers at each site. The two databases servers are replicated with the local server publishing to the remote server.

The relational database holds circa 26 million records, growing by a volume of 10,000 per day, there are approximately 50,000 queries performed per day.

My theory is that the replication of the two databases is causing a slowdown; despite fast network connections (averaging 200ms between servers) the replication seems to place a large load on the local server. Would it be sensible to replicate to a second local server and then replicate to the remote server, placing any burden on the second server?



I am planning to upgrade the local server to a high capacity 4+ cpu 64bit server, my problem is that although I have noticed a slow down in performance over time, I am unsure how to go about measuring and quantifying this in order to diagnose the bottlenecks and ensure that investing in a new server would be worthwhile. Where would one be best advised to start this project?

View 5 Replies View Related

SQL Server 2008 :: High Volume Reads And Writes?

Jul 6, 2015

We are in the process of moving existing clustered SQL server databases to AWS. There is one major database that has intensive reads and writes transactions. I'm wondering what is the best design to optimize the performance for both R/W since we have constant issues historically with the current environment when massive updates are happening. Reads shall have higher priority over writes.

View 2 Replies View Related

[Q] High Transaction Load Solution?

Nov 10, 2001

Hey guys,

I orignally wrote a post here regarding some info on setting up a cluster. Upon further analysis of the problem with our system, I noted that at particular times we have tremendous amounst of Update, Insert, Delete etc, transactions hitting out database.

I thought originally SQL Clustering could solve this problem, but the time and upkeep that will be required to maintain such a configuration might not be feasible and more importantly it may not even fix the problem.

Next week I plan on doing some more specific performance monitoring off the database during normal business activity, but my initial suspicion is that there is a tremendous amount of I/O processing due to the high transaction load which is slowing down the application.

I was wondering what you have done to alleviate such problems? One of the solutions I have come up with is to possibly create a Master/Slave SQL Server design where the Slave handles most of the database transactions and then at a low load during the day update the Master DB. How does this sound? Any other ideas would be greatly appreciated...

Thanx

View 1 Replies View Related

Design Solution Required

Nov 29, 2006

We are facing design issues, Could you please advice us how to proceed?Problem description: Web App will pass a complex dynamic SQL query tobackend and it should return result set as fast as it canIssue 1: SQL query will have lot of JOINS and WHERE clauseIssue 2: Each Table contain millions of recordsRequirement: Turn around time of the SQL query should be as far aspossible minimum.Could you please advice us which technology we should use, such thatusers get the resultset in few seconds.We are Microsoft Partner. We use only Microsoft technology for ourproduct development.Your Help is much appreciatedWith RegardsS a t h y a R

View 2 Replies View Related

High Availibility SQLServer2005 Infrastructure And Design

Mar 14, 2008


Hi,

We are stuck deep into a problem and need real help from you all.
The senario goes this way, We have a OLTP Database distributed on a ActiveActive cluster infrasture(Storage on SAN) with very high transactional rate CPU Utilization is above 60% and RAM Utilization is +7 GB.

We have 8 centers geographicaly distibuted and each center have there seperate Database, we need to setup a toplology in such a way that my all database will be in sync with each other without any dataloss.


All database have a common schema and table have identity values set to auto increment.

On setting up any replication we face conflict issues.

Please share your ideas how can i sync my all databases with eachother.

Regards
Sufian







View 8 Replies View Related

Loading Huge Volume Data

May 31, 2007

Hi Good morning to all,



My day started with loading huge volume of data and my data flow task failed to do so.



My data flow has a flat file connected to a OLEDB target. This is a one to one mapping. My source file contains 50 lac records and it is of 500 MB in size.



I'm processing the data with all the default buffer settings. I have 4 CPUs in my server.



the system process DTSDebug.exe is utilizing more than 2GB page size. My average CPU usage being 70% when one of those CPU s is hitting 100% utilization.



I'm very new to SSIS. So, please provide me some info how do i set my buffers and do we have any PDF for performance and tuning in SSIS ?



Do we have any bulk load transformation in SSIS to load into DB2UDB ?



If so how do i get it installed?





Thanks in advance,

Suresh N

View 2 Replies View Related

Transaction/Data Volume Vs SQL Edition

Nov 15, 2007

I am in the process of choosing between either SQL Workgroup or Standard Edition. I see the differences in features on the comparison table, but do not see any references to the differing capabilities in handling transactions.

Is there any differences between Workgroup and Standard in terms of handling transaction/data capabilities? i.e. Does Standard have the superior capability in handling X times more TPMs than Workgroup?

If not, am I correct to assume that this is totally determined by hardware configuration (# of CPUs, processor speed, HD speed, RAM) ?

If the data volume / transactions handling is solely determined by hardware configuration, and I know the # of transactions and amount of R/W per second, .......where would be a good reference to know what kind of hardware configuration I need (ideally, once I know the hardware configuration, I guess I would be able to determine I need Workgroup or Standard)

Thanks in advance,
benbry

View 3 Replies View Related

Huge Volume Of Data Loading Issue

Aug 21, 2007

Hi all,

I've faced a problem with the below error when I load 1.5m data into oracle database.


The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers.

Please help. Thanks

View 19 Replies View Related

High Level && Detail Level Design Documents

Nov 19, 2007



Hi,
AM in need of SSRS 2005 design documents for a project purpose. Can somebody let me know where can i find these documents? Thanks in advance

View 1 Replies View Related

T-SQL (SS2K8) :: Measuring Volume Of Data Created Temporarily To Replace Usage Of Physical Tables In Query

Sep 12, 2014

How I can measure the volume of data created temporarily to replace usage of physical tables in an SQL query.

View 1 Replies View Related

High Availability To High Protection Without Reconfiguring Mirroring

Apr 23, 2007

Hi,

Is there a way to configure mirroring to go from High Availability to High Protection without having to reconfigure Database Mirroring? Using the interface in Management Studio, I can change the configuration option to High Performance, but not High Protection despite both of them being Synchronous.

If not, what are the recommended steps to configure the mirror once it already has been configured? Is just like initially setting up the mirror or would there be any shortcuts I could take? If I stop the mirroring and remove the witness, will the High Protection option be available?

Thanks,
J.

View 3 Replies View Related

High Safety Changed To High Performance After Fail Over ?

Mar 6, 2008

Hi There

I realise this is a stupid quesiton but i cannot really find any confirmation of this in BOL.


If you are running High Safety with automatic failover, when failover occurs does this automatically change to High Performance mode. SInce for failover to occur something has happen with the primary , it will be impossible to commit transactions on the new primary and mirror asyncronously since 1 of them is no longer available.


So am i correct in assuming that automatic failover also automatically changes the mode to High Performacne for that session?

Thanx

View 4 Replies View Related

Tranform Large Volume Of Data. Sholuld It Be Processed Chunk By Chunk?

Dec 16, 2007



Hi,
I have to transform about 60 millions of data and it runs so slow that it never finishes in my testing. Should I have to process it chunk by chunk? Or is there any other techniques I can use (I am using data flow task). Thanks for advice.


View 12 Replies View Related

Looking For A Good Data Migration Solution

Oct 24, 2006

Hi,

I am trying to create a process that will take data source that has been output from a proprietary ISAM database to import into a SQL database. This particular ISAM system cannot be accessed via OleDB or ODBC. The thing is I want the process to be able to create the required tables based on data structure information that has been somehow encoded into the data source.

Currently the solution I am going with is to spit out a CSV file that has a header with table and data format information followed by rows of actual data that gets parsed by a SQL script however I am sure that Microsoft must have some kind of preferred solution to this kind of problem but I have not been able to find it. I have looked at the the tools that are available when creating a SQL Server DTS package as well as what seems to be available using the new Integration Services but nothing seems to be any better than the solution I just mentioned.

Anyone have any ideas, I am willing to bet there is a much better way of doing this.

View 2 Replies View Related

Working With Data In A Mobile Solution

Mar 19, 2006

Im doing a project using VS2k5 and sql server2k5. Its a mobile project and this is my first time working with SQL and Visual Basic. I need to know how to add data to my database from the windows form.I have however been able to use the tableadapters and some sql statements to read data from the database in my forms. Can anyone here help me?

View 6 Replies View Related

Solution To Store Million Of Data In A Table.

Jun 27, 2007

Dear all,

i need to design a database table which will store supplier's demand information. 1 supplier will probably have 10000 records and there are posibility that there are 10,000 suppliers. So, in total, the number of records will be 10000 * 10000 = XXXXA LOT XXXX which will be very large number of record to be inserted into a table. So, how can i design an table and structure to cater this scenario? Thanks.

Hope to hear from you..

View 10 Replies View Related

More On Mysterious Data Loss In Mssql2k - A Possible Solution?

Jul 23, 2005

Hi All,Further to my previous long-winded question about a situation in whichwe appear to be mysteriously losing data from our mssql2k server.We discovered an update statement, in the stored procedure we believeis at fault, after which no error check was being performed.Under certain conditions, this update is fired against the same recordin the same table as the immediately preceding update statement withinthe transaction. We are now suspecting that under some circumstances,these two updates get into a locking conflict that is eventuallyforcing the transaction to be rolled back.However, I'm still left with three questions.1) Where an update in a transaction gets locked, and an error isn'ttested immediately afterwards (ie no 'IF @@Error<>0' test is made),would the transaction proceed as normal?2) Most critically, would statements in the stored procedure thatappear after the COMMIT TRAN statement also be executed, even if anunresolved lock existed within the transaction?3) Assuming that (2) does happen, would a SELECT made on anotherconnection with a 'WITH(NOLOCK)' locking hint be able to see thechanges made in the locked transaction even if the server is set toREAD COMMITTED, and the SELECT takes place some time after the COMMITTRAN is issued? More to the point, given (2), how long would thelocked transaction survive before being rolled back after the COMMITTRAN has been issued? Is it possible that the COMMIT TRAN takes place,the transaction is flagged for potential rollback while a lockresolution is attempted, the stored procedure exists as thougheverything was fine, a subsequent SELECT (ie performed as one of thenext operations in the same application) using WITH(NOLOCK) 'sees' thechanges made by the transaction, reinforcing the impression that thetransaction succeeded, and then at some point thereafter the lock isdetermined to be unresolvable and the transaction is rolled back,making it seem as though the data disappeared, even though it had beenSELECTable via a different connection to the server?Thanks, by the way, to Simon and Erland for your advice on my previousquestions about this problem.Much warmth,M Wells

View 1 Replies View Related

Slow Running Query With A High Lob Logical Reads, Lob Data

Jun 1, 2007

we had some slow down complaints lately and this query seems to be the culprit almost every single time. The estimated execution plan is a clustered index seek as there is a clustered index on the uidcustomerid column. setting profile statistics on shows that every time it executes it does an index seek.

profiler session showed a huge number of reads for these queries depending on the value being looked up. 1500 through 50000. i set up profile io on and the culprit is lob logical reads. everything else is 0 or very low. in this case lob logical reads is over 1700.

3 of the columns in the select statement are text columns. when i take them out of the query the lob logical reads drops to 0 and goes up incrementally as i add each column back in.

is there anyway to improve the performance without changing data types to varchar(max)?


select SID,Last_name,Name_2,First_name,Middle_initial,Descriptives,Telephone_number,mainline,Residence,ADL,
DID_number,Svce_street,Svce_town,Svce_state,Svce_appt,Mailing_street,Mailing_town,Mailing_state,Mailing_appt,
Mailing_zip,Listing,Addl_listing,Published,Listed,Gold_number,PIN,status,SSnumber,tax_jurisdiction,
Bill_date,Past_balance,Service_start_date,Service_end_date,LOA,FCC_type,Line_type,I_W,Jacks,Voice_messaging,
vms_ring_cycles,CCS,phonesmarts,ringmate,voice_dialing,Bill_detail,Contact_Number,Contact_extension,
Best_Time,suspend,suspend_start,suspend_end,credits_allowed,credits_granted,home_region,Calling_Plan,Local_Plan,
Local_Plan_Rate,Flat_Rate,Sales_agent,Community,Building_Mgmt,How_Heard,Incentive_1,Incentive_1a,Incentive_1b,
Incentive_1c,Incentive_2,Incentive_2a,Incentive_2b,Incentive_2c,Incentive_3,Incentive_3a,Incentive_3b,
Incentive_3c,block_operator,block_collect,block_group,block_adult,block_call_return,block_repeat_dialing,
block_call_trace,block_caller_id,block_anonymous,block_all_high_toll,block_regional_and_ld,block_DA_Call_Completion,
block_DA,block_3rd_party,bank,prepayment,dial_around_number,custid,waive_interest,Financial_Treatment,
Other_Feature_1_code,Other_Feature_1_rate,Other_Feature_2_code,Other_Feature_2_rate,Other_Feature_3_code,
Other_Feature_3_rate,Other_Feature_4_code,Other_Feature_4_rate,Partial_Account,mail_date,snp_1_date,snp_2_date,
terminate_date,snp1notified,snp1peak,snp1offpeak,snp2notified,snp2peak,snp2offpeak,avg_days_paid,Pulled_Ld,SNP1,
SNP2,Treatment,Collections,Installment,Nynex_BTN,LD_rate,local_discount,to_month,rounds_up,full_package_made,
local_made,PIC,LPIC,tax_exempt_local,tax_exempt_federal,CommissionedAgent,LDRateID,UidCustomerId,
accVchLineClassUSOC,block_Inter_Reg_LD,block_international,block_DA_3rd_Collect,block_DH2,block_ISP_2,block_ISP_3,
block_ISP4_3_GBAS,block_ISP3_3_GBAS,block_collect_only,block_LD_Reg_DA,block_usage_based,block_ISP5_3_GBAS,
block_ISP5_2_GBAS,block_group_adult,csr_PIC,csr_LPIC,csr_SA,csr_exception,cutover_status,cutover_datetime,
OutsideAgent,prfVchAttributes,uidResellerID,Category,uidDealID
from profiles where UidCustomerID in (352199267)

View 3 Replies View Related

'Error Converting Data Type Varchar To Int' : Solution

Jul 3, 2007

Here is the solution to this error message if anyone gets one in the future...It took me about two days to figure out.



This error occurs if you save your stored procedure before execute it. Always execute first before you save. If you save before executing, the server will store its own defaults, usually integers.



This error aslo occurs if your datatypes do not match when passing values from code into the variables of a stored procedure.



It also occurs if the datatypes in your sql file do not match those of the original stored procedure.



You can check to see if the file saved in the "Projects" folder matches with the original by doing the following:



Expand Database, Expand Programmability, Expand Stored Procedures, Expand (Your Stored Procedure), Expand Parameters. Read the datatypes that are revealed in the tree.



Then go to File>Open>Projects>(save sql file). Click open.



View the datatypes in the file. If the datatypes in the file do not match the datatypes in the tree, what you must do to correct, as one solution, is delete both the file and stored procedure.



Then create a new stored procedure by right clicking the stored procedure node. Rewrite the store procedure, execute and then save. Everything should be okay.

View 3 Replies View Related

SQL 2012 :: TEMPDB High Avg Write Wait Time On Data Files

Apr 15, 2014

I am currently investigating aa high avg write time ms issue (145ms) which seems to be only occuring on the tempdb data files.I have followed the recommended setup of TEMPDB in that

1. Data files = number of physical cores
2. Data files and logfiles are on separate partitions away from the other databases.
3. Tempdb is presized and no incremental file increases look like they are happening with frequency.

We have sharepoint 2012 setup on other sql servers and with TEMPDB setup following the same guidelines, with far more Sharepoint activity on a similary specified hardware which is why its confusing.FileIO auditing on the partitions themselves shows that the FileIO is very fast on the partitions that the tempdb data file which leads me to beleive that Sharepoint may be the culprit perhaps due to excess use of tempdb with operations taking a long time to resolve.

View 3 Replies View Related

What Is The Optimal Solution To Find, If All The Data Is Entered On A Predefined Condition?

Dec 10, 2006

Hi,I want to know the optimal solution, to find if all the data was entered. Lets say, Table A (date field) and for a given month, i need that all the days in the given month are present in the Table A. Right now i have different solutions, 1) a stored procedure which loops through all the days in the given month against a select statement on Table A2) a stored procedure, create a temp table which contains all the dates in the given month, and a single select statement using where condition (select * from.... where datefield not in (select * from...))I want to know what is the best solution of these two or any other solution.Thanks 

View 9 Replies View Related

System.Data.OleDb.OleDbException: Timeout Expired Solution

May 4, 2007

Dear All,
I have a page on the web site that will upload a CVS fiile.  Then, it will read the file, and insert the data to the sql database.  (around 14,000 records)
I'm not sure why, it seems like whenever we do the upload/insertion for the first time, it will always return this timeout error.
System.Web.HttpUnhandledException: Exception of type System.Web.HttpUnhandledException was thrown. ---> System.Data.OleDb.OleDbException: Timeout expired
After I got this error, and do the upload again.  Then the upload will work fine.
So I wonder is it when I got the timeout error, will the SQL server clean up its used resources or something, so my function will work the next time?
Or any recommendation on this issue? 
 Thank you very much.
 
 
 
 
 

View 3 Replies View Related

Using Data Source View In Import Export Package Within Same Solution Is Not Possible ?

Dec 15, 2005

hello ,

 

I want to download data from an AS/400 to a SQL 2005 Database.

Within Sql server 2000 it was possible to do more than 1 transformation with just 1 source connenction and 1 destination.

In SSIS I have created:

-  new solution

-  data source

-  Data source viewer ( I made named queries to format the data as i want it to be in Sql)

Although it is possible to view the named query data it does not seem to be possible to import the data in the data viewer into my sql 2005 database. (For each dataflow I have to define each query again)

I have added the data source in the connection manager. 

Also I am searching to import more than 1 file in 1 package !

I can use however a Dts-2000 package but why then use SQL server 2005

Hope somebody can help me

Regard,

Ronny

 

View 7 Replies View Related

SOLUTION! - VB.NET 2005 Express And SQL Server 2005 - NOT Saving Updates To DB - SOLUTION!

Nov 30, 2006

VB.NET 2005 Express and SQL Server 2005 Express - NOT saving updates to DB - SOLUTION!

-----------------------------------

The following article is bogus and confusing:

How to: Manage Local Data Files - Setting 'Copy to Output Directory' to 'Do not copy'
http://msdn2.microsoft.com/en-us/library/ms246989.aspx

You must manually copy the database file to the output directory
AFTER setting 'Copy to Output Directory' to 'Do not copy'.

Do not copy








The file is never copied or overwritten by the project system. Because your application creates a dynamic connection string that points to the database file in the output directory, this setting only works for local database files when you manually copy the file yourself.

You must manually copy the database file to the output directory
AFTER setting 'Copy to Output Directory' to 'Do not copy'.

-----------------------------------

The above article is bogus and confusing.

This is rediculous!

This is the most vague and convoluted bunch of nonsince I've ever come accross!

Getting caught out on this issue for the 10th time!
And not being able to find an exact step-by-step solution.

--------------------------

I've tried it and it doesn't work for me.

Please don't try what the article eludes to as I'm still sorting out exactly what is supposed to be happening.



If you have a step-by-step procedure that can be reproduced this properly please PM me.

I would like to test its validity then update this exact post as a solution rather than just another dicussion thread.

Many thanks.



This is the exact procedure I have come up with:

NOTE 1: DO NOT allow VB.net to copy the database into its folders/directorys.

NOTE 2: DO NOT hand copy the database to a folder/directory in your project.

Yes, I know its hard not to do it because you want your project nice and tidy.
I just simply could NOT get it to work.
You should NOT have myData.mdf listed in the Solution Explorer. Ever.

Create a folder for your data following NOTE 2.

Copy your data to that folder. * mine was C:mydatamyData.mdf



Create a NEW project.

Remove any Data Connections. ( no matter what)

Save it.

Data | View Data Sources

Add New Data Source

select NEW CONNECTION ( No Matter what, do it!

Select the database. * again mine was C:mydatamyData.mdf

Answer NO to the question:
Would you like to copy the file to your project and modify the connection?
- NO ( no matter what - ANSWER NO ! - Absolutely NO )
Then select the tables you want in the DataSet.
and Finish.



To Test ----------

From the Solution Explorer | click the table name drop down arrow | select details
Now Drag the table name onto the form.

The form is then populated with a Navigation control
and matching Labels with corresponding Textboxes for each field in the table.

Save it.

1) Run the app.

Add one database record to the database by pressing the Add(+) icon

Just add some quick junk data that you don't mind getting lost if it doesn't save.

YOU MUST CLICK THE SAVE ICON to save the data you just entered.

Now exit the application.

2) Run the app again.

And verify there is one record already there.

Now add a second database record to the database by pressing the Add (+) icon.

NOW add some quick junk data that you WILL intentionally loose.

*** DO NOT *** press the save icon.

Just Exit the app.

3) Again, Run the app.

Verify that the first record is still there.

Verify that the Second record is NOT there.
Its NOT there because you didn't save the data before exiting the app.

Proving that YOU MUST CLICK THE SAVE ICON to save the data you just entered.

Also proving you must add your own code to catch the changes
and ask the user to save the data before exitiing or moving to another record.

As a side note, since vb.net uses detached datasets,
(a copy/snapshot of the dataset in memory and NOT directly linked to the database)
the dataset will reflect all changes made when moving around the detached datasets.
YOU MUT REMEMBER TO SUBMIT YOUR CHANGES TO THE DATABASE TO SAVE THEM.
Otherwise, they will simply be discarded without notice.

Whewh!

I hope this saves me some time the next time I want to start a new database project.

Oh, and uh, for anyone else reading this post.

Thanks,
Barry G. Sumpter

Currently working with:
Visual Basic 2005 Express
SQL Server 2005 Express

Developing Windows Forms with
101 Samples for Visual Basic 2005
using the DataGridView thru code
and every development wizard I can find within vb.net
unless otherwise individually stated within a thread.

View 17 Replies View Related

Data Flow Contains No Components After Package Save Operation And Reopening Solution

Jul 25, 2007

I have Data Flow task that contains 50 components.

My computer configuration: 1 GB RAM Microsoft Windows Server 2003

Periodicaly when i try to save package after making some changes Out of memory ... exceptions message box appears , and soon after this Not fatal error occurs ... message box shows . If i close solution and open it again all my 50 components disappears --instead I see clear list, and all my work losen.



Such "Not fatal errors" making hell out of job -- every time I need to change package i must add package to archive!!!

View 4 Replies View Related

Errors In The High-level Relational Engine. A Connection Could Not Be Made To The Data Source With The DataSourceID Of '

Feb 7, 2006

When I deploy the cube which is sitting on my PC (local) the following 4 errors come up:

Error 1 The datasource , 'AdventureWorksDW', contains an ImpersonationMode that that is not supported for processing operations. 0 0
Error 2 Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW', Name of 'AdventureWorksDW'. 0 0
Error 3 Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed. 0 0
Error 4 Errors in the OLAP storage engine: An error occurred while the 'Customer Alternate Key' attribute of the 'Customer' dimension from the 'Analysis Services Tutorial' database was being processed. 0 0

View 25 Replies View Related

Any Solution? Cannot Initialize The Data Source Object Of OLE DB Provider Microsoft.jet.oledb.4.0 For Linked Server (null).

Jul 22, 2007

This is a problem that never get solved, sometime I can use other way to avoid it, but havn't found a solution yet, i hope I can get some more idea here.

I am using SQL 2005, when I run



select * into #import1
from OpenRowSet('microsoft.jet.oledb.4.0','Excel 8.0;hdr=yes;database=\ws8webjeff2.xls',
'select * from [jeff2$]')



I get

Cannot initialize the data source object of OLE DB provider "microsoft.jet.oledb.4.0" for linked server "(null)".



when I try to compile a SP with that statement in it, I get the same error, like



create stored procedure test

as begin

select * into #import1
from OpenRowSet('microsoft.jet.oledb.4.0','Excel 8.0;hdr=yes;database=\ws8webjeff2.xls',
'select * from [jeff2$]')

end



so it seems the error may not relate to the real file, since at the compile stage, it should not check the real file?



On my live db, after I restart the SQL service, the statement will work, after a while, one or several days, I get the same error again. I can not restart my live db quite often for sure, so now I have another backup db server, I need run the statement on the backup server and then read the data from there.



I have the same problem at two places, both use SQL 2005.



So far there are three questions

1, why it works after restart, but only last for a while? something about memory? since the backup db seldom need restart and work fine after many days.



2, why it gives error in compile stage?



3, why two dbs in different Enviroment has the same problem



The most answer I have gathered so far is permission issue, true I got similar error if the import file is located in a place which SQL has no right to access. But in this case, it should not be.



Any other idea or suggestion?



thanks

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved