Import Via DTS: DeDup In Process?

Jan 29, 2002

Which method in MSSQL 7.0 would best suit being able to de-dupe, basically leave the dupes in the import file.

I have two process, in the first process, I'm importing from two different tables, so that any potential dupes would have their unique RecID's given from either table. I can then de-dupe on the unique ucase(entry)+RecID combo.

This works fine, however in the second process, the import file that has only one source, and therefore I could have real dupes. Currently I've only used a TSQL cursor process to copy all the data into a temp table, delete the data in the live table, then use another cursor in the same process to only copy one instance of an (account_number + RecID) back into the live table.

This too works, but I'd like to make a DTS package that can do this on import in as few steps as possible. I'm thinking to use one connection and a proc(?)

TIA

JeffP....

View 1 Replies


ADVERTISEMENT

Max Date And Dedup

Feb 12, 2002

Hi
I have a table PRODATE with 3 fields:
ID EMAIL DATE
1 aaa@a.com 08/15/2000
2 bbb@b.com 11/20/2000
3 ccc@c.com 02/04/1999
4 bbb@b.com 05/04/1998
5 aaa@a.com 11/26/2001
6 aaa@a.com 06/08/1999

What I need returned is 1 record per EMAIL only(dedup), based on most recent(MAX) date:
ID EMAIL DATE
5 aaa@a.com 11/26/2001
2 bbb@b.com 11/20/2000
3 ccc@c.com 02/04/1999

I've tryied several max and inner join statements with no luck.

Thanks
Nathan

View 1 Replies View Related

Optimize Dedup TSQL?

May 30, 2004

The following is fairly basic cursor based merge code. Order by UserName and DomainID; INSERT a single merged record for every distinct combination.

I had originally written this as a single INSERT statement with subqueries. Avoiding cursors is supposed to be faster but comparing the execution plans for both strategies suggested the cursor based code was MUCH faster, so this is what I'm going to try and use.

If anyone can think of any optimizations to this I would be very grateful. Thanks!


DECLARE @emailUser VARCHAR(50)
DECLARE @domainID INT
DECLARE @firstName VARCHAR(24)
DECLARE @lastName VARCHAR(24)
DECLARE @streetAddress VARCHAR(32)
DECLARE @city VARCHAR(24)
DECLARE @state VARCHAR(24)
DECLARE @postal VARCHAR(10)
DECLARE @sourceID INT
DECLARE @numNameNULLs INT
DECLARE @numAddressNULLs INT

DECLARE @rowEmailUser VARCHAR(50)
DECLARE @rowDomainID INT
DECLARE @rowFirstName VARCHAR(24)
DECLARE @rowLastName VARCHAR(24)
DECLARE @rowStreetAddress VARCHAR(32)
DECLARE @rowCity VARCHAR(24)
DECLARE @rowState VARCHAR(24)
DECLARE @rowPostal VARCHAR(10)
DECLARE @rowSourceID INT
DECLARE @rowNumNameNULLs INT
DECLARE @rowNumAddressNULLs INT

DECLARE stagingCursor CURSOR FOR
SELECT
UserName, DomainID, First, Last, StreetAddress, City, State, Postal, SourceID
, (CASE WHEN Stages.[First] IS NULL THEN 1 ELSE 0 END
+ CASE WHEN Stages.[Last] IS NULL THEN 1 ELSE 0 END) AS NumNameNULLs
,(CASE WHEN Stages.[StreetAddress] IS NULL THEN 1 ELSE 0 END
+ CASE WHEN Stages.[City] IS NULL THEN 1 ELSE 0 END
+ CASE WHEN Stages.[State] IS NULL THEN 1 ELSE 0 END
+ CASE WHEN Stages.[Postal] IS NULL THEN 1 ELSE 0 END) AS NumAddressNULLs
FROM Stages
WHERE NOT EXISTS
(SELECT UserName FROM Recipients
WHERE Stages.UserName = Recipients.UserName
AND Stages.DomainID = Recipients.DomainID)
ORDER BY UserName, DomainID

OPEN stagingCursor
FETCH NEXT FROM stagingCursor INTO
@rowEmailUser, @rowDomainID, @rowFirstName, @rowLastName, @rowStreetAddress, @rowCity, @rowState, @rowPostal, @rowSourceID, @rowNumNameNULLs, @rowNumAddressNULLs
WHILE @@FETCH_STATUS = 0
BEGIN
IF (@emailUser = @rowEmailUser AND @domainID = @rowDomainID) BEGIN
-- Merge a consecutive row for the current UserName/DomainID combination
IF (@rowNumNameNULLs < @numNameNULLs) BEGIN
SET @numNameNULLs = @rowNumNameNULLs
SET @firstName = @rowFirstName
SET @lastName = @rowLastName
END
IF (@rowNumAddressNULLs < @numAddressNULLs) BEGIN
SET @streetAddress = @rowStreetAddress
SET @city = @rowCity
SET @state = @rowState
SET @postal = @rowPostal
SET @numAddressNULLs = @rowNumAddressNULLs
END
END ELSE BEGIN
IF (@emailUser IS NOT NULL) BEGIN
-- Finished iterating 1+ records of a UserName/DomainID combination. INSERT the merged record.
INSERT INTO Recipients(UserName, DomainID, First, Last, StreetAddress, City, State, Postal, SourceID)
VALUES (@emailUser, @domainID, @firstName, @lastName, @streetAddress, @city, @state, @postal, @sourceID)
END

-- Reached a new UserName/DomainID combination. Set current data.
SET @emailUser = @rowEmailUser
SET @domainID = @rowDomainID
SET @firstName = @rowFirstName
SET @lastName = @rowLastName
SET @streetAddress = @rowStreetAddress
SET @city = @rowCity
SET @state = @rowState
SET @postal = @rowPostal
SET @sourceID = @rowSourceID
SET @numNameNULLs = @rowNumNameNULLs
SET @numAddressNULLs = @rowNumAddressNULLs
END

FETCH NEXT FROM stagingCursor INTO
@rowEmailUser, @rowDomainID, @rowFirstName, @rowLastName, @rowStreetAddress, @rowCity, @rowState, @rowPostal, @rowSourceID, @rowNumNameNULLs, @rowNumAddressNULLs
END
CLOSE stagingCursor
DEALLOCATE stagingCursor

IF (@emailUser IS NOT NULL) BEGIN
-- Finished iterating 1+ records of a UserName/DomainID combination. INSERT the merged record.
INSERT INTO Recipients(UserName, DomainID, First, Last, StreetAddress, City, State, Postal, SourceID)
VALUES (@emailUser, @domainID, @firstName, @lastName, @streetAddress, @city, @state, @postal, @sourceID)
END


A coworker has a C++ solution that uses FAST INSERT operations (directly through the ADODB API; like bcp or BULK INSERT). He claims to get twice the performance over the above T-SQL although I can't use his application due to bizarre errors. The above T-SQL is obviously using plain old INSERT statements although I hope the looping and everything would be faster since its 100% native database code. Is there anyway I can get FAST/BULK INSERT like performance through the above T-SQL?

View 9 Replies View Related

How To Automate The Import Process?

Jun 12, 2007

Hopefully, there is a way to do this. I work with two SQL servers. One is our production server the other is our test server. In order to test various things, I often need to copy the source data from one server to the other. Most of our programming is in VBA. It's easy enough to open a recordset and fill it with the data I need from the production server, then upload each record, one at a time, to the test server. The problem is that I am dealing with a massive amount of data and this takes a long time.

I have found that I can use the import task in SQL Server Enterprise and it transfers the data extremely quickly. Is there a way, preferably using VBA, that I could automate this import task process?

Thanks

View 1 Replies View Related

Import Process Modifes My Float Value

Oct 17, 2007

Hello, I try to load the following entry via SSIS in SQL 2005.

Account Name,Account ID,CCY,TD Cash Balance,SD Cash Balance,FX Rate,TD Cash Balance (Base),SD Cash Balance ,,EUR,8251439.34,8251439.34,1.4166,11689195.26,-11689195.26

My value in SQL appears in the 4th column as 8251439.5. I define that the source has a float field. Any idea why my value get changed and how to work around it?

Thanks,
Jam

View 2 Replies View Related

Counting # Of Imported Rows For Each Import Process

Nov 25, 1998

hi, I am importing data daily to many tables, I want to keep track of the #of rows for each import process. I already have created a trigger as follow:
CREATE TRIGGER tr_bcp_log ON dbo.A
FOR INSERT
AS

declare @name varchar(30),
@row_count int

select @name=name , @row_count= @@rowcount
from inserted

insert into bcp_tracks (name,row_count)
values(@name,@row_count)
GO

The problem is that I am getting a row for each inserted row in table A.for instance if I have 500 rows in table A, I will get 500 rows in the log table like this
table_name,#of rows
A 1
A 1
A 1
etc up to 500 rows for table A

This is not what I want, I want to capture the num of rows for every bcp process , so in the log table I want to see the following :
table_name, #of rows
A 500
B 600
C 450
A 250
etc

Any help ?

thanks

Ali

View 1 Replies View Related

Transaction (Process ID 135) Was Deadlocked On Lock Resources With Another Process And Has Been Chosen As The Deadlock Victim.

Nov 14, 2007



Hi,

I was trying to extract data from the source server using OLEDB Source and SQL Server Destination when i encountered this error:

"Transaction (Process ID 135) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.".

What must be done so that even if the table being queried is locked, i wouldn't experience any deadlock?

cherriesh

View 4 Replies View Related

FCB::Open: Operating System Error 32(The Process Cannot Access The File Because It Is Being Used By Another Process.) Occurred W

Dec 3, 2007

Hello all,
I am running into an interesting scenario on my desktop. I'm running developer edition on Windows XP Professional (9.00.3042.00 SP2 Developer Edition). OS is autopatched via corporate policy and I saw some patches go in last week. This machine is also a hand-me-down so I don't have a clean install of the databases on the machine but I am local admin.

So, starting last week after a forced remote reboot (also a policy) I noticed a few of the databases didn't start back up. I chalked it up to the hard shutdown and went along my merry way. Friday however I know I shut my machine down nicely and this morning when I booted up, I was in the same state I was last Wenesday. 7 of the 18 databases on my machine came up with

FCB:pen: Operating system error 32(The process cannot access the file because it is being used by another process.) occurred while creating or opening file 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf'. Diagnose and correct the operating system error, and retry the operation.
and it also logs
FCB:pen failed: Could not open file C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf for file number 1. OS error: 32(The process cannot access the file because it is being used by another process.).

I've caught references to the auto close feature being a possible culprit, no dice as the databases in question are set to False. Recovery mode varies on the databases from Simple to Full. If I cycle the SQL Server service, whatever transient issue it was having with those files is gone.
As much as I'd love to disable the virus scanner, network security would not be amused. The data and log files appear to have the same permissions as unaffected database files. Nothing's set to read only or archive as I've caught on other forums as possible gremlins. I have sufficient disk space and the databases are set for unrestricted growth.

Any thoughts on what I could look at? If it was everything coming up in RECOVERY_PENDING it's make more sense to me than a hit or miss type of thing I'm experiencing now.

View 13 Replies View Related

[Execute Process Task] Error:The Process Exit Code Was -1 While The Expected Was 0.

Mar 11, 2008

Dear list
Im designing a package that uses Microsofts preplog.exe to prepare web log files to be imported into SQL Server

What Im trying to do is convert this cmd that works into an execute process task
D:SSIS ProcessPrepweblogProcessLoad>preplog ex.log > out.log
the above dos cmd works 100%



However when I use the Execute Process Task I get this error
[Execute Process Task] Error: In Executing "D:SSIS ProcessPrepweblogProcessLoadpreplog.exe" "" at "D:SSIS ProcessPrepweblogProcessLoad", The process exit code was "-1" while the expected was "0".

There are two package varaibles
User::gsPreplogInput = ex.log
User::gsPreplogOutput = out.log

Here are the task properties
RequireFullFileName = True
Executable = D:SSIS ProcessPrepweblogProcessLoadpreplog.exe
Arguments =
WorkingDirectory = D:SSIS ProcessPrepweblogProcessLoad
StandardInputVariable = User::gsPreplogInput
StandardOutputVariable = User::gsPreplogOutput
StandardErrorVariable =
FailTaskIfReturnCodeIsNotSuccessValue = True
SuccessValue = 0
TimeOut = 0

thanks in advance
Dave

View 1 Replies View Related

Execute Process Task Error - The Process Exit Code Was 1 While The Expected Was 0.

Jan 30, 2007

How do I use the execute process task? I am trying to unzip the file using the freeware PZUnzip.exe and I tried to place the entire command in a batch file and specified the working directory as the location of the batch file, but the task fails with the error:

SSIS package "IngramWeeklyPOS.dtsx" starting.

Error: 0xC0029151 at Unzip download file, Execute Process Task: In Executing "C:ETLPOSDataIngramWeeklyUnzip.bat" "" at "C:ETLPOSDataIngramWeekly", The process exit code was "1" while the expected was "0".

Task failed: Unzip download file

SSIS package "IngramWeeklyPOS.dtsx" finished: Success.

Then I tried to specify the exe directly in the Executable property and the agruments as the location of the zip file and the directory to unzip the files in, but this time it fails with the following message:

SSIS package "IngramWeeklyPOS.dtsx" starting.

Error: 0xC002F304 at Unzip download file, Execute Process Task: An error occurred with the following error message: "%1 is not a valid Win32 application".

Task failed: Unzip download file

SSIS package "IngramWeeklyPOS.dtsx" finished: Success.

The command in the batch file when run from the command line works perfectly and unzips the file, so there is absolutely no problem with the command, I believe it is just the set up of the variables on the execute process task editor under Process. Any input on resolving this will be much appreciated.

Thanks,

Monisha

View 1 Replies View Related

Execute Process Task - Error :The Process Exit Code Was 2 While The Expected Was 0.

Mar 20, 2008



I am designing a utility which will keep two similar databases in sync. In other words, copying the new data from db1 to db2 and updating the old data from db1 to db2.

For this I am making use of the 'Tablediff' utility which when provided with server name, database, table info will generate .sql file which can be used to keep the target table in sync with the source table.

I am using the Execute Process Task and the process parameters I am providing are:



WorkingDirectory : C:Program Files (x86)Microsoft SQL Server90COM
Executable : C:SQL_bat_FilesSQL5TC_CTIcustomer.bat

The customer.bat file will have the following code:
tablediff -sourceserver "LV-SQL5" -sourcedatabase "TC_CTI" -sourcetable "CUSTOMER_1" -destinationserver "LV-SQL2" -destinationdatabase "TC_CTI" -destinationtable "CUSTOMER" -f "c:SQL_bat_Filessql5TC_CTIsql_filescustomer1"

the .sql file will be generated at: C:SQL_bat_Filessql5TC_CTIsql_filescustomer1.

The Problem:
The Execute Process Task is working fine, ie., the tables are being compared correctly and the .SQL file is being generated as desired. But the task as such is reporting faliure with the following error :

[Execute Process Task] Error: In Executing "C:SQL_bat_FilesSQL5TC_CTIpackage_occurrence.bat" "" at "C:Program Files (x86)Microsoft SQL Server90COM", The process exit code was "2" while the expected was "0". ]

Some of you may suggest to just set the ForceExecutionResult = Success (infact this is what I am doing now just to get the program working), but, this is not what I desire.

Can anyone help ?




View 9 Replies View Related

Integration Services :: Dataload Process - Error Capturing Process

Aug 20, 2014

I'm pulling data from Oracle db and load into MS-SQL 2008.For my data type checks during the data load process, what are options to ensure that the data being processed wouldn't fail. such that I can verify first in-hand with the target type of data and then if its valid format load it into destination table else mark it with error flag and push into errors table... All this at the row level.One way I can think of is to load into a staging table then get the source & destination table -column data types, compare them and proceed.

should I just try loading the data directly and if it fails try trouble shooting(which could be a difficult task as I wouldn't know what caused error...)

View 3 Replies View Related

Transaction (Process ID 66) Was Deadlocked On Lock Resources With Another Process.

Feb 14, 2007

Hi Folks,

I am having this table locking issue that I need to start paying attention to as its getting more frequent.

The problem is that the data in the tables is live finance data that needs to be changed and viewed almost real time so what I have picked up so far is that using 'table Hints' may not be a good idea.

I have a guy at work telling me that introducing a data access layer is the only way to solve this, I am not convinced but havnt enough knowledge to back my own feeling up. (asp system not .net).

Thanks in advance

View 1 Replies View Related

Transaction (Process ID 65) Was Deadlocked On Lock Resources With Another Process

Jan 6, 2012

We are facing deadlock issue in our web application. The below message is coming:

> Session ID: pwdagc55bdps0q45q0j4ux55
> Location: xxx.xxx.xxx.xxx
> Error in: http://xxx.xxx.xxx.xxx:xxxx/Manhatta...Bar=&Mode=Edit
> Notes:
> Parameters:
> __EVENTTARGET:
> __EVENTARGUMENT:

[code].....

View 2 Replies View Related

ASPNETDB.MDF: The Process Cannot Access The File Because It Is Being Used By Another Process

Feb 17, 2007

Hi,
I'm trying to upload the ASPNETDB.MDF file to a hosting server via FTP, and everytime when it was uploaded half way(40% or 50%)
I would get an error message saying:
"550 ASPNETDB.MDF: The process cannot access the file because it is being used by another process"
 and then the upload failed.
 I'm using SQL Express. Does anybody know what's the cause?
 Thanks a lot

View 1 Replies View Related

Cannot Process Request Because The Process (3880) Has Exited.

Nov 15, 2007



Hi. When I try to start a package manually clicking the Start Debugging button I get this after a little while:


Cannot process request because the process (3880) has exited. (Microsoft.DataTransformationServices.VsIntegration)

How can I prevent this from happening? This happens every time I want to start the package and
every time the process id is different. Here it is 3880.

Darek

View 13 Replies View Related

System Process Or User Process

Dec 20, 2006

select * from sysprocesses
How can I determine whether a process a system or user?

View 3 Replies View Related

Identify A Process Which Locked Other Process

Oct 11, 2007

Hello,



I have had a full lock on my sql server and I have a few logs to found the origin of the lock.

I know the process at the head of the lock is the 55 process.



Here are the information I have on this process:
Spid 55 55
ecid 5 5
Ecid 0 0
ObjId 0 1784601646
IndId 0 0
Type DB PAG
Resource 1:1976242
Mode S IS
Status TransID GRANT GRANT
TransID 0 16980
TransUOW 00000000-0000-0000-0000-000000000000
00000000-0000-0000-0000-000000000000


lastwaittype PAGEIOLATCH_SH
CMD AWAITING COMMAND
Physycal id 1059
Login time 2007-07-05 04:29:53.873
nat address DFF06EBF974D
Wait type 0x0046
HostName .
BlkBy .
DBName grpprddb
CPUTime 54331
DiskIO 1059
ProgramName


Would someone know a way to identify the origin of the process 55?

I have already tried to execute the following request:
select * from SYSOBJECTS
where id=1784601646

But I have had no returns.



Regards,

Renaud

View 3 Replies View Related

The Process Cannot Access The File It Is Being Used By Another Process.

Aug 24, 2006

I have a File System Task Copy file operation to copy a file in an SSIS package. The package when scheduled as a job fails with the following error:

The process cannot access the file 'C:ETLConsignmentAppleAppleRawFile.txt' because it is being used by another process.".

However when I right click on the package and execute it manually from the Integration Services it runs successfully without any problem. I am not certain on how to resolve this issue any inputs will be much appreciated.

Thanks,

Monisha

View 19 Replies View Related

The Process Cannot Access The File Because It Is Being Used By Another Process.

Feb 6, 2008

Error: 0xC002F304 at Rename file 1, File System Task: An error occurred with the following error message: "The process cannot access the file because it is being used by another process.".

When running two File System Tasks after each other, with the same file, the file is still locked when running the second task. Resulting in an error: 0xC002F304 at Rename file 1, File System Task: An error occurred with the following error message: "The process cannot access the file because it is being used by another process.".


I found a workaround by addind a Execute Process Task before the second File System Task that pings to the localhost. This results in a 5 second delay, but there must be a better solution. Anyone?

View 9 Replies View Related

The Process Cannot Access The File Because It Is Being Used By Another Process.

Nov 23, 2006

While configuring log shipping using SQL 2000 Ent, the copy process failed with this message :

The process cannot access the file because it is being used by another process.

does anyone know what cause this ?

SAP R/3 is configured with the system, could this be the problem ?

View 3 Replies View Related

A Connection Was Successfully Established With The Server, But Then An Error Occurred During The Login Process. (provider: Shared Memory Provider, Error: 0 - No Process Is On The Other End Of The Pipe.)

Apr 7, 2008

i'm going nuts with SQL server notification thing. I have gone throigh this artical which tells how to set user http://www.codeproject.com/KB/database/SqlDependencyPermissions.aspx. This article show how to create new user and setup for sql server notification.But In my case user was alredy existing in database. which is very common senario in most cases. So i did following( check the SQL script below) but then i get this error
"A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)"
this my sql script
use [master]Go
-- Ensuring that Service Broker is enabled ALTER DATABASE [DatabaseName] SET ENABLE_BROKERGO
-- Switching to our databaseuse [DatabaseName]GO
CREATE SCHEMA schemaname AUTHORIZATION usernameGO
ALTER USER username WITH DEFAULT_SCHEMA = schemaname GO
/* * Creating two new roles. We're not going to set the necessary permissions  * on the user-accounts, but we're going to set them on these two new roles. * At the end of this script, we're simply going to make our two users  * members of these roles. */EXEC sp_addrole 'sql_dependency_subscriber' EXEC sp_addrole 'sql_dependency_starter'
-- Permissions needed for [sql_dependency_starter]GRANT CREATE PROCEDURE to [sql_dependency_starter] GRANT CREATE QUEUE to [sql_dependency_starter]GRANT CREATE SERVICE to [sql_dependency_starter]GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification]  to [sql_dependency_starter] GRANT VIEW DEFINITION TO [sql_dependency_starter]
-- Permissions needed for [sql_dependency_subscriber] GRANT SELECT to [sql_dependency_subscriber] GRANT SUBSCRIBE QUERY NOTIFICATIONS TO [sql_dependency_subscriber] GRANT RECEIVE ON QueryNotificationErrorsQueue TO [sql_dependency_subscriber] GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification]  to [sql_dependency_subscriber]
-- Making sure that my users are member of the correct role.EXEC sp_addrolemember 'sql_dependency_starter', 'username'EXEC sp_addrolemember 'sql_dependency_subscriber', 'username'

View 10 Replies View Related

SQL Server Import And Export Wizard Fails To Import Data From A View To A Table

Feb 25, 2008

A view named "Viw_Labour_Cost_By_Service_Order_No" has been created and can be run successfully on the server.
I want to import the data which draws from the view to a table using SQL Server Import and Export Wizard.
However, when I run the wizard on the server, it gives me the following error message and stop on the step Setting Source Connection


Operation stopped...

- Initializing Data Flow Task (Success)

- Initializing Connections (Success)

- Setting SQL Command (Success)
- Setting Source Connection (Error)
Messages
Error 0xc020801c: Source - Viw_Labour_Cost_By_Service_Order_No [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0014019. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
(SQL Server Import and Export Wizard)

Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)


- Setting Destination Connection (Stopped)

- Validating (Stopped)

- Prepare for Execute (Stopped)

- Pre-execute (Stopped)

- Executing (Stopped)

- Copying to [NAV_CSG].[dbo].[Report_Labour_Cost_By_Service_Order_No] (Stopped)

- Post-execute (Stopped)

Does anyone encounter this problem before and know what is happening?

Thanks for kindly reply.

Best regards,
Calvin Lam

View 6 Replies View Related

Integration Services :: Can't Import Excel 2013 Using SSMS Import Wizard (2008 R2)

Jul 29, 2015

I am trying to import an xlsx spreadsheet into a sql 2008 r2 database using the SSMS Import Wizard.  When pointed to the spreadsheet ("choose a data source")  the Import Wizard returns this error:

"The operation could not be completed" The Microsoft ACE.OLEDB.12.0 provider is not registered on the local machine (System.Data)

How can I address that issue? (e.g. Where is this provider and how do I install it?)

View 2 Replies View Related

Import Data From MS Access Databases To SQL Server 2000 Using The DTS Import/Export

Oct 16, 2006

I am attempting to import data from Microsoft Access databases to SQL Server 2000 using the DTS Import/Export Wizard. I have a few errors.

Error at Destination for Row number 1. Errors encountered so far in this task: 1.
Insert error column 152 ('ViewMentalTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 150 ('VRptTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 147 ('ViewAppTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 144 ('VPreTime', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Insert error column 15 ('Time', DBTYPE_DBTIMESTAMP), status 6: Data overflow.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.
Invalid character value for cast specification.

Could you please look into this and guide me
Thanks in advance
venkatesh
imtesh@gmail.com

View 4 Replies View Related

Error Trying To Import MS Access 2003 Database Via SQL Server Import And Export Wizard - Too Many Sessions Already Active

Nov 29, 2006

I am trying to simplify a query given to me by one of my collegues written using the query designer of Access. Looking at the query there seem to be some syntax differences, so to see if this was the case I thought I would import the database to my SQL Server Developer edition.

I tried to start the wizard from within SQL Server Management Studio Express as shown in one of the articles on MSDN which did not work, but the manual method also suggested did work.

Trouble is that it gets most of the way through the import until it spews forth the following error messages:

- Prepare for Execute (Error)
Messages
Error 0xc0202009: {332B4EB1-AF51-4FFF-A3C9-3AEE594FCB11}: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Could not start session. Too many sessions already active.".
(SQL Server Import and Export Wizard)

Error 0xc020801c: Data Flow Task: The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0202009.
(SQL Server Import and Export Wizard)

Error 0xc004701a: Data Flow Task: component "Source 33 - ATable" (2065) failed the pre-execute phase and returned error code 0xC020801C.
(SQL Server Import and Export Wizard).

There does not seem to be any method of specifying a number of sessions, so I don't see how to get round the problem.

Does anyone know how I can get the import to work?

View 2 Replies View Related

ASP.net, Database File, Process Cannot Access The File Because It Is Being Used By Another Process

Mar 19, 2008

Hi I am using VWD 2008, SQL Express 2005, Reporting Services, Win-XP, IIS5Basically let's say I have 2 pages:Page1: has a SQLDataSource control that populates a GridView from a table from a database file myDB.mdf  (no code behind)Page2: has a reportviewer control that show a report with data from the same table from myDB.mdf from the reportserver, (no code behind)I have attached myDB.mdf to the SQL Server Express using the SQL Server Management Studio Express.If I first open Page2 to display the ReportViewer it works ok. or using the Report ManagerNow this is the problem:If after that I try to open Page1 then a get an error message:Cannot open user default database. Login failed.Login failed for user 'myServerASPNET'. Exception Details: System.Data.SqlClient.SqlException: Cannot open user default database. Login failed.Login failed for user 'myServerASPNET'.Then I have to restart the SQL Server to fix it,Now I can open Page1 ok, but if after this I try to open Page2 (ReportViewer) againThen I get this error:"      An error has occurred during report processing. o      Cannot create a connection to data source 'my_Datasource'.  &And this error if open the report using the report manager:"      An error has occurred during report processing. o      Cannot create a connection to data source 'my_Datasource'. §      Unable to open the physical file "C:InetpubwwwrootWebsiteApp_DatamyDB.mdf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)".  &Now if i check the Management Studio Express again, you can see that myDB.mdf was detached. It seems to be there by it has no Tables or definitions, so I have to attach it again..Do you know how to fix this?Thanks in advance,Ed

View 5 Replies View Related

IMPORT New Data Since Last IMPORT - DTS/Stored Procs?

Jan 7, 2004

Hello:

I am not sure how to implement the following, but I believe it entails using DTS, and hopefully it is fine that I post it here b/c ultimately I will need this backend data for my frontend .aspx pages:

On a weekly basis, I need to IMPORT some data located on a remote Oracle DB into SQL Server 2k. Since there is so much data to transfer, I would only like to transfer the data that is new to the table since the last IMPORT, i.e. a week ago and leave behin the OLD data.

Is DTS the correct way to go or do I have more control via DTS with STORED PROCEDURES? Does anyone have any good references for me?

On a similar note, once this Oracle data is IMPORTED into a certain table, I would like to EXPORT some of these NEWLY acquired rows matching certain criteria into another table for auditing purposes. For this scenario, should I implement a TRIGGER UPDATE event here on the first table?

Any advice will be greatly appreciated!

View 3 Replies View Related

Error Regarding File Import Through Import Wizard

Jan 12, 2006

Hi all,

when trying to ímport files to our database server from a client, I keep getting an error:

- Validating (Error)
Messages
Error 0xc00470fe: Data Flow Task: The product level is insufficient for component "Source_txt" (1).
 (SQL Server Import and Export Wizard)
 
Error 0xc00470fe: Data Flow Task: The product level is insufficient for component "Data Conversion 1" (175).
 (SQL Server Import and Export Wizard)

... doing the same import when logged on the server, hasn't been giving me any errors, how come. I can from my client without trouble import tables from other DB servers but when ever it is files it won't do it.

 

I tried as mentioned in other threads rerun setup to re-install SSIS, but as it was already installed it wouldn't re-install. My next move would be to make a clean install, but not sure it would help, as I think this is a buck.

best regards

 

Musa Rusid

View 1 Replies View Related

DTS Import Does Not Import All Rows / Records

Jul 23, 2005

Hi,I am having trouble importing data from an excel spreadsheet into MSSQL Server 2000 using DTS Wizard. The DTS import process issuccessfull, no errors, but only 50 rows of approx. 1500 rows of dataare imported. I tried to remove 20 rows in the excel spreadsheet inthe interval row 0-50. When i later ran the import, only 30 rows wereimported. I deleted almost every row in the interval 0-50, with theresult of the import having 0 rows imported (but job ransuccessfully). I decided to delete rows 0-100 in the spreadsheet inorder to see if the resolved the problem, but it didn't. As Isuspected something in the excel file to be the cause, I exported theexcel spreadsheeet to a tab delimited textfile, with only one row. ADTS import resulted in importing approx 100 rows, double the amount ofthe textfile, but the other 1400 rows were not imported. The data inthe column is containing numeric values only.Please help me! What could possibly be the cause of DTS skipping rowslike that. DTS doesn't feel reliable at all :/Regards,Björn

View 3 Replies View Related

Process ID

Mar 21, 2001

Hi All,

I came across a weird problem in the morning in SQL Server 7.0 on NT Cluster Server.
* SP_WHO - returned a list of System Process IDs where ID = 6 shown as a sql server User comming from a web server
doing rollback on a stored procedure..
a) I tried to KILL the user but I couldn't.
b) Next I stopped the IIS service on Webserver and tried to KILL the user but still couldn't. Surprisingly the user was showing up as comming from
the same Webserver eventhough the IIS Service was not running.

* When I checked in the Current Status window of Enterprise Manager - the same Process ID was shown as SYSTEM User doing rollback on a stored
procedure.

Because of above user problem I was not able to do any Delete or update in the database. Also Restore of database failed because of the above user.

I could get rid of this problem only after STOPPING and STARTING the SQL Server Service.

Does any one one have any IDEA on this type of problem. If so Please let me why it happend and what needs to be done to resolve this from happening again

Thanks

Sekhar

View 3 Replies View Related

Process Help

Oct 17, 2006

Dear friends,
I'm working on 5 different servers,my work is to write the script and make it to run successfully.

I've two databases for me one to test the script and the other for the final script. after that i'l implement in other servers. after successful execution, the testing team will take care. now my problem is, some times i'm not able to find the exact script which was successfully runned.(on some servers, modifications were done.because some dependencies will be increased).

my question is how to maintain the exact script? may be this is a silly question, but after one week i'm not able to find the exact script.

please help me in this regard with your valuble suggesions.

Vinod

View 3 Replies View Related

Row-by-row Process

Jul 20, 2005

Hi,Please help.I have 2 tables as followings:CREATE TABLE [dbo].[Master] ([masitemno] [char] (10) NOT NULL ,[masqty] [decimal](10, 3) NOT NULL ,[masunitcost] [decimal](10, 2) NOT NULL) ON [PRIMARY]GOCREATE TABLE [dbo].[Transaction] ([transeqno] [int] NOT NULL ,[tranitemno] [char] (10) NOT NULL ,[tranqty] [decimal](10, 3) NOT NULL ,[tranamount] [decimal](10, 2) NOT NULL ,[tranunitcost] [decimal](10, 2) NOT NULL) ON [PRIMARY]GOALTER TABLE [dbo].[Master] WITH NOCHECK ADDCONSTRAINT [PK_Master] PRIMARY KEY NONCLUSTERED([masitemno]) ON [PRIMARY]GOALTER TABLE [dbo].[Transaction] WITH NOCHECK ADDCONSTRAINT [PK_Transaction] PRIMARY KEY NONCLUSTERED([transeqno]) ON [PRIMARY]GOTable "Transaction" has about 1,000,000 (one million rows) and Table"Master" has about 500,000 rows.I have to update "MASTER" table with "TRANSACTION" table withrow-by-row processing basis sorting byprimary key TRNSEQNO column.Sometimes TRANSACTION can explicitly SET "MASQTY" and "MASUNITCOST"columns (TRANUNITCOST<>0) of MASTERwhich linked byitemno and after that AMOUNT column of next row ofTRANSACTION will used thisnew UNITCOST of MASTER as followed statements.-------------------------------------------------declare @count int, @max intset @count=1set @max = (select max(seqno) from transaction(nolock)while @count<=@maxbeginupdate TRANSACTIONset TRANAMOUNT = TRANQTY * (select MASUNITCOST from MASTERwhere MASITEMNO=TRANITEMNO)where TRANSEQNO = @countand TRANUNITCOST = 0update MASTERset MASQTY = MASQTY + TRANQTYfrom TRANSACTIONwhere TRANSEQNO = @countand TRANUNITCOST = 0and MASITEMNO=TRANITEMNOupdate TRANSACTIONset TRANAMOUNT = TRANQTY * TRANUNITCOSTwhere TRANSEQNO = @countand TRANUNITCOST <> 0update MASTERset MASQTY = MASQTY + TRANQTY,MASUNITCOST = TRANUNITCOSTfrom TRANSACTIONwhere TRANSEQNO = @countand TRANUNITCOST <> 0and MASITEMNO=TRANITEMNOset @count = @count +1end-------------------------------------------------The above sample statements take me more than 10 hrs. (I quit beforeactually done) with MS SQL SERVER 7.5 SP4.on WIN2K SERVER (2 XEON PROCESSORS, 1GB MEM.). I tried to use triggerbut result is not correct.Please advise on shorten running time (in minutes , maybe) and betterperformance.Thank you and appreciate any suggestionsNipon Wongtrakul

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved