Transformation Service Question About De_dup Process.
Jun 8, 2006
A standard SQL statement that is used a lot in SQL is a left join.
What I'm trying to do is dedup a table.
Example:
Select * from table1 as t1
left join table2 as t2
on t1.firstname = t2.firstname
and t1.lastname = t2.lastname
and t1.gender = t2.gender
and t1.dob = t2.gender
where t2.lastname is null
What I thought was with in a data flow I have one OLDBD SQL statement
that is the full table and in the second table I have the OLDBE SQL
statement deduping the dups. What I am wanting to do next is remove the
dups and then rejoin the deduped dups back into the oringal table that
was less dups.
In the data flow I have:
OLeDB source: Original Table
OLEDB Sounce 1: Deduped Dups
Sort both OLEDB connections
on the OLEDB Dups connection I put a multicast
Then place the both flows into a Merge Join that I have setup as a left
outer join select all columns from the oringal table linking to the
right table on firstname, lastname, dob, gender. But no columns from
the right table is selected.
Then I have the merge join go into a conditional spilt to seperate out
the NULL values and choose the error out. Redirecting the data through
the red arrow into a union all will the other multicast dups. Then into
OLEDB destination tables.
I only get the dedupped dups throught the process what is the best way
to do this with Tranformations. Maybe I'm not using the merge join
correctly or the conditional split please help.
If you have two synchronous transformation components and the input of the second is connected to the output of the first, does the first transformation process (loop through) all rows in the buffer before outputting these rows to the second transformation? Or does the first transformation output each individual row to the second transormation as soon as it has finished processing it?
Is there a way, for example to script a DTS Package, so that it can be deleted and recreated at a later date if necessary? I have quiet alot of these, but few are used regularly. The msdb database is now up to 80 MB. However I don't want to delete them and have no way to recreate them.If I took a backup of msdb and then deleted the packages, would restoring msdb at a later date restore the packages?????
I have a service broker process that's in run-away mode and I cannot figure out how to stop it.
Here's the situation:
The stored procedure named in the activation queue is opening a transaction but not closing it. The receive loop has a timeout of 3 seconds, with MAX_QUEUE_READERS set to 5 so every 3 seconds I'm getting 15 new records in
sys.dm_tran_locks.
The server is at 99% CPU with no users on it.
The problem is I can't stop it. Here's what I have tried:
Reboot the box Clears the lock table and runs away again ALTER QUEUE <Queue Name> WITH STATUS=OFF Just sits there forever Drop the Service, Drop the Queue Just sits there forever Dropped the stored procedure itself (!) It's still running like crazy
So my immediate question is how do I stop this crazy thing? And the follow up question is when a programmer makes a mistake, do we have to reformat our DEV box to recover from it (that's a joke) -- seriously how do you stop a run away service broker process?
Is there a way for a .NET application to receive a notification when a service broker queue has been updated with a new message? I tried using SqlDependency on an SB queue but I got an "invalid" error in my notification handler.
Such a notification would be much better than having to poll the queue every N seconds.
I was trying to extract data from the source server using OLEDB Source and SQL Server Destination when i encountered this error:
"Transaction (Process ID 135) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.".
What must be done so that even if the table being queried is locked, i wouldn't experience any deadlock?
Hello all, I am running into an interesting scenario on my desktop. I'm running developer edition on Windows XP Professional (9.00.3042.00 SP2 Developer Edition). OS is autopatched via corporate policy and I saw some patches go in last week. This machine is also a hand-me-down so I don't have a clean install of the databases on the machine but I am local admin.
So, starting last week after a forced remote reboot (also a policy) I noticed a few of the databases didn't start back up. I chalked it up to the hard shutdown and went along my merry way. Friday however I know I shut my machine down nicely and this morning when I booted up, I was in the same state I was last Wenesday. 7 of the 18 databases on my machine came up with
FCB:pen: Operating system error 32(The process cannot access the file because it is being used by another process.) occurred while creating or opening file 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf'. Diagnose and correct the operating system error, and retry the operation. and it also logs FCB:pen failed: Could not open file C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataTest.mdf for file number 1. OS error: 32(The process cannot access the file because it is being used by another process.).
I've caught references to the auto close feature being a possible culprit, no dice as the databases in question are set to False. Recovery mode varies on the databases from Simple to Full. If I cycle the SQL Server service, whatever transient issue it was having with those files is gone. As much as I'd love to disable the virus scanner, network security would not be amused. The data and log files appear to have the same permissions as unaffected database files. Nothing's set to read only or archive as I've caught on other forums as possible gremlins. I have sufficient disk space and the databases are set for unrestricted growth.
Any thoughts on what I could look at? If it was everything coming up in RECOVERY_PENDING it's make more sense to me than a hit or miss type of thing I'm experiencing now.
Dear list Im designing a package that uses Microsofts preplog.exe to prepare web log files to be imported into SQL Server
What Im trying to do is convert this cmd that works into an execute process task D:SSIS ProcessPrepweblogProcessLoad>preplog ex.log > out.log the above dos cmd works 100%
However when I use the Execute Process Task I get this error [Execute Process Task] Error: In Executing "D:SSIS ProcessPrepweblogProcessLoadpreplog.exe" "" at "D:SSIS ProcessPrepweblogProcessLoad", The process exit code was "-1" while the expected was "0".
There are two package varaibles User::gsPreplogInput = ex.log User::gsPreplogOutput = out.log
How do I use the execute process task? I am trying to unzip the file using the freeware PZUnzip.exe and I tried to place the entire command in a batch file and specified the working directory as the location of the batch file, but the task fails with the error:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC0029151 at Unzip download file, Execute Process Task: In Executing "C:ETLPOSDataIngramWeeklyUnzip.bat" "" at "C:ETLPOSDataIngramWeekly", The process exit code was "1" while the expected was "0".
Then I tried to specify the exe directly in the Executable property and the agruments as the location of the zip file and the directory to unzip the files in, but this time it fails with the following message:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC002F304 at Unzip download file, Execute Process Task: An error occurred with the following error message: "%1 is not a valid Win32 application".
The command in the batch file when run from the command line works perfectly and unzips the file, so there is absolutely no problem with the command, I believe it is just the set up of the variables on the execute process task editor under Process. Any input on resolving this will be much appreciated.
I am designing a utility which will keep two similar databases in sync. In other words, copying the new data from db1 to db2 and updating the old data from db1 to db2.
For this I am making use of the 'Tablediff' utility which when provided with server name, database, table info will generate .sql file which can be used to keep the target table in sync with the source table.
I am using the Execute Process Task and the process parameters I am providing are:
The customer.bat file will have the following code: tablediff -sourceserver "LV-SQL5" -sourcedatabase "TC_CTI" -sourcetable "CUSTOMER_1" -destinationserver "LV-SQL2" -destinationdatabase "TC_CTI" -destinationtable "CUSTOMER" -f "c:SQL_bat_Filessql5TC_CTIsql_filescustomer1"
the .sql file will be generated at: C:SQL_bat_Filessql5TC_CTIsql_filescustomer1.
The Problem: The Execute Process Task is working fine, ie., the tables are being compared correctly and the .SQL file is being generated as desired. But the task as such is reporting faliure with the following error :
[Execute Process Task] Error: In Executing "C:SQL_bat_FilesSQL5TC_CTIpackage_occurrence.bat" "" at "C:Program Files (x86)Microsoft SQL Server90COM", The process exit code was "2" while the expected was "0". ]
Some of you may suggest to just set the ForceExecutionResult = Success (infact this is what I am doing now just to get the program working), but, this is not what I desire.
I'm pulling data from Oracle db and load into MS-SQL 2008.For my data type checks during the data load process, what are options to ensure that the data being processed wouldn't fail. such that I can verify first in-hand with the target type of data and then if its valid format load it into destination table else mark it with error flag and push into errors table... All this at the row level.One way I can think of is to load into a staging table then get the source & destination table -column data types, compare them and proceed.
should I just try loading the data directly and if it fails try trouble shooting(which could be a difficult task as I wouldn't know what caused error...)
I am having this table locking issue that I need to start paying attention to as its getting more frequent.
The problem is that the data in the tables is live finance data that needs to be changed and viewed almost real time so what I have picked up so far is that using 'table Hints' may not be a good idea.
I have a guy at work telling me that introducing a data access layer is the only way to solve this, I am not convinced but havnt enough knowledge to back my own feeling up. (asp system not .net).
Hi, I'm trying to upload the ASPNETDB.MDF file to a hosting server via FTP, and everytime when it was uploaded half way(40% or 50%) I would get an error message saying: "550 ASPNETDB.MDF: The process cannot access the file because it is being used by another process" and then the upload failed. I'm using SQL Express. Does anybody know what's the cause? Thanks a lot
Hi. When I try to start a package manually clicking the Start Debugging button I get this after a little while:
Cannot process request because the process (3880) has exited. (Microsoft.DataTransformationServices.VsIntegration)
How can I prevent this from happening? This happens every time I want to start the package and every time the process id is different. Here it is 3880.
I have had a full lock on my sql server and I have a few logs to found the origin of the lock.
I know the process at the head of the lock is the 55 process.
Here are the information I have on this process: Spid 55 55 ecid 5 5 Ecid 0 0 ObjId 0 1784601646 IndId 0 0 Type DB PAG Resource 1:1976242 Mode S IS Status TransID GRANT GRANT TransID 0 16980 TransUOW 00000000-0000-0000-0000-000000000000 00000000-0000-0000-0000-000000000000
lastwaittype PAGEIOLATCH_SH CMD AWAITING COMMAND Physycal id 1059 Login time 2007-07-05 04:29:53.873 nat address DFF06EBF974D Wait type 0x0046 HostName . BlkBy . DBName grpprddb CPUTime 54331 DiskIO 1059 ProgramName
Would someone know a way to identify the origin of the process 55?
I have already tried to execute the following request: select * from SYSOBJECTS where id=1784601646
I have a File System Task Copy file operation to copy a file in an SSIS package. The package when scheduled as a job fails with the following error:
The process cannot access the file 'C:ETLConsignmentAppleAppleRawFile.txt' because it is being used by another process.".
However when I right click on the package and execute it manually from the Integration Services it runs successfully without any problem. I am not certain on how to resolve this issue any inputs will be much appreciated.
Error: 0xC002F304 at Rename file 1, File System Task: An error occurred with the following error message: "The process cannot access the file because it is being used by another process.".
When running two File System Tasks after each other, with the same file, the file is still locked when running the second task. Resulting in an error: 0xC002F304 at Rename file 1, File System Task: An error occurred with the following error message: "The process cannot access the file because it is being used by another process.".
I found a workaround by addind a Execute Process Task before the second File System Task that pings to the localhost. This results in a 5 second delay, but there must be a better solution. Anyone?
i'm going nuts with SQL server notification thing. I have gone throigh this artical which tells how to set user http://www.codeproject.com/KB/database/SqlDependencyPermissions.aspx. This article show how to create new user and setup for sql server notification.But In my case user was alredy existing in database. which is very common senario in most cases. So i did following( check the SQL script below) but then i get this error "A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" this my sql script use [master]Go -- Ensuring that Service Broker is enabled ALTER DATABASE [DatabaseName] SET ENABLE_BROKERGO -- Switching to our databaseuse [DatabaseName]GO CREATE SCHEMA schemaname AUTHORIZATION usernameGO ALTER USER username WITH DEFAULT_SCHEMA = schemaname GO /* * Creating two new roles. We're not going to set the necessary permissions * on the user-accounts, but we're going to set them on these two new roles. * At the end of this script, we're simply going to make our two users * members of these roles. */EXEC sp_addrole 'sql_dependency_subscriber' EXEC sp_addrole 'sql_dependency_starter' -- Permissions needed for [sql_dependency_starter]GRANT CREATE PROCEDURE to [sql_dependency_starter] GRANT CREATE QUEUE to [sql_dependency_starter]GRANT CREATE SERVICE to [sql_dependency_starter]GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification] to [sql_dependency_starter] GRANT VIEW DEFINITION TO [sql_dependency_starter] -- Permissions needed for [sql_dependency_subscriber] GRANT SELECT to [sql_dependency_subscriber] GRANT SUBSCRIBE QUERY NOTIFICATIONS TO [sql_dependency_subscriber] GRANT RECEIVE ON QueryNotificationErrorsQueue TO [sql_dependency_subscriber] GRANT REFERENCES on CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification] to [sql_dependency_subscriber] -- Making sure that my users are member of the correct role.EXEC sp_addrolemember 'sql_dependency_starter', 'username'EXEC sp_addrolemember 'sql_dependency_subscriber', 'username'
Hi I am using VWD 2008, SQL Express 2005, Reporting Services, Win-XP, IIS5Basically let's say I have 2 pages:Page1: has a SQLDataSource control that populates a GridView from a table from a database file myDB.mdf (no code behind)Page2: has a reportviewer control that show a report with data from the same table from myDB.mdf from the reportserver, (no code behind)I have attached myDB.mdf to the SQL Server Express using the SQL Server Management Studio Express.If I first open Page2 to display the ReportViewer it works ok. or using the Report ManagerNow this is the problem:If after that I try to open Page1 then a get an error message:Cannot open user default database. Login failed.Login failed for user 'myServerASPNET'. Exception Details: System.Data.SqlClient.SqlException: Cannot open user default database. Login failed.Login failed for user 'myServerASPNET'.Then I have to restart the SQL Server to fix it,Now I can open Page1 ok, but if after this I try to open Page2 (ReportViewer) againThen I get this error:" An error has occurred during report processing. o Cannot create a connection to data source 'my_Datasource'. &And this error if open the report using the report manager:" An error has occurred during report processing. o Cannot create a connection to data source 'my_Datasource'. ยง Unable to open the physical file "C:InetpubwwwrootWebsiteApp_DatamyDB.mdf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)". &Now if i check the Management Studio Express again, you can see that myDB.mdf was detached. It seems to be there by it has no Tables or definitions, so I have to attach it again..Do you know how to fix this?Thanks in advance,Ed
I have an almost virgin install of SQLExpres running on a WIN2K Pro system.
Have been able to create and connect db to Access 2000 without problem.
Now I wish to extend to remote connections. Using Surface Area Configuration tool, I changed Remote Connections to Local and Remote.
Whenever this setting contains TCP/IP and I try to restart the service I receive the following errors:
System Log:
The SQL Server (SQLEXPRESS) service terminated with service-specific error 10013.
Application Log:
Server TCP provider failed to listen on [ 'any' <ipv4> 0]. Tcp port is already in use.
TDSSNIClient initialization failed with error 0x271d, status code 0xa.
TDSSNIClient initialization failed with error 0x271d, status code 0x1.
Could not start the network library because of an internal error in the network library. To determine the cause, review the errors immediately preceding this one in the error log.
SQL Server could not spawn FRunCM thread. Check the SQL Server error log and the Windows event logs for information about possible related problems.
If I change back to Local Connections Only or Local and Remote using named pipes only, the service starts up again without a problem.
After five days of investigating, researching, reinstalling and waiting I have to ask for help.
I have a VM with SP2013 and SQL Server 2012 SP1. I have installed the powerpivot plugin SP1.But when I run the PowerPivot for SharePoint 2013 configuration tool.
Hello! I have the following problem. I developed CLR Stored Procedure "StartNotification" and deploy it on db. This sp calls external web service. Furthermore, this sp is called according with SQL Server Agent Job's schedule. On my PC SQL Server works under Local System account and this web service is called correctly (Executed as user: NT AUTHORITYSYSTEM). But on ther other server the following exception is raised during job running: Date 17.04.2007 16:42:10 Log Job History (FailureNotificationJob)
Step ID 1 Server MSK-CDBPO-01 Job Name FailureNotificationJob Step Name MainStep Duration 00:00:00 Sql Severity 16 Sql Message ID 6522 Operator Emailed Operator Net sent Operator Paged Retries Attempted 0
Message Executed as user: CORPmssqlserver. A .NET Framework error occurred during execution of user defined routine or aggregate 'StartNotification': System.Security.SecurityException: Request for the permission of type 'System.Net.WebPermission, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. System.Security.SecurityException: at System.Security.CodeAccessSecurityEngine.Check(Object demand, StackCrawlMark& stackMark, Boolean isPermSet) at System.Security.CodeAccessPermission.Demand() at System.Net. The step failed.
What is the reason of this behaviour? Unfortunately I do not have direct access to this server. I have the following guesses: 1) CORPmssqlserver may have not enough permissions to call web service 2) Something wrong with SQL Server account's permissions 2) Something wrong with SQL Server Agent account's permissions I will take the will for the deed. Thanks.
We are using the whole BI-package from Microsoft - from SQL, DTS-package,Raporting Service and Analyze Service.
It should be very helpful to be able to create a metadata databases where you could find all releations between different objects (tables,views,reports,cubes,DTS-package,Databases.
Just to get answer for: 'where is view xxx used', 'what are Report xxx depending upon'.
While everything exists in different SQL databases it should possible to do.
I have a initiator and a target service broker peer.
Both are controlled by a C# unit test. The initiator uses the Microsoft.Samples.SqlServer class. The target service uses stored procedure activation.
Sending a message from the initiator to the target, saves the content of the message, along with its conversation handle in the target's database specific table.
The unit test needs - at a later time - to instruct the target to send a message back on the same conversation handle to the initiator service.
For this the C# unit test creates a Conversation off of the saved conversation handle:
Service client = new Service("cleintservicename", conn, tran);
Conversation dialog = null;
dialog = new Conversation(client, convHandle); Sending the message on this dialog generates an error "Message body: <Error xmlns="http://schemas.microsoft.com/SQL/ServiceBroker/Error"><Code>-8495</Code><Description>The conversation has already been acknowledged by another instance of this service.</Description></Error>". Is the error due to the fact that a service - using the activated stored procedure already picked up the conversation, so that a new reference to the service can not be created through the Service class in CLR? If so, I might need then to skip the activated stored procedure in favor or a CLR service, alltogether? Any help - greatly appreciated.
We seem to be being plagued by the error below by our SQL Server agent. This happens almost everytime we restart the server that has been running for a day or two.
Our SQL Server Agent uses a none expiring domain credential. I understand that this problem only happens when the profile being used by the SQL Servr Agent has changed (password change). What puzzles me is that the login is A ok and no changes has been made to it's password.
We always resolve this problem by changing the login used in the SQL Server Agent to local and after that, returning it back to it's original domain login. Unfortunately, we cant always do this everytime something goes wrong.
Can anyone please help us shed a light on this? We're using SQL2k with SP3a. Thanks!
Error:
An error 1069 - )The service did not start due to logon failure) occurred while performing this service operation on the SQLServerAgent service.
I am trying to send a message between to SQL Server 2005 instances on two different machines. I have checked all my routes and all my objects appear to be setup correctly. However, when running Profiler on the target machine, I receive the "This message has been dropped because the TO service could not be found. Service name: "[tcp://mydomain.com/TARGET/MyService]". Message origin: "Transport". This is my activated stored procedure that is sending the message to the target service. I am using certificate security. Any help appreciated....
CREATE PROCEDURE [usp_ProcessMessage]
AS
BEGIN
SET NOCOUNT ON;
DECLARE @conversation_handle uniqueidentifier
DECLARE @message_body AS VARBINARY(MAX)
WHILE (1=1)
BEGIN
BEGIN TRANSACTION;
WAITFOR(RECEIVE TOP (1)
@conversation_handle = conversation_handle,
@message_body = message_body
FROM [tcp://mydomain.com/INITIATE/MyQueue]
), TIMEOUT 1000;
IF (@@ROWCOUNT = 0)
BEGIN
COMMIT;
BREAK;
END
END CONVERSATION @conversation_handle
IF @message_body IS NOT NULL
BEGIN
BEGIN DIALOG CONVERSATION @conversation_handle
FROM SERVICE [tcp://mydomain.com/INITIATE/MyService]
TO SERVICE '[tcp://mydomain.com/TARGET/MyService]'
ON CONTRACT [tcp://mydomain.com/INITIATE/MyMessage/v1.0]
WITH ENCRYPTION = ON, LIFETIME = 600;
SEND ON CONVERSATION @conversation_handle
MESSAGE TYPE [tcp://mydomain.com/TARGET/VisitMessage]
(@message_body);
END
COMMIT;
END
END
GO
My endpoints are created like so:
CREATE ENDPOINT MyEndpoint
STATE = STARTED
AS TCP
(
LISTENER_PORT = 4022
)
FOR SERVICE_BROKER (AUTHENTICATION = CERTIFICATE MasterCertificate)
GO
GRANT CONNECT TO CertOwner
GRANT CONNECT ON ENDPOINT::MyEndpoint TO CertOwner
GO
And my routes like so:
GRANT SEND ON SERVICE::[tcp://mydomain.com/INITIATE/MyService] TO CertOwner
GO
CREATE REMOTE SERVICE BINDING [MyCertificateBinding]
TO SERVICE '[tcp://mydomain.com/TARGET/MyService]'