Synchronization With Verbose History Logging And Specify An Output File To Write....
Feb 1, 2007
i had setup the merge replication across the server but each next morning i find that replication syncronization has stop with following error
The merge process was unable to access row metadata at the 'Subscriber'. When troubleshooting, restart the synchronization with verbose history logging and specify an output file to write to, or use SQL Profiler to determine the source of the failure. (Source: MSSQL_REPL, Error number: MSSQL_REPL-2147200996)
for this i manually restrat the syn. agent by stopping and then starting agent.
plz tell me how can i fix this hell of message or tell me how i call start/stop merge agent on command line so that i make script starting and stoping agent as job and will configure to run every morning.
I've created SQL Server job that executes my IS package in regular schedule, is there a way to redirect IS logging (Progress tab when the package is executed from VS) to flat file in same path as my packge.
When my job fails I want to know wich particular step fail and progress tab olso contain usefull info like the duration for each step.
I cannot seem to get my flat file to write columns in error when inserting into a SQL table. I have tried a few examples from MS and did not get anything written to my flat file output. I have set the Source Error Output on this flat file and it uses a script task to created the error description and then write it to a Flat File Destination.
I am new to SSIS and have not had any formal training on it. However, I am very familiar with VS.Net/c# and SQL 2000 DTS - I need to get this working ASAP as there are 45 total flat files that need to be processed. Once I have this solved for one, the rest will follow suit.
This is on a brandnew Win2003 server install with SQL Server 2005, RS 2005 and VS.NET 2005. I'm not able to log to the reportserver site to configure/publish reports. I'm logged in the system as administrator.
From IE6 I enter http://localhost/reportserver or http://localhost/reports and all I get is: Server Error in '/ReportServer' Application.
Compilation Error
Description: An error occurred during the compilation of a resource required to service this request. Please review the following specific error details and modify your source code appropriately.
Compiler Error Message: CS0016: Could not write to output file 'c:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eportserver1bbf2d182968d42eApp_global.asax.konkdp27.dll' -- 'The directory name is invalid. '
Source Error:
[No relevant source lines] Source File: Line: 0
Show Detailed Compiler Output:
c:windowssystem32inetsrv> "C:WINDOWSMicrosoft.NETFrameworkv2.0.50727csc.exe" /t:library /utf8output /R:"C:WINDOWSassemblyGAC_32System.Web2.0.0.0__b03f5f7f11d50a3aSystem.Web.dll" /R:"C:WINDOWSassemblyGAC_MSILSystem2.0.0.0__b77a5c561934e089System.dll" /R:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eportserver1bbf2d182968d42eassemblydl37d9e24c3 0719bf6_b4d0c501ReportingServicesWebServer.DLL" /out:"C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eportserver1bbf2d182968d42eApp_global.asax.konkdp27.dll" /debug- /optimize+ /w:4 /nowarn:1659;1699 "C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eportserver1bbf2d182968d42eApp_global.asax.konkdp27.0.cs" "C:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eportserver1bbf2d182968d42eApp_global.asax.konkdp27.1.cs"
Microsoft (R) Visual C# 2005 Compiler version 8.00.50727.42 for Microsoft (R) Windows (R) 2005 Framework version 2.0.50727 Copyright (C) Microsoft Corporation 2001-2005. All rights reserved.
error CS0016: Could not write to output file 'c:WINDOWSMicrosoft.NETFrameworkv2.0.50727Temporary ASP.NET Files eportserver1bbf2d182968d42eApp_global.asax.konkdp27.dll' -- 'The directory name is invalid. '
Version Information: Microsoft .NET Framework Version:2.0.50727.42; ASP.NET Version:2.0.50727.210
I've checked all permissions every where, I even allowed everyone full control to the entire C: drive, I uninstalled and reinstalled IIS6 (aspnet_regiis.exe) and nothing worked.
I have Lookup task to determine if source data should be updated to or insert to the customer table. After Lookup task, the Error Output pipeline will redirect to insert new data to the table and the Output pipeline will update customer table. But these two tasks will be processing at the same time which causes stall on the process. Never end.....
The job is similiart to what Slow Changing Dimention does but it won't update the table at the same time.
I have been using SSIS for about two months now and one of the main difficulties I have had is migrating Sql Agent SSIS jobs around environments.
Usually, the SSIS fails because of permissions to file systems, data bases, etc, but until now, I have had to discover these almost by guessing because the job history never game me any useful information. Also, from what I see, the SSIS logs have been unreliable, at best.
What I have done lately is creating Operating System (CmdExec) Steps as part of the job (not SSIS steps) and inserting the output of dtexec.exe into the history.
This seems to give me information that has been really hard to get, otherwise.
My question is (I am no DBA or ETL expert), is this the best way of getting the output of the package when debugging migration issues? Why doesn't the SSIS Steps have a similar feature of outputting useful info to the job history?
We have a publish job that we are trying to automate, the problem is getting the output back to the app. or a file. Originally we had print statements, this worked great when we manually ran the proc in QA and could capture the output, now that we are automating it from an application I am not sure how to capture these Print statements - ideally I would like to find this out.
The App. is doing a Try-Catch block so using something like isql.exe will not do the trick otherwise that is the route we would go.
I tried logging everyting to a table but those inserts get rolled back with XACT_ABORT. What about the xp proc that logs it to the event log? Thought of that but that would make a real mess of the event log with all of our status messages.
Now we are considering using xp_cmdshell that calls a batch file to output our status text, is this my best option? I would prefer to capture all of the print statements so if anyone knows how to do this that would be preferable!
Oldest Day till I want to Purge[30 days of Old Data] : 03-17-2015
Running Cleanuphistory Procedure On Server:
NO of records are qualifying for deletion:0
Deleted Sysmail MAil Items, Old history: 1 Deleted Sysmail Log Old history: 1 0 history entries purged. <<<<<<< ----------------------- [HOW IT IS COMING ?? ??] Deleted SQL JOB Old HISTORY: 1
its my flow in one of my packates (ETL job) Excel file contains monthly revenue details, i wanna import the excel data to my database staging table, so i've created the package. its working fine...
Problem if we change the new data for the next month and running the package its not running; the same file, same format, only we delete the contents, of the file except first row of the excel sheet, and pasting the new data; new data is coming from Oracle DataBase in the form of excel sheet ( manually they will copy the data and sending to us)
i open that package in design mode and while double clicking the excel file source it says <column name>'s Meta Data needs to be synchronized Do you want to Fix this issue automatically with the available external column's meta data
Clearly noted that its a data type issue; i have changed the corresponding data types as it is in the previous Excel sheet which is equivalant to the Table its copying to.
now the package is running with validation warnings, External Column "Invoice Amount" needs to be updated...etc. some 2 or three warning messages i can able to see in the package Execution wizard,
ok, i'm ready to accept these warnings, and i want my package running from my server;( packages had been deployed in to the Centeralized server; every time if we want to run the package, we have the asp.net webpage, that is executing the package in an On_click event)
The package is not running from the server, its due to the meta data change in the Excel file( i guess)
please suggest me some guide lines to resolve this meta data issue, i want my excel sheet meta data should not change when we have new updates in it;
otherwise suggest me some solutions that i can validate the excel sheet before running the package and testing whether the data is in correct format or not? its a kind of Data Profiling activity;
i know its some what crazy, but i need to maintain the system with permanent solution, instead of facing this meta data mismatch issue!!!
some what lenthy explanation--> its needed for my dear powerful microsoft responders. i think i 've explained my problem clearly, if i don't let me know your queries, i'll try my level best.
Basically I'm running a number of selects, using unions to write out each select query as a distinct line in the output. Each line needs to be multiplied by -1 in order to create an offset balance (yes this is balance sheet related stuff) for each line. Each select will have a different piece of criteria.
Although I have it working, I'm thinking there's a much better or cleaner way to do it (I use the word better loosely)
Example: SELECT 'Asset', 'House', TotalPrice * -1 FROM Accounts WHERE AvgAmount > 0 UNION SELECT 'Balance', 'Cover', TotalPrice FROM Accounts WHERE AvgAmount > 0
What gets messy here is having to write a similar set of queries where the amount is < 0 or = 0
I'm thinking something along the lines of building a table function contains all the descriptive text returning the relative values based on the AvgAmount I pass to it.
I need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?
I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.
Hi,I am trying to use BULK INSERT with format file. All of our data hasfew bytes of header in the data file which I would like to skip beforedoing BULK INSERT.Is it possible to write format file to skip these few bytes ofheader before doing BULK INSERT? For example, I have a 1 GB data filewith 1000 byte header. Except for first 1000 bytes, rest of the data isgood for BULK INSERT.Thanks in advance. Sorry if it is really a dumb question as I am newto BULK INSERT and practicing still.Bob
I need to record in a table: Who, When, What Field and New Value of Fields When changes occur to an existing record.
The purpose is for users to occassionally view the changes. They'll want to be able to see the history of the record - who changed what and when.
I figured I'd add the needed code to the stored procedure that's doing the update for the record.
When the stored procedure is called to do the update, the PK and parameters are sent.
The SP could first retain the current state of the record from the disk, then do the update, then "spin" thru the fields comparing the record state prior to the update and after. Differences could be parsed to a "Changes string" and in the end, this string is saved in a history record along with a few other fields:
Name, DateTime, Changes
FK to Changed Record: some int value Name: Joe Blow Date: 1/1/05 12:02pm Changes: Severity: 23 Project: Everest Assigned Lab: 204
How does the above approach sound?
Is there a better way you'd suggest?
Any sample code for a system that spins thru the fields comparing 1 temporary record with another looking for changes?
I would like to know if there is a way to maintain the history of changes to the reports that have been published to the report server?
I know that the report definitions get saved onto ReportServer database. But let's say a user makes a change to the published report and then saves it back to the server. And that the latest change was incorrect and I have to revert back to the previous version of the published report. Is there a way to do that? Does the report server maintain a history of previous versions.
There is a history for each report and I think that corresponds to the history of report executions (output data). But I am talking about the history of actual report definition.
Just like in DTS where we can add error file so that if the DTS package fails we can see what caused the DTS to fail, likewise do we have anything like error logging file in SSIS. I greatly appreciate your help on this. thanks!!
Can someone tell me if SQL server logs individual inserts when copying from Flat File to OLE DB Destination? If so, can this be turned off?
ULTIMATELY, I would like to use a Bulk Insert task, but it looks like it is pretty specific to a single DB and a single table in the DB. I would like the Bulk Insert task to be generic enough to where I can specify (via Parameters) the DB and table for the Bulk Insert because this task will be performed in an iteration of many different flat files.
Sorry I never seem to have a "simple" question. My DBA is breathing down my neck here..
After logging is configured in SSIS package, it seems that after each execution the output is appended to the log file (we are talking about log provider for text files in this case). As a result the file just keeps on growing. I would like to overwrite old information with each run, but I can't find where to configure this. Anybody knows?
I have a big stored procedure which is going to alter many tables,insert data, basically lot of changes.
So, i want to have a text file (or) any log file which will display, what all the changes does the stored procedure has done ( They dont want profiler output )
Can anybody know how to log the results of execution of stored procedure to a text file.
Is there any real purpose to the File Connection Manager? Not the Flat File Connection Manager, just the plain old File Connection Manager.
I have one in my SSIS package because I thought it might make writing to a file easier. But from what I can tell, it doesn't really do much. I have to pull the file name from the connection manager and basically open and manage the file myself in my scripts.
I went this route because I have two scripts that write to the file in turn... one writing out parent records and the other inserting the child records after each parent.
The script looks like this:
Code Block Dim outFile As String outFile = Dts.Connections("TestFile.LDIF").AcquireConnection(Dts.Transaction).ToString()
Dim sw As New StreamWriter(outFile, True) sw.WriteLine() sw.WriteLine("dn: " & Dts.Variables("GroupID").Value.ToString()) sw.WriteLine("changeType: add")
sw.Close()
I have to close the connection every time because the other script uses the file. It seems that this is quite inefficient. Am I using the File Connection Manager incorrectly? What's a more appropriate way to do this?
I could open the file in a different script and store the StreamWriter object in a global variable, I suppose...
I am looking to promote a SSIS package to various servers. The name and location of the text Log File setup in BIDS for the package is expected to change. The SSIS package will be executed through a 3rd party scheduler as a batch command file.
I am trying to use this command to change the location dtexec /F DASD_Database_List_export.dtsx /L "DTS.LogProviderTextFile.1;E:Log.txt"
For which I am getting this error:
Description: The connection manager "E:log.txt" is not found. A component failed to find the connection manager in the Connections collection.
I tried Set command approach from http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=72104
It did not work.
Has anyone been successful in using the dtexec log parameter DTS.LogProviderTextFile.1? Could you post how you got it to work? Thanks
I am trying to use a conditional split task so that I can check for specific fields. If the value doesn't exist I am piping the records to a derived field task, where I add an error. I then try to send these records to a flat file destination so that I can keep track of them. However, when I execute the SSIS data flow task I get the following error
[Log Invalid Records [5496]] Warning: The process cannot access the file because it is being used by another process.
This file isn't being used by any other process as far as I can tell, and the only process using it is the SSIS task trying to write to it.
If anyone has any ideas, then I would really really appreciate it
There is some strange behaviour i've recently noticed while watching synchronization progress in Replication Monitor on SQL 2005 Server Standard with merge replication configured. The merge process seems to repeat several times.
This is the initial synchronizaion (reinitalization at the subsciber). Client is using Microsoft.SQLServer.Replication objects from .net framework assemblies.
The synchronization starts normally (status is "Running"). The last message of selected session box shows (among other messages): "Beginning evaluating partial replication filters" then "Finished evaluating partial replication filters" and finally "Merge completed after processing xxx changes... etc." after a few seconds. Status changes to "Completed" and then... the merge process starts again!! "Beginning evaluating partial replication filters" etc. And this repeats about 15-20 times.
And so whole process takes about 15 minutes instead about 45 seconds to complete initial synchronization. The number of changes is "Merge completed after processing ..." never change since the first such message.
Is this some bug in web synchronization or some invalid configuration setting? Why does merge process repeat itself so many times??
SET @RowCnt = 1 SET @date = CONVERT(CHAR(10),GETDATE(),110) SET @ArchPath = '\D$EDATAWorkFoldersSendSendData' SELECT @TotalRows = count(*) FROM table1 --select @ArchPath
WHILE (@RowCnt <= @TotalRows) BEGIN SELECT @AccountNumber = AccountNumber, @output_filename FROM table1 WHERE Identity_Number = @RowCnt --PRINT @AccountNumber --test SELECT @sql = N'bcp "SELECT h.HeaderText, d.RECORD FROM table2 d INNER JOIN table3 h ON d.HeaderID = h.HeaderID WHERE d.ccountNumber = ''' + @AccountNumber+'''" queryout "'+@ArchPath+ @output_filename + '.txt" -T -c' --PRINT @sql EXEC master..xp_cmdshell @sql SELECT @RowCnt = @RowCnt + 1 END
I am developing a package on my local workstation. I have defined two logging service providers. One is for SQL Server and the other is for the Windows Event Log. I am using the Dts.Log method in a script task to write log entries.
Logging is working properly with the SQL Server provider and rows are being inserted into the sysdtslog90 table. However, the only events that are being logged in the Windows Event Log are the package start and end events which I believe SSIS is doing automatically anyway.
Is there something I need to do to enable WIndows Event Log logging other than defining a log provider and making sure it is checked active? Won't SSIS write to two different logs with one Dts.Log call? Any ideas on what might be going wrong with my approach?
We run std 2008 r2. When I deploy and run a pkg from the catalog, how can I get that flat file system log we always instructed ssis to write to when we ran from the command line? I believe it was the /L param . Not sure at this point if i'll use sql agent or somehow employ task scheduler to kick off the pkg.