Subscription Files(*.msf)
Aug 20, 2006I look for a subscription files in format *.msf to synchronize my net.Where can I find this ? Colud be a example file.Urgent!!!
View 1 RepliesI look for a subscription files in format *.msf to synchronize my net.Where can I find this ? Colud be a example file.Urgent!!!
View 1 Replies
Hi All!
Microsoft SQL Server 2005 - 9.00.3159.00 (Intel X86)
Mar 23 2007 16:15:11
Copyright (c) 1988-2005 Microsoft Corporation
Enterprise Edition on Windows NT 5.2 (Build 3790: Service Pack 2)
40 subsribers.
Adding new article.
On any aubscribe need regenerate snapshot agent agent.
The tables are very big. With filters.
Regenerating process anew copy data in in ...ReplDataame.bcp, during 30-40 minutes for any subscriber!
Full work to small correct of replication flow during 2-3 days....
Is it possible to initialize subscription without copying bcp data?
It seems to me that if a scheduled SSRS report subscription fails ( Status Message - An error has occurred during report processing) , that I actually need to delete the subscription and reconfigure it from scratch. The scheduled job doesn't try to run again automatically (say the next Monday on a weekly Monday schedule).
Is there a way to "reset" a failed subscription without have to recreate the entire subscription?
Thank you
Devon Kyle
28638
SSCE_M_DUPLICATETABLE
Existing subscription already contains table included in the new subscription.
What are the possible causes of this merge replication error?
Could it be caused by a SQL Server Compact Edition User trying to sync their .sdf file after their subscription has already expired on the SQL Server?
Would you expect to see a different message if a SQL Server Compact Edition user tried to sync a subscriber database (.sdf file) with merge replication if it's been longer than the subscription retention period since their last sync?
I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.
View 2 Replies View RelatedIn the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.
Any Suggestions?
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this:
1. If no file(s) found, stop executing and send email saying no file(s) found;
2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1:
----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports Microsoft.VisualBasic.FileIO.FileSystem
Imports System.IO.FileSystemInfo
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String
Dim cFileType As String
Dim cFileFlgVar As String
WriteVariable("SCFileFlg", False)
WriteVariable("OOFileFlg", False)
WriteVariable("INFileFlg", False)
WriteVariable("IAFileFlg", False)
WriteVariable("RCFileFlg", False)
cDataFileName = ReadVariable("DataFileName").ToString
cFileType = Left(Right(cDataFileName, 4), 2)
cFileFlgVar = cFileType.ToUpper + "FileFlg"
WriteVariable(cFileFlgVar, True)
Dts.TaskResult = Dts.Results.Success
End Sub
Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object)
Try
Dim vars As Variables
Dts.VariableDispenser.LockForWrite(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
vars(varName).Value = varValue
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
End Sub
Private Function ReadVariable(ByVal varName As String) As Object
Dim result As Object
Try
Dim vars As Variables
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
result = vars(varName).Value
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
Return result
End Function
End Class
example for step 2:
-------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
'Set the properties like username & password
cm.Properties("ServerName").SetValue(cm, "ftp.name.com")
cm.Properties("ServerUserName").SetValue(cm, "username")
cm.Properties("ServerPassword").SetValue(cm, "password")
cm.Properties("ServerPort").SetValue(cm, "21")
cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout
cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb
cm.Properties("Retries").SetValue(cm, "1")
'create the FTP object that sends the files and pass it the connection created above.
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))
'Connects to the ftp server
ftp.Connect()
'ftp.SetWorkingDirectory("..")
ftp.SetWorkingDirectory("directoryname")
Dim folderNames() As String
Dim fileNames() As String
ftp.GetListing(folderNames, fileNames)
Dim maxname As String = ""
For Each filename As String In fileNames
' whatever operation you need to do to find the correct file...
Next
Dim files(0) As String
files(0) = maxname
ftp.ReceiveFiles(files, "C: emp", True, True)
' Close the ftp connection
ftp.Close()
'Set the filename you retreive for use in data flow
Dts.Variables.Item("FILENAME").Value = maxname
Catch ex As Exception
Dts.TaskResult = Dts.Results.Failure
End Try
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
I have the need to delete old backup files via TSQL job. Found this solution online:
PushD "
emoteservershareDIFF" &&(
forfiles -m *DIFF*.sqb -d -1 -c "cmd /c del /q @path"
) & PopD
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current
SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!
I am thinking about replacing the INSERT data scriptfiles that I have with XML files. This way I can open the XMLfile using an XML Editor and see the values in a GRID andmake changes easier.Do you see any problem with this approach?I managed to put together some code that is exportinga SQL table with its data to an XML file and also a codethat reads the XML file's data and inserts it into a table.Now I am researching on XSD, td:datatype, DTD...(I am new to XML) in order to figure out how I canuse a single xml file that will hold both the sql serverfields, the datatypes and their values.If you have links to some sample code that has anythingto do with the datatype export and import I am workingon, can you please share them with me?Most importantly what do you think about the idea of usingXML files vs sql scripts?Thank you
View 4 Replies View RelatedI have a scenario where I need to convert RDF files to PDF files? may I know is this achievable in SSIS - writing C# code?
View 6 Replies View RelatedI am wanting to reduce the amount of Virtual Log Files I have. In reading through the Online Book Documentation, I realize that I have forgotten to move the Transaction Log Files to a different drive. Now that the server is in production, I wanted to get some input about the best way of making this change.
Can I just change the directory the log files are being written to in the DB properties without having any adverse problems occurring?
Hi!
I'm using replication with two database on SQL 2000,when begin, the log files size is 50mb and the data files size is 150mb. But now the log files size is 2Gb and the data files size is 4Gb. I would like to decrease the log files and the data files ??? How do i do this???
(I using Truncate and shrink doesn't change )
Thanks!!!
I need to write codes to access many image files and wave files and display them on the webpage. Both of these files are stored on the server (as many experts in this forum adviced) and their locations are in the SQL server database. About the wav files, I think I will have to convert them to MP3 first for fast delivery on internet. Since I am a newbie, any help would be appreciated.
Thanks,
I have a problem with bcp and format files.We changed our databases from varchar to nvarchar to support unicode. Noproblems so fare with that. It is working fine.But now I need a format file for the customer table and and it is notworking. It is working fine with the old DB with varchar, but withnvarchar I'm not able to copy the data. The biggest problem is, that Igot no error message. BCP starts copying to table and finished withouterror message.This is my table:CREATE TABLE [dbo].[Customer] ([ID] [int] NOT NULL ,[CreationTime] [datetime] NULL ,[ModificationTime] [datetime] NULL ,[DiscoveryTime] [datetime] NULL ,[Name_] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AI NULL ,[Class] [int] NULL ,[Subclass] [int] NULL ,[Capabilities] [int] NULL ,[SnapshotID] [int] NOT NULL ,[CompanyName] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AI NOTNULL ,[TargetRCCountry] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AINOT NULL ,[LocationID] [int] NULL ,[MirrorID] [binary] (16) NULL ,[DeleteFlag] [bit] NULL ,[AdminStatus] [bit] NULL) ON [PRIMARY]GOand this is the format file:8.0131 SQLINT 1 12 "#~@~#" 1 ID ""2 SQLDATETIME 1 24 "#~@~#" 2 CreationTime ""3 SQLDATETIME 1 24 "#~@~#" 3 ModificationTime ""4 SQLDATETIME 1 24 "#~@~#" 4 DiscoveryTime ""5 SQLNCHAR 2 510 "#~@~#" 5 Name_SQL_Latin1_General_CP1_CI_AS6 SQLINT 1 12 "#~@~#" 6 Class ""7 SQLINT 1 12 "#~@~#" 7 Subclass ""8 SQLINT 1 12 "#~@~#" 8 Capabilities ""9 SQLINT 1 12 "#~@~#" 9 SnapshotID ""10 SQLNCHAR 2 510 "#~@~#" 10 CompanyNameSQL_Latin1_General_CP1_CI_AS11 SQLNCHAR 2 510 "#~@~#" 11 TargetRCCountrySQL_Latin1_General_CP1_CI_AS12 SQLINT 1 12 "#~@~#" 12 LocationID ""13 SQLBINARY 1 33 "#~@~#
"13 MirrorID """#~@~#" is the field terminator. We have a lot of text files with allkind of charachers in it. So we think this is a set that will neveroccur in our files.Thanks for your help!*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
Hi All,
Are there any MDS Descriptor Files associated with Databases Files? Are these associated with transaction logs? As per my information there are .mdf , .ndf and .ldf files.Could anyone please guide me ASAP,as there is an urgent requirement from one of the clients?
Thanks in Advance
Gaurav Matkar
Is there a way to do this with some sort of task ?
View 7 Replies View RelatedI'm using script task to rename files inside foreach loop files,code like shown below
Dim file As New System.IO.FileInfo(CStr(Dts.Variables("User::FileName").Value))
dim newname as string = Split(file.Name, "_")(0) & ".jpg"
Dts.Variables("User::outputname").Value = file.DirectoryName & "" & newname
Then,i use ftp task to send files,but prompt "the variables User::outputname doesn't contains file path(s)"
I tried to use file system task to perform that,but failed either
Rename file operation in file system only can rename a file in a specified location,who can help me?
TIA
Hi gurus,
I've created a linked server (and set up the corresponding schema.ini file) in order to perform bulk-inserts from some CSV text files into SQL tables (from my standpoint the text files are just for reading purposes). The linked server works fine (I can select the data in the files without a problem).
Now the question: is possible to automatically detect when one or more of those files change in order to start the import process automatically? Something like having a trigger created on the CSV files Or there's no easy way to do that so I have, to say something, to create a Job that periodically checks if the files have changed programatically (say, recording each file's timestamp everytime is imported and comparing the recorded value with the current one, or whatever)?
Thanks a lot in advance!
Hi,
We are migrating our reporting services database from one server to another.
What needs to be done to subscriptions to ensure they work properly. Will they automatically update as part of the migration, or is there a manual process.
We are using Reporting Services 2000 and will be using Reporting services 2000 as well on the new server.
Thanks
Hi all,
I'm creating data driven subscription for parameterized reports. I'm supposed to deliver one report to six different customers. But each customer can see only his own data in the report. This is not a problem. But the problem is I need another report in which all the data can be seen. When I create DD subsription, I chosed Auto increment option for the write mode under "
Specify delivery extension settings for Report Server FileShare" .
But then it creates seperate report for each parameter value. But it does not create the report which contains all the data for all parameter values. How can I make it without creating another report.
Also when I use overwrite option instead of Auto Increment, It creates only one reports. Does any one know why it happens?
Thanks
Hello,
is it possible to make a new subscription remote over a website and NOT in the Reportmanager?
Has anybody done something like this and are there any examples?
Thx
Jo-Jo
Hi,
I am doing a repl.in SQL7/NT.
At time of 'Push Subscription wizard' - I used the option "Yes,Initialize the schema and data......" whereas I already had data and schema at the subscriber.
Now my question is what will happen now and how long will this process take(for approx5GB of DAta).
OR should I stop the process - it's already an hour when I started the process.
Any suggestion appreciated.
I have created a subscription pull from one SQL server to another so that the database from one SQL server is replicated to another. But when the subscription runs, no data appears in the subscriber SQL server. When I check the subscription, I see that the database is connecting to the publisher but that is it. I have the login correct but what am I missing? Do the tables in the subscriber database have to be already created and matching the ones in the publisher database before the subscription runs or are they automaticaly created after the subscription runs?
Thank you in advance
Hello all,
I have a data driven subscription with the information about the file name/file extension/ path etc coming in from a database. The problem is that the subscription status after running tells me that things are done and there is no error but the file is not being generated in the specified directory and for that reason the file is not generated at all anywhere on the hard disk. can anybody please help.
The information in the database for the report is as follows
FileName : Test1
FileExtn : True
Path = \ram\C$Reports
Render_Format = PDF
Username = Administrator
Password = password
Writemode = Overwrite
All the fields are varchar(50) in size.
Hi, wonder if someone can help.
I've created a subscription and scheduled it to run but I get the following message:
Failure sending mail: The transport failed to connect to the server.
Anyone any ideas why? or what I can do to fix it?
Thanks
Because of problems trying to alter databases used
for replication my software would need to find out
if a database is a publisher or was repliated (using
T-SQL). Is this possible?
My name is Tech guy,
I want to use window script program to execute a pull subscription installed on SQL 2005 express edition. Because it free for downloand.
is there a script that can call an existing pull subscription execution.
is there anyone know how to drop subscription from the publisher database without a connection to the subscriber(because the subscriber server is formated)
View 3 Replies View RelatedAre there any procedural differences between Express and MSDE considering pull subscription?
Thanks
abdul
Hi
I have uploaded a report onto my 2005 Reporting Manager and want to create a daily email of the report... How do I set up the email facility? Can someone give me a link?
I not sure what to do because the company uses Exchange Server and not SMTP....
Thanks
Hi,
I need to subscribe one report in Reporting Services. I was compiled all fields and I said to report server that
I want excel file at one shared directory of the server.
When the report server tried to process the report I have the following messagge:
Failure writing file S16 : An unexpected error occurred in Report Processing.
What's the report want?
thanks.
I am unable to create a subscription.
The current action cannot be completed because the user data source credentials that are required to execute this report are not stored in the report server database. (rsInvalidDataSourceCredentialSetting)
Delievered by: Report Server E-Mail
SMTP: is enabled on IIS
Reporting Services Configuration Manager:
Email Setting: setup correctly
What else is it that I would need to check to get the subscriptions going?