I am using SSIS for sending E-Mails based on certain transactions in SQL Server. I need to dynamically populate the E-Mail content as the content some times needs to be in Japanese text. I would like to use resource files for this. Could anyone please help me on using resource files in SSIS.
I would like to know witch is the best practices to make a .rdl file multilenguaje, that is, that the same .rdl file shows the text €śAmount€? on a label when the user is English or the text €śCantidad€? when the user is Spanish.
There are not something like the resource files in the in a VS.NET web application????
I would like to know witch is the best practices to make a .rdl file multilenguaje, that is, that the same .rdl file shows the text €śAmount€? on a label when the user is English or the text €śCantidad€? when the user is Spanish.
There are not something like the resource files in the in a VS.NET web application????
after moving the resource database files with this command:
ALTER DATABASE mssqlsystemresource MODIFY FILE (NAME=data, FILENAME= 'new_path_of_mastermssqlsystemresource.mdf'); GO ALTER DATABASE mssqlsystemresource MODIFY FILE (NAME=log, FILENAME= 'new_path_of_mastermssqlsystemresource.ldf'); GO
I am getting this error trying to do anything in the instance while I have it started in minimal mode (/f /t3608)
(the instance WILL NOT COME UP UNLESS I USE THE /F /T3608 from a command promt.. please .. any suggestions?????????????????????? Also.. I checked and the primary file is NOT READ-ONLY.
File activation failure. The physical file name "E:MSSQLKOCSQLDEV01Datafilesmssqlsystemresource .ldf" may be incorrect.
The log cannot be rebuilt when the primary file is read-only.
File activation failure. The physical file name "E:MSSQLKOCSQLDEV01Datafilesmssqlsystemresource .ldf" may be incorrect.
The log cannot be rebuilt when the primary file is read-only.
Msg 945, Level 14, State 2, Line 1
Database 'mssqlsystemresource' cannot be opened due to inaccessible files or insufficient memory or disk space. See the SQL Server errorlog for details.
I am new to SSIS, and have been struggling recently with SSIS development. I would like to find out if there are some good books out there on SSIS besides, the online resources, and BOL. Possibly a good book or tutorials. Thanks.
Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!
Here is the deal... I am sending tab delimited files from one folder and i want to zip the file once the files are send and then delete those files as i would have backup in the format of zip file.
Do any one know regarding the aforesaid..
ie. folder X has 5 tab delimited files
once i have send those 5 files i want to zip them in another folder (folder Y ) with date stamp and then delete those 5 files.
I use an XML source to load an XML file to my db, so i genrate with sucess the xsd file but when i check OK i get an errer which is "Task mismatch of data streams [Source XML [1]] The XML source adapter does not support the model of mixed content on complex types", "The component pipeline returned error code HRESULT 0xC02092A1 from a method call. (Microsoft.SqlServer.DTSPipelineWrap)".
I've been tasked with coming up with a process to delete old files out of a certain directory. It's prefered to have a configuration file to go in and be able to change the number of days to retain files (7 for delete files older than 7 days for example). I've been told I can do this with SSIS. Can anybody point me in the right direction to do this in SSIS or even in DOS? I don't know ActiveX scripting if that's what's needed in SSIS. I prefer doing something like this in DOS but not sure it's possible to automate. Any help is appreciated.
I've recently started creating SSIS packages using Visual Studios. I am trying to find a way of zipping and archiving files in one of the packages. Is this possible?, and if so, is it possible to date stamp the zip files to?
I would like to have a SSIS package which loops through each xml file (.xml files) in a folder on the network. And then for each file pull out the data and insert into a sql server table. Please kindly guide me through this i.e. What task(s) are required, etc. Thanks
I'm downloading zip'd files and would like to loop through each file that was downloaded. I'd also like to unizip each file and append all of them to one file. I have a dos batch that is fairly simple and would like to emulate it using ssis. Here is what the dos batch file looks like.
DATE /T >%TEMP%D.txt
FOR /F "usebackq tokens=2,3,4 delims=/, " %%i IN ("%TEMP%D.txt") DO SET fname=TAMF_162%%i%%j%%k-%%k.zip
Did you have any success when running bat files (Execute Process Task) from SSIS through SQL Server Agent jobs? My package will succeed when I run it from my machine, when I ask the DBA to run it manually from the Server but not when we run from a job.
The job will hang and the bat file does not seem to be executed. The executable property does evaluate to the right path and the package owner does have write/execute permissions in the folder where the bat file is located.
Everything I could find close to that is under <http://support.microsoft.com/kb/918760>. Would you shed any light on this? When I run the same bat file from a DTS through SQL Server Agent job, it will work with no issues. Any help would be very much appreciated.
I am having a bit of trouble with SSIS Configurations.
- In BIDS, I have added a configuration file and specified the properties I want to expose. - When I build the project, I get the standard bindeployment folder which contains the package file (.dtsx), the configuration file (.dtsconfig) and the deploymentmanifest. - Before deploying the package, I edit the config file to have the settings I want it to for the environment I am deploying to. - The package deploys OK - When I work directly on the SSIS server (64 bit), I can go into SQl Mgt Studio, choose 'run package' and when I look in the connection manager window all the settings are as I desire ( I havent had to add a configuration file manually) - When I work on a client machine, I connect to the SSIS server and choose 'run package' - the properties/connections are the same as on the server but the values themselves are completely different.
Why is this? Why when I run the package from the client (32 bit) do the configuration values appear to be completely different? How can I run the package remotely and pick up the configuration values that I deployed with? Or have I misunderstood this whole configuration function?
We recieve about 4 million rows of data from a client per week. Each file is a complete snapshot of their data, we need to be able to find the rows that are different or new in the current weeks file compared to last weeks file.
Is this possible using 2 Flat file sources and some kind of sort/merge join?
If you are familiar with ORACLE or postgres its similar to a "MINUS" or "EXCEPT" respectively. Right now we load both files to separate tables and compare them in sql server which is very slow.
http://blogs.conchango.com/jamiethomson/default.aspx has a lot of great tid-bits for SQL 2005. I am currently on a tight deadine for 25 SSIS packages that need to be able to move from Dev to QA to Staging to Prod. For the life of me I cannot get any of the packages to *READ* the config files created with the package config wizard. All I want to do is move the connection string out of the package so we can change the config file and not have to touch (hand edit) each package. Any help is appreciated!
Hi all,New to SQL Server - trying to create an SSIS package that will look forand import a series of Visual Foxpro tables (.DBFs) when they appear ina folder.The tables are/can be all different fields, field widths, etc. Withquite a bit of overlap though.The end result should be table "ABC.DBF" is pulled into SQL Server astable "ABC"Using: SQL Server 2005 Enterprise, SSIS, *latest* version of VFPOLEDBdownloaded from MSI have set up a package and tested it with several different tables andit works great - but I have to redo the data source and destinationeach time...I need to get this to be a somewhat automated process, pulling in all..DBFs no matter what they contain.Can I do this with SSIS alone (and variable substitution) or do I needto write a bunch of code...Thanks very much for your time and thoughts...
I have created a DTS package which imports text file into single sql server table with 8 columns (SourceData). The DTS package uses 'Test1.txt' file. Now i have around 200 text files (Test1,Test2,.....Test200). I need to import them one by one into 'SourceData' table. Could you pls. help me out in getting solved.
I could not find the exact details on how to create a SSIS script that would ftp files on these forums, so I am adding my code to help save time for anyone else that might be wanting to do something similar. Here is the VB code for my script task to FTP files (hope this helps someone):
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
'Set the properties like username & password
cm.Properties("ServerName").SetValue(cm, "Enter your Server Name here")
cm.Properties("ServerUserName").SetValue(cm, "Enter your FTP User Name here")
cm.Properties("ServerPassword").SetValue(cm, "Enter your FTP Password here")
cm.Properties("ServerPort").SetValue(cm, "21")
cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout
'create the FTP object that sends the files and pass it the connection created above.
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))
'Connects to the ftp server
ftp.Connect()
'Build a array of all the file names that is going to be FTP'ed (in this case only one file)
Dim files(0) As String
files(0) = "Drive:FullPathYourFileName"
'ftp the file
'Note: I had a hard time finding the remote path directory. I found it by mistake by creating both the FTP connection and task in the SSIS package and it defaulted the remote path setting in the FTP task.
ftp.SendFiles(files, "/Enter Your Remote Path", True, False) ' the True makes it overwrite existing file and False is saying that it is not transferring ASCII
I'm trying to join outputs of two text files combined in to one text file. The lenght of both the input text files are same (590) For example, If contents of text_file_1.txt is: aaaaaaaaaaaaaaaaaaaaaaaaaaaa and contents of text_file_2.txt is: bbbbbbbbbbbbbbbbbbbbbbbbbbbb
I have a number of XLS reports in template form. I want to move these to a new location on the File Server and after they have been populated move them to another location on the File Server.
I have seen some proposed solutions but I haven€™t found any that work. This should not be difficult and I envisage using a File System Task and a Foreach Loop Container. However passing the multiple file names to the File System Task errors repeatedly.
I am having a scenario where all the stored procedures are stored in a folder (one sql file per sproc). Stored procedure does not have 'IF Exists .... DROP Procedure' in the script so before creating them we have to drop all sproc manually.
Can anyone help me writing a script / SSIS process to loop through each file in folder and write "IF EXISTS ... DROP PROCEDURE" with the procedure name in it ??
I can create a package that loop through each file in FOR each Loop task but dont know how to write in file using .net
I am building a ssis package that imports multiple xml files containing data into the tables using one xsd file. I am using xml source task for this. I can only import one file as the primary key constraint gets violated.
I have four tables with four primary keys. The xml file does not have the primary key column data. So every time these columns get populated as 1, 2, 3, 4.
I am preety new to xml, so was wondering if anyone can help? Why doesn't the xml file have primary key column data?
I have created a job that will execute a SSIS package which will unzip some zip files. For unzipping we are using WinZip. In the package I have used a .Net script task for unzipping. This script is using WZUNZIP. When I am executing the package directly it is unzipping all the zip files. But when I am executing the job that will execute the SSIS package for unzipping it is going on with the execution and not unzipping the zip files. So finally I stopped the job.
I am new to SSIS and am looking to load XML files (with a DTD definition) into tables via a SSIS package. I have created a XML task and am able to load the XML and output it to file. I have also stripped out the DTD definition and am able through a dataflow using XML source an OLE db Destination load and map the XML to table sin my DB. But have no idea how to get the data in when it has a DTD definition included. I either want to put each file into a row in a table then query it. Or from the SSIS package input the relevant info into a set of staging tables or the real tables.
Why are some SSIS files, generated by the Import/Export Data wizard put into the local users temp folder? Why are these not compiled with the package when the solution is built?
Is there some setting I am missing?
This architecture is kind of silly, as the server always needs access to the temp folder on the local machine to run.
How can I get these temp files packaged with the rest of the package and deployed to the server so the server can run independent of the machine I develop the package on?
take files from c:directory1 copy them to cirectory2 with a different name (concatenated date on the end) delete them from c:directory1
Should I be using the Script Task for this? I am wondering if I should be using the File System Task which copies directories or files, but I don't know if I could rename the files during the copy.
Also, I want the directory names to be in the configuration file. I am not sure how to do this. If I set up a flat file source connection, I need to specify a file name and I just want the directory names. How do I get them in the config file and how to I read them into my script from the config file?
I am using a XML file and retrieving data for my SSIS 2005 (Intigration Service) package, where after retrieving the data I need to update my XML file with new data by using script task or XML task
This is a question of whether or not to use SSIS to solve a problem.
I need to copy SQL Server database backup files from a server in the DMZ to a fileserver inside the firewall. The SQL Server is not allowed to write it's backup files directly to the fileserver, so they are written to local disk. A connection can be made from inside the firewall to the SQL Server to copy the files off.
So, I'm considering SSIS for the job. Is it possible to use SSIS to perform the file copies from one remote server to another? If so, is the FTP task required, or can File System tasks be used?
An alternative would be Windows scripting, xcopy, robocopy, etc. but I like the features of SSIS and would like to take advantage of it's flow control, error handling, database scheduling, etc.