Changing The Content Of The DtsConfig File Using Script Task
Feb 16, 2008
Hello,
I have setup a package configuration and it ran fine; however, I would like to be able to use the script task to change content of the dtsConfig file. For example, I can get the newServerName using the SQL Task, then how can I assign the newServerName value to the DataSource in the package configuration file? Is this possible?
This delete's the directory content as well as sub directory content! Is their a way i can just set it to delete the files in the said directory and not sub directories?
I'm trying to easily change a value in 80 ssis configuration files from "localhost" to "myservername". I downloaded WinGrep to do this, but it's balking on some of the files, saying that they are binary files.
Does anyone know how to make the *.dtsconfig files NOT binary in SSIS?
Any suggestions GREATLY appreciated.
I won't go into the reason I have my servername in 80 places...
after i attached the file in package configurations .
i was unable to see the db connection in connection managers.
i tried to attach the .dtsconfig file in (New OLEDB connection manager) but after clicking the test connection " Testconnection failed because of an error in intializing provider.The connectionString property is not intialized"
I was requested to build and move and existing SSIS package from the developers local PC to the Dev server. Both are running SQL 2K5, I built the following command file so SQL Agent can run the import job.
REM *** Run for FileImport set FileImportLoadData=C:SSISConfigDEVFileImportLoadData.dtsConfig "C:Program FilesMicrosoft SQL Server90DTSBinndtexec" /f "C:SSISConfigFileImportFileImportLoadData.dtsx"
All files are in the proper place but when the jobs runs the following message shows up in the history;
Message Executed as user: LIGHTHOUSE1SQLEXEC. ...system32>set FileImportLoadData=C:SSISConfigDEVFileImportLoadData.dtsConfig C:WINDOWSsystem32>"C:Program FilesMicrosoft SQL Server90DTSBinndtexec" /f "C:SSISConfigDEVFileImportLoadData.dtsx"Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 8:54:00 AM Warning: 2008-05-08 08:54:01.77 Code: 0x80012014 Source: FileImportLoadData Description: The configuration file "FileImportLoadData.dtsConfig" cannot be found. Check the directory and file name. End Warning Warning: 2008-05-08 08:54:01.77 Code: 0x80012059 Source: FileImportLoadData Description: Failed to load at least one of the configuration entries for the package. Check configurations entries and previous warnings to see descriptions of which configuration failed. End Warning Progress: 2008-05-08 08:54:01.82 Source... Process Exit Code 0. The step succeeded.
My questions is how do I fix this and is this really and error?
I use an XML configuration file for my SSIS package, and I have "Enable package configurations" checked. When I build the package, however, there is no dtsConfig file in the output (set to in) or the Deployment folder, and the config file is not included in the SSISDeploymentManifest file. There are no build errors or warnings.
This was working fine the last time I worked on this package (~9 months ago), but not now. I have since installed SP2, but can't confirm if the problem is related to a "fix" in SP2.
Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:01:27 PM Warning: 2007-11-30 22:01:29.52 Code: 0x80012012 Source: Package Description: The configuration file name "C:DevelopmentdtsConfig.xml" is not valid. Check the configuration file name. End Warning ...
But then it goes on to read values from my Production config file. How can I stop the annoying warning about my Development config file-path (which doesn't exist on the Production machine)
I am migrating from local to Dev,QA and Prod. I created a .dtsconfig file containing database connection strings to Dev database. What is the "location" on the Dev server where this .dtsconfig file nees to be deployed to??
I've deployed my ssis pkg to the server and created a sql job to run this pkg. So far, everything is fine. Today, I got a request to change some variables inside the package which is part of the .dtsconfig. I want to edit the deployed .dtsConfig but it won't allow me and always complained that this file has been opened by another program. I am sure i've closed my ssis designer and other notpad, why can't I edit and save .dtsconfig file?
HI, We are using SQLLDR to send data from SQL Server to Oracle database. This utility creates a log file. I want to know whether there is a provision to change the contents of the file. We use this utility in multiple SQL Servers to send the data to the same Oracle Server. Can we add the source server name in the file so that identifying which server does a particular log file belong to? How?
I have a requirement where I should be reading the file (I will use File system Task for this) and the header content. I should be able to read the header, validate and when the validation succeeds , I should start process the rest of the file.
To elaborate this further, take a sample Example. Assume that the header will have the date information. In a given folder I should read the file, will check the date (header being the first line of the file). If it matches the current date, I will start processing and on completion I will archive my files.
Hello everyone!I'm having a problem with inserting the content of a text file into a Sql Server 2005 database.I'm reading the text file into a dataset, and works fine. What I can't do is what I suspect is the simple part: Insert all the data into a table that has exactly the same configuration that the file. I've never worked with dataset's before, and I can't seem to find the answer to this!This is what I have done so far: Dim i2 As Integer Dim j As Integer Dim File As String = Server.MapPath("..DocsFactsFORM_MAN_V3_1.txt") Dim TableName As String = "Facts" Dim delimiter As String = "9"
Dim result As DataSet = New DataSet() Dim s As StreamReader = New StreamReader(File) Dim columns As String() = s.ReadLine().Split(Chr(9)) result.Tables.Add(TableName) Dim strs1 As String() = columns For i2 = 0 To CInt(strs1.Length) - 1 Dim col As String = strs1(i2) Dim added As Boolean = False Dim [next] As String = "" Dim i As Integer = 0 While Not added Dim columnname As String = String.Concat(col, [next]) columnname = columnname.Replace(Chr(9), "") If Not result.Tables(TableName).Columns.Contains(columnname) Then result.Tables(TableName).Columns.Add(columnname) added = True Else i += 1 [next] = String.Concat("_", i.ToString()) End If End While Next i2 Dim strs2 As String() = s.ReadToEnd().Split(Chr(13) & Chr(10).ToString()) For j = 0 To CInt(strs2.Length) - 1 Dim items As String() = strs2(j).Split(Chr(9)) result.Tables(TableName).Rows.Add(items) Next j So now I have my dataset populated with all the information, but how can I insert it into the database?If anyone can help I would appreciate very, very much!Thank you Paula
I have a table with a column [FileData] of type TEXT or XML. I need to write the content of the column to a file, using sp_OAWrite for example, but the stored procedure who needs to read the content and write it to the file does not work because the limitation is no local variable for those types. What use is that datatype if i can not use it in procedures like this?
Anyone out there who could give me a tip on how to solve that?
Hi. I am currently building a blog application, and I need some advice... I have the post title, post date, post by, etc stored in a database table, however I was wondering whether I should store the actual post content in the database as well or in a text file. For example, if the posts get really long would it slow down database performance or would there not be much of a difference? Furthermore, if I wanted to keep the posts private, a text file would not be ideal as it can be accessed easily by surfers... What do you recommend?Thanks a lot
I get error reports in simple text files like the one below in relatively the same format. The only thing that varies is the number error reasons as there can be any number of error reasons for a file. Usually there is only one but there can be a handful. What is the best way to capture the error description and count of errors no matter how many there are? I want to take these items and update a table I have in sql server 2008r.
Original File Name: some.file.YYYYMMDD.d.incr.02of02.1.dat Source File ID: file02YYYYMMDD File Receipt Date: 10/17/2014 Total records received: 1331136 Total records loaded: 1329987
ERROR REASONS Error code: EBBW002 Error desc: Duplicate Record Total records: 1146 Error code: EABC001 Error desc: Invalid Length Record Total records: 1 Error code: ERRCM10 Error desc: Missing First Name Total records: 2
Total number of Errors encountered during the ODS update processing: 1149 ************************************************** *****
Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url. My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?
Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc.
I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).
Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.
I need to implement a file system for an application that allows me to roll back to a point in time. I can do this with either a journaling file system (Unix based such as JFS) or with a database file system such as Oracle Internet File system (now Oracle Content Services). I would MUCH prefer to use SQL Server but cannot find anything that supports this other than a 2000 ppt referencing the then up and coming SQL Server .NET File System.
The application(s) in question are older and store data in proprietary data files and need to access a local (or mapped) drive in standard form (d:programsmyprogram).
Does SQL Server 2005 or 2008 support this type of access? I have searched but cannot find anything to support this.
I want to set the value for a user defined variable with the script task with the intention of using that value as a condition in both of two Precedence Constraints for the purpose of determine which of two different direction the package will go. The problem is I don't know how to reference the User Defined variable in the Script of a Script task, nor how to alter its value.
Hello I use SSIS to load a Unicode file into a single table I Use a "slowly changing dimension" task to load the destination table and when i map a column (DT_WSTR) to a column with the datatype nvarchar(max) i have an error message that say that i can't map theses columns because there have not the same datatype.
I find a workaround : i map all my cols except the colums that must fill the cols with datatype nvarchar(max) , and after i modify manually the 2 subtask generated by the "slowly changing dimension" task (the insert and the update) and with this way i don't have error messages It works fine but is it the good way?
We have successfully built a SSIS ETL implementation for a data mart. Most of our dimension loads are using the slowly changing dimensions task. We successfully built every package and had 1 complete successful load.
We have made a modification to one of the tables that now has 16 mil records in it. I open the dimension's package and it takes 30 sec - 1 minute to validate. When I go to the Data Flow tab and attempt to open the Slowly Changing Dimension task, SSIS will freeze up. I will get no response for hours. I my testing so far SSIS has yet to come back.
My CPU usage is below 5%. devenv.exe memory usage is at 113 Mb. My computer is using 400mb / 1Gb of memory. The status bar at the bottom says "ready". I have let it sit for up to 2 hours with no results. When I kill the process in the Windows Task Manager it ends instantly.
I have also tried deleting the task and adding a new Slowly Changing Dimension Task. When I select the table from the drop down I get the same results. I have noticed this with all my million + record tables that use the slowly changing dimension task but not the smaller tables. I am also seeing the problem when I am in "Work Ofline" mode.
I am accessing a SQL 2005 server over a VPN connection. I am connecting to the server using a Native OLE DB OLE DB for SQL Server connection. My computer is running Windows 2000 SP4, Intel Pentium M1.86 Ghz, 1 Gb of RAM.
Can a Slowly Changing Dimension Task be used on large tables? I can understand the process taking a while to run but I do not understand why I can not edit the task.
In the Control flow tab, I have an Execute SQL Task that outputs full Result set into a variable of an object type. Now how can I write the contents of the Full Result Set into a text file using Script Task. I also want to format the following way while I output into a file:
Column Name 1 : Column Value
Column Name 2: Column Value and so on
I tried writing the contents of the Object Variable into a file, but the file had an output of single word: System.__ComObject.
Code for Writing the Full Result Set into a Text File
Dim RSsqloutput as String = Dts.Variables("objVariable").Value.ToString
Dim strVal as String = "File completed on " & Now() & vbCrLf & "------------------------------------------------------" & vbCrLf
Does anyone know how to change the task name displayed within a ForEach Loop Container (or of the ForEach Loop Container task itself) based on a variable. I am pretty familiar with setting variable values during task execution and using expressions to alter task properties based on variables. I have tried using an expression to alter the value of the Name property of the ForEach Loop Container but the name of the ForEach Loop Container does not change during execution. Since the color of the various tasks change during execution, I would think that the task names could be changed as well.
HI, I need to trigger some packages upon existance of specific files in a particular directory. Sound lkike the file watcher task (from SQLIS) would do the job but I am wondering what is the difference of using this tool instead of a for each loop container. I mean, If a file exists in a directory, the for each loop container will detect it. Since the file watcher is not a service, the package containing it needs to ne scheduled on a regular basis for the filewatcher to detect the file, right? So, a for each loop container would do the job? So, waht wouldbe the advantage of using the file watcher task?
A common issue that I run across with clients is they want only want to process a file if it's finished transmitting to the server. This SQL Server 2005 task reads the properties of a file and writes the values to a series of variables. For example, you can use this task to determine if the file is in use (still be uploaded or written to) and then conditionally run the Data Flow task to load the file if it's not being used. You can also use it to determine when the file was created in order to determine if it must be archived.
We find that if we deploy the OLAP database with a different name on the test server, then regardless of how we change the connection string provided to the SSIS package that processes the cube, then the package fails to connect to the database. To clarify:
In development the OLAP database is called MyOlapDB and the source database is called MySqlDB. Both are on the same machine. When the the application is built and released for test, the test team install the databases on a replica of the production environment (i.e. web app on one machine, OLAP DB on another and SQL database on yet another). They also, quite rightly, implement the new test databases so they incorporate the build version number. So, MyOlapDB123 and MySqlDB123 are both from build 123.
This is when the problems start. Regardless of how the connection string is specified in the job that processes the cube, the SSIS integration package fails with the error:
[Analysis Services Execute DDL Task] Error: Errors in the metadata manager. Either the database with the ID of 'MyOlapDB' does not exist in the server with the ID of 'OurTestServer', or the user does not have permissions to access the object.
We have tried config files and job properties, but neither work. Also, simply attempting to run the package using the DTEXECUI does not work either.
Looking inside the XML of the package, we clearly see the ConnectionManager object which has the original connection string, which is
Data Source=localhost;Initial Catalog=MyOlapDB;Provider=MSOLAP.3;Integrated Security=SSPI;Impersonation Level=Impersonate;
However, editing the initial catalog here still does not solve the problem. Searching the XML for the string MyOlapDB reveals the OLAP database name in two other places - both within the object data of the two Analysis Services Execute DDL tasks.
Anyone know how to solve this problem without having to hack the XML of the package?