Package With File System Task Doesn't Work Without Sensitive Data With User Keys
Dec 14, 2006
This problem is a bit weird but I'm just wondering if anybody else experienced this.
I have a package that has file system tasks (copying dtsx files actually). Basically the package copies other packages to a pre-defined destination. Thing is, it only works if one of the packages it is configured to copy has some sort of sensitive data (e.g., a connectionstring with a password), otherwise it reports a success message on execution but doesn't actually do anything. I've checked the forcedexecutionresult and it is set to None for that matter.
Just wondering if anybody else experienced this problem and of course if there's a way to solve it.
I am trying to create and later read a data file from a package deployed in SSISDB, but it is not reading it while it is successfully creating the file. The same package when run from the file system package, runs successfully. Generating ispac and deploying in SSISDB is running for infinite time. Is it a permission issue?
No matter what table, view, or stored proc I pick, it always says that it doesn't exist at the source. I know it exists because I am picking it from the list of tables, etc. that the GUI provides.
Pump file data into sql server Move file to "archive" directory(file system task) Delete File (file system task) End Loop
Unmap Drive (batch file)
The Map/unmap code is in a batch file c:windowssystem32et use \10.10.10.10ShareName MyPassword /USER:MyUserName /YES
Unmap: c:windowssystem32et use \10.10.10.10ShareName /DELETE /YES
Here are the results when running this package: 1. Running in BIDS on separate workstation. Everything OK. 2. Running on Server by right clicking on package in Integration Services (SSMS) and choosing "run". Everything OK 3. Running as job with SQL Agent: Package succeeds but no action was taken on the files, files in "ShareName" still there, so therefore no data pumped into SQL Server.
Now, the difference is the SQL Agent jobs are running using a domain account proxy. I'm not sure how that would affect things though--I have the tasks in the package set to fail the package if they fail, so they are not failing, the drives are being mapped o.k.
The computer with the share is non-domain, but that shouldn't matter--I am specifiying the local username and password in the batch file as you can see, and as you can see it works from the workstation in BIDS on a separate machine, and works on the server too as long as I don't run it as a job. The batch file sits on both the server and the local workstation with the same local path.
Any idea why the files aren't actioned when run as a job?
As pointed out in the SQL Express blog recently http://blogs.msdn.com/sqlexpress/archive/2006/11/15/sql-express-sp2-and-windows-vista-uac.aspx (look for the section that starts "Watch out!"), the SQL Express SP2 argument ADDUSERASADMIN will not work correctly if the user is a normal user. If the user is a member of the BUILTINAdministrators group then Vista will prompt to elevate them to allow the Admin rights to be effective.
However, my reading of the blog post is that if they are normal users, Vista will prompt for the Administrator credentials. This effectively runs the install as the Administrator user. So the ADDUSERASADMIN argument works, but adds the Administrator rather than the normal user.
This is not what I need to happen. Is there any way around this? I have a ClickOnce application. Is there any way to restrict the ClickOnce install to require the user to be a member of the BUILTINAdministrator group?
I have several DTS packages saved 'locally' to the SQL server. I want to duplicate a package, so that I can make some changes then replace the original. I certainly don't want to rebuild the entire package from scratch. So, I open up the original package, go to the 'Package' menu and choose 'Save As', then give it a new name and press OK. No errors, all appears well, the title bar even shows the new name of the package. But, when I close the package and go the the 'local' package list, it (the new package name) doesn't appear in the list. Refresh, exit SEM, reboot - doesn't show up. I even looked in the MSDB table where packages are supposed to be stored (at least the name / package id / etc), and it doesn't show there as well. Tried from several client machines.
OS: Windows 2000 Server (advanced) SP2 SQL: SQL 2000 Server (no SP's)
I have many jobs on sql 05 and all work but one. This one writes to an Access DB on the same server as SQL. The package works fine. But when executed in the context of the SQL Agent job, it fails.
Jobs that write to a text file work fine. The Access DB has no password required. By the way, that job in sql 2000 worked fine.
I'm stuck with this and I haven't idea how to solve it. I'm trying to migrate a dts 2000 package from BIDS and I obtain this message:
This wizard will close because it encountered the following error:
Index was out of range. Must be non-negative and less than the size of the collection. Parameter name: index (mscorlib)
I'm going to Migrate DTS 2000 Package select my current sql2k production server (it has almost 600 dts although I think that is not any problem at all)
Wizard recognize without problems my server and then I put a folder for save them but on the next step appears the aforementioned message.
I am trying to execute an SSIS package from an MS Access 2003 database that imports a table from the Access database into a target table in SQL 2005. I saved the package in SQL 2005 and tested it out. If I run it from the Management Studio Console with Run->Execute ... everything works just fine. However, if I try to run it using the following line command "Exec master.dbo.xp_cmdshell 'DTExec /SER DATAFORCE /DTS SQL2005TestPackage /CHECKPOINTING OFF /REPORTING V'" the execution will always fail when the Access database is open (shared mode). The connection manager looks like this: "Data Source=E:Test.mdb;Provider=Microsoft.Jet.OLEDB.4.0;Persist Security Info=False;Jet OLEDB:Global Bulk Transactions=1". The error is listed below:
Code: 0xC0202009 Source: NewPackage Connection manager "SourceConnectionOLEDB" Description: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Could not use ''; file already in use.".
Has someone managed to pass successfully a variable from a parent package to a child package? I€™ve tried a zillion permutations and I can€™t get it to work. The strange thing was that I was able to successfully do this with pre-RTM builds. Basically, what I am trying to do is:
The parent package has a variable, e.g. ExecutionID which I set using a script to System::ExecutionInstanceGUID. I verified that the variable is set correctly by dumping it to a SQL Server table. I created a child package variable with the same name. In the child package, I€™ve created a parent package configuration that points to the ExecutionID variable. I am trying to read the variable in a Derived Column Task in which I have a column linked to @ExecutionID. This doesn€™t work. Step-by-step instructions from someone who managed to concur this will be greatly appreciated.
Oh, I didn€™t have any luck hitting a breakpoint in a script task inside a child package with both in and out of process execution also.
Hi All there!I am quite new in MS SQL administration so let me explain how it workon Your instances of SQL Servers.We have several DTS packages on our server, all of them managed onsome station which have seriously hardvare problem. So we wolud liketo catch two problems at one time and decided to develop systematicway of DTS manipulation.One of several aspects of this operation would be migration fromsystem ODBC data sources definitions into file ODBC sources ( .dsnfiles) in order to make them easier to manage ( backup for example,and even reusability on other workstations). All .dsn files should belocated on some network resource (\server\directory...) which wouldbe set as default ODBC directory in ODBC administrator on managementstation.When I begin to do so, then it apears that EM DTS Designer does notremember the path to the DSN files ( for example on design panel Ichose file dsn and by browse button point at the certain .dsn file,and then after DTS save the path disapears).Do You use this facility ( file .dsn) in DTS EM Designer, or maybe MShas it treated as usless, and nobody wants to use this?RegardsK
Maybe it worked once, but in most time it doesn't work, query like below
select top 10 * from OpenRowSet('microsoft.jet.oledb.4.0','Excel 8.0;hdr=yes;database=\ws8webablefilessitefiles4000010 eibcactive.xls', 'select * from [crap2$]')
I got error
OLE DB provider "microsoft.jet.oledb.4.0" for linked server "(null)" returned message "Unspecified error". Msg 7303, Level 16, State 1, Line 1 Cannot initialize the data source object of OLE DB provider "microsoft.jet.oledb.4.0" for linked server "(null)".
but the same query can run without any problem on a SQL 2000 server run on a server in the same network.
i have created a dsn to extract data from an SQL2000 server into Excel. this all works fine. Now i have edited the data and would like to import the updated fields in my database.
There are no new fields just updated information. I have used the import wizard from SQL Enterprise manager. I have selected to delete the current table and replace is with the one ive created.
This is where the problems begin.
When i finish the wizard i get an error saying there is a conflict "collum reference constraint". wich i think has something to do that there are links to this table wich can not be simply deleted and recreated.
I'm copying files to a folder with the naming convention as follows in the source folder:
CM_ABC_MY_TEST.txt
In the destination folder, this filename needs to appear as:
CM_XYZ_MY_TEST.txt
In my File System Task, I'm pretty sure I'm going to need an expression with a replace, substring, etc. But am having a hard time nailing down the exact syntax.
Does anyone know how to do this using variables? Everytime I try it, I get the
Error: Failed to lock variable for read access with error 0xc00100001.
I also tried it writing a script and still the same error. If I hard code the values into the variables it works fine but I will be running this everday so that it will pull in the current date along with the filename. So the value of the variables will change everyday. Here is my expression:
I was wondering if there is a way to 'Move File' with the File System Task inside of a For Each Loop container but to dynamically set the Destination path variable.
Currently, this is what I have: FileDestinationPath variable - set to C:TestFiles FileSourcePath variable - set to C:TestFiles FileNameAndLocation variable - set to blank
For Each Loop Container €“ Iterates through a folder C:TestFiles that has .txt files in it with dates in the file name. Ex: Test_09142006.txt. Sets the file path (fully qualified) to the Variable Mapping FileNameAndLocation.
Script Task (within For Each Loop, first step) €“ Sets the FileDestinationPath to the correct dated folder within C:TestFiles. For example, if the text files I want to move are for the 14th of September, it takes FileDestinationPath and appends the date folder to the end of it. The text files have a date in the file name (test_09142006.txt) and I am picking this apart (from FileNameAndLocation in the For Each Loop) to get the folder date. (dts.Variables(€œUser::FileDestinationPath€?).Value = dts.Variables(€œUser::FileDestinationPath€?).Value & €œ€? Month & €œ_€? & Day & €œ_€? & Year & €œ€?) which gives me €œC:TestFiles 9_14_2006€?.
File System Task (within For Each Loop, second step) €“ This is where the action is supposed to occur. I want it to take the FileDestinationPath and move the FileNameAndLocation file (from the For Loop) into this folder for each run.
Now as for my problem. I want this package to run everyday but it has to set the FileDestinationPath variable dynamically according to that day€™s date. Basically, how do I get this to work since I can€™t hard code the destination path variable from the start? I have the DestinationVariable on the File System Task set to the FileDestinationPath variable, after the script task builds it. However, using FileNameAndLocation as the SourceVariable on my File System Task tells me that the €œVariable €œFileNameAndLocation€? is used as a source or destination and is empty.€?
Let me know if I need to clarify further€¦...I may be missing something very simple. Any help would be greatly appreciated!
Why System.Data.SqlserverCe.Dll doesn't appear in VS2005. In VS2003 System.Data.SqlserverCe.Dll appear to write winCE application. But it won't appear in VS2005 while am selecting PocketPc application. any body gimme the solution.
I am able to run SSIS packages as SQL Server Agent jobs with a Control Flow items "File system task", if I move a file (test.txt) from a drive (c on the server (where SQL Agent jobs run) to a subdirectory on the same drive. But, if I try to move a file on a network drive, the package fail.
I tried Data Mining Add-Ins for Office 2007 - CTP December 2006. Test settings: Windows XP SP2 english with Italian regional settings, Office 2007 english (RTM), SQL Server 2005 Developer (with SP2 CTP Dec06) and Data Mining Add-ins for Office 2007 (CTP Dec06).
If I keep regional settings in Italian, I get error like this:
Old format or invalid type library. (Exception from HRESULT: 0x80028018 (TYPE_E_INVDATAREAD))
If I change regional settings to English, the Add-in works.
I found this description as the possible cause of the problem: http://msdn2.microsoft.com/en-us/library/ms178780(vs.80).aspx - if this is the issue, it would be necessary to change the ExcelLocale1033Attribute on the component.
Is there another workaround other than to install the Office 2007 MUI?
Marco Russo http://www.sqlbi.eu http://www.sqljunkies.com/weblog/sqlbi
Historically I've always written a VB script to copy a file from a sharepoint library. I don't like this method because I have to input a username & password in the script and maintain a config file.
Yesterday I was playing around with using a file system task. The sharepoint file has a UNC path so why not? I created a simple test package with a single file system task that copies the sharepoint file (addressed via UNC) to another network location. Package runs fine locally.
I try running on our utility server but am getting a "The file name [SHAREPOINT UNC PATH] specified in the connection was not valid" error. Package is running with a proxy on the server and the proxy account has the same permissions to the sharepoint site (so far as I can tell) as me.
I have created a File System task which is contained in a Foreach Loop Container. I have .bak files that are populating a directory from a maintenance backup plan.
There is a point where I need to delete the .bak file's after I've zipped them all up.
How do I set the SourceVariable to read through the directory and pick up just the .bak file's in the directory to delete.
I have had the same Error 29506 that a lot of people are having when installing SP2 for SQL 2005. I've tried the install with myself (a Domain Admin), local Administrator, cascaded full rights down the entire file system structure and still not luck. One thing I'm wondering if it is hanging me up is that all of my databases and logs are not on C:. They are on LUNS on a NetApp SAN (Data is on M: and Logs is on L:). Even the system databases (Master, Model, etc.) are on the LUNs. The error logs referenced permissions to the data directory under the default installation path on C:. Anyone else have this problem? Got a fix? I really don't want to migrate all of my data back to the local machine, apply the patch, them migrate back. Surely this SP should be able to read the data location from the SQL engine. And surely others have their databases on SANs.... I'm at a loss.
I want to move and rename a file and embed the date/time into it, so that each time the package runs a new file is created. For example MyFile_20060712_150000.doc.
Can someone give me a hint how to do this with the File Systen Task SSIS Control Flow Item?
I have a source files folder where the files generated everyday. My goal is pick the latest file and copy this single file to another folder. I used the Foreach loop container and got the latest file and stored the file name to a varible i.e. LatestFile Then i want to use the File System Task to copy this to the destination. On the beginning, I could not setup the Latestfile since I don't its name then, so when I setup the Source Connection property of the File system task, it is not allowed to leave the SourceVarible as blank!
Could someone please instruct me on how to use the File System Task Editor to rename a file? I place control on control flow tab, change the operation to rename, from there I am not sure what to do.
It's well known issue, that one can't use any dataset fields in a report header/footer directly. One of the approach is to create query-based parameter that basically equals =First(Fields!@FieldName@.Value, "@DataSetName@") and use that parameter value instead. But it doesn't work in my case!
My report displays some entity description and is parametrized with EntityID param. Its header contains entity name that, according to the approach, is queried from the data source through the EntityName report parameter. There's important issue: the report is displayed in ReportViewer control, that is embedded into my application and entity ID parameter isn't ser by user in ReportViewer parameters area. Its default value is changed by the application with SetReportParameters() web method every time a user wants to view the report according to the entity the user is exploring in the application. But after the report has been rendered, its header always contains not actual (outdated) entity name. Nevertheless, the report body contains actual data (including entity name). If I alter entity ID parameter in ReportViewer or in web-based Report Manager and refresh report, header displays correct entity name.
I use the DTS 2000 Migration Wizard to migrate one of the DTS 2000 packages to SSIS. The migration failed with the following error message:
LogID=17 #Time=6:31 PM #Level=DTSMW_LOGLEVEL_ERR #Source=Microsoft.SqlServer.Dts.MigrationWizard.Framework.Framework #Message=Microsoft.SqlServer.Dts.Runtime.DtsRuntimeException: Failed to save package file "C:Documents and SettingsfuMy DocumentsVisual Studio 2005ProjectsKORTONKORTONProcessCubesMF.dtsx" with error 0x80070002 "The system cannot find the file specified.". ---> System.Runtime.InteropServices.COMException (0xC001100E): Failed to save package file "C:Documents and SettingsfuMy DocumentsVisual Studio 2005ProjectsKORTONKORTONProcessCubesMF.dtsx" with error 0x80070002 "The system cannot find the file specified.".
at Microsoft.SqlServer.Dts.Runtime.Wrapper.ApplicationClass.SaveToXML(String FileName, IDTSPersist90 pPersistObj, IDTSEvents90 pEvents) at Microsoft.SqlServer.Dts.Runtime.Application.SaveToXml(String fileName, Package package, IDTSEvents events) --- End of inner exception stack trace --- at Microsoft.SqlServer.Dts.Runtime.Application.SaveToXml(String fileName, Package package, IDTSEvents events) at Microsoft.SqlServer.Dts.MigrationWizard.DTS9HelperUtility.DTS9Helper.SaveToXML(Package pkg, String sFileLocation) at Microsoft.SqlServer.Dts.MigrationWizard.Framework.Framework.StartMigration(PackageInfo pInfo)
Looking at the call stack, it looks like COM wrapper fails on SaveToXML. Can someone tell me how I should workaround this problem?
I PLACED A FILE SYSTEM OBJECT WICH IS USED TO (COPY FILE/MOVE FILE SO ON ) Once copy file working fine second time copy file gives an errors we need to check the condition if that folder contrain the dest.txt file we dont require to copy a file other wise we need to copy
so i need a controle for checking a folder contrain the dest.txt file or not
I'm just learning 2005, so apologies in advance for the newb questions.
I am facing the following situation: Each day I have to upload an assortment of .csv files with variable names (eg. FileOneYYYYMMDD.csv, FileTwoYYYYMMDD.csv, etc.) to an FTP site, from the following directory structure:
Directory1
--- SubDirectoryA
--- SubDirectoryB
--- SubDirectoryC
I have accomplished this by setting up a package with three sequence containers (one for each subdirectory), each of which holds a Foreach loop (with a file enumerator configured as *.csv), each of which holds an FTP Task. This may not be the best way to do it, but it works. But if there's a better way I'd like to know!
Anyway, the wall I've run into is trying to move the .csv files to an archive directory after they've been uploaded. The wildcard variable doesn't seem to work with the File System Task, so I'm having a hard time figuring out how to move a bunch of variably named .csv files at different depths of a directory structure to an archive directory.