Hi,
Trying to dynamically set the connectionstring property of the excel source.
This is what I enter instead of the hardcoded excel file paths:
"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" + @[User::FileName] + ";Extended Properties=Excel 8.0;HDR=YES"
I get this error, every time I set the delay validation property of the dataflow tab to true.
Cannot detach from one or more processes.
The object invoked has disconnected from its clients
Do you want to terminate them instead?
Hello, I have a situation in which I need to use a foreach iterator that will perform a transformation on each excel file in a directory. The file names will change, but the structure will stay the same.
I was able to get this working by assigning the file path for each iteration to a variable, and then using that variable to set the excelfilepath in the Excel connection manager. However, for this to work I have to assign the variable to a default file.
Because of this, when I try to deploy the package I need to also add a configuration property for the variable, otherwise the first run will fail. The dummy file doesn't even really need to exist - I just have to put in a valid path, and then use any name that has an .xls extension. After that it runs fine regardless of what is in the directory.
This seems odd that I would need to do this - am I missing something? Apart from creating the Excel Connection Manager programatically (which I'm guessing might solve this), is there a way to avoid having to specify this dummy file?
I got your email address from your web cast. I really enjoyed the web cast and found it to be very informative.
Our company is planning to use SSIS (VS 2005 / SQL Server 2005). I have a quick question regarding the product. I have looked for the information on the web, but was not able to find relevant information.
We are getting Source data from two of our client in the form of Excel Sheet. These Excel sheets Are generated using reporting services. On examining the excel sheet, I found out that the name Of the columns contain data itself, so the names are not static such as Jan 2007 Sales, Feb 2007 Sales etc etc. And even the number of columns are not static. It depends upon the range of date selected by the user.
I wanted to know, if there is a way to import Excel sheet using Integration Services by defining the position Of column, instead of column name and I am not sure if there is a way for me to import excel with dynamic Number of columns.
Your help in this respect is highly appreciated!
Thanks,
Hi Anthony, I am glad the Web cast was helpful.
Kamal and I have both moved on to other teams in MSFT and I am a little rusty in that area, though in general dynamic numbers of columns in any format is always tricky. I am just assuming its not feasible for you to try and get the source for SSIS a little closer to home, e.g. rather than using Excel output from Reporting Services, use the same/some form of the query/data source that RS is using.
I suggest you post a question on the SSIS forum on MSDN and you should get some good answers. http://forums.microsoft.com/msdn/showforum.aspx?forumid=80&siteid=1 http://forums.microsoft.com/msdn/showforum.aspx?forumid=80&siteid=1
Can someone tell me why path to the configuration file couldn't be relative to the package? When I try to use relative path (i.e. do not specify full path to the configuration file) package then run from BIDS looks for configuration file in the solution folder instead of package's folder (project folder).
Is there ane way to force package use same folder to search for caonfiguration files?
We have a SQL Server setup as a publisher to 15 subscribers. We need to change the path of the data & log files to a new drive (added a new harddisk). We plan to take a cold backup of the database and shift the data & log files to the new drive. Then we just attach the data & log files from the new path.
Will this disturb my existing replication Setup? Is the the correct procedure for changing the path of the existing data & log files? What is the appropriate method for shifting data & log file of a live database to a different location (directory/drive) ?
I am creating an SSIS package witha a Dataflow task, which reads from an Excel source and then uses script component to dumpt the data to multiple tables in Sql Server database
I need to some how make my Excel source dynamic, that is my excel template which i would be using to map the excel columns to script component's input columns would be dynamic..
In other words, I should be able to define the Excel Source, Column Mapping Information, Precedence constraint to the Script component dynamically
Hi,Is there a way to read a trace file with fn_trace_gettable ifthat trace file resides in a path that contains a space? For example,test.trc is accessible with fn_trace_gettable if it resides in c: races, but throws an error if the trace file is in c:my traces.Thanks.-Nada
We have been having problems getting a linked server to an excel file working with an UNC path. If the UNC path is to the SQL server itself it will work but not if the UNC path is to another server.
The SQL server 2000 SP 4 processes are running under a domain id and we can logon interactively with that same login id and we can access the excel file via the same UNC path.
We have tried it with setting up a linked server and also linking to it 'on the fly':
SELECT * FROM OpenDataSource( 'Microsoft.Jet.OLEDB.4.0', 'Data Source="\AnotherServerUNCpathexcel.xls";User ID=Admin;Password=;Extended properties=Excel 5.0')...sheet1$ go
The message we are getting is the following: Server: Msg 7399, Level 16, State 1, Line 1 OLE DB provider 'Microsoft.Jet.OLEDB.4.0' reported an error. The provider did not give any information about the error. OLE DB error trace [OLE/DB Provider 'Microsoft.Jet.OLEDB.4.0' IDBInitialize::Initialize returned 0x80004005: The provider did not give any information about the error.].
It appears from reading Microsoft's documentation and other topics that this should be possible. Any ideas on what we are missing?
I have two, probably related, problems:I have a very simple site. It will be used on the local intranet only. I want to use the windows authentication so users do not have to log on. The aspnetdb has been created using the configuration tool. If I use Server Explorer in vs to work on the site on my local machine, I can open aspnetdb, look at its tables, etc. If I use Server Explorer in vs to work on the site on the server, aspnetdb will not open and gives this error message: ..."network path that is not supported for database files"...Other sites on the same server give the same message, but the sites work. I think this is because they can read the file but not write it; but that is a guess. We have tried everything; my head is ready to explode. We are using SQLServerExpress. It resides on the server's C drive (as does our data database). The web site, and hence the aspnetdb are on the F: drive of the same machine. Does SQLServerExpress treat the F: drive as a UNC drive? If so, are we in deep doo doo? I'm at a loss.As far as logging in: If I debug from vs, the login works beautifully both on my local machine and running from the server using vs. Login does not work when accessing the site from iexplorer. Mystified by that one, too. I tried to implement profiles and they crash, too; that led me to consider a write problem.Don't know what to do; any help is appreciated.
Hello, Completely new to this stuff, so sorry if I'm asking something that is painfully simple to resolve!!! Also, not sure whether there are standard formats for code, errors etc.
OK. When I create a Login.aspx file in VWD Express on my local Hard Disk Drive, the file works and runs correctly, allowing me to log in etc. However, when I do exactly the same thing using a remote Network Drive connection to an Intranet server, when you enter the Username and Password, and Click the Log In Button, the following error message appears:-
The file "[Network Drive]AuthorsApp_Dataaspnetdb.mdf" is on a network path that is not supported for database files.An attempt to attach an auto-named database for file S:AuthorsApp_Dataaspnetdb.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share. Then I get the following information - which I'm sure is supposed to help, but doesn't!!!
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: The file "S:AuthorsApp_Dataaspnetdb.mdf" is on a network path that is not supported for database files.An attempt to attach an auto-named database for file S:AuthorsApp_Dataaspnetdb.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
Stack Trace:
[SqlException (0x80131904): The file "S:AuthorsApp_Dataaspnetdb.mdf" is on a network path that is not supported for database files.An attempt to attach an auto-named database for file S:AuthorsApp_Dataaspnetdb.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.] System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +115 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) +346 System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) +3244 System.Data.SqlClient.SqlInternalConnectionTds.CompleteLogin(Boolean enlistOK) +56 System.Data.SqlClient.SqlInternalConnectionTds.OpenLoginEnlist(SqlConnection owningObject, SqlConnectionString connectionOptions, String newPassword, Boolean redirectedUserInstance) +1083 System.Data.SqlClient.SqlInternalConnectionTds..ctor(DbConnectionPoolIdentity identity, SqlConnectionString connectionOptions, Object providerInfo, String newPassword, SqlConnection owningObject, Boolean redirectedUserInstance) +272 System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection) +687 System.Data.ProviderBase.DbConnectionFactory.CreatePooledConnection(DbConnection owningConnection, DbConnectionPool pool, DbConnectionOptions options) +82 System.Data.ProviderBase.DbConnectionPool.CreateObject(DbConnection owningObject) +558 System.Data.ProviderBase.DbConnectionPool.UserCreateRequest(DbConnection owningObject) +126 System.Data.ProviderBase.DbConnectionPool.GetConnection(DbConnection owningObject) +651 System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection) +160 System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) +122 System.Data.SqlClient.SqlConnection.Open() +229 System.Web.DataAccess.SqlConnectionHolder.Open(HttpContext context, Boolean revertImpersonate) +114 System.Web.DataAccess.SqlConnectionHelper.GetConnection(String connectionString, Boolean revertImpersonation) +225 System.Web.Security.SqlMembershipProvider.GetPasswordWithFormat(String username, Boolean updateLastLoginActivityDate, Int32& status, String& password, Int32& passwordFormat, String& passwordSalt, Int32& failedPasswordAttemptCount, Int32& failedPasswordAnswerAttemptCount, Boolean& isApproved, DateTime& lastLoginDate, DateTime& lastActivityDate) +1105 System.Web.Security.SqlMembershipProvider.CheckPassword(String username, String password, Boolean updateLastLoginActivityDate, Boolean failIfNotApproved, String& salt, Int32& passwordFormat) +157 System.Web.Security.SqlMembershipProvider.CheckPassword(String username, String password, Boolean updateLastLoginActivityDate, Boolean failIfNotApproved) +68 System.Web.Security.SqlMembershipProvider.ValidateUser(String username, String password) +100 System.Web.UI.WebControls.Login.AuthenticateUsingMembershipProvider(AuthenticateEventArgs e) +100 System.Web.UI.WebControls.Login.OnAuthenticate(AuthenticateEventArgs e) +113 System.Web.UI.WebControls.Login.AttemptLogin() +178 System.Web.UI.WebControls.Login.OnBubbleEvent(Object source, EventArgs e) +134 System.Web.UI.Control.RaiseBubbleEvent(Object source, EventArgs args) +56 System.Web.UI.WebControls.Button.OnCommand(CommandEventArgs e) +107 System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) +178 System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) +31 System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +32 System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) +72 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +3838
Version Information: Microsoft .NET Framework Version:2.0.50727.42; ASP.NET Version:2.0.50727.42
I'm getting a rule check fail "Long path names to files on SQL Server Installation media failed" when installing SQL 2012 Standard edition from a network share.
I am having issue in running a ssis package which connects to an excel file from shared location.It works fine on the machine of the person who has developed it as he has access to that shared drive.After deploying the ssis package to SSISDB and creating a proxy account with the developer's credential, and running the ssis package using the proxy under SQL Agent Jobs, it is failing with error :
Load XXXXXXXXXX :Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "XXXXXXXXXX.xlsx"
failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
XXXXXXXXXX:Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.An OLE DB record is available. Source: "Microsoft Access Database Engine" Hresult: 0x80004005 Description: "Failure creating file.".
I wrote the below script to print all folders and files located in the share path. How to extend my script to mention by adding another column whether the file is a folder/file , sort of 0 or 1.
In SQL Studio, I can go to the restore window and the click "verify backup media". This would check the restore plan listed in this window and check if some of the file are missed.Is there any way to reach this with TQSL? I know there is a "restore verify" command, but this will only verify one backup/file and not the complete restore path. I'm looking for a TSQL solution which is able to control that all necessary backup files are still present on the file System and not moved or deleted (e.g through cleanup task).
I have a Windows Server 2012 R2 2 node cluster with SQL Server 2014 FCI installed. Data files are on a separate Windows Server 2012 R2 file server. Data files share has been permissioned to the SQL Server service and SQL Server Agent service accounts as Full Control. NTFS Permissions are Full Control.
When I try to attach a database CREATE DATABASE AdventureWorksDW2012 ON (FILENAME = 'apricotmssql_VIOLETMSSQL12.MSSQLSERVERMSSQLDATAAdventureWorksDW2012_Data.mdf') FOR ATTACHI get this error: Msg 5120, Level 16, State 101, Line 4 Unable to open the physical file "apricotmssql_VIOLETMSSQL12.MSSQLSERVERMSSQLDATAAdventureWorksDW2012_Data.mdf". Operating system error 5: "5(Access is denied.)".
If I log into the file server (called APRICOT) and look at the NTFS permissions they all look good. I have also reapplied the NTFS permissions from the root folder down.
EDIT If I log on to one of the nodes in the cluster as the SQL Server service account and navigate to apricotmssql_VIOLETMSSQL12.MSSQLSERVERMSSQLDATA and copy and paste the data file, it works fine.
EDIT2: If I log on to the file server and Enable Inheritance at the root level, then Replace all child objects with inheritable permission entries from this object, I get this error:
User Account Control settings on all nodes and the file server are set to Never notify
I was wondering if there is a way to 'Move File' with the File System Task inside of a For Each Loop container but to dynamically set the Destination path variable.
Currently, this is what I have: FileDestinationPath variable - set to C:TestFiles FileSourcePath variable - set to C:TestFiles FileNameAndLocation variable - set to blank
For Each Loop Container €“ Iterates through a folder C:TestFiles that has .txt files in it with dates in the file name. Ex: Test_09142006.txt. Sets the file path (fully qualified) to the Variable Mapping FileNameAndLocation.
Script Task (within For Each Loop, first step) €“ Sets the FileDestinationPath to the correct dated folder within C:TestFiles. For example, if the text files I want to move are for the 14th of September, it takes FileDestinationPath and appends the date folder to the end of it. The text files have a date in the file name (test_09142006.txt) and I am picking this apart (from FileNameAndLocation in the For Each Loop) to get the folder date. (dts.Variables(€œUser::FileDestinationPath€?).Value = dts.Variables(€œUser::FileDestinationPath€?).Value & €œ€? Month & €œ_€? & Day & €œ_€? & Year & €œ€?) which gives me €œC:TestFiles 9_14_2006€?.
File System Task (within For Each Loop, second step) €“ This is where the action is supposed to occur. I want it to take the FileDestinationPath and move the FileNameAndLocation file (from the For Loop) into this folder for each run.
Now as for my problem. I want this package to run everyday but it has to set the FileDestinationPath variable dynamically according to that day€™s date. Basically, how do I get this to work since I can€™t hard code the destination path variable from the start? I have the DestinationVariable on the File System Task set to the FileDestinationPath variable, after the script task builds it. However, using FileNameAndLocation as the SourceVariable on my File System Task tells me that the €œVariable €œFileNameAndLocation€? is used as a source or destination and is empty.€?
Let me know if I need to clarify further€¦...I may be missing something very simple. Any help would be greatly appreciated!
I need to write a genric CSV importer. one set of CSVs define the formats of all teh other CSVs. The format of the second set of CSVs is not know at the time of creating this project/SSIS package.
Imagine having two sets of CSV files.
Each file in the first set of CSVs define the format of files in the second set of CSVs The first set always have the same format. Each row of the CSV file defines a field with a Name and Type. The first two columns of each row of in the first set of CSVs have a FormatName and index. So a simple file may have:
The second set of CSVs contains records that have to comply with the format define in the first set. So in this aprticular case, a CSV file in the second set may have records such as:
The problem I have is creating a generic SSIS package for this. The first task, loading the first set of CSV file is fairly simple. The CSV format is known. But the second task is a bit trickier.
Assuming I have SQL tables to load the data. One 'Fields' record for each row in the CSVs from the fiirst set. One 'Rows' record for each row in the CSVs from the second set, One 'Values' record for each value in each row in the CSVs from the second set, Something like:
TABLE Fields (
FieldID int IDENTITY, FormatName varchar(100), Position int, ColumnName varchar(100), ColumnType varchar(20) )
- Dynamically create a SSIS package based on the content of each CSV fiel of the first set and execute that package for each file in the second set, selecting the correct package to execute (all records within a CSV file belong to the same format). - Write a single SSIS package that iterates through the rows of the second CSVs, does a lookup for each value of each row to find the field name and make an insert into the DB - Other SSIS method? - Don't use SSIS to parse the CSV and call a custom C# task? - Don't use SSIS at all ?
I am new to SQL7 and I am trying to run a package that will import fixed text files in a folder and transform the data to a table. I am using a good example found in sywink.com for dynamically importing files in a directory, but I am getting an error message when there is more than one file in the folder. Also, I am able to display all the files during the loop process, but the file initially set for the source connection is the only one that transfers data. I have the close(Transform) connection checkbox option checked, and have tried other methods of closing the connection before the new source name is called, none have helped. Does anyone have any solutions to this type of problem? Or know of other methods used for my situation? Thanks James
Hi. I am using DBF files as sources for some tables in SQL Server. The problem is that the table names in the DBF files are not all the same (e.g. frame061, frame949). I want to know the names of the tables inside the DBF files. Is there a way to query the table names, something like "select table_name from information_schema.tables" in SQL Server? By the way, those DBF files came from FoxPro. Thanks!
I have a challenge i am trying to overcome, hopefully soneone would have come across this issue before..
I am creating a DTS package that will be scheduled to run at a certain time everyday. A source folder exists that get a set of new files everyday.The DTS Package will then read each file and copy the data into a load table in my database the challenge is this:
I am trying to load files from a source folder into my load table, Within each file, the entires are in a specific format using pipes to seperate the data that goes into which column e.g
example of a file entry:
column1 | column2 | column3
data1 | data2 | data3
data1 | data2 | data3
data1 | data2 | data3
And now i am using DTS to specify the file format and map the cloumns as apprporiate to my table...all this is well and good, but my problem is each file has a different name as well as being timestamped, now how do i use DTS to specify the source folder, open each file sequentially and read (or more appropriate, copy the entries into my table, inserting new data from each file into my load table as well as overwriting old data in the load table from the files in the folder ?) is there a way in specifying your source folder in DTS rather than specifying the file in the Menu options (in the transformation data task properties )given, and or do i need to write a script for this(reading the file?)
can someone please give me a solution and how to approach this?
I am trying to modify the files path (primary file, log file) of databases, but it looks like I am not able to mofidy their files path directly from the database property dialogue? Would please any experts here give me some ideas on what else can I try to figure it out? Thanks a lot in advance and I am looking forward to hearing from you shortly.
Hello everybody . I am building DTS transfer data from SQL server into Excel file
source query constant ,but destination will be supplied by parameter
At design time I created destination excel file and saved a copy of it like C: empl_excel.xls
presently dts work in following order 1. set datasource of destination from global varaibale(@@X) 2. execute xp_cmdshell to copy C: empl_execel.xls to file in @@X 3.Run transformation
How to eliminate step 2 ? If I run step 1 and 3 ,I get error "table does not exist" How dynamicly create table in excel and map columns for transfer
I have different files that are sent from our vendors. Some are TXT and some are XLS. Some will have the same structure. I plan on grouping these together as best as I can.
My main problem is that I would like to go from one source that matches a group of files to a single SQL table. I'm still learning about SSIS and its capabilities. If I can get pointed in the right direction or have an example to work from, that would be great.
I've tried googl'ing to find some step by step, examples, and hints to do this right, but so far I'm at a bit of an impasse.
I Have Multiple Flat Files in Source Folder(They have Naming Conventions With Todays Date ex: Flatfile_20082204_1,Flatfile_20082204_2,Flatfile_20082204_3 ), I need to Extract Each and Evry file by Dynamically, and Transform the Flat File then load that Flat file into the Destination Folder with Standard Prefix and Todays Date with a Sequence No ex:Flatfile_20082304_A,Flatfile_20082304_B, Flatfile_20082304_C
is there any way to change the delimiter in a FlatFile Source Adapter dynamically? You can do that with the row delimiter but there is no expression for the column delimiter... Is there any other way besides changing the XML code?
When trying to install Business Contact Manager (BCM) for Outlook 2007, the setup failed and I was refered to a log file in my Local Settings/Temp folder. The log actually says that Business Contact Manager was installed sucessfully! BCM is supposed to install SQL Express 2005 as an instance or as instance if SQL Express is already installed. There is an MSSMLBIZ instance in Services..
Who can I send the Log File to for analysis and the fix feedback?
When I first went into Computer Management and clicked on Services and Applications in the left panel, the error message appeared "Snap-in failed to intialize. Name: SQL Server Configuration Manager CLSID:{CA9F8727-31DF-41D2-975C-887D84903967} This message diappeared when I clicked on Services and Applications again. Under Services, there are 3 SQL services - one is an application that was uninstalled 3-4 weeks ago and I disabled this service. The other 2 are: SQL Server (MSSMLBIZ) and the other one is SQL Server (SQLEXPRESS) When I tried to start either of the last 2, the message appeared: Services "Could not start the SQL Server (MSSMLBIZ) service on Local Computer. Error 3: The system cannot find the path specified. Under Program Files/Microsoft SQL Server/MSSGL.1 folder is mostly empty. So, it seems like the Path in the Registry is not valid and that nothing is being installed in the MSSQL.1 folder. If so, how do I fix this?
How do I get the BCM SQL instance to install and run properly? what do the messages in Services mean and how do I resolve these.
After updating TempDB path to a wrong path (without file name only folder name) the service is not starting. How can i sovle this and start the service
I have a SSIS Package that exports data from Sql Server to an Excel file. I need help figuring out how to have the file name be "Report_02132007.xls". Basically I want to append the date to the file name. Any ideas?
Hi All, i've been reading this article http://rafael-salas.blogspot.com/2006/12/import-header-line-tables-_116683388696570741.html
in regards to creating an excel spreadsheet dynamically in SQL Server 2005 SSIS. However, i'm constantly getting an where the tab is created but not being populated. Can somebody post up a clearer example?
The problem I'm trying to solve is to automate the export of a query onto a new dynamic spreadsheet each time I run this SSIS package.
This method has worked beautifully for all my SSIS pkgs thus far.
Basically, I use a Script Task to derive the name of the newest file in a local directory. Then I save the name of the file to user a user variable, e.g. User::File.
Then, in my flat file properties > Expressions, I set "ConnectionString" to reference User::File.
However, when attempting to use this method with an Excel source, I get this error message:
Error at myPkg [Connection manager "Excel Connection Manager"]: The connection string format is not valid. It must consist of one or more components of the form X=Y, separated by semicolons. This error occurs when a connection string with zero components is set on database connection manager. Error at myPkg: The result of the expression "@[User::Folder]+ @[User::File]" on property "ConnectionString" cannot be written to the property. The expression was evaluated, but cannot be set on the property.