Add Existing Shared Folder To File Table
Jan 29, 2015I need to add an existing shared folder to a SQL FileTable.So this is the path and I created a SQL Filetable and now I need to add it to the filetable.
View 0 RepliesI need to add an existing shared folder to a SQL FileTable.So this is the path and I created a SQL Filetable and now I need to add it to the filetable.
View 0 RepliesI need to add an existing shared folder to a SQL FileTable. So this is the path and I created a SQL Filetable and now I need to add it to the filetable.
View 0 Replies View RelatedPlease can anybody help me in transferring existing SSIS Packages saved in a shared folder location from development server 2ED to Live server TWD1.
Both has SQL server 2005 running and has visual studio 2005
Currently about 25 SSIS packages are executed from the development server transferring data on Live server TWD1...these ETL process is called from development server but executed on live server.
Now the problem is when i call these packages from the shared folder from live server it crashes.....i need to changes something to shift the whole package to the live server..and execute on live server itself instead of recreating the whole 25 process from scratch.....also i use optimize for many tables ..and run in a single trancastion....so how can i see the mappings of source and destination tables.
Please let me know the process how i can achieve this.
Thanks
George
I wrote the below script to print all folders and files located in the share path. How to extend my script to mention by adding another column whether the file is a folder/file , sort of 0 or 1.
declare @chkdirectory1 varchar(4000) = 'shared_pathfolder';
declare @finalserver3 varchar(4000);
create table #tmp (directory_name varchar(4000))
SET @finalserver3 = '''"DIR ' + @chkdirectory1 + ' /B"''';
--select @finalserver3
--SELECT @finalServer
DECLARE @ExecCmd varchar(100)
--SELECT @ExecCmd = 'EXEC master.dbo.xp_cmdshell ' + char(50) + 'mkdir D:'+ CONVERT(varchar(8), getdate(), 112) + '' + char(50)
SET @ExecCmd = 'EXEC master.dbo.xp_cmdshell ' + @finalserver3
--SELECT @ExecCmd
exec(@ExecCmd)
drop table #tmp
I have a package that need to copy a file from a remote server using path like ipaddresssharedfolder..now, inside of Microsoft data tools, everything runs fine, because y access to those folders on the windows sessions and enter my credentials. however how to I set up the package to use my credentials on the remote server?
Without it, I got error Executed as user: NT ServiceSQLAgent$RETAIL_PRO.and this user does not exists on remote server. so got access denied. error.
I know a WMI event watcher can be used to watch for a new file being added to a folder. However, I need to check for new folders being added to an existing folder. I haven't been able to find a post on doing this. Is there a way in WQL to check for a new folder being added instead of a new file? I've used SQL for years, but am new to SSIS.
View 2 Replies View RelatedHi all,
Ive tried searching but nothing came up specific to my problem.
I currently have all the reports separated into separate folders, such as Finance, HR etc, and the datasources in a datasources folder at the root level. My problem is that when i upload reports into their appropriate folder, the shared datasource information gets deleted, and i need to re-map it everytime.
Is there anyway to have the specific reports point to specific shared datasource folders on the report server when i upload them?
cheers
Rob.
can sql server 2005 access files on a shred folder (which sql 2000 was not able)?
thnaks in advance
peleg
Israel -the best place to live in aftr heaven 9but no one wan't to go there so fast -:)
Hi All,
I'm delivering some reports through data driven to a shared folder in another computer in my network. In report manager, under status, it shows the following message.
Processing: 10 processed of 12 total; 0 errors.
When I chek in the folder, I can see sometimes only 6 reports. I rescheduled and run it again. Then I could see only 10 out of 12 reports.
I opened the Management Studio and tried to refreshed the report subscription under subscription. Then I got the following error message.
"
nvalid node. (Microsoft.SqlServer.SmoEnum)
------------------------------
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
------------------------------
Program Location:
at Microsoft.SqlServer.Management.Smo.XPathExpression.Load(AstNode ast)
at Microsoft.SqlServer.Management.Smo.XPathExpression.Compile(String strXPathExpression)
at Microsoft.SqlServer.Management.Smo.Urn.get_XPathExpression()
at Microsoft.SqlServer.Management.Smo.Urn.Compare(Urn u1, Urn u2, CompareOptions[] compInfoList, CultureInfo cultureInfo)
at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItemBuilder.RebuildItem(INavigableItem item, Boolean applyItemParentFilter)
at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItem.RequeryProperties()
at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItem.Refresh(Boolean autoRefresh)
at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.NavigableItem.Refresh()
at Microsoft.SqlServer.Management.UI.VSIntegration.ObjectExplorer.ExplorerHierarchyNode.Refresh()
Even If see error log file, I can not see any file there. This is actually a file of the previous run.
<Header>
<Product>Microsoft SQL Server Reporting Services Version 9.00.2047.00</Product>
<Locale>en-US</Locale>
<TimeZone>Pacific Standard Time</TimeZone>
<Path>C:Program FilesMicrosoft SQL ServerMSSQL.3Reporting ServicesLogFilesReportServerService__11_05_2007_11_29_11.log</Path>
<SystemName>DEV02</SystemName>
<OSName>Microsoft Windows NT 5.1.2600 Service Pack 2</OSName>
<OSVersion>5.1.2600.131072</OSVersion>
</Header>
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing ConnectionType to '0' as specified in Configuration file.
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing IsSchedulingService to 'True' as specified in Configuration file.
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing MaxQueueThreads to '0' thread(s) as specified in Configuration file.
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing IsWebServiceEnabled to 'True' as specified in Configuration file.
ReportingServicesService!configmanager!4!11/5/2007-11:29:12:: w WARN: WebServiceAccount is not specified in the config file. Using default: DEV02ASPNET
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing MaxActiveReqForOneUser to '20' requests(s) as specified in Configuration file.
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing WatsonDumpOnExceptions to 'Microsoft.ReportingServices.Diagnostics.Utilities.InternalCatalogException,Microsoft.ReportingServices.Modeling.InternalModelingException' as specified in Configuration file.
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing WatsonDumpExcludeIfContainsExceptions to 'System.Data.SqlClient.SqlException,System.Threading.ThreadAbortException' as specified in Configuration file.
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing SecureConnectionLevel to '0' as specified in Configuration file.
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing DisplayErrorLink to 'True' as specified in Configuration file.
ReportingServicesService!library!4!11/5/2007-11:29:12:: i INFO: Initializing WebServiceUseFileShareStorage to 'False' as specified in Configuration file.
ReportingServicesService!library!8!11/5/2007-11:29:20:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.ReportServerDatabaseUnavailableException: The report server cannot open a connection to the report server database. A connection to the database is required for all requests and processing., ;
Info: Microsoft.ReportingServices.Diagnostics.Utilities.ReportServerDatabaseUnavailableException: The report server cannot open a connection to the report server database. A connection to the database is required for all requests and processing. ---> System.Data.SqlClient.SqlException: Cannot open database "ReportServer" requested by the login. The login failed.
Login failed for user 'ZOPAratnayake'.
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
at System.Data.SqlClient.SqlInternalConnectionTds.CompleteLogin(Boolean enlistOK)
at System.Data.SqlClient.SqlInternalConnectionTds.OpenLoginEnlist(SqlConnection owningObject, SqlConnectionString connectionOptions, String newPassword, Boolean redirectedUserInstance)
at System.Data.SqlClient.SqlInternalConnectionTds..ctor(DbConnectionPoolIdentity identity, SqlConnectionString connectionOptions, Object providerInfo, String newPassword, SqlConnection owningObject, Boolean redirectedUserInstance)
at System.Data.SqlClient.SqlConnection.Open()
at Microsoft.ReportingServices.Library.ConnectionManager.OpenConnection()
--- End of inner exception stack trace ---
ReportingServicesService!library!8!11/5/2007-11:29:21:: Exception caught while starting service. Error: Microsoft.ReportingServices.Diagnostics.Utilities.ReportServerDatabaseUnavailableException: The report server cannot open a connection to the report server database. A connection to the database is required for all requests and processing. ---> System.Data.SqlClient.SqlException: Cannot open database "ReportServer" requested by the login. The login failed.
Login failed for user 'ZOPAratnayake'.
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
ReportingServicesService!library!4!11/05/2007-11:43:03:: Status: The file "Members - Awaiting Approval.xls" has been saved to the "\derbyReports" file share.
ReportingServicesService!notification!4!11/05/2007-11:43:03:: Notification 77fc9843-80f6-4392-9c26-3a84b087dafb completed. Success: True, Status: The file "Members - Awaiting Approval.xls" has been saved to the "\derbyReports" file share., DeliveryExtension: Report Server FileShare, Report: Members - Awaiting Approval, Attempt 0
ReportingServicesService!dbpolling!4!11/05/2007-11:43:03:: NotificationPolling finished processing item 77fc9843-80f6-4392-9c26-3a84b087dafb
ReportingServicesService!library!4!11/5/2007-11:43:03:: i INFO: Initializing SqlStreamingBufferSize to default value of '64640' Bytes because it was not specified in Server system properties.
ReportingServicesService!library!4!11/5/2007-11:43:03:: i INFO: Initializing SnapshotCompression to 'SQL' as specified in Server system properties.
ReportingServicesService!library!d!11/05/2007-11:43:03:: Data Driven Notification for activation id 668b2d9c-d5ea-46e9-8efd-e2824e9a8c64 was saved.
ReportingServicesService!library!d!11/05/2007-11:43:03:: Status: The file "Members - Awaiting Approval.xls" has been saved to the "\derbyReports" file share.
ReportingServicesService!notification!d!11/05/2007-11:43:03:: Notification 5fa50ecb-1b0f-4e40-86c8-4c4700c4b820 completed. Success: True, Status: The file "Members - Awaiting Approval.xls" has been saved to the "\derbyReports" file share., DeliveryExtension: Report Server FileShare, Report: Members - Awaiting Approval, Attempt 0
ReportingServicesService!dbpolling!d!11/05/2007-11:43:03:: NotificationPolling finished processing item 5fa50ecb-1b0f-
ReportingServicesService!library!d!11/05/2007-11:43:03:: Data Driven Notification for activation id 668b2d9c-d5ea-46e9-8efd-e2824e9a8c64 was saved.
ReportingServicesService!library!d!11/05/2007-11:43:03:: Status: The file "Members - Awaiting Approval_1.xls" has been saved to the "\derbyReports" file share.
ReportingServicesService!notification!d!11/05/2007-11:43:03:: Notification 3a282979-9cab-44fd-b4e8-a083558c494e completed. Success: True, Status: The file "Members - Awaiting Approval_1.xls" has been saved to the "\derbyReports" file share., DeliveryExtension: Report Server FileShare, Report: Members - Awaiting Approval, Attempt 0
ReportingServicesService!dbpolling!d!11/05/2007-11:43:03:: NotificationPolling finished processing item 3a282979-9cab-44fd-b4e8-a083558c494e
ReportingServicesService!subscription!4!11/05/2007-11:43:03:: System.IO.IOException: The process cannot access the file '\derbyReportsMembers - Awaiting Approval_1.xls' because it is being used by another process.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy)
at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access)
at Microsoft.ReportingServices.FileShareDeliveryProvider.FileShareProvider.SaveReport(Notification notification, SubscriptionData data)
ReportingServicesService!subscription!4!11/05/2007-11:43:03:: Error writing file Members - Awaiting Approval_1.xls to path \derbyReports
ReportingServicesService!library!4!11/05/2007-11:43:03:: Data Driven Notification for activation id 668b2d9c-d5ea-46e9-8efd-e2824e9a8c64 was saved.
ReportingServicesService!library!4!11/05/2007-11:43:03:: Status: Failure writing file Members - Awaiting Approval_1.xls : The process cannot access the file '\derbyReportsMembers - Awaiting Approval_1.xls' because it is being used by another process.
ReportingServicesService!dbpolling!4!11/05/2007-11:43:03:: NotificationPolling finished processing item 09e28175-ce44-41cd-ab55-56199a58885f
ReportingServicesService!dbpolling!d!11/5/2007-11:43:03:: NotificationPolling processing item 41cfd72e-bda0-4e8f-87dd-a22789eb340f
ReportingServicesService!dbpolling!c!11/5/2007-11:43:03:: NotificationPolling processing 2 more items. 2 Total items in internal queue.
ReportingServicesService!dbpolling!4!11/5/2007-11:43:03:: NotificationPolling processing item 34e19ce1-b298-48e6-a2fc-b7637765968c
ReportingServicesService!subscription!11!11/05/2007-13:18:10:: Auto increment filename = Members - Awaiting Approval_5.xls
ReportingServicesService!subscription!11!11/05/2007-13:18:10:: System.IO.IOException: The process cannot access the file '\derbyReportsMembers - Awaiting Approval_5.xls' because it is being used by another process.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy)
at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access)
at Microsoft.ReportingServices.FileShareDeliveryProvider.FileShareProvider.SaveReport(Notification notification, SubscriptionData data)
ReportingServicesService!subscription!11!11/05/2007-13:26:15:: Auto increment filename = Members - Awaiting Approval_11.xls
ReportingServicesService!library!11!11/05/2007-13:26:15:: Data Driven Notification for activation id 7bd68b9c-855f-4f2a-9482-43ac5f7c9589 was saved.
ReportingServicesService!library!11!11/05/2007-13:26:15:: Status: The file "Members - Awaiting Approval_11.xls" has been saved to the "\derbyReports" file share.
ReportingServicesService!notification!11!11/05/2007-13:26:15:: Notification 76aedee6-e860-4c60-aebc-2c3aeb854bed completed. Success: True, Status: The file "Members - Awaiting Approval_11.xls" has been saved to the "\derbyReports" file share., DeliveryExtension: Report Server FileShare, Report: Members - Awaiting Approval, Attempt 1
ReportingServicesService!dbpolling!11!11/05/2007-13:26:15:: NotificationPolling finished processing item 76aedee6-e860-4c60-aebc-2c3aeb854bed
ReportingServicesService!library!11!11/5/2007-13:29:12:: i INFO: Cleaned 0 batch records, 0 policies, 12 sessions, 0 cache entries, 12 snapshots, 36 chunks, 0 running jobs, 0 persisted streams
Here I did not post the whole error log, because its too long.
Does any one know why I can not deliver all the reports?
I am trying to create a basic report in SSRS, with 6 columns but then we receive an excel spreadsheet with different account numbers daily and I am requested to create a report based on those account numbers. so how can I create a shared folder for the users to drop their excel spreadsheet there and make my SSRS only filter the result set based only on those account numbers?
View 5 Replies View Related
Hi All,
I subsricbe few reports by data driven file shared method. Each reports is supposed to be delivered to 5 different customers. But each customer can see only his/her data. So I use a paramter to generate 5 different reports for each customer. But once the five different reports are saved in the shared file folder, is there any method to delete old files? These reports are to be executed once a week
Thanks
Hi,
My code should copy files from a shared folder.
the share can be accessed by all who can provide a specified username and password.
I use the following code:
But where to specify username name password to access that folder?
DECLARE @cmdstr varchar(1000)
set @cmdstr = 'copy \servernamefoldername$ C:DataExTest'
print @cmdstr
EXEC xp_cmdshell @cmdstr
Any sort of help would be highly appreciated.
Thanks in advance
Hi,
I'm trying to silently(quiet mode) install Sql Server 2005 and our application.I created a custom wrapper and included all the files and folders extracted from SQLEXPR.exe.
So when I execute this customized setup its throwing an error : Could not register Program Files/Microsoft SQL Server/90/Shared/msxmlsql.dll
If I ignore the above error its throwing similar errors for almost all the files under Program Files/Microsoft SQL Server/90/Shared folder.
kindly help me to resolve the issue.
Thanks & Regards
Krish
Dear all,
I now have 2 server, say serverA and serverB. ServerA has a SQL server and I am using a command in the SQL server to access the serverB. The command is:
declare @command varchar(500)
set @command = 'mkdir ' + '"\serverBabckevin_wong123"
exit'
exec master.dbo.XYRunProc
'cmd.exe',
'c:',
@command
where abc is a shared folder in serverB which can be accessed by serverA. However the output of this execution is "c:> mkdir "\serverBabckevin_wong123" Access is denied". It is possible to do this in command prompt. Does anyone know what's the problem of this?
Thank you very much!!
Kevin
having on mind that this is my Target server: what is the way of creating shared folder in order to perform operation from the title (and, of course, to continue with installation of packages etc...)? SQL SERVER 2008 R2
View 26 Replies View RelatedI am using the below code in my command prompt and it is copying all the records from a particular table and dropping in Flat file format in particular folder location. The below code is working if I am pointing to my local database but if I need to point to different database outside my environment how should I set it here also including the case where User ID and password are required to access the db.bcp AdventureWorks.HumanResources.Department out C:myDepartment_c_t.txt -c -t, -r -T -S.
View 12 Replies View RelatedHi Experts,
I have to find the latest file in a folder and export data to a table in sql server.
The code should be something that has to be incorporated into a t-sql stored procedure.
The file name would for example abc_defYYYYMMDD.xls.
would i be able to find the latest file in the folder using the the datestamp (YYYYMMDD) in the filename.
Please note i would have files in other format and names with datestamp attached to it, so the code has to pick specific file for which the file name starts with 'abc_def'
and export data to a table.
Any help would be highly appreciated
I have a requirement where I have to take all the data available from a sql table and write it out as a flat file in folder location.Its a simple table have 8-10 coloumns, have to take this data on daily basis from sql table and deliver out as flat file in a folder.
View 19 Replies View RelatedI am trying my first bulk update to an existing SWL table from a CSV text file,The text file naming is exacrtly the same as the SQL table, with the same attributes
The statements:
BULK INSERT [Jedox_prod].[dbo].[B_BP_Customer]
FROM 'c:Baanjedox_dailyjdcom4401.txt'
WITH
[code]....
The error message is:
[size=1Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 3 (BP_Country).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".size=1]..The have checked and re-checked the BP_Country field ( the 1st field after the key) and I am not seeing any mismatches.
Hi i am getting a weekly transaction file which has two columns trans code and trans date to indicate whether the record is changed, added or modified . and the monthly master file contains blank in these two fields.
How do i update the row coming from the transaction file in the tables which contain the rows from the master file .
to better explain the example is
Master File
ID Name AGE Salary Transcode TransDate
2 dev 27 2777
Transaction File indicating change
ID Name AGE Salary Transcode TransDate
2 dev 27 24444 C 08072007
the ouput should be
2 dev 27 24444 C 08072007
replacing the existing row in the table(updating the whole row)
i have 50 columns in my table and based on the two fields i should replace the rows exisiting in table and if ID doesnot match exist just add them as a new row.
what transformation should i use .... to replace all the columns which have matching ID in table to the current record from trans file and if there doesnt exist matching id just add them as new row.
Thanks...
I am having problem with moving a file from one folder to another folder. Here is the detailed scenario:
I want to move a input.csv file from shared input folder to shared archive folder. i am using the below code to do this.
declare @inpath varchar(100)
SET @inpath = 'move "\abcdefINPUTinput*.csv" "\abcdefARCHIVEarchive.csv"'
EXEC @filestatus = master..xp_cmdshell @inpath
but the problem was it was cutting the input.csv file from INPUT folder but not pasting it in the ARCHIVE folder.
I really appreciate if anyone can help me to solve this or anyone can tell some workarounds.
Thanks,
Hi all,I have a problem and need some ideas.What I have done: I created a page to upload an excel file into a SQL Server table along with some customer info (from the login, day, etc.). This excel file contains several rows (some of them may be blank) and columns (also some may be blank). The file is stored in an image object.The file will be checked (they want to do it manually, because contents is a problem). If they say it is OK, I want to run a program to add a record into an existing table with the request no. (from the first table, where the object is stored) and all the information available from the filled rows (first row is header). I have a column, which can be checked, if the row contains data or not.Any ideas?I know how to read from and write the contents of the object to a field in the SQL table. Can I use this?Thanks for any idea / code / link.
View 2 Replies View RelatedIn the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this:
1. If no file(s) found, stop executing and send email saying no file(s) found;
2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1:
----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports Microsoft.VisualBasic.FileIO.FileSystem
Imports System.IO.FileSystemInfo
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String
Dim cFileType As String
Dim cFileFlgVar As String
WriteVariable("SCFileFlg", False)
WriteVariable("OOFileFlg", False)
WriteVariable("INFileFlg", False)
WriteVariable("IAFileFlg", False)
WriteVariable("RCFileFlg", False)
cDataFileName = ReadVariable("DataFileName").ToString
cFileType = Left(Right(cDataFileName, 4), 2)
cFileFlgVar = cFileType.ToUpper + "FileFlg"
WriteVariable(cFileFlgVar, True)
Dts.TaskResult = Dts.Results.Success
End Sub
Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object)
Try
Dim vars As Variables
Dts.VariableDispenser.LockForWrite(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
vars(varName).Value = varValue
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
End Sub
Private Function ReadVariable(ByVal varName As String) As Object
Dim result As Object
Try
Dim vars As Variables
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
result = vars(varName).Value
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
Return result
End Function
End Class
example for step 2:
-------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
'Set the properties like username & password
cm.Properties("ServerName").SetValue(cm, "ftp.name.com")
cm.Properties("ServerUserName").SetValue(cm, "username")
cm.Properties("ServerPassword").SetValue(cm, "password")
cm.Properties("ServerPort").SetValue(cm, "21")
cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout
cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb
cm.Properties("Retries").SetValue(cm, "1")
'create the FTP object that sends the files and pass it the connection created above.
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))
'Connects to the ftp server
ftp.Connect()
'ftp.SetWorkingDirectory("..")
ftp.SetWorkingDirectory("directoryname")
Dim folderNames() As String
Dim fileNames() As String
ftp.GetListing(folderNames, fileNames)
Dim maxname As String = ""
For Each filename As String In fileNames
' whatever operation you need to do to find the correct file...
Next
Dim files(0) As String
files(0) = maxname
ftp.ReceiveFiles(files, "C: emp", True, True)
' Close the ftp connection
ftp.Close()
'Set the filename you retreive for use in data flow
Dts.Variables.Item("FILENAME").Value = maxname
Catch ex As Exception
Dts.TaskResult = Dts.Results.Failure
End Try
Dts.TaskResult = Dts.Results.Success
End Sub
End Class
Is it possible to setup to SQL Server engines to point to a shared set of DB files on a shared file system? We want to use this as a type of fail-over keeping our files separate from the machines the engines run on.
View 1 Replies View RelatedI know , it is not going to work , just wondering if anyone could give any reasons for that. Whether it was intentional constraint or just internally compact edition was designed in particular way which makes such a usage not possible. It is a pity that it doesn't work that way as it would be much easier transitional path for many Visual Foxpro , MS Access applications.
In my case I just want READ-ONLY database either for multi-user access via shared drive or stand-alone on local drive.
Hi
I am having an Access database on a shared network drive which has read/write access rights on the that shared network drive.
When I try to Access data through the linked server it gives me gives me a message box saying you do not have permissions to view the data.
Also if i try to use xp_cmdshell to copy over the mdb file to my local drive it say 'Access denied'
But when I copy (through command prompt) the same file to another network drive or my local drive where I have full control the linked server can connect sucessfully.
The problem is the i cannot have 'full control' permissions on shared drive where my database resides.
Has anybody encountered this problem....
Any help will be greatly appreciated.
Urgent
Puru
I have a huge MDF File - 120 GB File (Had setup as 1 MDF initially) -- Did not anticipate that the DB would grow to that size!!
Anyways.. I heard that the general performance woul grow if i had them as "File Groups"..
Is there anyway - to split the existing MDF file into Mutliple files as a File Group?
Where should i start? Can someone please direct me..
How do I insert data from a flat file or .csv file into an existing SQL database???
Here what I've come up with thus far and I but it doesn't work. Can someone please help? Let me know if there is a better way to do this... Idealy I'd like to write straight to the sql database and skip the datset all together...
strSvr = "vkrerftg"
StrDb = "Test_DB"
'connection String
strCon = "Server=" & strSvr & ";database=" & StrDb & "; integrated security=SSPI;"
Dim dbconn As New SqlConnection(strCon)
Dim da As New SqlDataAdapter()
Dim insertComm As New SqlCommand("INSERT INTO [Test_DB_RMS].[dbo].[AIR_Ouput] ([Event], [Year], [Contract Loss],[Company Loss], " & _
"[IndInsured Loss Prop],[IndInsured Loss WC],[Event Info]) " & _
"VALUES (@Event, @Year, @ConLoss, @CompLoss, @IndLossProp, @IndLossWC, @eventsInfo)", dbconn)
insertComm.Parameters.Add("@Event", SqlDbType.Int, 4, "Event")
insertComm.Parameters.Add("@Year", SqlDbType.Float, 4, "Year")
insertComm.Parameters.Add("@ConLoss", SqlDbType.Float, 4, "Contract Loss")
insertComm.Parameters.Add("@CompLoss", SqlDbType.Float, 4, "Company Loss")
insertComm.Parameters.Add("@IndLossProp", SqlDbType.Float, 4, "IndInsured Loss Prop")
insertComm.Parameters.Add("@IndLossWC", SqlDbType.Float, 4, "IndInsured Loss WC")
insertComm.Parameters.Add("@eventsInfo", SqlDbType.NVarChar, 255, "Event Info")
da.InsertCommand = insertComm
Dim upComm As New SqlCommand("UPDATE [Test_DB_RMS].[dbo].[AIR_Ouput] " & _
"SET [Event] = @Event " & _
",[Year] = @Year " & _
",[Contract Loss] = @ConLoss " & _
",[Company Loss] = @CompLoss " & _
",[IndInsured Loss Prop] = @IndLossProp " & _
",[IndInsured Loss WC] = @IndLossWC " & _
",[Event Info] = @EventInfo", dbconn)
upComm.Parameters.Add("@Event", SqlDbType.Int, 4, "Event")
upComm.Parameters.Add("@Year", SqlDbType.Float, 4, "Year")
upComm.Parameters.Add("@ConLoss", SqlDbType.Float, 4, "Contract Loss")
upComm.Parameters.Add("@CompLoss", SqlDbType.Float, 4, "Company Loss")
upComm.Parameters.Add("@IndLossProp", SqlDbType.Float, 4, "IndInsured Loss Prop")
upComm.Parameters.Add("@IndLossWC", SqlDbType.Float, 4, "IndInsured Loss WC")
upComm.Parameters.Add("@EventsInfo", SqlDbType.NVarChar, 255, "Event Info")
da.UpdateCommand = upComm
da.Update(dsAIR, "TextDB")
************* ANY HELP WOULD BE GREATLY APPRECIATED************
THANKS
How do I insert data from a flat file or .csv file into an existing SQL database???
Here what I've come up with thus far and I but it doesn't work. Can someone please help? Let me know if there is a better wway to do this... Idealy I'd like to write straight to the sql database and skip the datset all together...
strSvr = "vkrerftg"
StrDb = "Test_DB"
'connection String
strCon = "Server=" & strSvr & ";database=" & StrDb & "; integrated security=SSPI;"
Dim dbconn As New SqlConnection(strCon)
Dim da As New SqlDataAdapter()
Dim insertComm As New SqlCommand("INSERT INTO [Test_DB_RMS].[dbo].[AIR_Ouput] ([Event], [Year], [Contract Loss],[Company Loss], " & _
"[IndInsured Loss Prop],[IndInsured Loss WC],[Event Info]) " & _
"VALUES (@Event, @Year, @ConLoss, @CompLoss, @IndLossProp, @IndLossWC, @eventsInfo)", dbconn)
insertComm.Parameters.Add("@Event", SqlDbType.Int, 4, "Event")
insertComm.Parameters.Add("@Year", SqlDbType.Float, 4, "Year")
insertComm.Parameters.Add("@ConLoss", SqlDbType.Float, 4, "Contract Loss")
insertComm.Parameters.Add("@CompLoss", SqlDbType.Float, 4, "Company Loss")
insertComm.Parameters.Add("@IndLossProp", SqlDbType.Float, 4, "IndInsured Loss Prop")
insertComm.Parameters.Add("@IndLossWC", SqlDbType.Float, 4, "IndInsured Loss WC")
insertComm.Parameters.Add("@eventsInfo", SqlDbType.NVarChar, 255, "Event Info")
da.InsertCommand = insertComm
Dim upComm As New SqlCommand("UPDATE [Test_DB_RMS].[dbo].[AIR_Ouput] " & _
"SET [Event] = @Event " & _
",[Year] = @Year " & _
",[Contract Loss] = @ConLoss " & _
",[Company Loss] = @CompLoss " & _
",[IndInsured Loss Prop] = @IndLossProp " & _
",[IndInsured Loss WC] = @IndLossWC " & _
",[Event Info] = @EventInfo", dbconn)
upComm.Parameters.Add("@Event", SqlDbType.Int, 4, "Event")
upComm.Parameters.Add("@Year", SqlDbType.Float, 4, "Year")
upComm.Parameters.Add("@ConLoss", SqlDbType.Float, 4, "Contract Loss")
upComm.Parameters.Add("@CompLoss", SqlDbType.Float, 4, "Company Loss")
upComm.Parameters.Add("@IndLossProp", SqlDbType.Float, 4, "IndInsured Loss Prop")
upComm.Parameters.Add("@IndLossWC", SqlDbType.Float, 4, "IndInsured Loss WC")
upComm.Parameters.Add("@EventsInfo", SqlDbType.NVarChar, 255, "Event Info")
da.UpdateCommand = upComm
da.Update(dsAIR, "TextDB")
************* ANY HELP WOULD BE GREATLY APPRECIATED************
THANKS
good morning,
I want to load data that i receive everydays from my customers in .xls file format (excel) or cvs file format, to the database that i have created on this purpose. but when trying to do that whith SSIS; i got an error message .... that i can't import redudant data in my database column.
Best regards.
We are trying to import data from a .csv file which sits on shared location. This package runs fine when we run it from designer. but we are having problem when we do it at run time (accessing it through a service). Same package runs fine if that file is on same server.
Is any one gone through this issue before? i appreciate any help in resolving this issue.
--------Log----
#Fields: event,computer,operator,source,sourceid,executionid,starttime,endtime,datacode,databytes,message
OnPreValidate,SMSPAD1125M,RFCGKommar1,GIRI_ETL_XREF,{4D456D56-B35F-4FCC-8A89-2D03AC545C76},{5395DAA0-DB96-49CA-BDE7-0DA5C623A2B0},7/17/2006 10:46:42 AM,7/17/2006 10:46:42 AM,0,0x,(null)
OnError,SMSPAD1125M,RFCGKommar1,GIRI_ETL_XREF,{4D456D56-B35F-4FCC-8A89-2D03AC545C76},{5395DAA0-DB96-49CA-BDE7-0DA5C623A2B0},7/17/2006 10:46:42 AM,7/17/2006 10:46:42 AM,-1073659875,0x,Connection "FlatFile" failed validation.
---------------------------------
Thanks,
-G
I am having frustration with issues I'm seeing within Development Studio. Here's the problem:
A fixed-width flat file is to be imported into SQL 2005. I am running from within BIDS on my workstation and send the data remotely to the server. It has over 200 fields, and I am attempting to import them into dimension tables, and one fact table. I had to manually create the columns on the flat file connection manager for this process, since the data file is delimited.
The SSIS solution has one project, and several independent data flow tasks within it. After I set up the fcm, I mapped fields using a flat file source, hooked up a derived column and sort transform. The idea was to use a 3rd party tool, TableDifference, to implement a Type 2 dimension. I was able to successfully build three data flow tasks that ran correctly. Later, as was building new data flow tasks, our project people decided to change the data types of several of the fields from Datetime to character, and also rename another 40+ fields. I dutifully kept my fcm synchronized, since column mappings are automatic when the names match; data type changes were also unavoidable on some of the columns.
The problem was noticed after I changed some of the columns in the fcm. My flat file source adapters on all the tasks needed to resolve changed metadata so I had to 'edit' and save them again. The derived column, sort, and downstream transforms also had to be updated. I saw strange behavior on previously tested data flow tasks. Some of them gave me errors even connecting to their sources. Other times I saw erroneous paths being taken which led to data being changed incorrectly on the dimension tables (new records being added even if there was an existing match, records being marked as expired incorrectly, etc.).
Are there known issues with editing a shared file connection manager after it has already been referred to within data flow tasks? It appears that the only way to avoid the problems I've seen is to either:
1. Create separate fcm's for each data flow task. Use 'filler' fields for the ranges of all the columns in between each referenced column. This way if you have to change anything with it, the only rework involved is limited to the data flow task using it.
2. Create one fcm for the entire package and NEVER change anything with it. If a column name is wrong, use a derived column to rename it in the data flow. If a data type changes, try to use a data conversion transform, but be sure to rename its output column so that it is not called 'data conversion.<field name>'. I think that's done in the Advanced Editor for the sort task.
My experience using BIDS has been hot and cold. It's nice to be able to quickly build the tasks you need, but it's as if the metadata (pipeline) is much too sensitive to change and requires a lot of touching up, either upstream at the source adapter level, or across data flow tasks when you make changes to the FCM. Maybe the problem is just limited to flat files...
Are there any tips or lessons learned from anyone on this? I've got another developer willing to create the FCM for other data feeds so I can copy/paste them into other projects. It would just be nice to know there was a bug fix or a good method to avoid all this 'rework' when you have to make changes to the FCM after the fact.
Thanks in advance.
I have an ssis package that moves data from a new csv file in a share location to sql server database table. However I need to get this agent job triggered whenever a new csv file gets added to the shared location.
What is a best strategy to do this keeping in mind that while package is running and two new csv files come in and package shd copy data from both the files.