I have installed SSAS on a 64 bit machine and am trying to connect to Oracle but having little success.
Some of the solutions I've read indicate to uninstall SSAS and then reinstall, making sure that the reinstall does not use the Program Files (x86) folder since Oracle has a problem with the parentheses.
Here's my question... How do you NOT install in the Program Files (x86) folder. It seems SQL Server always wants to put a small part in there.
Thanks in advance and sorry if this is a dumb question.
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this: 1. If no file(s) found, stop executing and send email saying no file(s) found; 2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1: ----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task.
' The execution engine calls this method when the task executes. ' To access the object model, use the Dts object. Connections, variables, events, ' and logging features are available as static members of the Dts class. ' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure. ' ' To open Code and Text Editor Help, press F1. ' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String Dim cFileType As String Dim cFileFlgVar As String WriteVariable("SCFileFlg", False) WriteVariable("OOFileFlg", False) WriteVariable("INFileFlg", False) WriteVariable("IAFileFlg", False) WriteVariable("RCFileFlg", False) cDataFileName = ReadVariable("DataFileName").ToString cFileType = Left(Right(cDataFileName, 4), 2) cFileFlgVar = cFileType.ToUpper + "FileFlg" WriteVariable(cFileFlgVar, True) Dts.TaskResult = Dts.Results.Success End Sub Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object) Try Dim vars As Variables Dts.VariableDispenser.LockForWrite(varName) Dts.VariableDispenser.GetVariables(vars) Try vars(varName).Value = varValue Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try End Sub Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
example for step 2: -------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
Hi All,I have a multiple files, but they are store in different directory onthe server. I want open those files and insert it into the databaseusing bcp.Example files structure dir:\xyz123abc ext1.txt\xyz123abc ext2.txt\zyz123999 ext2.txtbcp "dabase" in \xyz123abc ext1.txt -c -S"servername' -Usa-Ppassword -T".is there away to loop througth each dir, get the files, excecute thebcp, then go to next folder.Please help. Thanks in advance.Ted Lee
I'm tring to copy files from FTP address, the problem is that sometimes the FTP folder is empty, and then the FTP Task is failed. Why is it failed if there are no files? Any suggestion how to avoid the error?
I am developing application with SQLCE2.0, NETCF1.0,Sp1,VS2003.
I found there is some files are created under "Temp" folder by the system with size "0B".
Why/when these file are created? Do I need to clean it periodicly? If not, will this cause exception like "Not enough storage is available to complete this operation"?
1) I read that the path can be changed without any problem, in the same server we have a D drive with 40GB of space, so I am planning to change the path to: D:RSTempFiles. Any experience on changing that path?
2) If the change of path is successfull, how I can delete the files on the RSTempFiles on the C: drive. Can I directly delete the files? Ther just previous snapshots or I will lose something? Any experience with this will be great.
I'm wondering if its possible to return the number of files in a folder suing something like DIR command.I'm wanting to do something like, If count(DIR) >0 then do something
I have a challenge i am trying to overcome, hopefully soneone would have come across this issue before..
I am creating a DTS package that will be scheduled to run at a certain time everyday. A source folder exists that get a set of new files everyday.The DTS Package will then read each file and copy the data into a load table in my database the challenge is this:
I am trying to load files from a source folder into my load table, Within each file, the entires are in a specific format using pipes to seperate the data that goes into which column e.g
example of a file entry:
column1 | column2 | column3
data1 | data2 | data3
data1 | data2 | data3
data1 | data2 | data3
And now i am using DTS to specify the file format and map the cloumns as apprporiate to my table...all this is well and good, but my problem is each file has a different name as well as being timestamped, now how do i use DTS to specify the source folder, open each file sequentially and read (or more appropriate, copy the entries into my table, inserting new data from each file into my load table as well as overwriting old data in the load table from the files in the folder ?) is there a way in specifying your source folder in DTS rather than specifying the file in the Menu options (in the transformation data task properties )given, and or do i need to write a script for this(reading the file?)
can someone please give me a solution and how to approach this?
I can detach/attach SSAS database.But I have a software that protects(backs up) the files of the SSAS Database.
What the customer needs is to be able to take these backed up files and port it to a different server and attach it there.But the new server complains that these files have no corresponding detach-log files.
The customer doesn't want to backup and restore the SSAS databases.
After the incremental process and full process, SSAS doesn't drop the index files #.xxx.fact.map and #.xxx.fact.map.hdr in the file.0.dim folder. We now have all the versions from 3 to 5000 sitting in the folder. The DBA team only found this when the disk is running out of space recently.
We've already check the account running the SSAS has local admin to the server.
Is there any config setting that might cause this issue? If not, what could it be causing this issue.
Hi, I am wondering how can we return each file in a folder ? I am trying to get each file in a folder to perform other job. Below is the description ... while folder is not empty foreach file run the job end for end while
SET nocount ON DECLARE @fileNameNew VARCHAR(1024) -- filename of new error report DECLARE @fileNameOld VARCHAR(1024) -- filename of old error report DECLARE @out_fileDate CHAR(1024) DECLARE @fileDate DATETIME -- used for file name
We noticed SQL Server 2005 is creating Program FilesCommon FilesMicrosoft SharedDW on our largest drive for each 64-bit installation. Does anyone know what this is? It appears there is no Microsoft documentation regarding this installation and if we need to keep it. It may be .NET related, but I have no idea why it is needed.
I am looking for a way to truncate raw files without losing the metadata. The metadata of the raw file should be automatically detected at run-time. The result will be a raw file with the same metadata as the original file, but no data in it.
There are two reasons I would like such a tool. First, I want to erase the potentially sensitive data stored in the raw files. Second, I want to keep Validation enabled so that development is simpler.
Using the below script task I am checking for the excel file existence and upon file existence using the data flow task will load the excel data into sql table. After the data is loaded from one file or however number of excel files present, I want to move those into a archieve folder with datetimestamp to the filenames,please let me know how I can move those files with datetimestamp to the filenames, any help is greatly appreciated. Thanks!!
Imports System Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime Imports System.IO Public Class ScriptMain Public Sub Main() If File.Exists(ReadVariable("FileNameVariable").ToString()) Then Dts.TaskResult = Dts.Results.Success Else Dts.TaskResult = Dts.Results.Failure End If End Sub
'From Daniel Read's Blog - http://www.developerdotstar.com/community/node/512/ Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
I have a folder which contains all the flat files which are used by all the packages(ex--flat file connection managers) in my project. If we want to change the name of the folder,have to change in every package( in all connection managers) manually.It looks hardcoding and timetaking.
Is there any way to change in one place(xml,file,variable) so that it should be affected in all the packages.
one more doubt is..
If we configure the flat file connection manager in package configurations,configuration file (ex-xml)will be created (we can make changes in that file regarding that connecion mgr only.)
But i want one configuration file (ex--xml) so that i should configure the details of all the connection managers used in all packages.
Using the below script task I am checking for the excel file existence and upon file existence using the data flow task will load the excel data into sql table. After the data is loaded from one file or however number of excel files present, I want to move those excel files into a archieve folder with date×tamp to the filenames, please let me know how I can move those files with datetimestamp to the filename, any help is greatly appreciated. Thanks!! Imports System Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime Imports System.IO Public Class ScriptMain Public Sub Main() If File.Exists(ReadVariable("FileNameVariable").ToString()) Then Dts.TaskResult = Dts.Results.Success Else Dts.TaskResult = Dts.Results.Failure End If End Sub 'From Daniel Read's Blog - http://www.developerdotstar.com/community/node/512/ Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
As the source data of the cube is from MySQL, and the source data volume is more than 80M row per month, I have to build multiple partition in the cube, each partition contain only one month data, even though, the time to load data directly from MySQL is still too long, and because the mysql .Net provider and is not mature enough, the connection often break while loading, so I have to try to load from flat file which was exported from MySQL. The Partition Processing Destination seems support this way, but, even it shows the partition name in component editor, it actually process the partition with partition ID, and there's not any way to change destination partition name for this component. Unless I have to change the SSIS package every month, looks like it is impossible to make a smart ETL program that can dynamic create new partition with the ID and name as YYYYMM every month and load data directly from the flat csv file exported from big MySQL table.
Does anyone know how to load data from flat text, and also support dynamic destination partition name?
And I also find a bug in Partition Processing Destination, even 2005 SP2, if there's more than one cube in a SSAS database, and if two partition name in different cube is same, in component Editor, you can not set mapping to the second one with the same name, even you point to the second one, and click ok, the next time you open the editor, you will find it high light at the first one. And even it shows name in editor, it actually process with partition ID instead of partition name, this make it is not possible by change the partition name which need to process to a constant name, say currentMonth to force the component process it.
I have a small project to be done in which I need to fetch the pdf file from a my system and save it in database and also fetch the name of it and save it in the database.
Is there any way to track the information about the connections to SSAS cubes through local Excel files (BI usage).
OPreviously, we are tracing the information about the BI usage through the BI SharePoint site. Now we want track the users who are using through local excel files .
I wrote the below script to print all folders and files located in the share path. How to extend my script to mention by adding another column whether the file is a folder/file , sort of 0 or 1.
I'm using the FTP Task in SSIS to send files. They task succeeds but the files get uploaded to the wrong folder. Instead of being sent to the cg268301 folder they are being sent to the cg268300 despite selecting /cg268301 from the remote path field in the FTP task editor.
I've tried uploading this file to the /cg268301 file using an execute process task and it works fine. I just don't know why it won't work with the FTP task.