How To Run XMLA File In C Sharp?
May 10, 2007how to run XMLA file in C Sharp
View 4 Replieshow to run XMLA file in C Sharp
View 4 RepliesI am trying to run XMLA script to save .trc file to remote drive.
Anytime I try to save file it save it to the local drive where SSAS instance is installed.
Here what I am using it within XMLA script
<LogFileName>c:TraceTraceFileName.trc</LogFileName>
How can I save it to my local (c drive) or remote server
Is it possible to create an extended stored procedure in C Sharp.
This is for Sql Server 2000.
Books online mentions that you have to use c / c++ to create an extended stored procedure.
However have Microsoft added any support so that the same thing can be done through a simpler language like C Sharp.
Thanks,
Alok.
I have created a Backup Schedule job for the Cube using following XMLA Query
Code Snippet
<Backup xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Object>
<DatabaseID>SalesAnalysis</DatabaseID>
</Object>
<File>D:Cube BackupsSalesAnalysis.abf</File>
<AllowOverwrite>true</AllowOverwrite>
</Backup>
But thing is everyday the backup file is getting created with the same name "SalesAnalysis.abf", and it is overwriting the previous days file.
So I want to create a Backup file with Timestamp in its name.
Thanks in advance.
I hope some of you can point me in the right direction. I am new to all this and
working on a project using Cubes.
The cubes need to be generated in a dynamic fashion as the contents
(that means field) in the fact tables change change every quarter and are
based off of a external application.
The base strucutre stays the same though. So we thought we would export the
XMLA representation of the cube and then munge (word for alter) the data in the
XMLA file and send it someplace to create the new cube on a SQL 2005 server.
This seemed easy enough but I have found no real information on how to
run some piece of code that would create the cube based on this XMLA code.
On another note is there a parsing object for the XMLA so i dont have to right one from
scratch?
Thanks,
Bob
Hi,
We want to synchronize one database in two instances of Analysis Services.
We are using XMLA synchronize command.Our script is correct.
Account, which is executing the script has server role in source SSAS.
I cannot find anywhere what other permissions must have the user which is executing the script and what permissions must have SSAS account(on the destination) in source server and database.
Thanks Krasimir
Hello,
We are in a similar situation at my client... we wish to issue the CREATE and DELETE XMLA scripts for a specific cube via SSIS, however are unsure as to the control object to use...
More info regarding our processes:
OUR SSIS flow is designed to:
1.) backup an existing cube
2.) create a new cube leaving the original in place for users to use while the new one is building
3.) once the newly built cube has been validated, drop the orignal cube, and rename the newly built cube back to the original cube's name
We used DDL code files (DDL Task Objects within SSIS) which contain the XMLA query code for each CREATE, DELETE, ALTER statements used for each specific task:
The flow:
------------------------------------------------------------------------------
1.) BACKUP Original_Cube
2.) CREATE New_Cube
3.) PROCESS New Cube
4.) DELETE Original Cube
5.) ALTER New_Cube to Original Cube
------------------------------------------------------------------------------
In order to execute each DDL Task code set for a specific task, it is necessary to specify a connection to a given catalogue... however, in our case, and in the general case of using CREATE/DELETE/ALTER ddls, a given database & catalogue may or may not always be available...
it would be ideal for there to be a way to issue an XMLA script to CREATE a DUMMY CUBE (outline and model only) for all other DDLs to use as a basis (at the beginning of our SSIS Flow), and then issue another XMLA script to DELETE the DUMMY cube when it is no longer needed (at the end of our SSIS Flow)
Currently, the flow does work fine, however, the DUMMY CUBE must exist on the Server in order to provide a connection for each DDL Task... at this point, the creation of this DUMMY CUBE has to be a manual process, we are looking to automate this....
the "Script Task" object bumps you into a VB window asking you to provide a VB script... is there a way to have VB issue an XMLA command...? Or, is there an easier way around this...?
THANK YOU!
Michael
I have a very simple SSIS package with one connection and one Execute SQL Task.
The connection is a Native OLE DB for Analysis Services 9.0
The execute SQL task fires off this XMLA query created through the syncronize database wizard from MSAS2005.
All I want to do is dynmaicaly replace the Source on the connection String in this script to a local variable. I have tried the @variable and the ? syntax as I found some good info in BOL. However neither of these work.
I supposed I could use scripting to handle this but I didn't want to add that layer and feel that SSIS should be able to handle this.
Here is the raw script which works fine withing SSIS:
<Synchronize xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Source>
<ConnectionString>Provider=MSOLAP.3;Data Source=ServerName;ConnectTo=9.0;Integrated Security=SSPI;Initial Catalog=catalogname</ConnectionString>
<Object>
<DatabaseID>Recommended Pricing</DatabaseID>
</Object>
</Source>
<Locations />
<SynchronizeSecurity>CopyAll</SynchronizeSecurity>
<ApplyCompression>true</ApplyCompression>
</Synchronize>
Here is what I want to execute (Which gives this error
failed with the following error: "The following system error occurred: No such host is known. .".
):
<Synchronize xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Source>
<ConnectionString>Provider=MSOLAP.3;Data Source=?;ConnectTo=9.0;Integrated Security=SSPI;Initial Catalog=catalogname</ConnectionString>
<Object>
<DatabaseID>Recommended Pricing</DatabaseID>
</Object>
</Source>
<Locations />
<SynchronizeSecurity>CopyAll</SynchronizeSecurity>
<ApplyCompression>true</ApplyCompression>
</Synchronize>
In Analysis services,i have multiple DB's. i need to delete all using XMLA script.
View 3 Replies View RelatedI am working on SSAS project and now i am looking for a way to generate some documentation from XMLA code ?I have seen that it's possible to do it with XML, so i think it is possible too in XMLA !
View 2 Replies View Related
Does anyone know where can i get the source code for XMLA Thin Miner ?
It would be useful to my project.
Hi:
I am a R data miner who is new to SQL and SSIS and would appreciate any help.
I wanted to automate the process of creating and processing decision tree models for every county in the Country. I wanted to use the foreach loop for iterate through all the counties. I wanted the foreach enumerator to be used by the XMLA code that creates the model so it would append it to the name of the model and i would get a different model for every county. I am not sure how to have the XMLA code accept foreach loop enumerator.
Any help would be greatly appreciated and if you could direct me to a previously done example that would greatly benefit me.
Thank you
avneet
I am trying to incrementally update a Cube to get near real time data for the end users. Currently we have a Sql server agent Job that does a FullProcess on the Cube. The Cube consists of a single Measure group which is simply one named query containing inner joins of all the dimensions and fact tables in the underlying relational database. The end users have a lot to upload during the day and they would like us to refresh the cube (near real time) to ensure the adjustments are loaded so that they could reconcile their daily PnLs. We have a MeasureId added which is an auto increment column in the Measures table.
I am trying to schedule the below XMLA query in Sql server agent Job and schedule it to run every 15mins or even less (if possible). However it seems to be not working and keep throwing all sorts of errors.
DECLARE @LastMeasureId AS INT, @myXMLA nvarchar(max)
SELECT @LastMeasureId = "[Measures].[Maximum Measures Id]" FROM
OpenRowset(
'MSOLAP',
'DATA SOURCE=L68F728326574; Initial Catalog=GMDR;',
'SELECT NON EMPTY {[Measures].[Maximum Measures Id]} ON COLUMNS FROM [GMDR]');
[Code] ....
Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url. My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?
Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc.
I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).
Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.
I have a customer they are running raid 5 on a windows 2000 server one of the drives went bad. The customer replaced the drive and raid rebuilt the drive, every thing seamed to be fine but there is one database file that cannot be attached to SQL. The file is 15G so I know there is information the error states that the file is not a Primary file. Any clue on how to fix this?
mdf file size 5,738,944 KB
ldf file size 10,176 KB
Greetings, I have just arrived back into the country (NZ) and back into ASP.NET.
I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share.
It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page.
Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer.
Cheers ~ J
how can i attach the mdf file with a corrupted/deleted Log file
my log file is deleted how can i attach the mdf file
I am testing some maintenance tasks sql commands such as index rebuild, index reorg, update statistics and db integrity check on a SQL Server 2014 Database. This is a new non-production vendor database (DB Size 500 GBs, Log Size 25 GBs) which eventually will be created in production. Currently, it is in full recovery model and without log backups. The database has a whole lot of indexes. I am just trying to rebuild and reorganize all the indexes (that need it), in addition to trying to get an idea of how long these maintenance task will take and the space needed in the log file to complete these tasks/commands. I would like to execute these tasks manually (the first time) to gather the duration and space required information. Eventually, I would probably schedule a weekly job to perform this maintenance.
I ran the index rebuild task on the database and noticed that the log file grew by over 50 GBs. I killed the process and truncated and shrunk the log file back down.
1. Does the index rebuild, index reorg, update statistics and db integrity check commands all use the log file?
2. Does Indexs Reorg have less impact on log file then Index Rebuild?
3. Should a truncate log and shrink log file be performed after these maintenance commands?
4. Should a full database backup be performed after these maintenance commands? Or before the maintenance commands?
I have read and understand that shrinking is not good for the database (could lead to more fragmentation and more data file growth when data is added) and I know about rebuilding indexes when fragmentation is GT 30% and reorganizing indexes when fragmentation is GT 5% and LE 30%.
Since this is a non-production database maybe I should set the recovery model to simple, run the maintenance commands and leave the database in simple recovery model unless the vendor needs it in full recovery model for some unknown reason.
5. With the simple recovery model the log file should be reused in a circular manner and not grow during these maintenance tasks. Is this correct?
Hello World,
I'm new to SSIS and would like a little assistance getting started, if possible...
Here is what I want to do:
Check if file exist (C:DTS UpgradeFilexxx.txt) --->
Archive file (C:DTS UpgradeArchive) --->
Check if file has data (true or false)
AND/OR
If there are any good website that have good direction, let me know
Thanks in advance for your help!!!
I need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?
I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.
For a database, we have 4 data files in a particular file group and the file sizes are almost 70 GB each.
Do I come across any performance issues if I create/pre-allocate an additional data file in the same file group so that the existing files don't grow too much?
I have a package in which there are only one Data flow Task and it has only three components. 1) Source , which is a SQL db 2) destination and 3) OLE DB Destination flat file Error output file. I want the error file to be created ONLY if there is any error while dumping the data into destination DB. But , the issue is, the error flat file is being created inspite of No error while dumping the data from Source to Destination.
View 5 Replies View RelatedI'm copying files to a folder with the naming convention as follows in the source folder:
CM_ABC_MY_TEST.txt
In the destination folder, this filename needs to appear as:
CM_XYZ_MY_TEST.txt
In my File System Task, I'm pretty sure I'm going to need an expression with a replace, substring, etc. But am having a hard time nailing down the exact syntax.
Need to know how I can get the dynamic filename created in the FlatFile destination for insert into a package audit table?
Scenario: Have created a package that successfully outputs Dynamiclly named flat files { Format: C:Test’Comms_File_’ + ‘User::FileNumber’+’_’+Date +’.txt’
E.g.: Comms_File_1_20150724.txt, Comms_File_2_20150724.txt etc} using Foreach Loop Container :
* Enumerator Set to: “Foreach ADO Enumerator” with the ADO object source variable selected to identify how many total loop iterations there are i.e. Let’s say 4 thus 4 files to be created
*Variable Mappings : added the User::FileNumber – indicates which file number current loop iteration is i.e. 1,2,3,4
For the DataFlow task have a OLDBSource and a FlatFile Destination where Flat File ConnectionString is set up as:
@[User::Output_Path] + "Comms_File"+ @[User:: FileNumber] +"_" + replace((DT_WSTR, 10) (DT_DBDATE) GETDATE(),"-","")+ ".txt"
All this successfully creates these 4 files:
Comms_File_1_20150724.txt, Comms_File_2_20150724.txt, Comms_File_3_20150724.txt, Comms_File_4_20150724.txt
Now the QUESTION is how do I get these filenames as I need to insert them into a DB Audittable. The audit table looks like this:
CREATE TABLE dbo.MMMAudit
(
AuditID INT IDENTITY(1, 1) NOT NULL,
PackageName VARCHAR(100) NULL,
FileName VARCHAR(100) NULL,
LoadTime DATETIME NULL,
NumberofRecords INT NULL
)
To save the Filename & how many records in each file in our Audit Table, am using an Execute SQL Task and configuring it as this:
Execute SQL Task
Parameter mapping - Mapped the User Variable (RecordsInserted) and System Variable( PackageName) to Insert statement as shown below
SQLStatement: INSERT INTO [dbo].[MMMAudit] (
PackageName,NumerofRecords,LoadTime)
(?,?.GETDATE)
Again this all works terrific & populates the dbo.MMMAudit table as shown below BUT I also need to insert the respsctive file name – How do I do that?
AuditID PackageName FileName NumberOfRecords
1 MMM NULL 12
2 MMM NULL 23
3 MMM NULL 14
4 MMM NULL 1
Hi,
I am looking for tutorials about how to create dts et dtsx files.
Thanks for your help.
Arioule.
In my script task I have the following code. The task I'm trying to accomplish is:
If the filename on FTP can be found in the local archive folder of e: drive then show message "FileAlreadyThere" (I will ultimatley change it to do nothing); if the filename on FTP cannot be found in the local archive folder of e: drive then transfer the file to the local package folder on d: drive.
While the script task is executing I was watching it closely, but the problem i saw is that:
If some files on FTP are already in local archive folder and some are not, then it the files which are already in the archive folder are dumped to the package folder; then after that the files which are not in the archive folder are then dumped to the package folder. But I only want the new files on FTP to be transferred to the package folder for further processing.
Then after this is finished, I saw all the files in the package folder are refreshed one after another, after the first round of refresh the second round starts, after the second round finishes it then stopped. I saw it refreshes itself because the 'Date Modified' of the file changes. And I saw the script task turned green.
I don't see how the code below produced this result. Something is wrong in the logic of the loop? Anyone has any idea why it's behaving the way it is now? And how to change the code to accomplish what I want? Thanks a lot!!
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Imports System
Imports System.IO
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
Public Sub Main()
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
cm.Properties("ServerName").SetValue(cm, "ftp2.name.com")
cm.Properties("ServerUserName").SetValue(cm, "username")
cm.Properties("ServerPassword").SetValue(cm, "password")
cm.Properties("ServerPort").SetValue(cm, "21")
cm.Properties("Timeout").SetValue(cm, "0")
cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb
cm.Properties("Retries").SetValue(cm, "1")
Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))
ftp.Connect()
ftp.SetWorkingDirectory("/directory")
Dim fileNames() As String
Dim folderNames() As String
ftp.GetListing(folderNames, fileNames)
If fileNames Is Nothing Then
MsgBox("NoFileOnFTP")
Else
Dim fileName As String
For Each fileName In fileNames
If File.Exists("c: emp" + fileName) Then
MsgBox("FileAlreadyThere")
Else
ftp.ReceiveFiles(fileNames, "c: emp", True, True)
End If
Next
End If
ftp.Close()
End Sub
End Class
Hey All,
Similar to a previous post (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=244646&SiteID=1), I am trying to import data into a SQL Table.
I am trying to program a small application that will import product data obtained through suppliers via CD-ROM. One supplier in particular uses Fixed width colums, and data looks like this:
Example of Data
0124015Apple Crate 32.12
0124016Bananna Box 12.56
0124017Mango Carton 15.98
0124018Seedless Watermelon 42.98
My Table would then have:
ProductID as int
Name as text
Cost as money
How would I go about extracting the data with an XML Format file? I am stumbling over how to tell it where to start picking up data for a specific column.
Is there any way that I could trim the Name column (i.e.: "Mango Carton " --> "Mango Carton")?
I don't know if it makes any difference, but I've been calling SQL from my code by doing this:
Code in C# Form
SqlConnection SqlConnection = new SqlConnection(global::SQLClients.Properties.Settings.Default.ClientPhonebookConnectionString);
SqlCommand cmd = new SqlCommand();
cmd.CommandType = CommandType.Text;
cmd.CommandText = "INSERT INTO PhonebookTable(Name, PhoneNumber) VALUES('" + txtName.Text.ToString() + "', '" + txtPhoneNumber.Text.ToString() + "')";
cmd.Connection = SqlConnection;
SqlConnection.Open();
cmd.ExecuteNonQuery();
SqlConnection.Close();
RefreshData();
I am running Visual Studio C# Express 2005 and SQL Server Express 2005.
Thanks for your time,
Hayden.
I need find out the number of columns in flat file before i process that particular file.I have file name in @filename variable and file path is @filepath variable.But do not not that how i will check the column name in before i will process that file.
@filePath = C:DatabaseSourceFilesCAHCVSSourceFiles
And i am using for each loop container to read the file one by one and put the file name in @filename variable.and my file name like
Product_20120607060930.txt
Product_20130708060930.txt
[code]....
Now what i have to do is i need to make sure that ID,Name,City,County,Phone is there in flat file.if it is not there then i have to send mail to client saying that file is not valid.I need to also calculate the size of flat file.
The TEMPDB transaction log file keeps growing.The database server is new and the transaction log was presized to 1 GB on installation. After installing a number of databases, the log file grew over a day to 38GB. Issuing a manual checkpoint was the only way to free some space to allow it to be shrunk back to a usable size. The usage of the file is still going up.
I am struggling to find what process is causing the log to be used so heavily. Looking at the log reuse wait desc for tempdb returns "Nothing" and tempdb itself isn't being used very much or growing in size.
In a server we had File Growth,And then We had to Add New Hard Drive And New File On It.And Now We have New server with a Huge Hard Drive.But all files remaind.Can I Reduce This files to One data file or not ?
View 3 Replies View RelatedHi,I am trying to use BULK INSERT with format file. All of our data hasfew bytes of header in the data file which I would like to skip beforedoing BULK INSERT.Is it possible to write format file to skip these few bytes ofheader before doing BULK INSERT? For example, I have a 1 GB data filewith 1000 byte header. Except for first 1000 bytes, rest of the data isgood for BULK INSERT.Thanks in advance. Sorry if it is really a dumb question as I am newto BULK INSERT and practicing still.Bob
View 7 Replies View RelatedI've production sql server 7 sp3 on windows NT. I had a 8GB data file ofwhich 5GB were used and 3GB were unused. I wanted to take back the unused3GB.So I did the following with EM GUI:1. I tried to "truncate fre space from end of the file". Didn't truncatethe file. I believe there was no empty space at the end of the file.2. Next I chose the option to "shrink file to 5GB". And to my horror thedata file instead of taking just 5GB took the empty spaces also and the sizeof the used data file went to 8GB.Any idea what's going on?TIA,SP
View 2 Replies View RelatedI have a ssis package where I need to have excel destination. In the Excel file, I need to have few rows with some text and then populate data below the text. One the text is like this:
Data as of: 08/25/2015
if the report ran today, then Data as of will have Yesterday. So, if the user opens that excel file after a week, then user should see same Data as of: 08/25/2015. not today()-day(1).
I was planing to handle on excel side with today()-day(1). but it only works the day it was run. Then the excel file is open after few days later, then it might as Data as of: 08/30/2015 which is not true. It should still stay Data as of:
08/25/2015 on what ever date the excel file is open. The SSIS package runs only once.
How do I handle this so that whenever user open the file, they will see Data as of: 08/25/2015. This is not a column in excel. It is like a description of data in excel.