I have an SQL data source on my page and I select "Table". On the next screen I pick the fields I want to show. Then I click the "Advanced" button because I want to allow Inserts, updates and deletes. But its all greyed out abd I can't check this option. The UID in the connection string I am connecting under has the correct permissions in SQL server to do inserts, update and deletes too. Anyone know why it would be greyed out? The connectionstring property in the aspx code is dynamic but this shouldn't be the reason because I have used this before with success
Our company is doing some basic research into using analysis services as a solution for our customers. I've taken our database and built an analysis services project in BI Studio and deployed it to Analysis Services on my local machine I connect to Reporting Services in Management Studio, create a folder for the cube, give my Windows user acct. Content Manager permission, create the Data Source (conn string = "Provider=SQLNCLI.1;Data Source=(local)SQLServer2k5;Integrated Security=SSPI;Initial Catalog=STI-PDB" -- copied from the Analysis Services project in BI Studio), and go to "Generate Model". What I get is an rsModelGenerationError, sub [Create Perspective From Cubes] with a connection issue at the base:
Either the user, <computername>Thomas, does not have access to the STI-PDB Database, or the database does not exist. () (Microsoft SQL Server 2005 Analysis Services)
Now, in the Database Engine, my Windows account has the db_owner database role for STI-PDB and it's default schema is dbo, which all the objects are created under. What am I missing here?
I'm trying to generate an update statement based off select statement results. A very basic example,
SELECT ListID FROM DListing WHERE Status = 2Results return 3 rows. 1234 2345 3456
How can I take those results and turn them into WHERE criteria for UPDATE statement?
Generated UPDATE statement should look like this.
UPDATE DListing SET Status = 1 WHERE ListID IN (1234,2345,3456)
I have started by creating a temp table to hold my SELECT results, but I don't know how to get those results into format for IN condition. Right now I get 3 UPDATE statements with 1 ListID in each IN condition.
CREATE TABLE #TempStatusUpdate(ListID INT) INSERT INTO #TempStatusUpdate SELECT ListID FROM DListing WHERE Status = 2 SELECT 'UPDATE DListing SET Status = 1 WHERE ListID IN (' + CONVERT(VARCHAR(30),ListID) + ') AND Status = 2' DROP TABLE #TempStatusUpdate
I am exporting data from SQLServer 7.0 to a text file using DTS. A subset of the entire table is extracted based upon an indicator flag.
Question: How can I reset the flag while exporting the data? I am trying to avoid doing this in two separate steps because other users might update the data between the two steps. This would cause some rows to be flagged as exported when they, in fact, were not.
I am new to SQL2005 SSIS. I would like to know whether the output of a data flow can be the data source of UPDATE SQL or not. If yes, how?
My situation is I use data flow task to select and transform some data from a Table A. Then, I transfer then output to a resultset destination and store the value in a variable X.
After data flow task finish, I want to use the data to perform update SQL by SQL task in control flow e.g.
update table B set fieldB1 = fieldA1 from X where B.key = X.key
I know I can store the output in data flow in a temp table, run SQL and then drop the temp table. But is it approcach is slow? Using variable/resultset is much faster?
I am having a brain-fart or something. I need help!!!
I'm developing a transform that selects data from a SQL script, then transforms that data, and then needs to update the data back into the same source table. The transform is working great, but I can't figure out the update table part. I've got it adding the transformed records to the table, but that's not what I need to do. I need to update the table with the changed data.
Any sample code or blog links would be greatly appreciated!
I've had little success gooling/searching for this (so far).
Given a simple spreadsheet:
StoreNumber StoreName
1 UPDStoreName_1
2 UPDStoreName_2
3 UPDStoreName_3
4 NEWStoreName_4
I want to have an SSIS package that will update a table: mystores (storenumber int, storename nvarchar(255))
StoreNumber StoreName
1 StoreName_1
2 StoreName_2
3 StoreName_3
5 StoreName_5
.. what I need to do is insert the new, update the existing and leave the remaining unchanged. i.e. :
StoreNumber StoreName
1 UPDStoreName_1
2 UPDStoreName_2
3 UPDStoreName_3
4 NEWStoreName_4
5 StoreName_5
(the UPD and NEW are added to simplify the example).
Now the default action of an excel source into an ole db destination is an insert into the table - so PK constraints causes failures.
Now, given that the table is referred to by other table, and is in a 24x7 website, how do I change the SSIS package such that, on a row-by-row basis, anUpSert (update or insert) is performed?
The only idea I have so far is:
create temp table
insert excel data into temp table
iterate through the table, using if exists ... update else insert logic <-- this to be done in a SP
Take a Dataflow Task, with an OLEDB Source Component and an OLEDB Destination Component in it. The Source component's Source is stored in a SQLQuery variable, and the Destination component's Destination is stored in a TableName variable. The Dataflow Task is put into an For Loop container.
I just want to do one thing:
In the For Loop, everytime it send a new value to the SQLQuery variable and the TableName variable, then I could use it to transfer many tables in a same logic.
My question is: Really, I could send new values to the variables and make it go. But everytime it says "The external metadata column collection is out of synchronization with the data source columns." Because the external metadata was recorded and not be change automatically everytime of Loop.
How to update the source columns'metadata automatically?
I'm trying to work out how to update a table from a flat file source.
I could use the simple way of using OLE DB Command with
UPDATE <tblName> SET Column1 = ?, Column2 = ?, <etc> WHERE ID = ?
then using the mapping to join the parameter fields.
However, this (maybe!) a slow process when you have potential 500k+ records in the source file (and approx 150 files to process!)
What I would like to do is the following statement but I'm not sure if this is possible?
UPDATE <tblName> SET Column1 = sourceColumn1, Column2 = sourceColumn1, <etc> FROM <sourcefile> WHERE ID = sourceID
I've also thought about importing the sourcefile into a Recordset Destination and trying to use the variable in the SQLCommand but I'm not sure how to do this...
Does anybody have any ideas on how this could work or an alternative solution?
Hi! A question just struck me that I cannot test right now (don´t have vs2005 on this computer)...Is it possible to add a database-field value to an Xml-Source object attribute?
For instance <Object id=1 name="void" description="" /> is my souce xml... in this case I want to replace the description attribute value (string.Empty) with a value from my db... can this be done?
We are going insane trying to start Report Manager on a SQL Reporting Services 2000 installation. The programmer/admin who originally set this up for us is gone.
We recently upgraded a database/application server to a new server, causing the data source being used by reports in reporting services to no longer be valid. Unfortunately, we do not have access to the original report project to 're-deploy' with the corrected data source.
We desperately need to get the reports that are installed to retrieve their data from the new database location/machine. We understand this can be done by specifying a new data source in Report Manager. (To clarify, we have NOT moved the report servier installation or database, these remain in place - it's just the deployed reports that no longer point to the correct data source.)
For some reason, our Report Manager will no longer run - when we try to launch it, we get a windows login dialog - nothing will work here. We've tried both local and domain admins and constantly get ACCESS IS DENIED 401.3 error messages that we do not have permission/problems with ACL's.
We've gone so far as to allow EVERYONE read/write access to the ReportManager and ReportServer folders and the RS virtual directories, but nothing seems to help.
Can anyone help with this? Ideas on how to change our data source, or how to get back into Report Manager?
Since we are somewhat technical, but not experts, and don't have much more time to waste, we are willing to pay $500 for this project to someone willing to access the server and fix the problem so that the reports point to the correct database and restore access to Report Manager.
its my flow in one of my packates (ETL job) Excel file contains monthly revenue details, i wanna import the excel data to my database staging table, so i've created the package. its working fine...
Problem if we change the new data for the next month and running the package its not running; the same file, same format, only we delete the contents, of the file except first row of the excel sheet, and pasting the new data; new data is coming from Oracle DataBase in the form of excel sheet ( manually they will copy the data and sending to us)
i open that package in design mode and while double clicking the excel file source it says <column name>'s Meta Data needs to be synchronized Do you want to Fix this issue automatically with the available external column's meta data
Clearly noted that its a data type issue; i have changed the corresponding data types as it is in the previous Excel sheet which is equivalant to the Table its copying to.
now the package is running with validation warnings, External Column "Invoice Amount" needs to be updated...etc. some 2 or three warning messages i can able to see in the package Execution wizard,
ok, i'm ready to accept these warnings, and i want my package running from my server;( packages had been deployed in to the Centeralized server; every time if we want to run the package, we have the asp.net webpage, that is executing the package in an On_click event)
The package is not running from the server, its due to the meta data change in the Excel file( i guess)
please suggest me some guide lines to resolve this meta data issue, i want my excel sheet meta data should not change when we have new updates in it;
otherwise suggest me some solutions that i can validate the excel sheet before running the package and testing whether the data is in correct format or not? its a kind of Data Profiling activity;
i know its some what crazy, but i need to maintain the system with permanent solution, instead of facing this meta data mismatch issue!!!
some what lenthy explanation--> its needed for my dear powerful microsoft responders. i think i 've explained my problem clearly, if i don't let me know your queries, i'll try my level best.
Firstly thanks a lot Phil and Jamie on such a helpful article on "Checking to see if a record exists and if so update else insert"
Here is my question
I have about 10 tables and there respective working tables For examples: A, B, C, D, E.... and WorkA, WorkB, WorkC....
Notes: 1) When I execute a package these work table (Work A, WorkB ...) get populated with certain rows say about 5 2) Its not that all the work table are populated on every execution. 3) Tables A, B, C... have thousands of records in it. 4) Work table is of same structure as there parent table..Like WorkA same structure as A..... 5) The table A and WorkA and as on... are linked with a KeyID
Now I want to build a SSIS package that can 1) Get the the data from these multiple tables(WorkA, WorkB...) 2) Process each row of these tables WorkA, WorkB.. 3) Depending upon the KEYID of WorkA., WorkB.. etc Update a Flag colunm of table A, B...where the KeyID is equal to KeyID of Work Table 4) After updating insert that processed row of Work A, WorkB ...into Table A, B..
I can do this if I have one source table and one destination table. Here i have some say 10 randomly source tables to respective random destination . All I could think of creating 10 different packages as adviced in Jamie's article. But I am sure there might some other alternative.
Can somebody advice me the best practice of doing this. Thanks a lot in advance
Hello everybody,I was configuring a SqlDataSource control using SQL Authentication mode.I first added a database file (testdb.mdf) through Solution Explorer-Add New Items. Then through Database Explorer I created a table named "info"Then while configuring the SqlDataSource control I used the SQL Authentication mode and attached the "testdb.mdf" database file.Test Connection showed success. But when I hit the Ok button of the wizard it displayed the following error message:Failed to generate a user instance of SQL Server. Only an integrated connection can generate a user instance.While configuring the SqlDataSource control I clicked "New Connection". Under Data Source section I tried both Microsoft SQL Server and Microsoft SQL Server Database File. And in both the cases I attached a databese file(testdb.mdf). Plz enlighten me on this.Thanks and Regards,Sankar.
I want to update @Stop.UserField with thevalue from @UpdateSource where @UpdateSource.HasPathway=@Stop.UserField...but I need to use the @FieldDescription table to determine how to map the columns.
I have a got a package with source as sql table which has got 50 columns. We are using only 10 columns out of this. Recently one column name has changed and thus throws error invalid mapping. When I open the source to do the changes noticed that all the colums are prselected now and also the datatypes got changed to default ( I had changed the datatypes as per my requirement while i developed). So now I had to select required columns from source and redo the datatype changes in advanced editor.Is there any option which doesnt disturb this settings and we just need to correct the mapping alone.Â
Hi, I want to transform an xml flow to an html flow. For this, I create an XmlDataDocument, I add on it, all that I want, and I store it in a file (in a dataflow task). After that, in an xml task, I set the input properties like that: Operation type: xslt SourceType: File connection Source: ras.xml
All works fine. But now, I want to use a variable to store my xml data. So, instead of storing my data in ras.xml, I store it in a variable like that:
Dim v As Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSVariables90 Me.VariableDispenser.LockOneForWrite("RAS", v) v("RAS").Value = doc.OuterXml v.Unlock()
and in the xml task, I set configuration like that:
But when I execute, I got this following exception:
Error: 0xC002F304 at Format HTML Mail, XML Task: An error occurred with the following error message: "Root element is missing.". Error: 0xC002928F at Format HTML Mail, XML Task: Property "New Source" has no source Xml text; Xml Text is either invalid, null or empty string.
Furthermore, when I want to debug it, in replacing the xml task, by a script task, I can see that in my RAS variable, I get the xml data which is well formed.
I was trying to load data using SSIS, Data Flow Task, OLE DB Source, source was a view to a OLE DB Destination (SQL Server). This view returns 420,591 rows from Query Analyzer in 21 seconds. Row length is 925. When I try to executed the Data Flow Task from SSIS, I had to stop the process after 30 minutes, because only 2,000 rows had been retrieved. I modified the view to retun top 440, 000 and reran. This time all 420, 591 rows were retrieved and written in 22 seconds. Next, I tried to use a TOP 100 Percent. Again, only 2,000 rows were return after 30 minutes. TempDB is on a separate SAN Raid group with 200 gig free, Databases on a separate drive with 200 gig free. Server has 13 gig of memory and no other processes were executing.
The only way I could populate the table was by using an Execute SQL Task and hard code an Insert into table selecting data from the view (35 seconds) from SSIS.
Have anyone else experience this or a similar issue? Anyone have a solutionexplanation?
I am having a huge xml file with nested section. i also have a xsd file for that xml. i have a destination table where the data from the xml should be loaded into. i am using the xml source transformation. But o get all the data i need to use multiple merje joins to get the data in a single row which i can insert into the destination.i was not quiet convinced with using so many joins.
so i tried using the script source transformation where i am using xml objects to get the node and dynamically construction the data row. and the output is then inserted into the destination. on comparing the two approach the one using the script source is working much faster than the xml source transformation.
i wanted to know is there any limitaion using the script source to parse through xml files. also i would like to know any other better way of getting the data from xml source without using the joins.
I am trying to create a program that transfers tables to flat files. At this point in time, I have suceeded in created one that creates delimited files.
However, I am now trying to create fixed-width files as you can do with the SSIS designer, but programatically.
Is there a way to programatically determine the width of a column from the source table? I can not seem to find any kind of function or member that stores this information or allows me to retrieve it.
I know what I need to change in order to set a width for a column, but I just don't know how to find the width without just asking the user to provide one.
, Hi In this code how can I create a new data source and new data source view and model and structure that it run dynamic. In this code I have a lot of errors, that they are about server and database don€™t have in current code, In this code, first I should definition server or no?
How can I create data source and data source view and model and structure? Please say code of that, and guide me. databasename and srv is unknown. Do I add other reference with analysis services? Please explain about these codes: ************************************************************************ 1) RelationalDataSource dsNew = new RelationalDataSource( datasourceName, Utils.GetSyntacticallyValidID( datasourceName, typeof(RelationalDataSource)));
I have set up a new connection as a connection from data source, but I cannot see how to use this connection to create my Data Flow Source. I have tried using an OLE DB connection, but this is painfully slow! The process of loading 10,000 rows takes 14 - 15 minutes. The same process in Access using SQL on a linked table via DSN takes 45 seconds.
Have I missed something in my set up of the OLE DB source / connection? Will a DSN source be faster?
Hi! When MS published starter kits there were files .sql in App_Data. This files contained some sample data for a DB. How to create such files when I have database with data ? Jarod
One of my first tasks on my current project was to generate all of the SQL scripts needed to rebuild the database. Of course, one of the catches was that everything needed to be separated out (table columns in different script from PKs different from FKs different from DEFAULTs, etc., etc.). And we wanted our own comment block inserted. So, I couldn't just use the Generate SQL Script from Enterprise Manager and be done with it. I was going to need to do some more work. Well, here's one of the scripts I wrote to build the FK scripts. Execute this script, then copy & paste the results into a new window. Break apart into separate .sql files as desired.
NOTE: Be sure to show results in Text (not grid) and change the display options in QA to return more than just 256 characters.
/*************************************************************************************************** Author: Mark Caldwell Date: 11/15/2002 Descrip: Generate scipt commands to ADD FOREIGN KEY CONSTRAINTS for constraints already in DB.
NOTE: Be sure to set your Tools/Options/Results/Maximum Characters per column to a large enough number (such as 2000) to display the entire command. ***************************************************************************************************/
SELECT '---------------------------------------------------------------------------------------------------- -- ' + so2.name + ' to ' + so3.name + ' ---------------------------------------------------------------------------------------------------- IF OBJECTPROPERTY(OBJECT_ID(N''' + so1.name + '''), ''IsForeignKey'') = 1 BEGIN ALTER TABLE ' + so2.name + ' DROP CONSTRAINT ' + so1.name + ' PRINT '' -- DRP - ' + so1.name + ''' END G' + 'O
' FROM sysforeignkeys sfk JOIN sysobjects so1 on sfk.constid = so1.id JOIN sysobjects so2 on sfk.fkeyid = so2.id JOIN sysobjects so3 on sfk.rkeyid = so3.id JOIN INFORMATION_SCHEMA.COLUMNS ISC1 on so2.name = ISC1.TABLE_NAME AND sfk.fkey = ISC1.ORDINAL_POSITION JOIN INFORMATION_SCHEMA.COLUMNS ISC2 on so3.name = ISC2.TABLE_NAME AND sfk.rkey = ISC2.ORDINAL_POSITION ORDER BY so2.name
hi everybody, i want to create data source and data source view for data mining, with using C Sharp. i have create data source and data source view and export to XML file, but when i change to another computer, run those XML file, it return error, when i run statement to create and biuld mining model, what can i change on xml or how to run XML on another computer sucessfully, and have i build data source and data source view, how to do it.?
Today I was making a few reports. When I tested the reports in Visual Studio, they worked great: I got the expected result. But when I deployed the reports to our reportserver the problem started. When I click on the directory in which my reports are deployed, I got my 4 reports. Till now everything worked correct. But when I click on a report to view the results it went wrong. I got an error: "Cannot create a connection to data source 'Live'" (Live is the name of our data source).
We are using the Windows Logons and I am sure that I have all the rights on the server, I gave myself 'sysadmin' rights, so it should work. I also have tried it with all the roles assigned on my account, but then it still won't work.
When I modify the data source, and set it to another server en database it works. The datasource 'Live' exists on a x64 MsSQL server, en the other datasource is on a x86 MsSQL server. Maybe that is the problem?