We desperately require the RetainSameConnection to be set to True on our ADO.NET - ODBC connection manager. Unfortunately RetainSameConnection always defaults back to False when you open the package.
* Is RetainSameConnection supposed to work for the ADO.NET - ODBC combination?
* Is it a bug that it defaults back to false for the ADO.NET - ODBC combination?
I am a little bit confused about Transactions, the connection string parameter "RetainSameConnection" and the execution of a child package...
I have a parent package, which should execute some child packages in a loop depending on a sql dataset. doing this using variables works fine. I am setting the variables for Source and Destination also and I am having a connection manager for source and destination in the parent package.
The child packages alos have source and destination connection managers depending on variables coming from the parent package.
I am forced to use transactions on the destination, to be able to rollback if an error occurs.
When I initiate the transaction in my parent, call the child package with its own connection manager depending on the variables coming from the parent package will it be the same Transaction in my child and will it be possible to commit the transaction?
If this is not a proper way to do it is there another way to handle this?
My package creates a set of #temp tables in Execute SQL Tasks that are then used for lookups in a data flow. All of the #temp tables are created using the same OLE DB connection manager, and all of the lookup transforms are using this same connection manager. I have set RetainSameConnection=true for the connection manager.
However, when the data flow task is caching the lookup rows in pre-execute, it fails on one of the #temp tables. The error message I receive is:
Invalid object name '#TempTableName'
When I watch what's happing in SQL Server Profiler, I can see that all of the lookup tables are being created within the same process. The caching is likewise performed by this same process, EXCEPT for the caching of the #TempTableName table. For some reason, when the caching of this lookup table is stating, I see a new login and the caching SQL is executed on this new login, which of course does not have access to the temp table created in the previous connection.
Does anyone know why the connection manager would be connecting a second time for this lookup?
I've seen variations on this question in this forum, but none of the suggestions work for me, so any ideas would be greatly appreciated!
I have a SSIS package whose function is to monitor a directory for flat files and distribute the information in these files into a SQL 2005 DB for further processing later. I've configured Transactions for the package as serializable and supported, and I have a ForEach container set to serializable and required. All SQL and Data Flow tasks take place within this container. I have a single Connection Manager to pull in the flat input file, and another for the connection to the SQL DB. This setup will function, and roll back if necessary, but I take a huge hit on the continual establishment of connections to the DB. If I set the retainsameconnection property on the SQL DB Connection manager, I get an error when executing my first Data Flow task to move data from the flat input file to working tables within the DB: 0xC001A004 - Incompatible transaction context was specified for a retained connection. If I turn off Transaction support, the retainsameconnection property works as expected.
I've been running Profiler Traces, and can verify that no connection has been established to the SQL Server prior to execution of this task, so it seems like the context should be OK. I would welcome any feedback, suggestions or alternatives.
Service Broker will let you create two services with the same name (one with contract and another without)
CREATE SERVICE [Order Msg Recieve] AUTHORIZATION [dbo] ON QUEUE [dbo].[Order Return Msg Queue]
CREATE SERVICE [Order Msg Receive] AUTHORIZATION [dbo] ON QUEUE [ODS].[Order Return Msg Queue] ([OrderSubmission])
When you delete the service....
drop service [Order Msg Recieve]
It will only drop the first one. In the BOL there is no syntax for telling it to delete the second one, however you can drop it from SQL Management Studio.
When upgrading a database from MSDE to Visual Studio 2005 Express, I've noticed that the system tables in the MSDB database don't get upgraded. For example, the SYSSCHEDULES table doesn't get added. Is this by design or a bug?
Try this in sql server 2005: select COALESCE(a1, char(254)) as c1 from (select 'Z' as a1 union select 'Ya' as a1 union select 'Y' as a1 union select 'W' as a1) as b1 group by a1 with rollup order by c1 select COALESCE(a1, char(255)) as c1 from (select 'Z' as a1 union select 'Ya' as a1 union select 'Y' as a1 union select 'W' as a1) as b1 group by a1 with rollup order by c1
The only difference is that the first one uses 254 and the second one uses 255. The first sorts like this: WYYaZþ The second one sorts like this: WYÿYaZ Is this expected behavior?
Has anyone noticed that if you created DTS package and try to change connection properties (i.e. change DSN or redirect DTS to different server or Database), as soon you click OK it does not save new password and hence does not work anymore. In my case to move Database from Development to Production server I would have to recreate all DTS packages. Is there any way around it?
CREATE TABLE ItemInformation([Description] varchar(80)) GO
INSERT INTO ItemInformation([Description]) SELECT 'CHOCOLATE CHIP‚' UNION ALL SELECT '‚COOKIES‚' UNION ALL SELECT '‚CROISSANTS *PLAIN*‚' UNION ALL SELECT '‚DONUTS‚' UNION ALL SELECT '‚DONUTS *DOZEN*‚' UNION ALL SELECT '‚MUFFINS‚' UNION ALL SELECT '‚BAGELS‚' UNION ALL SELECT '‚ROLLS‚' UNION ALL SELECT '‚CUPCAKES‚' UNION ALL SELECT '‚CRISPIES‚' UNION ALL SELECT '‚DANISH/SWEET ROLLS‚' UNION ALL SELECT '‚FUDGE BROWNIES‚' UNION ALL SELECT '‚PUFF PASTRIES/ECCLES‚' UNION ALL SELECT '‚STICKY BUNS‚' UNION ALL SELECT '‚TURNOVERS‚' UNION ALL SELECT '‚BLACK & WHITE COOKIES‚' UNION ALL SELECT '‚LINZER TARTS‚' UNION ALL SELECT '‚SCONES/BISCUITS‚' UNION ALL SELECT '‚SCUFFINS‚' UNION ALL SELECT '‚SINFULL BITS‚' GO
SELECT * FROM ItemInformation GO
UPDATE ItemInformation SET [Description] = REPLACE([Description],',','') GO
SELECT [Description], LEN([Description]) FROM ItemInformation GO
SELECT REPLACE([Description],',','') FROM ItemInformation
SELECT REPLACE([Description],'C','') FROM ItemInformation
SELECT CHARINDEX(',',[Description]) FROM ItemInformation GO
INSERT INTO ItemInformation([Description]) SELECT 'CHOCOLATE, CHIP‚' UNION ALL SELECT 'CHOCOLATE, CHIP‚' UNION ALL SELECT ',CHOCOLATE, CHIP‚' UNION ALL SELECT ',CHOCOLATE, CHIP‚ ' UNION ALL SELECT ',CHOCOLATE, CHIP‚ A' UNION ALL SELECT ',CHOCOLATE, CHIP‚ , ' GO
SELECT REPLACE([Description],',','') FROM ItemInformation GO
Currently - our Reporting Services site is setup so that all Domain Users can access it. We are starting to use Report Builder now. I have enabled the My Reports folder feature that that creates a user folder for anyone that logs onto the site. Is there a way to just create the user folders for certain people?
I just installed SQLExpress, with reporting services on my own laptop
However, when I go to http://localhost/Reports/, I can see my reports uploaded, but the subscription links are not available. In Properties, I cannot see "Execution" either.
Can someone help me to get the Subscriptions feature enable again?
Hi! I am having a terrible time with the report feature. First of all, my queries through it do not match my queries thru SQL server 2005 Express Management. When I enter my fields, through the auto query function, it moves them. When I try to put them where I think they belong (by editing the SQL code, my queries get all messed up).
Is there a way to import my SQL query and run it through the report feature?
I am using windows XP Pro Microsoft Visual Studio 2005 version 8.0.50727.42 Microsoft SQL Server Reporting Services Designers version 9.00.2047.00
Theres a feature in oracle that allows you to modify tables, colums, values and the data from its enterprise console the same way that you can in sql server. In oracle however theres a button called 'show sql' that allows you to see and copy/paste the resulting sql for the changes made via the console.
I would imagine that sql server has a similar option. The reason i ask is that i would like to more fully learn how to do this through the query analyser and get more familiar with sql involved and I would be able to do this if I could see the resulting sql from enterprise manager.
Hope this makes sense.
I did find something in sql server called 'generate sql' but this doesnt update during changes you make automatically.
I installed full-text feature on a previously installed SQL Server. It seemed it installed and it did not give me error but when I run “EXEC sp_fulltext_database 'enable'� it gives me no full-text search featured installed message. Any idea?
I need some help... I'm trying to execute a stored procedure and I'm getting this message
Run-Time Error '-2147217887 (80040e21)':
[Microsoft][ODBC SQL Server Driver]Optional feature not implemented
Here is the code: Public Function D2L(sconnect As Variant, dDate As Variant) As Variant Dim rsDate As ADODB.Recordset Dim cmdDate As ADODB.Command Dim prmDate As ADODB.Parameter
Set cmdDate = New ADODB.Command Set ADOConn = New ADODB.Connection
ADOConn.Open sconnect
Set cmdDate.ActiveConnection = ADOConn cmdDate.CommandText = "dbo.UP_CVRT_DATE_TO_LONG" cmdDate.CommandType = adCmdStoredProc
Set prmDate = New ADODB.Parameter prmDate.Type = adDate 'prmDate.Size = 32 prmDate.Direction = adParamInput prmDate.Value = dDate cmdDate.Parameters.Append prmDate
Microsoft Internet Information Services (IIS) is either not installed or is disabled. IIS is required by some SQL Server features. Without IIS, some SQL Server features will not be available for installation. To install all SQL Server features, install IIS from Add or Remove Programs in Control Panel or enable the IIS service through the Control Panel if it is already installed, and then run SQL Server Setup again. For a list of features that depend on IIS, see Features Supported by Editions of SQL Server in Books Online.
how do i get hold of Internet Information Services I installed it on vista premium.
I have a package which loads the fact data from Stage into Warehouse database. This packages normally handles early arriving facts. In that package I use lookup to check the dims which exists, and where they don't I populate the dimension and use the surrogate key to load the facts. This works fine.
I had a request to load 7 years worth of historical data. Instead of re-writing the package I took the package which handles early arriving facts and deleted the section which handles early arriving facts. I knew all the dimensions already exists and I don€™t want to hinder the performance when I load millions of rows. During testing I found something very interesting.
If you have configured error path in the lookup component and removed the error path later, the package will NOT fail (won't produce error) even if the lookup can't find matching values.
Correct Behaviour Example 1: [1] Stage fact table has 2 records, with product code 1 and 2. [2] Warehouse Product table has only product code 1. [3] Source - Lookup - Destination in the data flow task. Error port on lookup is not configured. [4] From source we read 2 records, and the package will fail at lookup as it can't find Product Code 2.
Correct Behaviour Example 2: [1] Stage fact table has 2 records, with product code 1 and 2. [2] Warehouse Product table has only product code 1. [3] Source - Lookup - Destination in the data flow task. Error port on lookup is configured to go to RowCount. [4] From source we read 2 records, and the package will run successfully. It will put one record into warehouse table and send the invalid record into RowCount.
Incorrect Behaviour Example 3: [1] Stage fact table has 2 records, with product code 1 and 2. [2] Warehouse Product table has only product code 1. [3] Source - Lookup - Destination in the data flow task. Delete the configured error port from lookup. [4] From source we read 2 records, and the package will run successfully. It will put one record into warehouse table and discard the other.
My understanding if the error port is NOT configured as shown in example 2, it should fail as shown in example 1.
Am I missing a point or is this suppose to be a correct behaviour or is it a bug?
I tried SQL mirroring in beta 1 , then it was gone, until SP1.
Now I can not setup mirror, it is fine if it is just hard to setup, but it seems it is full of bug! The mirroring has to be stable, since I am trying to mirror product db, what a diaster if something goes wrong.
I am trying two servers, both has 9.0.3042. First I tried to setup on my home machine, I VPN to my network, After I config security, I see two connection strings:
and it doesn't show that error anymore, I am not sure why the connection should be like the latter format, but in anyway, how come SMO can not make it right?
then I get another error, SQL server doesn't exist or can not access, I search on the Internet, it seems that error could mean anything, include that the mirror db is not in restore mode.
But I did set the mirror db in restore mode and both sql5 and sql8 are under same domain, pysically close.
I am trying here to get a situation going which includes both transactions and checkpoints to make sure that when something goes wrong I don't get a) data corruption (hence the transactions and b) I don't have to completely restart my 2hr run (hence the checkpoints). However I ran into something of which i cannot see whether it is intended behaviour or simply a bug.
Here's the deal: I have a SSIS-package in which I enable checkpoints (CheckpointUsage: IfExists and SaveCheckpoints: True). I have 2 Dataflows which follow eachother (the first dataflow prepares data for the second dataflow to edit). Because I want to make sure that my data is secure I put a separate transaction on both the dataflows.
And here my problem arises. If I run my package now and the second dataflow breaks then my checkpoint sends me back to the first dataflow and my initial insert is executed again, which isn't meant to happen (I enabled checkpoints to prevent rerunning items). Somehow my checkpoint does not register the fact that the first dataflow has already been executed and it will execute that one again upon rerun.
However: if I put a random task between the 2 transacted dataflows (for example an empty script-task) it will work as intended. Just as long as this inserted item doesnt have a transaction; because if it does then the problem comes back Now if I execute the package then my checkpoint shows that the first dataflow has already been executed and thus it will not execute this one again and it starts at the second dataflow upon re-execution.
I can work around it (with the empty script-task) but still I am wondering as to why this is happening. I am very interested to hear whether this is really a bug or if it is intended behaviour (and if it is then why is it intended?)
I have two input columns (both DT_I4) in a column collection to a Aggregator transform. Now I am doing a group by to one and Count to another column.
To my surprise the output's column datatype is changed for Count Transform (DT_UI8) and I have to put extra Data Conversion Transfrom to get my DT_I4 datatype back.
A new developer tried to import a data file using the ImportExport wizard. Unfortuantely, the file had several format errors and the process aborted after reading one bad line. (I imported it using a BIDS package with error handling. The file had 454 bad rows.)
Are there any options in the I/E Wizard to logskip errors but continue processing? After all, the I/E Wizard is designed for quick-and-dirty use and if it can't "handle" erroneous data "gracefully", it is not a very practical tool.
I am currently using the SSIS Logging feature in my SSIS package. Currently, I have defined a destination log file, and each time the package is executed the log file gets appended with that days log.
Im trying to figure out how best to keep the log file name static (it gets emailed out, and my email client looks for a particular log file name) yet include only todays log information and append the rest of the log information to a history log file or something like that.
Has anyone tried doing something similar, or have any ideas on how best this can be accomplished?
Good day all,I'm looking for suggestions on how to handle the calculating of scores for search results from data. Primarily if it would be best to calculating the score on the SQL server side or in the application's logic itself after the results have been retrieved? I already have an idea of the calculations I want to do which will be pretty simple, just a basic point system for containing all the queried terms and additional points for the number of times those words appear on the page. Feedback or links to articles on this would be appreciated. I'm leaning towards doing so on the SQL side myself but my SQL skills are not as polished as I would like for jumping into that, so examples could be good.
I am looking for a solution to allow users to generate data by selecting tables-->columns-->where clauses on the fly.
I am thinking to maintain relationships, joins, etc., in some configuration tables. Based on the fields selected by the user, I can get these conditions and generate a query, execute and export to excel. Any ideas what is the good way of storing relationships, etc. in configuration table?
Also suggest if there is any good excel add-in to do adhoc reporting. Point me if there is any other ways of doing.
If I have to do it from scratch, i probably use VB.NET.
I'm using SQL2K with SP4 2187. I have created a DB Maintenance wizard where the purging older than 1 day is set.
However, this feature seems not to be working, even if I tried two ways. Delete the scheduled job and recreate it - not successful, 2nd) delete the Maintenance Plan, still not successful.
after installing the Microsoft SQL Server 2000 DTS Designer Com feature pack and then restarting my management studio, I still cannot see or edit my dts packages. I also tried editing them in the integration services consule but no luck there. I see them under the msdb and have no option to edit them.