Hopefully one of you gurus can offer a solution to this problem:
First off I do not have the ability to use SQLMail - so that is not an option.
I have a DTS package that performs the following steps in order to populate an excel sheet with some data and email it off to a coworker:
1) FTP Task to copy excel sheet template from one folder to another on same machine
2) Data pump between SQL server connection (local) and copied excel sheet.
3) SQL script task which truncates a table on the sql server
4) activeX script task which contains the following code that uses CDO to attach the excel sheet that was just populated to an outgoing email:
Function Main()
Dim iMsg
set iMsg = CreateObject("CDO.Message")
Dim objMail
Set objMail = CreateObject("CDO.Message")
objMail.From = "from@from.com"
objMail.To = "to@to.com"
objMail.AddAttachment("F:copiedexcelsheet.xls")
objMail.Subject="subject"
objMail.HTMLBody = "Body"
objMail.Send
Set objMail = nothing
Main = DTSTaskExecResult_Success
End Function
The result of DTS execution is a failure on the activeX step:
Error Description: The process cannot access the file because it is being used by another process.
I assume this means Excel still has its grubby paws on the file - but I've tried alleviating this by (1) using a waitfor delay between the excel connection and the activex script task and even by (2) using the run package task to run a separate package that only has the activex mailsender script in it (thinking the parent package would terminate along with the excel process before running the next package - maybe I'm wrong here?).
I have a package the looks for any Excel files in a folder, moves the data to a SQL table, then archives the file to one of two archive folders--a success folder or an error folder. I have an OnError handler on the Data Flow that sets a flag that lets the archive process know where to move the file.
This works when the processing is successful. It also works when the error in the Data Flow occurs right off the bat, i.e., in the Source. When the error occurs later on, say in the Destination, it doesn't work correctly. In this case, the OnError sets the flag, but when the archive process tries to move the Excel file, it can't because it's locked. I assume this is because OnError interrupted the Data Flow before the Excel file could be closed properly.
Any ideas on how I can avoid this problem? Can I manually get the Data Flow to close the Excel connection somehow?
Has anybody seen this? Is there a configuration setting or something that needs to be changed? Working with the development version of SQL 2005 that comes with Visual Studio
Problem: I can't seem to get IS to work with Excel 2007 files. I've tried both BIDS and Import/Export Wizard
I've got the connection set to use ACE The datasource is: c:BranchList - 20080331.xlsx the extended properties property is: Excel 12.0;HDR=Yes Test connection succeeds Preview succeeds If I save the Excel file as tab-delimited text, BULK INSERT succeeds
But trying to run the package against Excel I get this error, over & over. I have not yet been able to get SQL2005 to import Excel 2007 data
TITLE: SQL Server Import and Export Wizard ------------------------------ Could not connect source component. Error 0xc0202009: Source - 'Branches $' [1]: An OLE DB error has occurred. Error code: 0x80004005.
Error 0xc02020e8: Source - 'Branches $' [1]: Opening a rowset for "`'Branches $'`" failed. Check that the object exists in the database.
------------------------------ ADDITIONAL INFORMATION: Exception from HRESULT: 0xC02020E8 (Microsoft.SqlServer.DTSPipelineWrap)
I have a package for which I have set ProtectionLevel to DontSaveSenstive.
I connect to a SQL Database which supports mixed authentication.
To connect to this server in a flexible way, I have enabled SSIS package configuration.
I have edited my dtsconfig file with notepad and put the right password there (I am able to use management studio to connect to that database using same password).
Now when I come back to BIDS and double click on my connection manager, it doesn't read the configuration from the config file. (it just has a blank password). So it seems that BIDS doesn't read the config file.
What can I do to make BIDS read the config file.
Also, is there a way to encrypt the password (without keys) in the configuration file?
Well, I'm to the point of giving up. If there was a 1-800 # that charged $400 /minute for support at this point I would pay it! I have been developing an ASP.net 2.0 web application that basically has login security. I have been struggling to get this thing to work. On my local machine it works fine, but when I copy the code (including the ASPNETDB.mdf that is in the APP_Data directory) over to a web server, I get to the login screen but when I attempt to login I get an error. I've tried using the SSEUTIL.exe and was unsuccessful. At this point I don't even know where to start. Can someone please help me??? The error is listed below: Server Error in '/' Application.
Unable to open the physical file "C:InetpubWIWebApp_Dataaspnetdb.mdf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)".Unable to open the physical file "C:InetpubWIWebApp_Dataaspnetdb_log.ldf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)".Cannot open user default database. Login failed.Login failed for user 'NT AUTHORITYNETWORK SERVICE'.File activation failure. The physical file name "C:InetpubWIWebApp_Dataaspnetdb_log.ldf" may be incorrect. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.SqlClient.SqlException: Unable to open the physical file "C:InetpubWIWebApp_Dataaspnetdb.mdf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)".Unable to open the physical file "C:InetpubWIWebApp_Dataaspnetdb_log.ldf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)".Cannot open user default database. Login failed.Login failed for user 'NT AUTHORITYNETWORK SERVICE'.File activation failure. The physical file name "C:InetpubWIWebApp_Dataaspnetdb_log.ldf" may be incorrect.Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace:
[SqlException (0x80131904): Unable to open the physical file "C:InetpubWIWebApp_Dataaspnetdb.mdf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)". Unable to open the physical file "C:InetpubWIWebApp_Dataaspnetdb_log.ldf". Operating system error 32: "32(The process cannot access the file because it is being used by another process.)". Cannot open user default database. Login failed. Login failed for user 'NT AUTHORITYNETWORK SERVICE'. File activation failure. The physical file name "C:InetpubWIWebApp_Dataaspnetdb_log.ldf" may be incorrect.] System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +684835 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) +207 System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) +1751 System.Data.SqlClient.SqlInternalConnectionTds.CompleteLogin(Boolean enlistOK) +32 System.Data.SqlClient.SqlInternalConnectionTds.OpenLoginEnlist(SqlConnection owningObject, SqlConnectionString connectionOptions, String newPassword, Boolean redirectedUserInstance) +601 System.Data.SqlClient.SqlInternalConnectionTds..ctor(SqlConnectionString connectionOptions, Object providerInfo, String newPassword, SqlConnection owningObject, Boolean redirectedUserInstance) +159 System.Data.SqlClient.SqlConnectionFactory.CreateConnection(DbConnectionOptions options, Object poolGroupProviderInfo, DbConnectionPool pool, DbConnection owningConnection) +346 System.Data.ProviderBase.DbConnectionFactory.CreatePooledConnection(DbConnection owningConnection, DbConnectionPool pool, DbConnectionOptions options) +28 System.Data.ProviderBase.DbConnectionPool.CreateObject(DbConnection owningObject) +445 System.Data.ProviderBase.DbConnectionPool.UserCreateRequest(DbConnection owningObject) +66 System.Data.ProviderBase.DbConnectionPool.GetConnection(DbConnection owningObject) +304 System.Data.ProviderBase.DbConnectionFactory.GetConnection(DbConnection owningConnection) +85 System.Data.ProviderBase.DbConnectionClosed.OpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory) +105 System.Data.SqlClient.SqlConnection.Open() +111 System.Web.DataAccess.SqlConnectionHolder.Open(HttpContext context, Boolean revertImpersonate) +126 System.Web.DataAccess.SqlConnectionHelper.GetConnection(String connectionString, Boolean revertImpersonation) +239 System.Web.Security.SqlMembershipProvider.GetPasswordWithFormat(String username, Int32& status, String& password, Int32& passwordFormat, String& passwordSalt, Int32& failedPasswordAttemptCount, Int32& failedPasswordAnswerAttemptCount, Boolean& isApproved) +815 System.Web.Security.SqlMembershipProvider.CheckPassword(String username, String password, Boolean updateLastLoginActivityDate, Boolean failIfNotApproved, String& salt, Int32& passwordFormat) +80 System.Web.Security.SqlMembershipProvider.CheckPassword(String username, String password, Boolean updateLastLoginActivityDate, Boolean failIfNotApproved) +42 System.Web.Security.SqlMembershipProvider.ValidateUser(String username, String password) +78 System.Web.UI.WebControls.Login.OnAuthenticate(AuthenticateEventArgs e) +161 System.Web.UI.WebControls.Login.AttemptLogin() +94 System.Web.UI.WebControls.Login.OnBubbleEvent(Object source, EventArgs e) +101 System.Web.UI.Control.RaiseBubbleEvent(Object source, EventArgs args) +35 System.Web.UI.WebControls.Button.OnCommand(CommandEventArgs e) +115 System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) +134 System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) +7 System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +11 System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) +33 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +5670
Version Information: Microsoft .NET Framework Version:2.0.50215.44; ASP.NET Version:2.0.50215.44
I am writing a package to process perfmon logs. The issue I have come across is that the perfmon process holds onto the log file and SSIS fails because it wants to exclusive read access. Is there any way of getting SSIS to not take an exclusive read on the file.
I have the Excel Connection Manager and Source to read the contents from an Excel file. For some reason couple of numeric fields from the Excel worksheet are brought over as nulls even though they have a value of 300 and 150. I am not sure why this is happening. I looked into the format of the fields and they are set to General in Excel, I tried setting them to numeric and that did not help.
All the other content from the excel file is coming thru except for the 2 numeric fields.
I tried to bring the contents from the excel source to a text file in csv format and for some reason the 2 numeric fields came out as blank.
Any inputs on getting this addressed will be much appreciated.
Is there anyway to send excel file from ssis using send mail task without saving the excel file locally. I need to automate the process which involves loading the excel file from the database and send it to some people.
Trying to upload excel in server where excel is not installed. BIDs was there in the server, when i am trying to craete Excel source I am not able.what the workround for this.. How to upload excel without excel installed on the server.
I am using a Excel Source to get the data from an excel file to sql server 2005 table. A couple columns are coming in a double precision float, but some values have characters in them, but those values are coming out as null, even though I changed the datatype from float to unicode string. Any inputs on resolving this will be much appreciated.
We have 10 sheets in Excel File and 10 sheet contains errror data. How to load 9 sheets data in to 1 destination and error data in to other destination?
i have an SSIS package that exports to an excel file. This works fine. the problem is that it appends the data instead of overwriting the file. Is there any way to overwrite the file like you can with a flat file? I have to email the file everyweek and don't want to have to clear it out manually. Any help would be appreciated
Good Day to all, Hope you could help me w/ my project. Im creating a DTS Package. The source data will be coming from an excel file going to my SQL table. The DTS package is scheduled to execute daily, but the source data will be coming from different excel filename. Example, today the DTS will get data from Data092506.xls. Then tomorrow, the data will be coming from Data092606.xls. How can I do this? The DTS I've already done has a fixed source data file. Please help. Thank you so much. God Bless.
I have a ssis package where I need to have excel destination. In the Excel file, I need to have few rows with some text and then populate data below the text. One the text is like this:
Data as of: 08/25/2015
if the report ran today, then Data as of will have Yesterday. So, if the user opens that excel file after a week, then user should see same Data as of: 08/25/2015. not today()-day(1).
I was planing to handle on excel side with today()-day(1). but it only works the day it was run. Then the excel file is open after few days later, then it might as Data as of: 08/30/2015 which is not true. It should still stay Data as of:
08/25/2015 on what ever date the excel file is open. The SSIS package runs only once.
How do I handle this so that whenever user open the file, they will see Data as of: 08/25/2015. This is not a column in excel. It is like a description of data in excel.
its my flow in one of my packates (ETL job) Excel file contains monthly revenue details, i wanna import the excel data to my database staging table, so i've created the package. its working fine...
Problem if we change the new data for the next month and running the package its not running; the same file, same format, only we delete the contents, of the file except first row of the excel sheet, and pasting the new data; new data is coming from Oracle DataBase in the form of excel sheet ( manually they will copy the data and sending to us)
i open that package in design mode and while double clicking the excel file source it says <column name>'s Meta Data needs to be synchronized Do you want to Fix this issue automatically with the available external column's meta data
Clearly noted that its a data type issue; i have changed the corresponding data types as it is in the previous Excel sheet which is equivalant to the Table its copying to.
now the package is running with validation warnings, External Column "Invoice Amount" needs to be updated...etc. some 2 or three warning messages i can able to see in the package Execution wizard,
ok, i'm ready to accept these warnings, and i want my package running from my server;( packages had been deployed in to the Centeralized server; every time if we want to run the package, we have the asp.net webpage, that is executing the package in an On_click event)
The package is not running from the server, its due to the meta data change in the Excel file( i guess)
please suggest me some guide lines to resolve this meta data issue, i want my excel sheet meta data should not change when we have new updates in it;
otherwise suggest me some solutions that i can validate the excel sheet before running the package and testing whether the data is in correct format or not? its a kind of Data Profiling activity;
i know its some what crazy, but i need to maintain the system with permanent solution, instead of facing this meta data mismatch issue!!!
some what lenthy explanation--> its needed for my dear powerful microsoft responders. i think i 've explained my problem clearly, if i don't let me know your queries, i'll try my level best.
I am deploying programatically an Excel 2007 file to a SQL Server 2005 Reporting Server. The problem is that if a file with the same name already exists, that file isn't replaced. I would like the opposite to happen. I'm using the following code:
--Executable
set svr=http://w3sdwsqld1/reportserver set src_fld="\w3sdwsqld1\deploy\SAD\ECRANS\UPDATES_20061127_190000\Ecrans\AM\Associados\" set dest_fld="Associados" set script="\w3sdwsqld1\deploy\SADECRANS\UPDATES_20061127_190000\Ecrans\AM\Associados\PublishReports.rss" REM Sample: deploy.bat http://w3sdwsqld1/reportserver "\w3sdwsqld1\deploy\SAD\ECRANS\UPDATES_20061127_190000\Ecrans\AM\Associados\" "Associados" "\w3sdwsqld1\deploy\SADECRANS\UPDATES_20061127_190000\Ecrans\AM\Associados\PublishReports.rss" for /R %src_fld% %%f in (*.xlsx) do rs -i %script% -s %svr% -v ParentFolder=%dest_fld% -v reportP="%%~nf" -v path=%src_fld% PAUSE
--rss Code
' ' Script Variables ' ' Variables that are passed on the command line with the -v switch: ' ' (a) parentFolder - corresponds to the folder that the script creates and uses ' to contain your published reports
' (b) reportP - corresponds to the report to publish
Dim ROOT As String = "/SAD/Ecrans/Ecrans/AM"
Dim definition As [Byte]() = Nothing Dim warnings As Warning() = Nothing Dim parentPath As String = ROOT + "/"+ parentFolder Dim filePath As String = path Dim report As String = reportP
'Create the parent folder Try rs.CreateFolder(parentFolder, ROOT,Nothing) Console.WriteLine("Parent folder {0} created successfully", parentFolder) Catch e As Exception
Console.WriteLine(e.Message)
End Try
'Create shared data source 'CreateSampleDataSource("Solucao_Integrada", "OLEDB-MD", "Data Source=dwareas1;Initial Catalog=SAD_Solucao_Integrada")
'Publish the sample reports PublishReport(report)
End Sub
Public Sub CreateSampleDataSource(name As String, extension As String, connectionString As String) 'Define the data source definition. Dim definition As New DataSourceDefinition() definition.CredentialRetrieval = CredentialRetrievalEnum.Integrated definition.ConnectString = connectionString definition.Enabled = True definition.EnabledSpecified = True definition.Extension = extension definition.ImpersonateUser = False definition.ImpersonateUserSpecified = True 'Use the default prompt string. definition.Prompt = Nothing definition.WindowsCredentials = False
Catch e As Exception Console.WriteLine(e.Message) End Try
End Sub
Public Sub PublishReport(ByVal reportName As String) Try Dim stream As FileStream = File.OpenRead(filePath + reportName + ".xlsx") Console.WriteLine(reportName)
definition = New [Byte](stream.Length) {} stream.Read(definition, 0, CInt(stream.Length)) stream.Close()
Catch e As IOException Console.WriteLine(e.Message) End Try
Catch e As Exception Console.WriteLine(e.Message) Console.WriteLine("Failed to publish report") End Try End Sub --------------------------------------------------------------------------------------------------------------------
I hope someone can help me with this - I started receiving this error message in the past month or so when I open a csv report and save it as an Excel file in a folder I use on my VPN and in My Documents. It does not show up when I save it to my Desk Top.
I have Microsoft Office Student and Teacher and Office XP Professional installed on my notebook. I tried to uninstall Office XP and it would not let me. Something about a "patch could not be opened......"
I'm importing a multi tab spreadsheet using Import wizard, which I understand to use the same internals as SSIS. The total number of columns in the spread sheet will be over 500. The import wizard defaults everything to varchar 255. I understand there is an XML file I can manipulate to change this and they are located
I have a DTS setup that copies a couple of databases from one server to another in order to have a hot backup server. When I execute the DTS package from the DTS portion of the Enterprise Manager tree it runs fine. When I schedule it, it fails with the following error:
Much of the data in the Comments field already has carriage return in it. When I use bcp out -c, it uses as a new record to figure out when the new row stats. However, the data in that column has carriage returns! Hence, when it detects the carriage return in the user data(comments) itself, it is pushing the remainder of that text to the 2nd line. Then, when I try to BCP IN, it is trying to push it into ID column!
The text file looks something like this:
ID COMMENTS -- --------- 1 This is a very long comment with a line return now. User hits return here. This should be part of the previous record but gets detected as a new row 2 Some Text 3 Some Text
So, in the BCP IN, it is trying to put 'This should be part of..' into the ID column and I am getting an error.
Is there any way I can get BCP to NOT recognize in user data as end of a record??
We've had RS running on an internal network for a while now and initially installed it with no SSL certificate. We recently decided that we would like to make some reports available through the web, and that we should secure the data with SSL. So, we registered and set up the subdomain, bought and installed the certificate, closed off all but port 443 on the box in question, and opened it up to the world. We tweaked a few options in the RS config files (after reading around):
RSReportServer.config: - SecureConnectionLevel changed from 0 to 2 (recommended) - this changed all but the help file links to https://. - UrlRoot changed from HTTP to HTTPS, and also changed from internal to external name (to match the SSL certificate)
RSWebApplication.config: - ReportServerUrl changed from HTTP to HTTPS and changed from internal to external name.
After this was changed, along with some IIS tweaks, we were able to get to the report manager through the web, and force SSL only. The problem right now is that we are intermittently getting "Underlying connection closed" errors (described here: http://support.microsoft.com/kb/915599). ; The odd thing is that we first get the Windows login prompt, wait about a minute then get the error (which is encapsulated by the Report Manager page style). After a quick browser refresh, the Report Manager appears almost instantly, with no 2nd request for a windows login.
My question: Are there any IIS or RS config settings I can look at, or any log file entries I should look for in order to determine the cause of this problem? My guess at this point is the error has to do with some sort of timeout, but to be honest, I have no clue.
I want to be able to intersect many tables. I am building my query from vb code in asp .net based on key fields entered in a search engine box.my query should look like this, which will return the rows that will have the values var, var_2, var_3 in any columns. All three must be in a row for it to be a hit. I cant get this to work in ms sql. I don't know if it supports this feature.select * from t where column1 Like '%var%' or column2 like '%var%' or column3 like '%var%'intersect select * from t where column1 Like '%var_2%' or column2 like '%var_2%' or column3 like '%var_2%'intersectselect * from t where column1 Like '%var_3%' or column2 like '%var_3%' or column3 like '%var_3%' I also googled around and found a where exists... But cant seem to figure out how to do multiple tables:select * from t where exists (select * from t where column1 Like '%var_3%' or column2 like '%var_3%' or column3 like '%var_3%') i would like to add multiple conditions to the where exists table. Could anyone please tell me what I am doing wrong.
Taking the advise from one of the postings I removed the setting for checkpoint log on checkpoint. I have set up a batch job that does a dbcc checktable (syslogs) and a dump transaction with no_log every 15 minutes. We are a development shop and do not need the transaction log, I would have liked to use the truncate log on checkpoint. I found that the transaction log seems to grow indefinitely by viewing DBCC sqlPerf and sp_spaceused if I do not first issue the checktable command. Why is this needed? This is the same problem I had with the truncate log on checkpoint option. Has anyone else encountered a similar problem? We have an application that does single row inserts multithreaded at a rate of 2500 rows a second. Without performing the dbcc checktable the log filled to over 80% in a 45 minute period. Running the stored procedure with the two commands the logs stays under 40%.
Trying to get my jobs to send mail when job fails. Should be easy but it's giving me headache
Had a whole slew of issues. Outlook is installed with a n outlook mail profile set up that can send mail in outlook. I can create a SendMail DTS and execute it to send mail
Email works in these scenarios 1. I create a DTS package in SQL Server with just SendMail with the same Profile "ABC" and click Execute and it sends 2. I can send using Outlook with the same profile "ABC" 3. I can run DTSRun with command prompt calliing the DTS package and it sends fine
However it wont send in these 3 scenarios (where I need it!):
1. I go to Operator, and put my Profile "ABC" in the Email Name, click Test and I get this error "Error 22022: SQLServerAgent Error: The SQLSErverAgent mails ession is not running; check the mail profile and/or the SQLServerAgent service startup account in the SQLSErverAGent Properties sQLServer Agent is running
2. If I add the DTS Package "EmailTest" to one of my jobs as a step to go to if the 1st step fails, nothing gets sent
3. In JObs -- Notificatioin, If I set the E-maill operator to my operator, nothing gets sent
I set the Mail Profile to the Mail Profile (in SQL Server AGent' properties) However when I click Test, I get this error: Error 0: Cannot load the DLL sqlmap70.dll, or one of the DLLs it references
I am using SQLServer Version 8.00.194, OS Windows 2003
I don't know if anyone can help me but I have two issues with SQL Server 2000 SP4 (version 8.00.2039 - Desktop Engine) running on W2K and W2K3. I'm also running SQLXML 3.0 (msxml2.dll version is 8.30.9530.0).
Is it me or is sp_xml_preparedocument a crippled fat dog that is blind?...not that I have anything against crippled fat dogs that are blind :)
In all the stored procs I have developed, I pass a text var as an input parameter and return an IStream to ADO (using an sqlxml provider) in COM+. All has been very well and fine...until the passed text parameter resembles a data object of any decent size.
The first error I was noticing was a "XML Parsing Error: not enough storage is available to complete this operation". Well, I thought I would debug logically in a step fashion and just prepare the doc first and then do a return and then do a return on the next segment of code to find out where the issue is. I was amazed to find that sp_xml_preparedocument is taking 7 seconds to load a simple 1MByte text input var and around three minutes to load a 7 MByte file.
I believe these long load times are causing issues with transaction timouts etc so I thought I would try to solve the speed issue with sp_xml_preparedocument and then see if the "XML Parsing Error" continues.
So, my first question is:
Should sp_xml_preparedocument take 7 seconds to load a 1MByte text variable and nearly three odd minutes to load a 7 MByte file? Surely there is something wrong somewhere?
I'm also running these tests on two machines - one is 2 GHz and the other is 2.4 GHz P4's.
SELECT T1.* FROM Cust_Table T1 INNER JOIN ( SELECT Family_Name FROM Cust_Table WHERE Cust_Name IN ('Billy', 'John') AND RowNum < 100 GROUP BY Family_Name HAVING COUNT(*) > 1 ) T2 ON T1.Family_Name= T2.Family_Name WHERE RowNum < 100
( This code above finds all the familys that contain either multiple billys and/or multiple Johns and displays all the duplicates ordered by the family_name. )
The problem is that what I want it to do is search through the whole table and find within each Family (Family_ID) who has both a sibling called Billy AND John (Cust_Name) wether they have multiple johns and multiple billys I don't mind as long as they have a minimum of 1 of each. I then want to just output all those examples only not anything else. An example of the table is bellow, I hope this helps. Thanks for your advice.
BEFORE Family_Name CUST_Name Bruce Billy Bruce John Bruce Mike Bruce Oli Smith Billy Smith Billy Harold John
AFTER Family_Name CUST_Name Bruce Billy Bruce John
Trying to get my jobs to send mail when job fails. Should be easy butit's giving me headacheHad a whole slew of issues. Outlook is installed with a n outlookmail profile set up that can send mail in outlook. I can create aSendMail DTS and execute it to send mailEmail works in these scenarios1. I create a DTS package in SQL Server with just SendMail with thesame Profile "ABC" and click Execute and it sends2. I can send using Outlook with the same profile "ABC"3. I can run DTSRun with command prompt calliing the DTS package andit sends fineHowever it wont send in these 3 scenarios (where I need it!):1. I go to Operator, and put my Profile "ABC" in the Email Name,click Test and I get this error "Error 22022: SQLServerAgent Error:The SQLSErverAgent mails ession is not running; check the mail profileand/or the SQLServerAgent service startup account in theSQLSErverAGent PropertiessQLServer Agent is running2. If I add the DTS Package "EmailTest" to one of my jobs as a stepto go to if the 1st step fails, nothing gets sent3. In JObs -- Notificatioin, If I set the E-maill operator to myoperator, nothing gets sentI set the Mail Profile to the Mail Profile (in SQL Server AGent'properties)However when I click Test, I get this error:Error 0: Cannot load the DLL sqlmap70.dll, or one of the DLLs itreferencesI am usingSQLServer Version 8.00.194,OS Windows 2003Help!
I'm baffled by this error but I'm guessing once someone points it out, it will a oh duh! moment. I have a proc. I want to get the results of the proc into a temp table. I've tried both SELECT INTO and INSERT INTO. Both give me the error "Invalid object name #HACK" which is the tmp table name I've used.
I'm experiencing a problem using transactions within a package, and would be grateful if anyone can help out. A search on the forum has turned up a number of similar posts but I don't think any of them deal specifically with my problem (and I didn't find an existing solution), but apologies if this is considered a duplicate!
I have two Sequence Containers in a package, each of which contains a number of Data Flow Tasks responsible for copying data from one SS2k5 database to another (on the same server). There is a precedence constraint between the two Sequence Containers (within each container the Data Flow Tasks run in parallel, although I don't think this is relevant).
I need each Sequence Container to execute transactionally i.e. within each Container the Data Flows must either all succeed or all fail. However I don't want the package as a whole to execute transactionally i.e. I don't want the two Sequence Containers within a single transaction, but rather to each start a transaction of their own. So, accordingly I have set the TransactionOption property of the package to Supported, and the TransactionOption property of each of the Sequence Containers set to Required. Each Data Flow Task has TransactionOption of Supported.
All is well when I run the package by itself, but unfortunately within my ETL this package is invoked as the child of another package, and when this happens the first sequence container fails with error message:
"The SSIS Runtime has failed to enlist the OLE DB connection in a distributed transaction with error 0x8004D00A "Unable to enlist in the transaction.""
Some quick Googling turned up the following MS Support article that seems to pretty much describe my situation and acknowledge it as a SSIS bug:
http://support.microsoft.com/?kbid=914375
However there is a bit of a problem: the article claims this problem is fixed in Service Pack 1 ... but I'm already running Service Pack 1 (Build 2047). Additionally, the alternative workaround given doesn't help me (unless I'm misunderstanding) since I don't want the two Sequence Containers running in the same transaction and so can't set their TransactionOption to Supported and rely on them joining an existing transaction.
I am aware of the post-SP1 SS hotfix but had some problems with my AS installation when I upgraded to this a while back, so I'm keen to stay with my fresh SP1 unless someone can assure me that the hotfix addresses this issue (and I can't see any mention of an issue such as this in the hotfix notes).
I guess an alternative workaround would be to use native SQL Server transactions instead of MSDTC by including explicit T-SQL transaction commands within my Sequence Containers (and RetainSameConnection on the Connection Manager), but I'm reluctant to modify my packages to do this unless necessary, since I believe the intrinsic transaction support should be able to cover what I'm trying to achieve?
Has anybody else experienced this problem even on SP1, or am I perhaps misunderstanding how to use transactions in SSIS (this is the first time I've used them)?