I am having trouble understanding how the SSIS data pump determines when to decide "The final commit for the data insertion has started/ended". On some tasks the rows are inserted one at a time every few milliseconds (shown by a default getdate() in a datetime column). In others the final commit occurs as I would expect at the end of the data pump task.
There are times i want the data pump task to commit all records that are succesful, row by row and there are times I want an all or none situation. Can somebody explain why this behaviour occurs and how i can control which commit option I want the data pump tasks to use?
I'm currently creating a SSIS package that takes data from 3 unique databases. A SQL DB, FoxPro DB, and an Oracle DB. The data is pulled, cleansed and put into a single SQL 2005 table. The data is then pulled from this table every 15 minutes, formated in a given specification and uploaded to an ftp site. This part is done. My question is this:
This package needs to run around the clock, non-stop. How can package be set up to do this? It needs to pull data from the 3 DBs and put it in the common table, wait 15 minutes and do it again. Wait 15 more mintues and do it again. And so forth. A problem I'm having is I don't see a way to set up a SSIS package so that it runs around the clock.
On same premise, I have another issue. When I try to take data from the common table and there is nothing there, it causes an error. Is there some way that you can run a test like
SELECT * FROM _table_ WHERE is_sent = 0
if results == 0 { wait 15 minutes and test again. } else if
{ write flat file, wait 15 minutes. }
This has to be done in the Control Flow scope, so I can't use a conditional split. This is a pretty big deal as this needs to run around the clock. Thank you in advance for your assistance.
My client is using a sql server 7.0 to store realtime data like heat,temp,pressure etc inserted every second.He wants me to provide a solution to transfer the summarised data to oracle server on a regular basis..say once on every 5 min..
I have to value [CreateDate] in the data pump of my Flat File Source into my OLE DB Destination SQL Server Table. With a Variable within the SSIS Package or with a Derived Column task within the Data Flow between the Flat File Source and OLE DB Destination?
I have a largish DTS package built generically from VB. It uses a combination of DTSExecuteSQLTask and DTSDataPumpTask (using SQLstatements for the source). 18 tasks are failing (1 and 17 respectively by the above types).
When I try to execute the tasks individually I get messages like "Column name xxxx was not found". the column does exist in the table specified in the SQL statements and further the SQL statements execute ok using the Query Analyzer.
If I select Properties, Transformations for a task I get presented with the Verifying Transformations dialog (i.e. indicating there are errors). If I select the third option (Remove all transformations and redo audo-mapping) and save my changes the task then executes okay.
I did not see a forum for the SQL Server 2000 DTS.
I have a flat file feeding a table via a data pump. The table is only used by this process. It will run for about 30minutes and then fail. The message in the history does not give any detail on why it is failing. Below is the message I get and if I rerun the job it works fine. Anyone help me please.
Date 07/23/2007 6:00:02 AM Log Job History (Daily: Load EOL from MVS1 (First Run))
Step ID 1 Server PIT-CS-M608 Job Name Daily: Load EOL from MVS1 (First Run) Step Name Daily: Load tblCaseMasterSched Duration 00:28:05 Sql Severity 0 Sql Message ID 0 Operator Emailed Operator Net sent Operator Paged Retries Attempted 0
Message Executed as user: PIT-CS-M608SYSTEM. ...rt: DTSStep_DTSActiveScriptTask_1 DTSRun OnFinish: DTSStep_DTSActiveScriptTask_1 DTSRun OnStart: DTSStep_DTSExecuteSQLTask_1 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_1 DTSRun OnStart: DTSStep_DTSDataPumpTask_1 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 1000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 1000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 2000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 2000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 3000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 3000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 4000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 4000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 5000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 5000 DTSRun OnProgress: DTSStep_DTSDataP... Process Exit Code 1. The step failed.
I have a vbscript to read all files from a directory and, if the fileis valid, I would like my DTS to process it. I tried using thevbscript as an ActiveX workflow script in the DTS, but it does notexecute the data pump until it has completed looping through all thefiles, so only the last file read is sucked into the database(utilizing a global variable as the filename). Is there a way toexecute the data pump task from within the activex script? I can'tseem to find any documentation about executing a DTS task.Basically, the workflow I want is:1)Read files from directory (the number and names may change eachtime). (done with vbs)2)For each file, send it through the transformation into the database.3)When the information is in the database, append a date to the fileand move it to the archive folder. (done with vbs)If I am going about this the wrong way and you see something that isnot obvious to me, please let me know.Thanks in advance!
Say I do File -> New -> Project and make a new project called RedApple under C:. For simplicy, I chose root, but I would designate location for all my SSIS projects. I also check the box "Create directory for solution".
I see that this creates C:RedApple
Under this, I have another RedApple directory with a .sln and .suo files. In this 2dn RedApple directory, I have .database, .dtproj, and .dtproj.user files.
I deleted the default package1, so I don't have any .dtsx files yet.
As far as the above files are concerned (.sln, .suo, .database, .dtproj, .dtproj.user), I understand these are some kind of system or meta data files and I shouldn't be messing around with them. And that's as far as I need to concern myself on these files.
As I add packages to my project, I see that it is creating .dtsx files under the 2nd RedApple directory. So, I'm thinking these are the package files, one for each package I see in my solution explorer, correct?
MY QUESTION: Now, I am looking at another SSIS directory tree for a project that I created before I started to pay attention to all this. In this other directory tree, I see that there is also a bin directory, which I do not yet have in C:RedAppleRedApple. In this bin directory I see more .dtsx files, the same ones as I see in the directory above it, but also other .dtsx files that have the names of other packages that I onoce created and deleted along the development of this particular project. What are these .dtsx files?
I built a packaage in SSIS with the import/export utility. It created a Package.dtsx and Package1.dtsx. Both of these files seem to be XML files. I want to understand how these files work. For example, in the package I built I had about 80 tables exporting and importing data. Some of them I want to allow the identiy insert and delete the rows first. Others I want to append the data. How can I find the code or settings that does this? Or where can I find the options on the gui interface to change these settings. When I search the code I can't even find a some of the tables that are being transferred.
I'm able to connect to the Oracle database to insert the data into multiple tables using OLEDB connection via Oracle Provider for OLEDB. However, i wish to create a transaction so that i'm able to rollback all the data in the case where the insertion fails in one of the table. May i know where should i start from?
I have a stored proc that is returning the results I need for output to .txt file.
Is there a way in SSIS to commit 50K (or whatever number) row batches at a time or should I just handle this in the stored proc?
select * into #TempTable from SomeTable [WHILE LOOP] --throttle commit batches of 50K rowcount select * from #TempTable [END LOOP] drop table #TempTable
But If I'm doing this in SSIS, I can't drop the #temp table otherwise I have nothing to output right?
I've created a package that runs fine from BIDS when logged in with my domain account. I have created a SQL Agent Proxy on the server with that same account. In the Job Step on the server, I edit the connection strings so that username and password is there for both my source Access connection and the destination SQL Server. Here is the connection string I create for MS Access:
Code SnippetData Source=\10.210.226.202OTM Reports for SymmetricsCDRD001.MDB;User ID=admin;Password=;Provider=Microsoft.Jet.OLEDB.4.0;
Here is the error:
Code SnippetExecuted as user: DOMAINMRUSER. ethod call to the connection manager "MSAccessDB" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. End Error Error: 2008-01-30 09:49:19.66 Code: 0xC0047017 Source: Cost DTS.Pipeline Description: component "Cost" (1) failed validation and returned error code 0xC020801C. End Error Error: 2008-01-30 09:49:19.66 Code: 0xC004700C Source: Cost DTS.Pipeline
I have tried with various settings in the package for "ProtectionLevel" such as "DontSaveSensitive" and "EncryptSensitiveWithUserKey". I would think that using my account with the proxy the last option would work when running it on the server, since it is essentially the same user running the package, but I'm new to playing with the proxy.
I tried using package configurations but got an error there too, think it couldn't access the file, event though it was on an accessible share--accessible to my account.
How do I make use of begin transaction and commit transaction in SSIS.
As am not able to commit changes due to certain update commands I want to explicitly write begin and commit statements. but when i make use of begin and commit in OLEDB commnad stage it throws an error as follows:
Hresult:0x80004005
descriptionyntax error or access violation.
its definately not an syntax error as i executed it in sql server. also when i use it in execute sql task out side the dataflow container it doesnt throw any error but still this task doesnt serve my purpose of saving/ commiting update chanages in the database.
The following SQL is lifted from one of the Reporting Services / Adventureworks2000 sample reports. I'm a little slow / baffled on how the inner joins are working? Specifically the Inner Join Locale and Inner Join ProductModel. I'm used to seeing Inner Join SomTable On Something = Somthing but how these joins are working is lost on me. Can someone give a quick overview (or point me to a reference) so I can better understand.
Thanks!
SELECT ProductSubCategory.Name AS ProdSubCat, ProductModel.Name AS ProdModel, ProductCategory.Name AS ProdCat, ProductDescription.Description, ProductPhoto.LargePhoto, Product.Name AS ProdName, Product.ProductNumber, Product.Color, Product.Size, Product.Weight, Product.DealerPrice, Product.Style, Product.Class, Product.ListPrice FROM ProductSubCategory INNER JOIN Locale INNER JOIN ProductDescriptionXLocale ON Locale.LocaleID = ProductDescriptionXLocale.LocaleID INNER JOIN ProductDescription ON ProductDescriptionXLocale.ProductDescriptionID = ProductDescription.ProductDescriptionID INNER JOIN ProductModel INNER JOIN Product ON ProductModel.ProductModelID = Product.ProductModelID INNER JOIN ProductModelXProductDescriptionXLocale ON ProductModel.ProductModelID = ProductModelXProductDescriptionXLocale.ProductModelID ON ProductDescriptionXLocale.LocaleID = ProductModelXProductDescriptionXLocale.LocaleID AND ProductDescriptionXLocale.ProductDescriptionID = ProductModelXProductDescriptionXLocale.ProductDescriptionID ON ProductSubCategory.ProductSubCategoryID = Product.ProductSubCategoryID INNER JOIN ProductCategory ON ProductSubCategory.ProductCategoryID = ProductCategory.ProductCategoryID LEFT OUTER JOIN ProductPhoto ON Product.ProductPhotoID = ProductPhoto.ProductPhotoID WHERE (Locale.LocaleID = 'EN')
I'm trying to get the following poll working:http://www.codeproject.com/useritems/Site_Poll_Control.aspIt looks like it's exactly what I was looking for, but it doesn't come with much in the way of instructions. I have the following function: Public Function CastVote(ByVal PollId As Integer, ByVal Answer As Integer, ByVal MemberId As Integer) As Boolean Dim cmd As New SqlCommand("InsertPollResult", New SqlConnection(Connection)) With cmd.Parameters .AddWithValue("@PollId", PollId) .AddWithValue("@PollChoice", Answer) .AddWithValue("@MemberId", MemberId) End With Return (SqlExecuteInsertSp(cmd) > 0) End Function This calls SqlExecuteInsertSp(cmd) which is:Public Function SqlExecuteInsertSp(ByVal cmd As SqlCommand) As Integer Dim i As Integer cmd.CommandType = CommandType.StoredProcedure Try cmd.Connection.Open() i = cmd.ExecuteNonQuery() Catch ex As Exception ErrorMessage = "ProDBObject.SqlExecuteInsertSp(SqlCommand): " & ex.Message.ToString Finally cmd.Connection.Close() End Try Return i End Function I can't figure out what this is doing. The best I can figure is it determines if we have a good connection. Is this right? In my code CastVote keeps returning false, and I don't know why. The answer seems to be in the i = cmd.ExecuteNonQuery() line, but I can't figure out what that line is supposed to be doing.Diane
Hi Guys, I have written quite a big stored procedure which creates a temporary table (multi-session) and updates it. All the statements are encapsulated in a single transaction which is explicitly declared in the code. What happens is that a lock is being put by the server on that table (of type Sch-M) in order thus preventing any type of operations on it (including simple select)
Now, I want to be able read that table from within another transaction. Why is that I cannot use a table hint NOLOCK in the select statement?
Here is some code which reproduces my problem.
Query A:
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;
BEGIN TRAN TR_DEMO;
CREATE TABLE ##TBL1( Oidx int not null primary key identity(1,1), Name nvarchar(30) not null, Type char(1) not null );
My question is in what situations @@ERROR will be set...
I like to do some logic when some error is occured in a particular statement....
the doc. says the @@ERROR value will be set if an error occurs in a statement, and the control will move to the next statement without exiting(???) the procedure and @@ERROR value can be used in that statement.
but when i execute the below procedure, the execution is terminated ( when the error occurs) without moving to the next statement. please help me to understand the SQL Server's @@ERROR and the situations when it will be set....
----------------------------------------------------------------------- CREATE PROCEDURE VALUE_ERROR_TEST AS BEGIN DECLARE @adv_error INT DECLARE @errno INT DECLARE @var int SELECT @var = '101 a' SELECT @errno = @@ERROR print @errno END go ----------------------------------------------------------------------- procedure get successfully compiled. when executed it says,
Server: Msg 245, Level 16, State 1, Procedure VALUE_ERROR_TEST, Line 10 Syntax error converting the varchar value '101 a' to a column of data type int.
Please consider the following example.CREATE TABLE test (an_ndx int NOT NULL primary key identity(1,1),a_var varchar(48) NOT NULL,last_edit_timestamp datetime NOT NULL default CURRENT_TIMESTAMP);CREATE TABLE test_history (an_ndx int NOT NULL,a_var varchar(48) NOT NULL,last_edit_timestamp datetime NOT NULL,current_edit_timestamp datetime NOT NULL default CURRENT_TIMESTAMP);GOCREATE TRIGGER update_history ON test FOR UPDATEASBEGININSERT INTO test_history (an_ndx, a_var, last_edit_timestamp)SELECT * FROM deleted;UPDATE inserted SET last_edit_timestamp = CURRENT_TIMESTAMP;END;The question is, does this do what I think it should do? What Iintended: An insert into test results in default values for an_ndx andlast_edit_timestamp. An update to test results in the original row(s)being copied to test_history, with a default value forcurrent_edit_timestamp, and the value of last_edit_timestamp beingupdated to the current timestamp. Each record in test_history shouldhave the valid time interval (last_edit_timestamp tocurrent_edit_timestamp) for each value a_var has had for the "object"or "record" identified by an_ndx.If not, what change(s) are needed to make it do what I want it to do?Will the trigger I defined above behave properly (i.e. as I intended)if more than one record needs to be updated?ThanksTed
I am using SQL Server Express and Visual Studio 2005. I am new to batches and am trying to understand how they work. I am trying to write a query that creates an assembly and the functions that are contained in it. Here is my query:
USE ProductsDRM GO
IF NOT EXISTS (SELECT 'True' FROM sys.assemblies WHERE name = 'ComputedColumnFunctions') BEGIN CREATE ASSEMBLY ComputedColumnFunctions FROM 'C:WebsitesAssemblyTestStoredFunctionsStoredFunctionsinStoredFunctions.dll' GO
CREATE FUNCTION fImageFileName ( @ProductID int, @ImageSizeCode nvarchar(4000) ) RETURNS nvarchar(4000) AS EXTERNAL NAME [ComputedColumnFunctions].[StoredFunctions.UserDefinedFunctions].ImageFileName GO
CREATE FUNCTION fTestInt ( @ProductID int ) RETURNS int AS EXTERNAL NAME [ComputedColumnFunctions].[StoredFunctions.UserDefinedFunctions].TestInt GO
CREATE FUNCTION fTestInt2 ( @TestInt int ) RETURNS int AS EXTERNAL NAME [ComputedColumnFunctions].[StoredFunctions.UserDefinedFunctions].TestInt2 END ELSE BEGIN PRINT 'The assembly named "ComputedColumnFunctions" already exists. No new assembly was created.' END
GO
I read in a book about SQL Server 2005 about including a test for whether the object (such as assembly in this case) exists before trying to create it. If I only include the CREATE ASSEMBLY statement and the FROM line below it and delete the next GO down through the last CREATE FUNCTION (just before the END ELSE), it works fine. If I leave it as is, I get a runtime error on the GO line just after the CREATE ASSEMBLY statement. What am I doing wrong?
I have a situation where I need to distribute a db to 'subscribers' for use during network, and preferred application downtime. Currently, we do this by employing MS-Access. We update our db on the 'subscriber' by sending a text file with the new data using FTP.
When the 'subscriber' opens their local copy of the Access db application, a macro fires off to check for any new file in the ftproot folder, and if one is detect that is newer than the last update, it truncates the existing table, and imports the text file using a predefined import specification format. The process works well enough as is. However, we were hoping to move beyond our dependency on MS-Access for a variety of reasons, so we are looking at developing windows forms apps using the new Asp.Net v2.0 technology and Sql2005 and SqlExpress.
I need some clarification on how replication works. Does it allow a 'snapshot' db to be created on a subscriber that can be used when the network is down? If not do we have alternatives? For example, I guess we could export/import to the subscriber in some manner.
Hope I've made the case clear enough for some responses. Thanks in advance for your thoughts and help. :)
Please forgive the basic-ness of my question, but I have only been using DTS and SSIS for 2 weeks now.
What I'd like to know is, is my understanding of SSIS package development correct.
In SQL Server 2005, I use BIDS to develop my SSIS packages. During development, I simply store my project on the local C drive.
However, once I am finished writing and testing my projects, I move them to SQL Server, using SAVE COPY AS... (for now, I only using one server for both development and production)
Then, anytime I need to change or modify the package, I change the local file system copy of the package, then do another COPY AS... to SQL Server.
That is, once the package is on SQL Server, if you need to change it, you can only do so through BIDS, correct?
I just want to know if I'm on the right track here.
I just learned some basic FTP commands for the first time and was able to transfer over some files from one machine to another. Now I'm trying to to this using the FTP Task.
First of all, I understand that the destination machine has to be set up with something called an FTP Site. And I noticed that when I do a "cd" command in my FTP session, I have visibility only to those directories that are set up as an FTP Site.
Ok, onto the FTP Task. First I created and test connected my FTP Connection Manager. Next, I went into the FTP Task Editor, and I'm in the File Transfer page. Looking at the properties, I guess, this is where I tell what file to move where.
So, in IsRemotePathVariable, I selected "False". When I did this, I was expecting to be able to navigate the various FTP Sites I have set up on my destination computer. However, when I click on RemotePath, I only see "/" (Root). Was I wrong to expect to see the FPT Sites here? What am I doing wrong?
Hi, I'm fairly new to coding with ADO in C++. My previous experience with it was in VB 6.0 and VB.NET. I'm having difficultly deciphering HRESULTs that come back from it when an error occurs. Up until this point I was using,
DXGetErrorString9(), and DXGetErrorDescription9() to interpret any HRESULT that got sent back to my program, but now that I'm using ADO they always return "Unknown".
Is there a Microsoft provided group of functions that help programmers to better understand ADO's HRESULTs?
Can someone help me to understand a stored procedure I am learning about? At line 12 below, the code is calling a function named"ttg_sfGroupsByPartyId" I ran the function manually and it returns several rows/records from the query. So I am wondering? does a call to the function return a temporary table? And if so, is the temporary table named PartyId? If so, the logic seems strange to me because earlier they are using the name PartyId as a variable name that is passed in.
1 ALTER PROCEDURE [dbo].[GetPortalSettings]2 (3 @PartyId uniqueidentifier,45 AS6 SET NOCOUNT ON7 CREATE TABLE #Groups8 (PartyId uniqueidentifier)910 /* Cache list of groups user belongs in */11 INSERT INTO #Groups (PartyId)12 SELECT PartyId FROM ttg_sfGroupsByPartyId(@PartyId)
I've couple of questions about Sql Server 2000 and would greatly appreciate, If somebody out there, please answer to it.
Questions:
I've a SQL 7 server dump file and I was wondering, If I can directly load this file into SQL Server 2000? If yes, Would this call a Upgrade of sql 7.0 database to SQl Server 2000? Or this is just a backward compatability support in 2000?
Well the main objective of this is, We are planning to upgrade one of our production SQL 7.0 server and I was wondering, If we can just take the SQL 7.0 dump file and load it into the SQL 2000 Server? If yes, What are the downgrades of this and if no, Should we upgrade the sql 7.0 server itself and along with it all the databases sitting on it? Please shed some light here.