My client is using a sql server 7.0 to store realtime data like heat,temp,pressure etc inserted every second.He wants me to provide a solution to transfer the summarised data to oracle server on a regular basis..say once on every 5 min..
I have to value [CreateDate] in the data pump of my Flat File Source into my OLE DB Destination SQL Server Table. With a Variable within the SSIS Package or with a Derived Column task within the Data Flow between the Flat File Source and OLE DB Destination?
I have a largish DTS package built generically from VB. It uses a combination of DTSExecuteSQLTask and DTSDataPumpTask (using SQLstatements for the source). 18 tasks are failing (1 and 17 respectively by the above types).
When I try to execute the tasks individually I get messages like "Column name xxxx was not found". the column does exist in the table specified in the SQL statements and further the SQL statements execute ok using the Query Analyzer.
If I select Properties, Transformations for a task I get presented with the Verifying Transformations dialog (i.e. indicating there are errors). If I select the third option (Remove all transformations and redo audo-mapping) and save my changes the task then executes okay.
I did not see a forum for the SQL Server 2000 DTS.
I have a flat file feeding a table via a data pump. The table is only used by this process. It will run for about 30minutes and then fail. The message in the history does not give any detail on why it is failing. Below is the message I get and if I rerun the job it works fine. Anyone help me please.
Date 07/23/2007 6:00:02 AM Log Job History (Daily: Load EOL from MVS1 (First Run))
Step ID 1 Server PIT-CS-M608 Job Name Daily: Load EOL from MVS1 (First Run) Step Name Daily: Load tblCaseMasterSched Duration 00:28:05 Sql Severity 0 Sql Message ID 0 Operator Emailed Operator Net sent Operator Paged Retries Attempted 0
Message Executed as user: PIT-CS-M608SYSTEM. ...rt: DTSStep_DTSActiveScriptTask_1 DTSRun OnFinish: DTSStep_DTSActiveScriptTask_1 DTSRun OnStart: DTSStep_DTSExecuteSQLTask_1 DTSRun OnFinish: DTSStep_DTSExecuteSQLTask_1 DTSRun OnStart: DTSStep_DTSDataPumpTask_1 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 1000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 1000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 2000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 2000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 3000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 3000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 4000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 4000 DTSRun OnProgress: DTSStep_DTSDataPumpTask_1; 5000 Rows have been transformed or copied.; PercentComplete = 0; ProgressCount = 5000 DTSRun OnProgress: DTSStep_DTSDataP... Process Exit Code 1. The step failed.
I have a vbscript to read all files from a directory and, if the fileis valid, I would like my DTS to process it. I tried using thevbscript as an ActiveX workflow script in the DTS, but it does notexecute the data pump until it has completed looping through all thefiles, so only the last file read is sucked into the database(utilizing a global variable as the filename). Is there a way toexecute the data pump task from within the activex script? I can'tseem to find any documentation about executing a DTS task.Basically, the workflow I want is:1)Read files from directory (the number and names may change eachtime). (done with vbs)2)For each file, send it through the transformation into the database.3)When the information is in the database, append a date to the fileand move it to the archive folder. (done with vbs)If I am going about this the wrong way and you see something that isnot obvious to me, please let me know.Thanks in advance!
I'm currently creating a SSIS package that takes data from 3 unique databases. A SQL DB, FoxPro DB, and an Oracle DB. The data is pulled, cleansed and put into a single SQL 2005 table. The data is then pulled from this table every 15 minutes, formated in a given specification and uploaded to an ftp site. This part is done. My question is this:
This package needs to run around the clock, non-stop. How can package be set up to do this? It needs to pull data from the 3 DBs and put it in the common table, wait 15 minutes and do it again. Wait 15 more mintues and do it again. And so forth. A problem I'm having is I don't see a way to set up a SSIS package so that it runs around the clock.
On same premise, I have another issue. When I try to take data from the common table and there is nothing there, it causes an error. Is there some way that you can run a test like
SELECT * FROM _table_ WHERE is_sent = 0
if results == 0 { wait 15 minutes and test again. } else if
{ write flat file, wait 15 minutes. }
This has to be done in the Control Flow scope, so I can't use a conditional split. This is a pretty big deal as this needs to run around the clock. Thank you in advance for your assistance.
I am having trouble understanding how the SSIS data pump determines when to decide "The final commit for the data insertion has started/ended". On some tasks the rows are inserted one at a time every few milliseconds (shown by a default getdate() in a datetime column). In others the final commit occurs as I would expect at the end of the data pump task.
There are times i want the data pump task to commit all records that are succesful, row by row and there are times I want an all or none situation. Can somebody explain why this behaviour occurs and how i can control which commit option I want the data pump tasks to use?
I've created a package that runs fine from BIDS when logged in with my domain account. I have created a SQL Agent Proxy on the server with that same account. In the Job Step on the server, I edit the connection strings so that username and password is there for both my source Access connection and the destination SQL Server. Here is the connection string I create for MS Access:
Code SnippetData Source=\10.210.226.202OTM Reports for SymmetricsCDRD001.MDB;User ID=admin;Password=;Provider=Microsoft.Jet.OLEDB.4.0;
Here is the error:
Code SnippetExecuted as user: DOMAINMRUSER. ethod call to the connection manager "MSAccessDB" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed. End Error Error: 2008-01-30 09:49:19.66 Code: 0xC0047017 Source: Cost DTS.Pipeline Description: component "Cost" (1) failed validation and returned error code 0xC020801C. End Error Error: 2008-01-30 09:49:19.66 Code: 0xC004700C Source: Cost DTS.Pipeline
I have tried with various settings in the package for "ProtectionLevel" such as "DontSaveSensitive" and "EncryptSensitiveWithUserKey". I would think that using my account with the proxy the last option would work when running it on the server, since it is essentially the same user running the package, but I'm new to playing with the proxy.
I tried using package configurations but got an error there too, think it couldn't access the file, event though it was on an accessible share--accessible to my account.
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.
The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?
Is there any performance impact if we enable CDC on OLTP db?
Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?
What is the best way to implement CDC to take incremental changes for reporting.
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:Â
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected. It seems that somehow the 3 columns are not read in from the source file? ans alslo fiscal year, fiscal week is not set up up properly in my data destination? anyone faced such errors before?
When I execute the below stored procedure I get the error that "Arithmetic overflow error converting expression to data type int".
USE [FileSharing] GO /****** Object: StoredProcedure [dbo].[xlaAFSsp_reports] Script Date: 24.07.2015 17:04:10 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO
[Code] .....
Msg 8115, Level 16, State 2, Procedure xlaAFSsp_reports, Line 25 Arithmetic overflow error converting expression to data type int. The statement has been terminated. (1 row(s) affected)
is there a step by step paper to get there? here is what i need to consider. I Iwill have many customers that will need their own set of records and access pages "branded for their company" each customer will have many clients. I am hosting this application on a windows 2003 server with SQL 2005 server enterprise.
I am using windows authentication, I have created a username in windows, then i added the windows user in SQL management studio in security, granted "DB Read" and "DB write" and again under the database security tab. still from the web authentication fails. i must be nissing a step or two?
I expect to set up a username for each database as i setup new customers.
RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.
I want my XML Data source to be an expression as i will be looping through a directory of xml files.
I don't see the expression property or the connection property??
I tried setting the XMLData property to @[User::filename], but that results in:
Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning. Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data. Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0. Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows. Task failed: Load XML Files Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted. Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. SSIS package "Package.dtsx" finished: Failure. The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).
I setup this package to import data from a Sharepoint list to a SQL Server data table. The primary key of my SQL table is mapped to the Title column of my Sharepoint list. There is a possibility that duplicate values will be entered in the Title field of the Sharepoint list. So when importing data into my table via SSIS, my package always error-out when there it comes across duplicate values. how you others have managed data integrity when importing from a Sharepoint list with the Title column being mapped to the primary key of a table.
"pRecordSet" is an ADO recordset. The database column "MyColumn" is of type "decimal(19,10)".
The most important question for me is, if the regional settings of the database server or the regional settings of the client PC are considered during the conversion from the string to the decimal value. For example in standard French regional settings the "." would not be recognized as decimal separator.
I am also wondering if the language of the database instance, in which this data is saved, is considered during this conversion or any other settings of this database instance.
So my general question is: Does anybody know exactly what rules apply during the above mentioned conversion?
I've question about how to handle structural datamodel changes in a datasource of PowerPivot. Suppose I'm developing a starmodel in SQL Server and sometimes a datatype changes or a name of a field changes in a table. It seems to me that PowerPivot handle this not gracefully as Analysis MD does (mostly). I received an error because of a wrong fieldname or even no error when a dattype changes in PowerPivot. Is this common or do I something wrong here. Does this mean that every time the datamodel changes the PowerPivot should be recreated? Or am I missing the clue here?
I have to extract, dayly a list of contacts on a exchange server in a table on our EDW on sql server 2005. Is it possible to get the information directly from a dataflow or i will have to developpe a script task ?
I am getting ErrorCode 8 while loading the data from stage to model. I have checked my error view it states that "Member Code is Inactive".
Initially I have loaded same set of data in Model from MDS Stage table but then deleted with ImportType = 5 which removed all the data from the MDM model.
Now i want to load it back but its giving the Error Code 8 .. Before loading the same data i have changed the stage table Importtype to 2 and Importstatusid to 0.
HiI'm having problems following the tutorial on creating a data access layer - http://www.asp.net/learn/dataaccess/tutorial01cs.aspx?tabid=63 - when I try to compile in Visual Studio 2005 I get namespace could not be found. I followed exactly the tutorial - I created a dataset and added this code in my aspx page. <asp:GridView ID="GridView1" runat="server" CssClass="DataWebControlStyle"> <HeaderStyle CssClass="HeaderStyle" /> <AlternatingRowStyle CssClass="AlternatingRowStyle" />In my C# file I added these lines... using NorthwindTableAdapters; <<<<<this is the problem - where does this come from? protected void Page_Load(object sender, EventArgs e) { ProductsTableAdapter productsAdapter = new ProductsTableAdapter(); GridView1.DataSource = productsAdapter.GetProducts(); GridView1.DataBind(); }Thanks in advance
My vendor requires data to be sent in Excel format. Some of my tables have rows over 65,536 so I need to use Excel 2007 (Max of 1,048,576). Right now my data sits in SQL 2000. I am using MS SQL Enterprise Manager 8.0 to prepare the data. Is there some kind of add on or selection I am missing to use DTS to export from SQL to Excel 2007?Thanks in advance.
I think I am definitely thrashing and am not getting anywhere on something I think should be pretty simple to accomplish: I need to pull the total amounts for compartments with different products which are under the same manifest and the same document number conditionally based on if the document types are "Starting" or "Ending" but the values come from the "Adjust" records.
So here is the DDL, sample data, and the ideal return rows
CREATE TABLE #InvLogData ( Id BIGINT, --is actually an identity column Manifest_Id BIGINT, Doc_Num BIGINT, Doc_Type CHAR(1), -- S = Starting, E = Ending, A = Adjust Compart_Id TINYINT,
[Code] ....
I have tried a combination of the below statements but I keep coming back to not being able to actually grab the correct rows.
SELECT DISTINCT(column X) FROM #InvLogData GROUP BY X HAVING COUNT(DISTINCT X) > 1
One further minor problem: I need to make this a set-based solution. This table grows by a couple hundred thousand rows a week, a co-worker suggested using a <shudder/> cursor to do the work but it would never be performant.
Was wondering if there was a best practice minimum permissions for creating a SQL login to use when setting up a new shared Data source for SSRS report manager?
Something along the lines of them being a data read for the DB and permissions to update tempdb?
Would have thought it not advisable to have the login be able to update the main db...
I need to create a function that replaces the data in a column with an 'X' based on the LEN of the data in the column. I created one that does a replacement, but it fills the column based on the max data length, and not the current length of the string or integer. An example of what I'm trying to accomplish.
Original data in a varchar(30) column: thisisavalue thisisanothervalue thisisanothervalueagain shortval
replaced with xxxxxxxxxx xxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxx xxxxxxx
My current function is replacing the data like this: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Hello all. Before my arrival at my current employer, our consultantsphysically set up our MSSQL 7 server as follows:drive c: contains the mssql enginedrive d: contains the transaction logdrive e: contains the data filesNo filegroups were set up and the data files consist of only 1 largephysical file. Currently, our data file is >10GB. When I was trained onthe physical aspects of sqlserver, I was told to never create physical files[color=blue]> 2048MB each. If I did, I could expect inefficient physical storage of[/color]data and slower performance (due to the OS).Our server has 2 RAID-5 arrays. Drive c: and e: are located on the firstarray and drive d: on the second. We're running Windows 4.0 NT Server SP6with NTFS.Can someone comment on the use of 1 single large data file vs. more smallerdata files?
I am trying to find a reference for a client that lists the fields available to be substituted into a data driven subscription from the query, along with the expected data types.  For example, the field on whether or not to include a link to the report seems to be expecting a bit data type.I have searched and can't seem to find anything.  I guess I could walk through the interface and try different data types, but if  a list exists, that would be better.Â