Save Time In Script Component Which Is Almost Forwarding
Feb 12, 2007
Hi,
I have to add the fields coming from the source AS IS but, I need to add current date as a column to it. So, What I do is to add an Script Component and add each and every output column in that along with defining their types and writing an "assignments" script.
Is there any possibility for me to save time in the scenarios where I am almost passing on the information to next level in the data flow ?
Any input regarding this will sincerely be appreciated.
I would like to save only the time into the datetime field in a table. when I do this, It will save it as "1/1/1900 15:23:00" All I would like to have is the time! Is there a way to update it so that it will only save the time?
What I want to do is remove the time component from a datetime value. That is to say I want 2008-04-18 00:00:00.000 intead of 2008-04-18 13:46:57.983. I have heard that the only way to do this is to convert the datetime to a string, strip off the time portion and then convert back to a date. Isn't there a simpler way?
I recenly stumbled upon this construct: { d '2008-04-18' } . It appears that this might accomplish what I am lookig for above except that it appears to only except literal strings and not variables. I think this construct comes from the ODBC "improvements" realm so I am leary of depending on it as I am not sure it is standard. Any insights and links to more documentation on using this new, for me, construct would be appreciate.
I have a flat file that contains detail in each record as indicated below:
HEADER_ID, ATTRIB_A(1..10), ATTRIB_B(1..10), ATTRIB_C(1..10), etc.
The index of the attribute relates it to other attributes with the same index. It needs to look like this in the detail table:
HEADER_ID, ATTRIB_A1, ATTRIB_B1, ATTRIB_C1
HEADER_ID, ATTRIB_A2, ATTRIB_B2, ATTRIB_C2
I need to pivot these attributes into a detail table that relates back to the header information. Because of the number of these, I don't want to use the UNPIVOT Task because there are so many. I was hoping to move the complexity to a Script Component where I could read one line and transform it to a normalized state.
I have a package that has a data lfow task. this task imports data from a db2 database (using the IBM Ole DB provider fro db2) and adds it to sql server database table. This package was created on the server. then though version control (using TFS source control) I check out the package on my local machine. and when I open the package I get the foll 3 errors.
Error 1 Validation error. Import Account Num from BMGP_BDR: DTS.Pipeline: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
Error 2 Error loading BMAG Download Xref Tables - bmag.dtsx: Microsoft.SqlServer.Dts.Pipeline.ComponentVersionMismatchException: The version of component "DataReader Source" (1113) is not compatible with this version of the DataFlow. [[The version or pipeline version or both for the specified component is higher than the current version. This package was probably created on a new version of DTS or the component than is installed on the current PC.]] at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostCheckAndPerformUpgrade(IDTSManagedComponentWrapper90 wrapper, Int32 lPipelineVersion)
Error 3 Error loading BMAG Download Xref Tables - bmag.dtsx: The component metadata for "component "DataReader Source" (1113)" could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
I have a package which reads an Access file from a folder. My connection manager to this file is .NET providers for OledbMicrosoft Jet 4.0 OLE DB Provider.
Package works from my computer. But when I execute it on the server as a SQL Agent job, I get
The component metadata for "component "DataReader Source" (1) could not be upgraded to the newer version of the component. The PerformUpgrade method failed.
I copied the mdb file to a folder on the server which my packages have no problem reading data from.
My packages run under the same domain account as defined in proxies.
Anyone have a good resource for a forwarding example ?
I have a distributed service broker application working but i would like to add a forwarder in the mix, all i can find in BOL is that you must create an endpoint for forwarding and create the correct routes on a frowarder but thats it.
Is that all you require ? AN instance with a forwarding endpoint correct routes ?
I am looking for an actual example, cannot find anything in BOL on how to actually implement it ?
I downloaded forwarder1.1 example from getdotnet.com, but that example sucks, it does even create routes ???
I have been triyng to find an explanation for this error everywhere, but so far have been unsuccessful:
Cyclical forwarding detected for event 17806: Event source was 'SP-MINCL-WDB04' which matches the current forwarding server. The event was not forwarded.
Hi, I have a program continously updating an SQL (using Microsoft 2005 Server for Management) database, and I need to set up a way that when ever a certain column or tag is generated to the database, an alert is forwarded to my e-mail, through VB or C or whatever can handle this. Any help would be terrific, thanks.
Hi, I have a program continously updating an SQL (using Microsoft 2005 Server for Management) database, and I need to set up a way that when ever a certain column or tag is generated to the database, an alert is forwarded to my e-mail, through VB or C or whatever can handle this. Any help would be terrific, thanks.
I am trying to centralize my alerts and events forwarding using this feature in SQL server 2000. I setup event forwading in every one of my satellite SQL servers, however I am not getting any events or notifications.
Anybody has worked on this or know of any good information on this topic.
Is there any concerns or problems with forward a port to SQL server from our internet firewall so that you can access the databases over the internet? Is it a standard practice to do this in order to remotely access the SQL server. Or is there a better way to do this? Also, if it is ok to do this are any things you need to do to enhance your security from vulnerabilities being open to the internet.
Any suggestions or comments would be appreciated on this subject.
I know that SQL Servers can be set up to forward alerts to a singleSQL Server. I am wondering if same would be true for the MaintenancePlan Reports. Basically, I want to have only one SQL Server on thenetwork that has the SQL Agent Mail profile set up, and to use thatserver for all e-mail and/or pager notifications for all alerts andevents raised by any of the SQL Servers on the network, includingMaintenance Plan reports. Is this possible? Is anything required toset this up beyond alerts forwarding?Alexey Aksyonenko
Hello:I didn't find any documentation that notes save point names are casesensitive, but I guess they are...Stored Proc to reproduce:/* START CODE SNIPPET */If Exists (Select * From sysobjects Where Type = 'P' and Name ='TestSaveTran')Drop Procedure dbo.TestSaveTranGoCreate Procedure dbo.TestSaveTranAsBeginDeclare@tranCount int--Transaction HandlingSelect @tranCount = @@TRANCOUNTIf (@tranCount=0)Begin Tran localtranElseSave Tran localtranBegin Try--Simulate Error While ProcessingRAISERROR('Something bad happened', 16, 1)/*If this proc started transaction then commit it,otherwise return and let caller handle transaction*/IF (@tranCount=0)Commit Tran localtranEnd TryBegin Catch--Rollback to save pointRollback Tran LOCALTRAN --<< NOTE case change--Log Error--Reraise ErrorEnd CatchEndGo--Execute Stored ProcExec dbo.TestSaveTran/*Should receive the following message:Cannot roll back LOCALTRAN. No transaction or savepoint of that namewas found.*//* END CODE SNIPPET */What is really strange, if there is a transaction open, then no erroris thrown. So if you execute as so:/* START CODE SNIPPET */Begin Tran--Execute Stored ProcExec dbo.TestSaveTran/* END CODE SNIPPET */There is no "Cannot roll back LOCALTRAN...." message.Questions:1-)Can someone confirm save point names are case sensitve and this isnot happening because of a server setting?2-)Is this a logic error that I am not seeing in the example codeabove?We have changed our code to store the save point name in a variable,which will hopefully mitigate this "problem".Thx.
Myself and a user have the same issue which is when we hit forward and type our reply, spell check checks the entire email. I want it to just check what I wrote.
We both have Ignore original message text in reply or forward checked off under, Tools, Options & Spelling
I am new to sql sever management studio express, but a long time query analyzer user. This is a very basic question.
I want to change the default directory in sql server management studio express so that when I go to save a query, it is already pointed to the correct one. Where do I change that?
I have found that when I'm debugging a custom component in BIDS that I've created in another instance of Visual Studio, every time I rebuild the component I have to shutdown and restart BIDS and then reattach to the BIDS process. Which is pretty time consuming... And if I find a small error in my custom component when debugging then I don't seem to be allowed to make any changes to the code unless I stop debugging and go through the process above.
Am I missing something here? Or do I really have to manually go through these steps every time I want to change code in the component I'm debugging?
Can I automate the process with MSBuild or NAnt? If so, is there an example of this anywhere?
In a Data Flow, I have the necessity to use a SSIS variable of type €œObject€? inside Script Component and assign to it the content of 'n' variables of string type. On exiting from the script the variable of type object should contain something like in the following lines: AAAAAAAAAAAAAAAAAAAAAAAAAAAAA BBBBBBBBBBBBBBBBBBBBBBBBBBBBB CCCCCCCCCCCCCCCCCCCCCCCCCCCCC DDDDDDDDDDDDDDDDDDDDDDDDDDDDD €¦€¦€¦€¦€¦€¦€¦. €¦€¦€¦€¦€¦€¦€¦. On exiting from the data flow I will use the variable of type Object in a Script Task, by reading each element in a cyclic fashion. Is there anyone who have experienced something like this? Could anyone provide any example of that? Thanks in advance!
Hi all I'm into a project which uses a lot of views for joining 2 or more tables. Using the MERGE component in SSIS will be a huge effort coz it only has 2 inputs and I gotta SORT the input too. Isnt it possible to have a VIEW like component that joins more than 2 tables and DOESNT need sorting?? (I've thought about creating views in database engine but it breaks my data floe in SSIS and is'nt a practical solution)
I am writing a custom dataflow transformation component and I need to get the name of the preceeding component.
I have been trying to find a way to get a reference to the Package object, MainPipe object or IDTSPath90 object (connecting to the IDTSInput90 of my component) from my component because I think from there I can get to the information I want.
No idea where this bug crept in from. Have been using SSIS for 1.5 years now without hitting this problem.
I had a script component opening an XML document and parsing it using XPATH. I added some code that uses StreamReader / Streamwriter (closing one stream before starting the other). The code works without issue in my C# app.
And it ran without issue 2-3 times in SSIS. Then suddenly after running my package again, the script component says it completes successfully, yet nothing happens. I set a breakpoint on the first line of code - it never hits it. I add a msgbox as the first line of code - and it never displays.
I then close my package / exit out of ssis ... and then re-open it. When i open my script component, all of my code is GONE. All references that I added are gone.
I tried adding the streamreader/writer process to a dll I created from my c# app ... and added the DLL to the package -- same result.
I can reproduce this on 2 different computers.
Anyone experience this problem ? Any idea how to stop it ? Or debug it ?
Here is a slimmed down code sample of what causes the error :
Public Class ScriptMain Public Sub Main() Try Dim xmlDoc As New XmlDocument xmlDoc.Load("c:ulkasync_86281519_20070628045850225_4.xml") MsgBox("xmlLoaded") --this doesn't display once the package starts "acting up" Catch ex As Exception MsgBox(ex.Message) UpdateXML("c:ulkasync_86281519_20070628045850225_4.xml", ex.Message) End Try Dts.TaskResult = Dts.Results.Success End Sub Private Sub UpdateXML(ByVal fileName As String, ByVal message As String) Try Dim invalidChar As String = message.Trim().Substring(message.Trim().IndexOf("0x"), 4) Dim rd As StreamReader = New StreamReader(fileName) Dim xml As String = rd.ReadToEnd() Xml = Xml.Replace(invalidChar, String.Empty) xml = xml.Replace("", String.Empty) xml = xml.Replace("<![CDATA[<![CDATA[", "<![CDATA[") xml = xml.Replace("]]>]]>", "]]>") MsgBox("replaced") rd.Close() Dim wr As StreamWriter = New StreamWriter(fileName) wr.Write(xml) wr.Close() Dim xdoc As XmlDocument = New XmlDocument() xdoc.Load(fileName) Catch ex As Exception UpdateXML(fileName, ex.Message) End Try End Sub End Class
I have a problem regarding forwarding 'n number of parameters' from Visual Studio 2005 using VB to SQL-Server 2005 stored procedure.I have to save N number of rows in my stored procedure as a transaction. If all rows are not saved successfully, I have to roll-back else update some other table also after that. I am unable to handle - How to send variable number of parameters from Visual Stduio to Sql - Server ? My requirement is to use the SQL-Stored Procedure to store all the rows in the base table and related tables and then update one another table based on the updations done. Please Help .....
USE [Testing] GO /****** Object: Table [dbo].[Testing] Script Date: 4/25/2014 11:08:18 AM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON
[Code] ....
It seems to work fine with one million records.
Each primary key is unique, but the begindate is non-unique, and i guess even if i use datetime2 and add nanoseconds, from what i have read, there is a chance that i could have a duplicate datetime since the date is imported via XML from multiple sources.
Is there a way to keep track in real time on how long a stored procedure is running for? So what I want to do is fire off a trace in a stored procedure if that stored procedure is running for over like 5 minutes.
I am trying to load previous days data at 3 am via a SSIS job.
The Date variable is initiated as DATEADD("dd",-1, GETDATE()) in the for loop.
Now, as this job runs at 3 am, and I set the variable as GETDATE() - 1, it excluded the data from 12 am to 3 am in the resultset as Date is set as YYYY-MM-DD 03:00:00:000 I need this to be set as YYYY-MM-DD 00:00:00:000
I hope to update a DateTime column value with a Time input parameter.  Poor attempt below but it looks like the @ApptTime param is coming in as 10:45:00.0000000 and I might have an existing @SendOnDate as: 2015-10-05 07:00:00.000...I hope to end up with 2015-10-05 10:45:00.000
ALTER PROCEDURE [dbo].[SendEditUPDATE] @QuePoolID int=null ,@ApptTime time(7) ,@SendOnDate datetime
I am using VS2005 (VB) to develop a PPC WM5.0 Program. And I am using SQLCE 3.0. My PPC Hardware is in 400MHz.
The question is when the program try to insert the first record into sdf database after each time the program started. It takes a long time. Does anyone know why and how can I fix it?
I will load the whole database into a dataset when the program start and do all the "Insert", "Update", "Delete" in this dataset and fill it into database after each action.
cn.Open() sda = New SqlCeDataAdapter(SQL, cn) 'SQL = Select * From Table scb = New SqlCeCommandBuilder(sda) sda.Update(dataset) cn.Close()
I check the sda.update(), it takes about 0.08s for filling one record into database normally. But:
1. Start the PPC Program
2. Load DB into dataset
3. Create a ONE new record in dataset
4. Fill back to DB
When I take this four steps everytime, the filling time is almost 1s or even more!
Actually, 0.08s is just a normal case. Sometimes, it still takes over 1s to filling back a dataset which only inserted one record when the program is running. (Even all inserted records are exactly the same in data jsut different in the integer key)
However, when I give up the dataset and using the following code:
cn.Open() Dim cmd As New SqlCeCommand(SQL, cn) ' I have build the insert SQL before (Insert Into Table values(XXXXXXXXXXXXXXX All field)
I found that it is still the same that the first inserted record takes more time, but just about 0.2s. And the normal insert time is around 0.02s. It is 4 times faster!!!