Loading Data From Xml Http Stream

May 8, 2007

Hi,

I have a problem which I don't entirely know how to tackle:

Essentially, is it possible to query a web service (via http), using
sql server 2000, and then import that data in to the database?

I have seen many posts on openxml and sql servers bulk load facilities
but nobody seems to mention whether you can open an http stream and
read the xml in from there.

Any help would be greatly appreciated.

Thanks.

View 3 Replies


ADVERTISEMENT

The Incoming Tabular Data Stream (TDS) Remote Procedure Call (RPC) Protocol Stream Is Incorrect

May 22, 2006

I've read the other posts related to this issue, but I'm just REALLY confused as to whats happening in my case. Like everyone else it was working fine in SQL 2000 but now in SQL 2005 there is an issue. I'm calling a stored procedure with parameters defined like this:

@action varchar(10),
@GLTransactionID int = NULL OUTPUT ,
@GLBatchID int = NULL ,
@GLAccountID int = NULL ,
@CurrencyID int = NULL ,
@LocalDebit decimal(28, 13) = NULL ,
@LocalCredit decimal(28, 13) = NULL ,
@BaseDebit decimal(28, 13) = NULL ,
@BaseCredit decimal(28, 13) = NULL ,
@TransID int =NULL,
@Description varchar(255) = NULL

I am calling this proc from VS.NET 2003 using the .Net SqlClient Data Povider (C#). I'm setting the values of the parameters like this:

cm.Parameters.Add("@action", "insert");
cm.Parameters.Add("@GLBatchID", _gLBatchID.DBValue);
cm.Parameters.Add("@GLAccountID", _gLAccountID.DBValue);
cm.Parameters.Add("@CurrencyID", _currencyID.DBValue);
cm.Parameters.Add("@LocalDebit", _localDebit.DBValue);
cm.Parameters.Add("@LocalCredit", _localCredit.DBValue);
cm.Parameters.Add("@BaseDebit", _baseDebit.DBValue);
cm.Parameters.Add("@BaseCredit", _baseCredit.DBValue);
cm.Parameters.Add("@TransID", _transID.DBValue);
cm.Parameters.Add("@Description", _description.DBValue);

When I execute the call to the stored proc I get this:

"The incoming tabular data stream (TDS) remote procedure call (RPC) protocol stream is incorrect. Parameter 8 ("@BaseDebit"): The supplied value is not a valid instance of data type numeric. Check the source data for invalid values. An example of an invalid value is data of numeric type with scale greater than precision."

Using the VS.NET command window I then inspect that parameter to see what the heck is going on and get this:

?cm.Parameters["@BaseDebit"].SqlDbType
Decimal
?cm.Parameters["@BaseDebit"].Precision
0
?cm.Parameters["@BaseDebit"].Scale
22
?cm.Parameters["@BaseDebit"].DbType
Decimal
?cm.Parameters["@BaseDebit"].Value
1000000
[System.Decimal]: 1000000

So I set a decmial parameter to 1,000,000, that parameter in the DB is defined as decimal(28,13) so should fit no problem, but it seems the Sql data provider is confused and thinks 1,000,000 is decimal (0,22)???

View 5 Replies View Related

The Stream Cannot Be Found. The Stream Identifier That Is Provided To An Operation Cannot Be Located In The Report Server Databa

Apr 6, 2008

I receive this error when I make a depolyment to our new server(virtual server).
The report works fine in the report manager. In my application, I use RenderStream method to retrieve the images and embed in the webform. I googled it and found some people having the same issue because of the cookie, so they set 'UseSessionCookies' = false in the table ConfiurationInfo of ReportServer database. I tried this, but no luck.

Also, there is a hotfix from Microsoft http://support.microsoft.com/kb/913363.
I have requested a copy, but not sure whehter it's gonna be helpful.


Any clues or suggestions weclome.

Thanks

View 18 Replies View Related

Error Loading Http://localhost/ReportServer

Dec 1, 2006

Hi!

I've installed the Report Server but when I try to go to http://localhost/ReportServer I get the following error:
Reporting Services Error
An internal error occurred on the report server. See the error log for more details. (rsInternalError)

And the log in the c:Program FilesMicrosoft SQL ServerMSSQL.3Reporting ServicesLogFilesSQLDUMPER_ERRORLOG.log file is this:

12/01/06 12:51:40, ERROR , SQLDUMPER_UNKNOWN_APP.EXE, AdjustTokenPrivileges () failed (00000514)
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Input parameters: 4 supplied
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ProcessID = 5344
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ThreadId = 0
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Flags = 0x0
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, MiniDumpFlags = 0x0
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, SqlInfoPtr = 0x0A0F2A04
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, DumpDir = <NULL>
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ExceptionRecordPtr = 0x00000000
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ContextPtr = 0x00000000
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ExtraFile = <NULL>
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, InstanceName = <NULL>
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ServiceName = <NULL>
12/01/06 12:51:40, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Callback type 11 not used
12/01/06 12:51:42, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Callback type 7 not used
12/01/06 12:51:42, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, MiniDump completed: C:Archivos de programaMicrosoft SQL ServerMSSQL.3Reporting ServicesLogFilesSQLDmpr0058.mdmp
12/01/06 12:51:42, ACTION, aspnet_wp.exe, Watson Invoke: No


Does anybody have which could be the problem???

Thanks in advance! :)

View 5 Replies View Related

Data Stream And SQL

Jul 28, 2004

Hi all,

I am just trying to establish if there is a way that I can capture a direct data stream(feed) from another server and have it input into a table.

The scenario is -
I have a PABX that is outputting a stream of data with each record being 83 characters long. Each of the fields is seperated by a space. The end of the record is distinguished by a "Line Feed". The port it is being sent to on my local machine is Port 5000.

Is this possible to do or not? I know that some of our DB guys who use oracle can do it, I would like to have this as a MSSQL database so that I can use it and manipulate it a bit further.

Thanks.

View 2 Replies View Related

Weird Replication Bug. Bulk Data Stream Was Incorrectly Specified As Sorted.

Apr 17, 2006

before anyone even says it, i checked the collation order on everything and it's the same. i get the error when the snapshot is trying to be bulk copied to the subscriber.

i'm on sql2k sp4, server and db collations are SQL_Latin1_General_SP1_CI_AS. here's a repro. 1st, run this in a blank db:

if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[Event_Transactions]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
drop table [dbo].[Event_Transactions]
GO

CREATE TABLE [dbo].[Event_Transactions] (
[EventTransactionId] [int] IDENTITY (1, 1) NOT NULL ,
[OrphanedFlag] [bit] NOT NULL ,
[ProcessedFlag] [bit] NOT NULL ,
[ProcessedTimeStamp] [datetime] NULL ,
[EventTimeStamp] [datetime] NOT NULL
) ON [PRIMARY]
GO

CREATE CLUSTERED INDEX [EventTransactions_IDX_ProcessedOrphanedEventTimeSt amp] ON [Event_Transactions] (
[ProcessedFlag],
[OrphanedFlag],
[EventTimeStamp]
) ON [PRIMARY]
GO

ALTER TABLE [dbo].[Event_Transactions] ADD
CONSTRAINT [PK_Event_Transactions] PRIMARY KEY NONCLUSTERED
(
[EventTransactionId]
) ON [PRIMARY]
GO

insert into Event_Transactions (
OrphanedFlag
,ProcessedFlag
,ProcessedTimeStamp
,EventTimeStamp
)
values (
1
,0
,NULL
,'2004-05-07 15:15:24.000'
)

insert into Event_Transactions (
OrphanedFlag
,ProcessedFlag
,ProcessedTimeStamp
,EventTimeStamp
)
values (
0
,1
,'2004-07-08 13:04:01.513'
,'2004-07-07 16:52:08.000'
)

Now, use transactional replication to replicate it to another db taking all the defaults. when the distribution agent tries to apply the snapshot, it fails with the message mentioned in the title..

Has anyone ever seen this? It's keeping us from considering MS replication for one of our major products. Thanks.

View 1 Replies View Related

Change Reporting Services Default URL (From Http://server/reports To Http://CompanyReports)

Jul 31, 2007

I looked online and couldn't find anything to help me make this change. I want to change the default URL for reporting services to another url. Is this possible? Any assistance would be greatly appreciated.

View 3 Replies View Related

Access Cube Data Via HTTP

Feb 26, 2003

Is there any way to use a web based thin client to access OLAP without installing analysis services on the web server?

Thanks for your help!

View 1 Replies View Related

Downloading Data From A Http Website

Jun 23, 2008

hi guys,
i want to use SSIS package to download files from an http website.
i am tried using Http connection managers and webservice task but it asking me the wsdl file. i dont know where that file is.
can anyone please guide me? or suggest me any other way to do this?

View 3 Replies View Related

Integration Services :: SSIS VB Script Loading Data Into Oracle DB Missing Some Data

Nov 10, 2015

I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below: 

SQL Server:
Oracle:
DDL:

create table Person
(
BusinessEntityID Integer,
FirstName nvarchar2(50),
MiddleName nvarchar2(50),
LastName nvarchar2(50)
);

Result:

I follow up this article: [URL] ....

VB Script: 
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

[Code] ..........

View 8 Replies View Related

Master Data Services :: Error Code 8 While Loading Data From MDS Stage To Model

Apr 22, 2015

I am getting ErrorCode 8 while loading the data from stage to model. I have checked my error view it states that "Member Code is Inactive".

Initially I have loaded same set of data in Model from MDS Stage table but then deleted with ImportType = 5 which removed all the data from the MDM model.

Now i want to load it back but its giving the Error Code 8 ..  Before loading the same data i have changed the stage table Importtype to 2 and Importstatusid to 0.

View 6 Replies View Related

Do We Have To Alawys Use Slowly Changing Dimensions (SCD) Component In The Data Flow For The Loading Of Table Data?

Feb 28, 2008

Hi, all experts here,
Do we always have to use SCD component for the loading of data into data warehouse to handle changes of rows?
I am looking forward to hearing from you and thank you very much in advance for your help.
With best regards,

View 4 Replies View Related

Problem Loading Data From FlatFile Source Data For Column Overflowed The Disk I/O Buffer

Sep 10, 2007



Hi i am trying to do a straight forward load from a Flatfile source , i have defined the columns according to the lenghts defined in the Data Dictionary Provided but when i am trying to run the Task i am encounterring this error

The column data for column "Column 20" overflowed the disk I/O buffer.

I tried to add another column 21 at the end and truncate or leave that column unmapped to destination but the same problem occurs for column 21 what should i do to over come this .

In case of Bad Data how to clean up the source.. Please help me with this








View 5 Replies View Related

Loading Data Froma A Text File To SQL Data Base

Sep 10, 2007

Hello!! searching information about how to migrate some date from an old data base (any tipe) from SQL I´v found this:
LOAD DATA [LOW_PRIORITY | CONCURRENT] [LOCAL] INFILE 'file_name.txt'
[REPLACE | IGNORE]
INTO TABLE tbl_name
[FIELDS
[TERMINATED BY 'string']
[[OPTIONALLY] ENCLOSED BY 'char']
[ESCAPED BY 'char' ]
]
[LINES
[STARTING BY 'string']
[TERMINATED BY 'string']
]
[IGNORE number LINES]
[(col_name_or_user_var,...)]
[SET col_name = expr,...)]
Does anybody know how does it works and how to use it????I´d like to know because I have to load data from a text file to a SQL Data Base and this seems to be te fastest an easiest way to do it...Thanks!!!!bye!

View 1 Replies View Related

SSIS - Data Loading Job -- Update Col B With Col A If Col B Is NULL In The Data File?

May 10, 2007

How do u achieve this -- While SSIS Data Load Execution itself?

Update Col B with Col A value if Col B is NULL in the Data File?

View 1 Replies View Related

Http://localhost/reportserver Works Http://&&<servername&&>/reportserver Doesn't

Aug 1, 2007

Hi All,

I have setup SSRS 2000 and gotten it to work but I am having trouble with SSRS 2005. I can't access to reportserver anywhere on the network. The only way to get to reportserver is termserv into the server and hit it with http://localhost/reportserver The server is Windows 2003 server Standard Ed. running SQL 2005 SP2 and Sharepoint Portal Server 2007. Can somebody please help? Thank you.

View 11 Replies View Related

Data Missing When Loading Data Into Sql Server 2005

Jan 17, 2008



Hi, Experts

The project is a C/S data analysis system built with .Net 2.0 in windows environment, OS: Microsoft Windows 2003 R2 standard Edition Service Pack2, Database used in this project is: Sql server 2005. As a data analysis system, we need to load large amount of data from file to database, we do it by create a dts package and then do data loading by execute "m_Package.Execute(null, variables, m_PackageEvents, null, null)".

The problem is, we fount that DTS miss some data randomly sometimes, we can't find the rule till now. for example we've data as follows in data file, all data field splited by '|'
11234|26341|2007-09-03 00:00|0|0|0.0|0|0.011470833793282509|1|0.045497223734855652|0|0|1|0|3|2929|13130|43|0|2|0|0|40|1|0|0|0|0|0|1||0|0|3|0|0|0|0|0||0|3|0|0|43|43|0|41270|0|3|0|0|10|3|0|0|0|0|0||0|1912|0|0|3|0|0|0|0|0|0|0|3|0|0|5|0|40|0|9|0|0|0|0|0|0|0|0|29|1|1|24|24.0|16|16.0|0|0|0.0|0|0|24|23.980693817138672|0|0.0|0|0.0|0|0.0|0|0.0|11|2.0764460563659668|43|2|0|0|30|11|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|3|3|0|0|0|0|0|0|0|0|0|6|0|0|0|0|0|6|0|0|45|1|0|0|0|2|42|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|0|0|0|2|0|0|0|0|0|0|51|47|85|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|||||||||||||0|0|0|0|0|97.117401123046875|0|0|83|57|||0.011738888919353485|0|1|0.065955556929111481|0|4|||0.00026658669230528176|1|0.00014440112863667309|1|68|53|12|2|1|2.0562667846679688|10|94|2|0|0|30|11|47|4|13902|7024|6878|18|85|4.9072666168212891|5|0.0|0|0.0|0|0.0|0|0.0|0|358|6448|6324|0|0|0|0||0||462|967|0|41|39|2|0|0|0|1|0|0|0|0|0|0|0|0|3|0|0|3|0|0|0|0|0|0|0|0|0|3|0|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|46|0|1|0|1|37|0|0|46|0|1|0|1|37|0|0|0|0|0|0|0|0|0.0|0|0|6|4|2|0|0|2|1|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|1|0.012290795333683491|0|44|44.0|0|0.0|0|0|0|30|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|0|2|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|1|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|1|0|0|0|0|0|0|0|0|0|0|0|0|27|0|0|2112|411|411|45|437|2|0|2|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|4|0|4|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|6|6|0|3|2|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|5|5.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|600|600|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|6|0|0|0|0|0|0|6|0|9|1|2|2|3|0|1|0|0|0|0|0|0|0|0|0|0|0|13|3|2|5|1|1|1|0|0|0|102|0|1|1|0|0|0|3|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0||||||||||0|0|0|0|0|0|0||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|46.0|46|0.0|0|0.0|0|0.011469460092484951|1|0.0|0|0.0|0|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|100.0|100.0|0|1|0|1|0|0|0.02481505274772644|1|0.014951236546039581|1|0|0|0|0|0|0|0|0|0|0|0|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|4695.5556640625|42260|7126.66650390625|62040|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||||||0|0||||||||||

11213|26341|2007-09-03 00:00|0|0|0.0|0|0.068584725260734558|2|0.14375555515289307|27|0|2|1|11|3966|13162|97|0|13|0|0|83|3|2|3|0|0|0|26||0|0|11|0|0|0|1|0||0|1|0|3|97|98|0|246795|0|11|1|0|3|14|0|0|0|0|0||0|1912|0|0|12|0|0|0|0|0|0|0|12|0|0|17|0|83|2|2|2|0|0|0|0|0|0|0|73|3|1|24|24.0|16|16.0|0|0|0.0|3|0|24|23.905364990234375|2|0.0040000001899898052|0|0.0|0|0.0|0|0.0|11|2.0772171020507812|97|7|0|0|80|10|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|12|12|0|0|0|0|0|0|0|0|0|41|0|0|0|0|0|41|0|0|99|25|0|0|0|0|74|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|2|0|0|0|0|0|0|0|0|0|0|177|158|332|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|3|||||||||||||0|0|0|0|0|0.0|0|0|321|198|||0.041950233280658722|0|2|0.1999288946390152|0|5|||0.00088306842371821404|1|0.00051651173271238804|1|529|113|4|8|2|2.0671226978302002|10|274|7|0|0|80|10|66|17|13934|7024|6910|31|332|4.7173333168029785|5|0.000033333333703922108|1|0.000033333333703922108|1|0.000033333333703922108|1|0.0|0|358|6448|6356|0|0|0|0||0||1609|3423|0|83|78|5|0|0|26|0|0|0|0|0|0|0|0|0|2|0|1|1|0|0|0|0|3|0|0|0|0|2|0|2|0|0|0|0|0|0|0|0|0|0|2|0|0|0|0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|65|0|1|0|1|72|0|0|65|0|1|0|1|72|0|0|0|0|0|0|0|0|0.0|0|0|12|7|0|2|3|16|5|5|6|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|2|0.04131799191236496|0|48|48.0|5|5.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|1|0|0|0|0|0|0|9|0|5|1|0|0|0|1|0|0|0|1|1|0|0|0|0|0|0|0|0|0|0|4|2|1|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|3|0|0|0|0|3|0|0|0|0|0|0|0|0|121|0|1410|6046|558|1400|192|2467|10|0|5|1|0|0|0|2|0|0|0|1|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|15|0|10|0|0|0|0|3|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|21|9|12|10|3|1|0|1|4|0|1|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|163|4|144|91|92|2|92|0|0|0|0|0|101|92|0|0|0|0|101|0|0|0|0|600|596|1|0|0|3|0|0|0|0|0|0|0|0|0|0|0|9|0|0|1|0|0|0|8|0|34|3|4|14|7|2|3|0|1|0|0|0|0|0|0|0|0|0|41|6|4|23|5|2|1|0|0|0|289|0|7|7|0|0|0|11|11|0|0|4|4|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0||0|0|4|0|0|0|0|0|4||||||||||3|0|0|0|0|0|3||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|55.814689636230469|47.931411743164062|48|0.0|0|0.0|0|0.068584725260734558|2|0.0|0|0.0|0|14|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0.0|0|0|0|0|1|0|0|1|0|0|0|0|0|||0|100.0|100.0|0|26|26|0|0|0|0.088263392448425293|2|0.056161552667617798|2|0|0|0|0|0|0|0|0|0|5|22|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|16308.888671875|146780|23162.22265625|184840|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||||||0|0||||||||||

11220|26341|2007-09-03 00:00|0|0|0.0|0|0.309184730052948|2|0.17757916450500488|0|0|7|4|18|3925|13682|172|0|19|0|0|164|10|5|4|0|0|0|2||0|0|20|0|0|1|4|0||0|5|0|4|172|172|0|1165085|0|20|4|0|20|8|0|0|0|0|0||0|1912|0|0|24|0|0|1|0|0|0|0|23|0|0|30|0|164|4|6|8|0|0|0|0|0|0|0|121|10|15|24|24.0|16|16.0|0|0|0.0|4|0|24|23.646148681640625|1|0.0040013887919485569|0|0.0|0|0.0|0|0.0|11|2.0849723815917969|172|5|0|0|123|44|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|26|24|0|0|0|2|0|0|0|0|0|192|1|0|0|0|0|191|0|0|190|12|0|0|0|0|178|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|3|0|0|0|0|0|1|0|0|0|0|1008|953|2758|0|5|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|4|||||||||||||0|0|0|0|0|84.418106079101562|0|0|2626|1420|||0.29022222757339478|0|5|1.5045944452285767|0|5|||0.0058597764000296593|2|0.0046600494533777237|2|1340|1114|80|119|27|2.2584490776062012|10|1180|5|0|0|123|44|953|55|14462|7024|7438|52|2758|3.0037333965301514|5|0.021266667172312737|1|0.00036666667438112199|1|0.0|0|0.0|0|362|6440|6880|0|0|0|0||0||13711|27667|0|159|149|10|0|0|1|1|0|0|0|0|0|0|0|0|7|0|0|7|0|0|0|0|4|0|0|0|0|7|0|7|0|0|0|0|0|0|0|0|0|2|3|0|0|0|0|0|0.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|842|0|111|0|102|1702|0|1|842|0|111|0|0|1703|0|0|0|0|0|0|0|0|0.0|0|0|47|26|11|3|7|37|1|20|16|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|4|0.24921548366546631|0|44|44.0|0|0.0|0|0|0|1003|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|10|0|8|2|0|0|0|0|0|0|0|0|0|0|81|1|60|10|10|0|0|0|0|0|0|0|0|0|1|1|0|0|0|0|0|0|0|25|16|4|2|3|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|53|27|17|4|5|0|0|0|0|0|0|0|0|0|421|0|8685|67179|2138|12104|80|26285|104|1|73|16|14|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|87|1|77|7|2|0|0|0|0|0|0|0|0|0|1|0|0|1|0|0|0|0|0|0|0|0|0|0|16|0|9|5|2|0|0|0|0|0|0|0|0|0|155|155|0|105|51|32|9|13|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|1|0|0|0|0|0|0|0|0|0|0|0|5|5.0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|102|0|0|0|0|0|600|445|4|20|32|63|16|15|4|1|0|0|0|0|0|0|0|37|0|0|5|0|0|0|32|0|230|7|10|99|68|22|21|0|3|0|0|0|0|0|0|0|0|0|286|18|10|182|53|17|6|0|0|0|2528|0|10|10|0|0|0|22|22|0|0|25|25|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||0|0|0|0|0|0|0|0|0|0|0||0|0|30|0|0|0|0|0|30||||||||||23|0|0|0|0|0|23||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|1|0|0|0|0|0|0|0|0|0|0|0.0|45.998283386230469|46|0.0|0|0.0|0|0.30917638540267944|2|0.0|0|0.0|0|8|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0.0|0|0.0|0|0|0|0|1|1|0|0|0|0|0|0|0|||0|100.0|100.0|0|2|1|0|0|1|0.73375397920608521|5|0.41600942611694336|6|0|0|0|0|0|0|0|0|0|0|0|0|0|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|||0|||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|98115.5546875|865520|176626.671875|1159360|||||||||||||||||||||||||||||||||||||||||||||||||||||||0|0|0|0|0|0|0|0|0|0|0|0|0|0|0|0||||||||||0|0||||||||||

We found that some of the data field become 'null' after the load action finished, if we load the same data again, problem disappeared, we can't 100% reproduce this issue each time, we don't know why, Anybody here can help us to solve this issue or give us some clue?


View 3 Replies View Related

SQL Server 2005 Integration Service (SSIS) Reading HTTP Data

Jan 16, 2008



I need to read in general web pages (not a web service) from a typical web site using SSIS and make it available for other SSIS transformations (Script Component). I tried using the XMLSource data source but this appears to require well formed XML, and will not accept HTML which is what I am likely to be getting from the web pages.
I tried a HTTP Connection Manager with a DataReader Source, but seems to only accomodate web services.

Can this be done? If someone has an example (tutorial) of how to accomplish this I would greatly appreciate a copy.
James

View 1 Replies View Related

Deleting Existing Data Before Loading New Data

Apr 10, 2007

I have a package which loads data from a flat file (csv) to 4 tables in a database.
Now, the load is incremental.

I want to clear the data of all 4 tables(in the database) before loading the data from flat file everytime.How can i do this?
Iam using 4 Oledb Destinations, 1 multicast, 1 source component to do this.
Also can it happen like a transaction? because if it deletes the existing data and couldnt load new data there will be a problem!.how to avoid this?

View 4 Replies View Related

Loading Data

May 18, 2001

Hi,
I am loading Data from Mainframe to SqlServer on WinNt.
Normally it was taking 35 mts do the dts job.
on last two days it runs for more then six hours still the job doesnt get over.
I am at a loss to know what to do and how to fix this problem. THe main frame ppl said sqlServer is fetching the data very slowly.
If anyone knows the solution pl post a solution

View 5 Replies View Related

Data Loading

Apr 10, 2000

What is the best way to load large amounts of data? I am working on a project where I will need to load data into approx. 20 tables. Into several of the tables I will need to load around 400,000 records. I am familiar with the concepts involved in using BCP but was hoping I could avoid the step of going to text files. I am pulling data from Access (either 97 or 2000). Any suggestions would be welcome.

View 1 Replies View Related

Loading Data From Xml

Sep 17, 2006

I have an xml file I want to use as the source. It's not overly complicated, but not simple either. It has one hierachy, and one optional field and looks like this



a

1 'text'

2 'text'

b

1 'text'

2 'text' 'optional field'



Ok, now I want the data to load like this:

a,1,text

a,2,text

b,1,text

b,2,text, optional field



but when I try to use the xml source it won't create the xsd...anything I can do?

View 9 Replies View Related

Help Loading XML Data

May 1, 2008



I am trying to enhance an existing package that actually does the xml to table load, this packge is using a script component to parse and load the xml file into multiple tables (3 tables) using VB.net code, I want to add a component to it where if there are any xml rows that are erroring out , they get redirected to a different table, this log table is having only one column and the xml records are supposed to be loaded into this column in XML format. this is the existing design and I have to live with it and at the same time I am not a big .net coder, any help is appreciated.

Thanks

View 5 Replies View Related

Loading Data To Db2

Jun 29, 2006

Hi,

trying to load data from sql2000 to DB2 UDB 7.2 ,

building SSIS package using wizard

case 1:

source ole db -sql2000 destination IBM ole db provider for DB2

data conversion compnent placed on data flow page , package run fine

case 2

source ole db -sql2000 destination microsoft ole db provider for DB2

data conversion component NOT placed on data flow page ,just source and destination , package fail (can't detect db2 code page)

questions

if I transfer 1000000 rows from sql to db2 ,what will happen ?

in case 1 :

is it single transaction ?

is possible to set "Maxinsertcommitsize" or other property and commit every N rows ?

case 2:

if I remember microsoft ole db driver for db2 (used in host integration server) forced commit every row. (I might be wrong)

again , any property could be set to commit every N rows ?

Thanks

Alex



View 1 Replies View Related

Stream Aggregate

Jul 24, 2001

What I'm trying to solve:
I have an application that generates SQL queries, and sometimes uses
DISTINCT where the result set has no dupe rows. In terms of database
resources, I'm trying to figure out if it's worth it to change to app to be
smart enough to not use DISTINCT where it won't serve any purpose, or
whether to let it do the DISTINCT and save added complexity to the query
building application. I.e. what is the cost of DISTINCT where there are no
dupe rows?

What I want to know:
Can someone explain how the stream aggregate operator actually goes about
doing its work?

Does this always create a temp table for sorting and discarding duplicates
(for DISTICNT)? If the answer is "no or sometimes", how does it do so in
the case where a temp table is not involved? I noticed the the estimated
I/O for this operator was zero for some queries I wrote agains pubs. Does
this mean that the optimizer believes the temp table needed will fit
in-memory and creates it in-memory? Or does the estimated I/O figure not
included disk writes for work tables?

tia for any info
Bill

View 1 Replies View Related

What Is My Best Strategy For Loading Data.

Apr 22, 2007

I have been developing a genealogy application using a SQL Server 2000 database and ASP .NET 2.0.  In this application a process, Ged.Parse, converts data from the GEDCOM standard format (a heirachical file format that looks as if it was designed for 80-column cards) into my SQL Server database.
As we started to load reasonable quantities of data into the system we found that the on-line response became abysmal.  This problem was fixed by defining a number of secondary indexes (response times dropped to under a second, from previously exceeding 2 minutes and often timing out).  Unfortunately however the processing time of Ged.Parse then tripled, and it may now take up to an hour to process a GEDCOM. I believe that this is a byproduct of defining several indexes that are not needed by Ged.Parse itself, but which are of course maintained as Ged.Parse inserts new records into the database.  
I am wondering what my best strategy is, apart from putting Ged.Parse into a background task and just letting it trickle away.  (I will probably do this anyway). What I'd like to be able to do is to have Ged.Parse load records without creating the secondary indexes, and then create the indexes for the newly-added records as a penultimate step just before it makes them available for general use.  Of course there is no way that you can do this:  records in a table are either indexed or they are not.
Proposed change:  recode Ged.Parse to load data into temporary tables, say NewPeople, NewFacts, etc., with these tables having only the indexes required by Ged.Parse. Then, as the last process in Ged.Parse run a SQL procedure with code like: -            Insert into People Select * From NewPeople            Delete from NewPeople            etc
This is a reasonable amount of programming, so before I make this change could somebody tell me:  will this be significantly faster overall, or is this likely to make little or no improvement compared to the present process in which Ged.Parse loads data directly into People, Facts, etc?   Two facts that may influence the answer.  First, all record relationships are through GUIDs, so records in NewPeople, NewFacts, etc would already have their final key values.  Second: although Ged.Parse needs to form relationships between records, these relationships are only within the new records (created from the same GEDCOM), and Ged.Parse does not need to relate any of these new records to earlier records.
Thank you,
Robert Barnes.
 

View 2 Replies View Related

Loading Data Into SQL7

Jun 26, 2001

Hi!
I need to load text data into SQL7. The tricky part (at least for me) is that this data may be duplicated.
How can I load this data discarding the duplicated rows? AFAIK, the sql job (or DTS) will fail if a primary key is violated.
TIA,
Fabio Aneas

View 1 Replies View Related

Can I Automate Data Loading Through DTS?

Sep 19, 2000

Using SQL Server 7.0, I need to watch for a file to be placed in a directory and then load it automatically. What is the best way to do this? I have the Bulk Insert process set up in DTS and would like to just add another process if possible.

View 2 Replies View Related

Loading Data Into Sql Table

Apr 25, 2008

I want to read file from c:abc.txt through sql programming and Insert this file data into table.

suppose c:abc.txt has following data
aman 10 50,000
sumesh 20 40,000

suppose I have created table abc( a varchar(30), b int, c int)
How I will do it

Thanks
Ajay

View 3 Replies View Related

Loading Data Into A Database

Sep 13, 2007

Greetings,

I am new at SQL and I have been asked to load some data into a database. I was given a file that has an extention .sql. shown below are the first few lines

quote:TRUNCATE TABLE henrylee.[HenryLee]
GO
INSERT INTO henrylee.[HenryLee] ( [custno], [company], [address1], [address2], [city], [state], [zip], [phone], [email] ) VALUES ( 'C00001', 'SUPREME NOVELTY', '5954 S PULASKI ROAD', '', 'CHICAGO', 'IL', '60629', '', '' )

several more rows after this

This looks to me like it was built to be scripted in or use some function of SQL to create and populate the table... does that make sense? Anyway, is there an easy way to insert this data into a table?

Thanks
-B

View 15 Replies View Related

Loading Data To A Table

Feb 20, 2008

Hi,
I created a package to load a fact table
which loads more than 7 million records.
When loading the table it took nearly 15 minutes.
Then indexes were created for the table at the DB level after which the time to load same number of records has increased.
How to resolve this time delay?

View 4 Replies View Related

Loading Data From Sql To Access

Oct 25, 2007

Hi the scenario is,

I have a remote website which uses sql server database. the sql job runs in a server called goofy which imports the data from remote sql to an access sitting on another server called dumpy. I get below error when trying to import data. I check but no one is accessing this data.


Executed as user: GOOFYSYSTEM. ...e Package Utility Version 9.00.1399.06 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 7:45:29 PM Progress: 2007-10-25 19:45:29.44 Source: Data Flow Task 1 Validating: 0% complete End Progress Progress: 2007-10-25 19:45:29.44 Source: Data Flow Task 1 Validating: 33% complete End Progress Progress: 2007-10-25 19:45:29.60 Source: Data Flow Task 1 Validating: 66% complete End Progress Error: 2007-10-25 19:45:29.68 Code: 0xC0202009 Source: importdata Connection manager "SponsorshipData 1" Description: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "The Microsoft Jet database engine cannot open the file '\dumpyd$CompanyCCNASponsorshipData.mdb'. It is already opened exclusively by another user, or you need permission to view its data.". End Error Error: 20. The step failed.

my another question is. there is a column of type memo in access. but in remote sql i have that field as ntext how could i also pull ntext into memo field. i saw in forums but i found it too complex to understand is there any step by guide to pull ntext data into memo field in ssis.

thanks in advance.

View 3 Replies View Related

Loading Data From XML To Tables

Dec 1, 2007

Hi,

I have got an xml file with size more than 2 GB. I have to load this file into tables. With 32 bit platform, I am unable to load this file using SSIS. Ram is 8 GB, but it is still bombing out. As I know it uses XML DOM Parser and tries to shred the file in memory and because of memory limition, it fails. Although I have already written code in C# using XmlTextReading object(implemetation of SAX Parser) to load data in tables, but I want to keep this loading process within the limits of DBAs.

I am stuck. Can someone guide me through the situation?

Thanks for your help!

Navnish

View 8 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved