Data Conversion Mapping Weirdness

Mar 4, 2008

Hi,

I have an Excel source > Data Conversion task > OLE DB Destination.

In the Data Conversion task I rename all the outputs to match the column names in my destination table.

However, when I go to map the columns in the OLE DB Destination mapping tab, it's a mess. Some fields are prefaced by "Data Conversion" as in "Data Conversion.column1", others are prefaced by "Excel Source" and others have no prefix at all.

What's confusing is, since all the columns are going through the Data Conversion task, how come I don't have ALL fields prefaced by "Data Conversion" ?

That is, only SOME of the fields have a "Data Conversion" prefix, and some don't. The ones that aren't prefaced by "Data Conversion" have no corresponding "Excel Source" so I'm assuming that the ones without the prefix are from the Data Conversion task.

It's inconsistent. Any ideas.

Thanks

View 3 Replies


ADVERTISEMENT

WBE Based Data Mapping/conversion Tool

Oct 3, 2007

Can anyone recomend an EXTREEMLY user friendly web based data mapping / conversion tool? I am wanting to transform csv, xml, database sources into csv, xml, db sources. There are a ton of Windows based applications (ie: www.altova.com / mapforce). I am looking for something I can expose via the web...

users has ability to upload a csv file and then define how the data would be mapped to an output source (say another csv or xml). In addition having functions like concantination, sum, if-exists, etc...

Does SSIS have a user friendly interface or has someone written some interactive tools on top of SSIS.

Any information will be greatly apreciated.

Thanx

View 1 Replies View Related

SSIS Parameter Mapping With Oracle Data Type Mapping!

Mar 19, 2008

Hi Friends,

I have a small problem in parameter mapping for Execute SQL Task.
I am using a delete statement with 2 conditions.
Followed by another Execute SQL Task which contains commit statement.

delete from tname where c1 = ? and c2 =?

where c1 is number(4) datatype and c2 is of varchar2(20) datatype in oracle.


The connection manager i am using is ORacle OLE DB provider.
I am passing 2 global variables i.e g_v1 of Int32 and g_v2 of String Type.

In the parameter mapping of the Executing SQL task, i am mapping these 2 variables for
c1 and c2 and changed the datatypes inside parameter mapping as Numeric for c1 and Varchar for c2.

I also set the property as ByPassPrepare = True.

When i am executing the package i getting INVALID NUMBER ERROR.
i believe the SSIS is unable to perform the implict datatype converison.

For the next run, i changed the g_v1 varible datatype to Double and also i changed the parameter mapping for c1 as Doble datatype.
This time it is working fine. I can see the Green signal for the 2 SQL Tasks.

But when i connected to Oracle check the count in the table, the data is not getting deleted.

Also,
I set the property RetainSameConnection = TRUE for oracle connection manager.
I am not able to trace this logical error.

The same is working fine in my local machine.
But i am facing the problem when i deployed the same on the client machine.


Is there any problem with parameter mapping?
What should be equialent Datatype for Oracle NUMBER datatype that should be used inside the SSIS package while declaring the global variable and
inside the parameter mapping.

Any thoughts!

View 5 Replies View Related

Mapping Of SQL Server Data Types To Integration Services Data Type

Oct 14, 2005

Does anyone know of any cross-references between SQL Server data types and the new data types introduced with SQL Server Integration Services? 

View 6 Replies View Related

Weirdness In .NET Sqlclient?

Mar 4, 2008

Hi All: I'm having problems creating a data connection between a windows 2000 server running .net 2.0 and a sql2005 server (server2003). I'm trying to create a connection in VWD2005 express, and when I use the visual wizard, I get a connection error with something about "named pipes" when I'm trying to connect using an alias on the server that uses tcp/ip (the alias is created in cliconfg).
If I click on "advanced properties" in the connection, and I select "TCP/IP" for the Network Library, what's displayed is "TCP/IP (DBMSGNET)" ...but shouldn't it say "DMBSSOCN"?
I'm wondering if this is correct, or a cosmetic bug in the .net sqlclient GUI, or indicative of any underlying problem that's preventing me from creating the connection?
If I manually override the connection string in web.config by entering "Network Library: DBMSSOCN", I still can't connect, getting a timeout error that the sql2005 server is not responding. But I know the server is working properly, with other .NET DB connections working within the same application...so I'm thinking this must be a problem with VWD and/or the .net framework on my local computer (not the server)?
I don't know where to begin to look to debug this problem?

View 2 Replies View Related

Servername Weirdness

Mar 30, 2001

Hey folks,
Today's been way too much fun...
Someone changed the name of a machine in our department and now Win2k thinks the machine's name is (for example) "Fred", but if I do a select @@servername, SQL returns (again, for example) "Barney".

The machine's name started out as "Barney".

The last time this happened, it was on an NT4 box, and we just reinstalled sql and away we went.

This time, they reinstalled sql and released the box to the team, only to find out later that reinstalling sql didn't do squat this time.

Which brings me to the question: How do I convince SQL that it's really, truly supposed to be named "Fred"?
Besides that, what do I tell Wilma, Betty, Pebbles and Bamm Bamm?

Any assistance would be greatly appreciated.

: )

Thanks,

Tom

View 3 Replies View Related

Inner Join Weirdness In DTS

Nov 23, 2005

I am trying to import data from Access 2000 in SQL Server 2000 usingDTS. One of the tasks requires a multi-table join but I am gettingsyntax errors if I generate the query with Build Query.With just a single join like this it works fine:FROM Tracker INNER JOINbdmanager ON Tracker.bdmanager = bdmanager.name,countryBut as soon as I get it to generate an extra join, e.g.FROM Tracker INNER JOINbdmanager ON Tracker.bdmanager = bdmanager.nameINNER JOINcountry ON Tracker.country = country.country.... I get "sytax error (missing operator)". The weird thing is that itgenerated the syntax itself!I can paste the query into Access and it works fine.Why is this happening and what's the best workaround?ThanksAndy

View 3 Replies View Related

Sqldatasource Weirdness On Postback

Jan 7, 2006

If I alter the SqlDataSource select command in code and then bind to a gridview, I run into problems. When I do a sort, next page (basically any postback), the datasource goes back to the original state. It is like the SqlDataSource is not maintained in the  state. I end up having to re-alter the SqlDataSource select command on every page_load. Is this by design or is this a bug?
Is the SqlDataSource any "smarter" than doing it the old fashion way by populating a Dataset on (!IsPostBack) ? For example, if I have a bunch of data in a paged gridview, is the SqlDataSource smart enough not to bother filling the entire dataset if I don't need it for that page's display?  I know the SqlDataSource provides for update/insert/delete, but I am not doing that in this application, it is just a query/report page.
Thanks in advance

View 4 Replies View Related

Stored Procedure Weirdness

Apr 15, 2008

I've been writing stored procedures for a while, but right now I'm stumped on something. I've got this one error when I try to use OPENROWSET in my stored procedure that tells me I need to set my ANSI_NULLS and ANSI_WARNINGS, so I put SET ANSI_NULLS ON GO SET ANSI_WARNINGS ON GO into my stored procedure, I clicked check syntax (this is all in enterprise manager), it was okay, I clicked okay, it told me I needed to set those things again, so I played around some more, deleted the SET ANSI's, adding them, deleting them, moving them around, and eventually it worked after I deleted them again.

Anyone know why this happens? Where exactly are my SET ANSI_NULLS supposed to go? As a workaround I've been simply running them in the query analyzer.

View 1 Replies View Related

Data Conversion Failed. The Data Conversion For Column Value Returned Status Value 4 And Status Text Text Was Truncated Or On

Jan 7, 2008

Hi Experts,

I am extracting data from SQL Server 2005 to flat file destination. I am using SQL Command to specify the data selection query. One of my query uses Replicate function to derive a column value. When I execute this package it fails with the error "Data conversion failed. The data conversion for column "value" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page".

The reason for the problem is that, it is taking the InputColumnWidth of the flat file destination as 8000 and I specified the OutputColumnWidth as 4.

If I change the OutputColumnWidth to 8000, it is working without any error but resulting in the column width of 8000.

I tried using DerivedColumn Transformation's Type cast and DataConversion Transformation but still I am getting the same error in the respective Transformation components.

Can anyone suggest how to solve this issue.

View 11 Replies View Related

Data Mapping And Migration

May 28, 2004

Has anyone used DTS packages for migrating old data to a new schema?

If so are there any tutorials on this?

I'd prefer not to do this by hand. ;-)

View 3 Replies View Related

Mapping XML Data To Variable

Jun 20, 2007

I can€™t figure out how to map xml data stored in a table to a variable in integration service.

For example:
I would like to use a €śfor each loop container€? to iterate through a row set selected from database. Each row has three columns, an integer, a string and an xml data. In the variable mappings, I can map the integer column and the string column to a variable with type of int and a variable with type of string. But I am having trouble to map the xml data column to any variable. I tried using either a string variable or object. It always reports error like €śvariable mapping number X to variable XXX can€™t apply€?.
Any help?

View 1 Replies View Related

Data Mapping And Importing App For SQL Server

Nov 4, 2006

Hello,Our company often receives data from outside sources to add to our application.  This data is usually provided to us in Excel, CSV, XML, etc.  The files that we receive usually have different columns from the columns in our database, so we have to map these columns to our table structure to import.I'm looking for an application that will easily allow me to load up the data file (whatever type it may be), expose the columns in the data file, allow me to map these columns in our SQL server, then import the data.  I know that this can be done as DTS, however I'm looking for alternatives.  Does anyone have any recommendations? Thanks in advance. 

View 1 Replies View Related

Data Mapping And Migration Tool?

Jun 21, 2014

I am developing one automation tool for data migration from one table to other table, here i am looking for one function or SP for which i will pass source column and destination column as input parameter and want output parameter to return true when source column data is compatible to copy to destination column if not then it should return false.

For example if source column is varchar and destination column is integer, the script should check all the data in source column in good enough to move to integer column or not and return the output flag. I want a script to work this for all types of data types.

View 2 Replies View Related

Major Query Optimiser Weirdness With UDFs And SPs On SQL 2000

Jul 20, 2005

There is something very strange going on here. Tested with ADO 2.7 andMSDE/2000. At first, things look quite sensible.You have a simple SQL query, let's sayselect * from mytab where col1 = 1234Now, let's write a simple VB program to do this query back to anMSDE/2000 database on our local machine. Effectively, we'llrs.open sSQLrs.closeand do that 1,000 times. We wont bother fetching the result set, itisn't important in this example.No problem. On my machine this takes around 1.6 seconds and modifyingthe code so that the column value in the where clause changes eachtime (i.e col1 = nnnn), doesn't make a substantial difference to thistime. Well, that all seems reasonable, so moving right along...Now we do it with a stored procedurecreate procedure proctest(@id int)asselect * from mytab where col1 = @idand we now find that executingproctest nnnn1,000 times takes around 1.6 seconds whether or not the argumentchanges. So far so good. No obvious saving, but then we wouldn'texpect any. The query is very simple, after all.Well, get to the point!Now create a table-returning UDFcreate function functest(@id int) returns table asreturn(select * from mytab where col1 = @id)try calling that 1,000 times asselect * from functest(nnnn)and we get around 5.5 seconds on my machine if the argument changes,otherwise 1.6 seconds if it remains the same for each call.Hmm, looks like the query plan is discarded if the argument changes.Well, that's fair enough I guess. UDFs might well be more expensive...gotta be careful about using them. It's odd that discarding the queryplan seems to be SO expensive, but hey, waddya expect?. (perhaps theUDF is completely rebuilt, who knows)last test, then. Create an SP that calls the UDFcreate procedure proctest1(@id int)asselect * from functest(@id)Ok, here's the $64,000 question. How long will this take if @idchanges each time. The raw UDF took 5.5 seconds, remember, so thisshould be slightly slower.But... IT IS NOT.. It takes 1.6 seconds whether or not @id changes.Somehow, the UDF becomes FOUR TIMES more efficient when wrapped in anSP.My theory, which I stress is not entirely scientific, goes somethinglike this:-I deduce that SQL Server decides to reuse the query plan in thiscircumstance but does NOT when the UDF is called directly. This iscounter-intuitive but it may be because SQL Server's query parser istuned for conventional SQL i.e it can saywell, I've gotselect * from mytab WHERE [something or other]and now I've gotselect * from mytab WHERE [something else]so I can probably re-use the query plan from last time. (I don't knowif it is this clever, but it does seem to know when twotextually-different queries have some degree of commonality)Whereas withselect * from UDF(arg1)andselect * from UDF(arg2)it goes... hmm, mebbe not.... I better not risk it.But withsp_something arg1andsp_something arg2it goes... yup, i'll just go call it... and because the SP was alreadycompiled, the internal call to the UDF already has a query plan.Anyway, that's the theory. For more complex UDFs, by the way, theperformance increase can be a lot more substantial. On a big complexUDF with a bunch of joins, I measured a tenfold increase inperformance just by wrapping it in an SP, as above.Obviously, wrapping a UDF in an SP isn't generally a good thing; theidea of UDFs is to allow the column list and where clause to filterthe rowset of the UDF, but if you are repeatedly calling the UDF withthe same where clause and column list, this will make it a *lot*faster.

View 3 Replies View Related

Mapping SQL Server Data Types To C DataTypes

Jul 20, 2005

Hi Can anyone point me to a document somewhere that shows a mapping ofSQL Server 2000 datatypes to C datatypes? I am writing some extendedstored procedures which need to be able to process pretty much anydata type, so I want to make sure I am taking them from SRVPROC andstoring them in the correct C data type.Thanks,Bruce

View 1 Replies View Related

One To Many Column Mapping In Data Flow Task

Sep 27, 2007



Hi,
Is there a way to accomplish one- many or many -one or many - many column mappings in the SSIS data flow task or using any other tasks. We were able to do this in DTS Transform data task. Also is it possible to edit the mapping like:
dest column1 = Right(dest column1, 3)

Thanks.

View 4 Replies View Related

SSIS Data Cleansing Mapping Problem

Apr 18, 2007

I am just starting out with SSIS and trying to get the feel of data cleansing using it. But on my very first project for data cleansing I've got into this weird error.



My data flow is very simple, it has a OLE DB source, Fuzzy Lookup and OLE DB destination. I've built three tables for this purpose, one is source, one is reference (it will be used to match for the real entries in fuzzy lookup) and the last is the destination table.



In all the three tables I've a field of City which I'd like to Fuzzy lookup in the reference table and if it crosses certain confidence level, I'd like to insert to the destination table. City in all the tables has the same datatype, defined in the same way, it is varchar(50).



But when in the fuzzy lookup I try to map the Source tables City field to reference tables City field, it gives me this error:



The following columns cannot be mapped:

[City, CityRef]

One or more columns do not have supported data types, or their data types do not match.



Although as I have mentioned before, both have same data types and are defined in the same manner (i.e. I've just selected the datatypes for those columns and all the other settings are left to default). I just cannot understand why this is happening, plz help me with this. FYI I've also tried to give the City Column different datatypes in all the tables like varchar(max), text, Only to be greeted with the same error message.



Waiting for your reply!!

Regards,

Sajid.

View 6 Replies View Related

Mapping Un Structured Report Data Into SQL Server Database

Feb 29, 2008

 Hi I need to map the unstructured  report to my Sql server data source,Is there any method to achieve this in ASP.Net or is there any good tool available in the market which support ASP.Net2.0 with SQL server 2005.My requirement is to load/parse the existing data into my application which in various formats for each cleintsPlease help me on this   

View 2 Replies View Related

Data Type In Parameter Mapping For An Execute SQL Task

Jul 12, 2006

Hi, I am trying to use an integer as input parameter for my task I get suck on the parameter data type.

The input parameter is define as @Control_ID variable as Int32 in SSIS. When I got into the parameter mapping of Execute SQL Task, I don't find the Int32 data type. I used to try Short, Numeric, Decimal and so on, but all of those data type didn't work. and it returns the following error message:

SSIS package "DCLoading.dtsx" starting.
Error: 0xC002F210 at Update Control_ID, Execute SQL Task: Executing the query "use DCAStaging

update DCA_HFStaging set
[dbo].[Control_ID] = P0 where [Control_ID] is null
" failed with the following error: "The multi-part identifier "dbo.Control_ID" could not be bound.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Update Control_ID
Warning: 0x80019002 at DCLoading: The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "DCLoading.dtsx" finished: Failure.

Any help?

View 6 Replies View Related

Custom Data Flow Component Column Mapping Question

Feb 7, 2007

Hi,

I'm having my first go at developing a destination adapter which will send data to an update Web Service.

I've got some rather big gaps in my understanding. I've been following the various samples I've found on the net and have validated my mapping and picked up all the available column names and datatypes which are appearing in the Input and Output Properties tab of the Advanced Editor but I only have a tab for "Input Columns" and not "Column Mappings".

Which method defines the availble columns for the user to map?

Let me know if I haven't given enough information.

cheers

View 1 Replies View Related

SSRS -- Data Driven Subscription And Pivoting For Dynamic Parameter Mapping

Feb 12, 2007

Hi,

For the Data Driven Subscription in SSRS we are using the following stored procedure

In Step 3 - Create a data-driven subscription



create procedure spRSGetReportSettings

(

@ReportID as integer

) as

begin

set nocount on

declare @t as table(y int not null primary key)

declare

@cols as nvarchar(max),

@y as int,

@sql as nvarchar(max)

set @cols=stuff(

(select N',' + quotename(y) as [text()]

from (select ParameterName as y from Reportsettings where reportid=1) as Y

order by y

For XML Path('')),1,1,N'');

set @sql=N'select * from

(select reportid,parametername, parametervalue from ReportSettings where reportid= ' + Cast(@ReportID as varchar(5)) +' ) as D

pivot(min(parametervalue) for parametername in(' + @cols +N')) as p'

exec sp_executesql @sql

end



Basically the idea is to maintain a single report parameter setting table for multiple reports.

Structure of the table is as given below

ReportID, ParameterName, ParameterValue.

Using Pivot we can generate the ParameterName/ParameterValue combinations for each report. This stored procedure is working fine in query editors(Management Studio)

But, in SSRS it is giving any results.

In Step 4 - Create a data-driven subscription,

Get the value from the database drop down, I am not getting any database columns.

Please help.

Kumar

View 3 Replies View Related

SQL Server 2008 :: Review Data Type Mapping (Import And Export Wizard)

Jun 5, 2015

Not seeing the Review Data Type Mapping Screen in SQL Server Import and Export Wizard?

Is there only a certain version where that screen shows up?

I am trying to import data from an MS Access application to SQL Server and all of the connections are good, but some of the data isn't and if I let it migrate using this tool it crashes on the bad data and there is no data that migrates. The Review Data Type Mapping screen will allow me to bypass the records in error and load the rest. however, I can;t do that if I cannot see the screen.

View 9 Replies View Related

Analysis :: Sales And Mapping Data - Apply Join To Get Result Into SSRS Report

May 28, 2015

I have sales data in SSAS cube and mapping data in RDBMS table. I want to apply join to get result into SSRS report.

Here we should get data of yesterday from time dimension of cube.

Time is in [Time].[FiscalYearHierarchy].[Fiscal Day].&[2015-05-28T00:00:00] format.

View 4 Replies View Related

Mapping Surrogate Keys Of Level 2 Dimensions To Fact Table In SSIS Data Flow

Aug 16, 2007



Hi,
I use lookups to map surrogate of level 1 dimensions to my fact tables in SSIS.
But how to handle a level 2 dimension with a ValidFrom and a ValidUntil date field?
I do not use an IsCurrent column, because this could problem with late arriving facts.


- In dts I used an SQL statement like this:

update SA
SET SA.DimProdRef = Dim.RecordID
FROM SAWarenEingang SA, DimProd Dim
where SA.ProduktNumber = Dim.ProduktNumber
and SA.ArtikelkontoBewegungsdatum between Dim.ValidFrom and Dim.ValidUntil


Now in SSIS I want to handle the whole thing in the data flow without using a staging table:
- Using Lookups: I would have to pass the date column for each inside the fact table into the lookup. That does not work.
- Using Execute SQL in the data flow: would be very slow, because the statement will be executed for any line in the dataflow


Any ideas?


Best regards,
Stefoon

View 10 Replies View Related

Data Conversion From String To Decimal When Saving Data To SQL Server 2005 Using An ADO Recordset

Feb 12, 2008

Hello,

I am wondering what conversion rules apply, when a string, which contains a number, is saved to a SQL Server 2005 into a column of type decimal.

This is the code I€™m using (C++):

CString cValue = "0.75"
_variant_t vtFieldValue;
vtFieldValue = _variant_t(cValue)
pRecordSet->Fields->Item["MyColumn"]->Value = vtFieldValue;

"pRecordSet" is an ADO recordset. The database column "MyColumn" is of type "decimal(19,10)".

The most important question for me is, if the regional settings of the database server or the regional settings of the client PC are considered during the conversion from the string to the decimal value. For example in standard French regional settings the "." would not be recognized as decimal separator.

I am also wondering if the language of the database instance, in which this data is saved, is considered during this conversion or any other settings of this database instance.

So my general question is: Does anybody know exactly what rules apply during the above mentioned conversion?

Thank you for your help.

Regards,
Volker

View 2 Replies View Related

SSIS - Data Conversion Failed - The Value Could Not Be Converted Because Of A Potential Loss Of Data.

Aug 3, 2006

Hello

 

I have an odd problem that is driving me nutz. I have a very simple SSIS package that imports a 5 colum flatfile into a sql Server 2005 Table.

When I created this package with the wizzard, it will execute perfectly fine and processes all rows into the destination table.

But when I hit F5 to execute it manually it will fail before inserting a single row.

 

The error it generates is (Spalte 5 is a Datetime in the format DD.MM.YYYY) :

Error: 0xC02020A1 at Datenflusstask, Source - Daten_NC_1_txt [1]: Data conversion failed. The data conversion for column "Spalte 5" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

Error: 0xC0209029 at Datenflusstask, Source - Daten_NC_1_txt [1]: The "output column "Spalte 5" (25)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "Spalte 5" (25)" specifies failure on error. An error occurred on the specified object of the specified component.

Error: 0xC0202092 at Datenflusstask, Source - Daten_NC_1_txt [1]: An error occurred while processing file "C:WorkDaten_NC_1.txt" on data row 177.

 

 Edit: Modified the Title so it properly reflects the Problem & the Solution

View 3 Replies View Related

Data Conversion Failed Due To Potential Loss Of Data

Aug 29, 2007



Hi,

I am getting this error when my ssis package is running

Data Conversion failed due to Potential Loss of data

the input column is in string format and output is in sql server bigint

the error is occuring when there is an empty string in the input. what should i do to overcome this

It is an ID field and should i convert to bigint or should i leave it as char datatype is it i a good solution or is there a way to over come this.

View 4 Replies View Related

IsNumeric Does Not Work On Data From Data Conversion Task

Jan 3, 2008

Hi,

I have another issue. I have an excel file that I pipe through a "data conversion" task. I have set all the column data types to strings, because there's no way to know beforehand if a particular column will be number or text because the file is very non-standard (it looks more like a formatted report).

After the data conversion, I send all the rows to a script task. In the script task, I do a check on the numeric fields.

for example:


If Not IsNumeric(Row.Price) Then


Row.Price_IsNull = True

End If


However, this check fails each and every time, even if the field contains a number! I don't have this problem when using flat file sources.

So, none of my numeric fields are getting loaded to my ole db destination.

Help, is there a way around this? Or am I forced to just skip this number check altogether? I'd prefer not to.

Thanks

View 10 Replies View Related

How To Let Result Data Place At Temptable Temperary For Other Mapping? Should I Create Temptable Firstly?

Dec 11, 2007

Hi!
I have several problems of my coding, please give me some advice.


1. How to let result data place at temptable temperary for other mapping? should i create temptable first ?

2. When I got the duplicated record result, I require to map with the main tables (tblROrder & tblSOrder)to find out the record reference no. Please advice how to handle this issue with reference no. at the outcome.

Many thanks,
New Learner


***Coding as following

SELECT code, SMSNo, holderNo, count(*) From tblROrder

WHERE Day = @Day

GROUP BY code, SMSNo, SholderNo

HAVING COUNT(*) > 1<=====result will be found, but since i didn't get the RefNo at the listing, please advise how to let the outcome with RefNo

INTO #tmpOne (code, SMSNo, holderNo) <====error found




SELECT code, SMSNo, holderNo, GrossAmt, count(*) From tblSOrder

WHERE Day = @prmDay between D1 AND D14

GROUP BY code, SMSNo, holderNo, GrossAmt

HAVING COUNT(*) > 1<=====result will be found, but since i didn't get the RefNo at the listing, please advise how to let the outcome with RefNo

INTO #tmpTwo (code, SMSNo, holderNo) <====error found



View 4 Replies View Related

System.Data.SqlClient.SqlException: The Conversion Of A Char Data Type To A Datetime Data Type Resulted In An Out-of-range Datetime Value.

Dec 14, 2005

After testing out the application i write on the local pc. I deploy it to the webserver to test it out. I get this error.

System.Data.SqlClient.SqlException: The conversion of a char data type to a
datetime data type resulted in an out-of-range datetime value.

Notes: all pages that have this error either has a repeater or datagrid which load data when page loading.

At first I thought the problem is with the date, but then I can see
that some other pages that has datagrid ( that has a date field) work
just fine.

anyone having this problem before?? hopefully you guys can help.

Thanks,

View 4 Replies View Related

Data Conversion

Dec 6, 2000

i have a column(of type 'varbinary') which has datetime stored as binary.
how do i convert this binary value back to datetime??

when i do a :
convert(datetime, column_name), i get an error message "Syntax error converting datetime from binary/varbinary string".

when i do a :
convert(datetime, convert(binary, column_name)), i get all the dates as 1900-01-01 00:00:00.000

thanx.

View 1 Replies View Related

Data Conversion

Jun 15, 2000

we are currently trying to extract data from an SQL server (10 tables)to insert into another data source (notes) would any one out there have any tips or best way to go about this

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved