SSIS Import Of CSV Don't Recognize NULL Values For VARCHAR Fields
Sep 4, 2007
Hi, i'm new to SSIS and trying to import some csv files (comma delimited) into SQL Server. A NULL value for a CHAR column is correctly regonized as NULl in SQL Server, but a NULL value for of a mapping to a VARCHAR column in SQL Server is not recognized correctly and i get the value "'NULL'" in SQL Server (including the single comma.
Sample:
CSV file contains columns A and B. A and B contains the Text NULL.
Column A is mapped to a CHAR field, and column B is mapped to a VARCHAR field in SQL Server.
After the import, SQL has the following value: A = NULL as NULL, B 'NULL' as text.
did anyone else had this problem?
thanks so much for any help.
View 4 Replies
ADVERTISEMENT
May 1, 2001
Hi,
I'm using SQL SERVER 7.0.
I'm driving myself crazy on this one. I have 2 tables that look like this:
T1T2
C1C2C3C1C2C3
JOE1OTTAWAJOE1TORONTO
MARC1OTTAWAMARC4OTTAWA
GAVINNULLHALIFAXGAVIN3HALIFAX
DARRINNULLHALIFAXDARRIN3HALIFAX
DENISENULLPITTSBURGHDENISENULLPITTSBURGH
LOUISENULLRUSSELLLOUISE2RUSSELL
ANDREA3STITTSVILLEANDREANULLSTITTSVILLE
MARIO66PITTSBURGHGEORGE6KINCARDINE
LARRY6KINCARDINE
MARIO66PITTSBURGH
What I need to do is get all of the records from T2 that don't match EXACTLY to a record in T1. So I figured a LEFT OUTER JOIN should work:
SELECTT2.*
FROMT2 LEFT OUTER JOIN T1
ONT2.C1 = T1.C1
ANDT2.C2 = T1.C2
ANDT2.C3 = T1.C3
WHERET1.C1 IS NULL
But this statement returns the DENISE record when I do this (which has an EXACT match).
So, my thoughts took me to the NULL values in T1.C2 and T2.C2 for this record and I thought that, perhaps, the NULL values aren't being recognized as being equal (as they are UNKNOWN).
So I started digging around and found SET ANSI_NULLS OFF. I tried it but with no luck. Can you offer any insight on this one? What can I do to have NULL values recognized as being equal?
This is the result set that I would like to have returned in this example:
JOE1TORONTO
MARC4OTTAWA
GAVIN3HALIFAX
DARRIN3HALIFAX
LOUISE2RUSSELL
ANDREANULLSTITTSVILLE
GEORGE6KINCARDINE
LARRY6KINCARDINE
I've included a script to build and populate the tables below.
Any help on this will be greatly appreciated.
Thanks in advance,
Darrin
------------------------------------------------
IF EXISTS(
SELECT*
FROMSYSOBJECTS
WHERENAME = 'T1'
)
DROP TABLE T1
GO
IF EXISTS(
SELECT*
FROMSYSOBJECTS
WHERENAME = 'T2'
)
DROP TABLE T2
GO
CREATE TABLE [dbo].[T1] (
[C1] [varchar] (50) NULL ,
[C2] [varchar] (50) NULL ,
[C3] [varchar] (50) NULL
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[T2] (
[C1] [varchar] (50) NULL ,
[C2] [varchar] (50) NULL ,
[C3] [varchar] (50) NULL
) ON [PRIMARY]
GO
INSERT T1 VALUES('JOE', '1', 'OTTAWA')
INSERT T1 VALUES('MARC', '1', 'OTTAWA')
INSERT T1 VALUES('GAVIN', NULL, 'HALIFAX')
INSERT T1 VALUES('DARRIN', NULL, 'HALIFAX')
INSERT T1 VALUES('DENISE', NULL, 'PITTSBURGH')
INSERT T1 VALUES('LOUISE', NULL, 'RUSSELL')
INSERT T1 VALUES('ANDREA', '3', 'STITTSVILLE')
INSERT T1 VALUES('MARIO', '66', 'PITTSBURGH')
GO
INSERT T2 VALUES('JOE', '1', 'TORONTO')
INSERT T2 VALUES('MARC', '4', 'OTTAWA')
INSERT T2 VALUES('GAVIN', '3', 'HALIFAX')
INSERT T2 VALUES('DARRIN', '3', 'HALIFAX')
INSERT T2 VALUES('DENISE', NULL, 'PITTSBURGH')
INSERT T2 VALUES('LOUISE', '2', 'RUSSELL')
INSERT T2 VALUES('ANDREA', NULL, 'STITTSVILLE')
INSERT T2 VALUES('GEORGE', NULL, 'KINCARDINE')
INSERT T2 VALUES('LARRY', NULL, 'KINCARDINE')
INSERT T2 VALUES('MARIO', '66', 'PITTSBURGH')
GO
View 3 Replies
View Related
Sep 6, 2007
I have an Excel file(.xls) that I need to import to SQL Server database. I am trying to use the Import wizard. However, when I select the file and click next I get the warning "External table is not in the expected format.(Microsoft JET Database Engine)" . I opened the .xls file using a text editor. It looks like some html document.
<html xmlns="urnchemas-microsoft-comfficeffice"
xmlns:x="urnchemas-microsoft-comffice:excel"
xmlns="http://www.w3.org/TR/REC-html40">
<head>.................
I also used OPENQUERY to see if it works.
SELECT * FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0 xml;HDR=Yes;Database=C:myexcelfile.xls', 'SELECT * FROM [Table1$]');
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=C:datamyexcelfile.xls', 'SELECT * FROM [Table1$]')
I get the following errors.
OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)" returned message "External table is not in the expected format.".
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "Microsoft.ACE.OLEDB.12.0" for linked server "(null)".
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "Microsoft.Jet.OLEDB.4.0" for linked server "(null)".
When I open the file and save it to overwrite the existing .xls file it works fine.
Is there a way to import such files using SSIS without having to open and save it?
Thanks in advance!
Tkaram
View 25 Replies
View Related
Nov 16, 2007
I have a stored procedure that accepts a parameter of type varchar(32). This parameter gets tested against NULL as part of an IF statement. The behavior I'm seeing is that this test never evaluates to TRUE, even in cases where NULL has been passed in as the value of the parameter.
CREATE PROCEDURE uspNewVital
@Description varchar(32),
@Type tinyint,
@DeviceType varchar(32),
@Dimension varchar(32),
@Unit varchar(32)
AS
IF (@Dimension != NULL AND @Unit != NULL)
BEGIN
RAISERROR ('You must specify either a dimension or a unit.', 15, 1) ;
RETURN 0 ;
END
IF NULL = @Dimension
BEGIN
DECLARE @UnitID AS int ;
SELECT @UnitID = ID FROM tblUnit WHERE Description=@Unit ;
IF @@ROWCOUNT = 0
BEGIN
RAISERROR ('Unit not found.', 15, 1) ;
RETURN 0 ;
END
INSERT INTO tblVitalType VALUES (@DeviceTypeID, @DatumID, NULL, @UnitID) ;
END
ELSE
BEGIN
DECLARE @DimensionID AS int ;
SELECT @DimensionID = ID FROM tblDimension WHERE Description=@Dimension ;
IF @@ROWCOUNT = 0
BEGIN
RAISERROR ('Dimension not found.', 15, 1) ;
RETURN 0 ;
END
INSERT INTO tblVitalType VALUES (@DeviceTypeID, @DatumID, @DimensionID, NULL) ;
END
GO
When I pass in a NULL for @Dimension, code execution still goes to the ELSE block.
At first I thought this might have something to do with the ANSI_NULLS option, but the database has this defaulting to OFF.
Any insights would be greatly appreciated. Thank you.
View 4 Replies
View Related
May 22, 2008
Hi all
I have a calculated fields in report designer such as a + b + c + d. In sql server if I run this query
select a + b + c + d from table1
and any of a,b,c or d is null, the result is null.
whereas in calculated fields, it does not return null but infact ignores the null value and treats it as zero.
I want my calculated field to be null if any of the values are null.
Is it possible? I cannot use the isNothing function because I have too many fields and it will be quite cumbersome.
Thanks
View 4 Replies
View Related
Jul 11, 2006
Hey everyone,
This is probably a very simple question, but I am just stumped. I am storing different name parts in different fields, but I need to create a view that will pull all of those fields together for reports, dropdowns, etc.
Here is my current SELECT statement:
SELECT m.FName + SPACE(1) + m.MName + SPACE(1) + m.LName + ', ' + m.Credentials AS Name,
m.JobTitle,
m.Company,
m.Department,
m.Address,
m.City + ', ' + m.State + ' ' + m.Zipcode AS CSZ,
m.WorkPhone,
m.FAX,
m.Email,
c.Chapter,
m.Active,
s.Sector,
i.Industry
FROM tblMembers m
LEFT OUTER JOIN tblChapters c
ON m.ChapterID = c.ChapterID
LEFT OUTER JOIN tblSectors s
ON m.SectorID = s.SectorID
LEFT OUTER JOIN tblIndustries i
ON m.IndustryID = i.IndustryID
WHERE m.DRGInclude = 1
My problem is that I don't know how to test for NULL values in a field. When you concatenate fields that contain NULL values, the entire contactenated field returns NULL. I am not aware of an IF statement that is available within the SELECT statement.
The first thing I would like to accomplish is to test to see if MName contains NULL. If it does I do not want to include + SPACE(1) + m.MName in the clause. Then, if Credentials contains NULL I do not want to include + ', ' + m.Credentials in the clause.
Can someone tell me what I am missing? Is there a function that I can use for this?
Thanks,
View 8 Replies
View Related
Jul 12, 1999
Hi,
When I try to insert a new record into a table that has a datetime field that allows nulls, a default 01/01/1900 date is inserted instead of null. I recreated the table and set the datatype to smalldatetime and I still get the error. What have I missed?
View 1 Replies
View Related
Jan 15, 2008
Hi all,
I have a projects table with 2 foreign key fields that both link to the same employees table because a project has a Package Engineer (PkgEngineerID) and a Contract Administrator (PkgContrAdmin). When I try to insert a record with null values for either or both of these foreign keys I get an error:
The data in row xxx was not commited. The record can't be added or changed. Referential integrity rules require a related record in table 'tblEmployees'. The transaction ended in the trigger. The batch has been aborted.
An insert statement for the above would look something like the following:
INSERT INTO tblPackages (PkgNo, PkgName, PkgEngineerID, PkgContrAdmin, PkgRemark)VALUES (1234, 'My Package', NULL, NULL, 'My Package remark')
And the create table statements are:
USE [PASSQL]GO/****** Object: Table [dbo].[tblPackages] Script Date: 01/15/2008 23:25:26 ******/SET ANSI_NULLS ONGOSET QUOTED_IDENTIFIER ONGOCREATE TABLE [dbo].[tblPackages]( [ID] [int] IDENTITY(1,1) NOT NULL, [PkgNo] [nvarchar](20) NULL, [PkgName] [nvarchar](255) NULL, [PkgEngineerID] [int] NULL, [PkgContrAdmin] [int] NULL, [PkgRemark] [nvarchar](255) NULL, [upsize_ts] [timestamp] NULL, CONSTRAINT [aaaaatblPackages_PK] PRIMARY KEY NONCLUSTERED ( [ID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]) ON [PRIMARY]GOALTER TABLE [dbo].[tblPackages] WITH NOCHECK ADD CONSTRAINT [FK_tblPackages_tblEmployees] FOREIGN KEY([PkgEngineerID])REFERENCES [dbo].[tblEmployees] ([ID])GOALTER TABLE [dbo].[tblPackages] CHECK CONSTRAINT [FK_tblPackages_tblEmployees]GOALTER TABLE [dbo].[tblPackages] WITH NOCHECK ADD CONSTRAINT [FK_tblPackages_tblEmployees1] FOREIGN KEY([PkgContrAdmin])REFERENCES [dbo].[tblEmployees] ([ID])GOALTER TABLE [dbo].[tblPackages] CHECK CONSTRAINT [FK_tblPackages_tblEmployees1]
And:
USE [PASSQL]GO/****** Object: Table [dbo].[tblEmployees] Script Date: 01/15/2008 23:28:01 ******/SET ANSI_NULLS ONGOSET QUOTED_IDENTIFIER ONGOCREATE TABLE [dbo].[tblEmployees]( [ID] [int] IDENTITY(1,1) NOT NULL, [EmpName] [nvarchar](255) NULL, [EmpShort] [nvarchar](255) NULL, [upsize_ts] [timestamp] NULL, CONSTRAINT [aaaaatblEmployees_PK] PRIMARY KEY NONCLUSTERED ( [ID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]) ON [PRIMARY]
Any ideas on how to accomplish this would be great!
Thanks in advance!
View 4 Replies
View Related
Jul 25, 2006
I am using a DAL and i want to insert a new row where one of the columns is DATE and it can be 'NULL'.
I am assigning SqlTypes.SqlDateTime.Null.
But when the date is saved in the database, i get the minvalue (1/01/1900) . Is there a way to put the NULL value in the database using DAL????how can i put an empty date in the database?
THANK YOU!!!
View 3 Replies
View Related
Aug 9, 2004
Hi,
I have fields in my table which allow nulls. Is it efficient to not insert anything (the field automatically shows up as null in this case) and leave or store some value into it. The field is a smallint field?
Thanks
View 2 Replies
View Related
Jul 20, 2005
Question: Why would I not be able to import an Access 97 table inwhich some records have null values in fields that allow null values?Wouldn't the table's design be imported first, bringing the columns'"allow nulls" attribute with it?I'm dealing with both text and numeric columns. Not all columnscontaining nulls cause an error.Thanks,Bob C.
View 2 Replies
View Related
Nov 16, 2005
Hello,I am trying to import a CSV file into my SQL Server database, this file was originally generated by another database table (on another server) with the same structure, the table contains two columns of real datatype with Allow Null Value setto true for those columns, the CSV file contains the value NULL for theses columns, I am facing a problem when importing this file. This may be because DTS tries to represent values as strings then to convert them to real datatype which results in transforming the value "NULL" to real, I receive an error message saying.Error during Transformation 'DirectCopyXform' for Row number 1. Errors encountered so far in this task: 1.
TransformCopy 'DirectCopyXform' conversion error: Conversion invalid for datatypes on column pair 8 (source column 'Col008' (DBTYPE_STR), destination column 'zip_longitude' (DBTYPE_R4)).
TransformCopy 'DirectCopyXform' conversion error: Conversion invalid for datatypes on column pair 7 (source column 'Col007' (DBTYPE_STR), destination column 'zip_latitude' (DBTYPE_R4)).How can I work around this problem? Any help would be appreciable
View 2 Replies
View Related
Dec 9, 2013
I have SQL Server 2012 SSIS. I have Excel source and OLE DB Destination.I have problem with importing CustomerSales column.CustomerSales values like 1000.00,2000.10,3000.30,NotAvailable.So I have decimal values and nvarchar mixed in on Excel column. This is requirement for solution.However SSIS reads only numeric values correctly and nvarchar values are set as Null. Why?
CREATE TABLE [dbo].[Import_CustomerSales](
[CustomerId] [nvarchar](50) NULL,
[CustomeName] [nvarchar](50) NULL,
[CustomerSales] [nvarchar](50) NULL
) ON [PRIMARY]
View 5 Replies
View Related
Jul 21, 2007
I am copying a simple table from a Sql Server 2005 database to an *.sdf mobile database.
I am brand new to SSIS and I am probably doing something wrong. But after executing the SSIS package all the rows and all the fields are NULL in the destination database. I put a datagrid viewer between the OLE DB Source and the Sql Server compact edition destination and I can see the real data which is obviously not ALL NULL.
Does anyone have a clue as to why it would be doing this?
Any help would be much appreciated.
Thanks...
View 1 Replies
View Related
Sep 3, 2015
I have an SSIS package that imports data from an Excel file, replaces any value in Excel that reads "NULL" to "", then writes the data to a couple of databases.
What I have discovered today, is I have two columns of dates, an admit date and discharge date column, and what I need to do is anywhere I have a null value in the discharge date column, I have to replace it with the value in the admit date column.
I have searched around online and tried a few things using the Replace funtion in Derived columns but no dice so far.
View 3 Replies
View Related
Jan 8, 2008
Hi, how do I convert dates with SSIS, when I receive dates I have no problem, with empty fields I have to go through a transformation as:
trim(field_name) == "" ? "999999" : substring....... (derive transform)
after this I have to update each field holding '999999' to NULL. (using sql task)
Is there an easier way to doing this?
Thanks
View 1 Replies
View Related
Jan 25, 2008
I have a pivot transform that pivots a batch type. After the pivot, each batch type has its own row with null values for the other batch types that were pivoted. I want to group two fields and max() the remaining batch types so that the multiple rows are displayed on one row. I tried using the aggregate transform, but since the batch type field is a string, the max() function fails in the package. Is there another transform or can I use the aggragate transform another way so that the max() will work on a string?
-- Ryan
View 7 Replies
View Related
Jul 14, 2006
Hi all,
I'm having a strange behavior here, or maybe I'm doing something wrong, I'm not sure.
Anyway, I have a csv file, the Flat File Connection Manager is configured like this:
Row delimiter: {CR}{LF}
Column delimiter: {;}
For some rows in my file the last two columns are empty and the there is no semicolon for these empty rows but these rows are still ended by a CRLF but SSIS does not consider the CRLF as the end of the row, it consider the first 2 columns of the next row as the last 2 columns of the current row.
Sample:
CSV file:
Col 1;Col 2;Col 3;Col 4;Col 5
AAAA;BBB;CCC;;
AAA1;BBB1;CCC1;;
AAA2;BBB2;CCC2
AAA3;BBB3;CCC3
Imported rows in SSIS:
Col 1 Col 2 Col 3 Col 4 Col 5
AAAA BBB CCC
AAA1 BBB1 CCC1
AAA2 BBB2 CCC2 AAA3 BBB3;CCC3
Any idea ?
Sébastien
View 8 Replies
View Related
Apr 20, 2010
I'm trying to import some XLS files that I receive from some suppliers. The problem is that every time they send some columns with text values but formatted as number. When I read those columns with SSIS Excel Source, they come all with null values.
I don't want to change the columns data types every time, so I would like to know if there's a way to bypass the column types that are already there.
I tried to use both the Jet driver and the Office 12 driver. I've already used the IMEX=1 on ExtendedProperties too with no success. Is there a way to force reading the columns as text, even if they have data types assigned to them?
View 15 Replies
View Related
May 8, 2009
I'm using SSIS 2005 Enterprise edition, I'm creating a package that reads an excel (xls) file using the "excel source" component, and it dumps the data into an OLEDB destination (a sql server). When I drag the excel source component and create the excel connection to my file the component automatically reads the columns and their datatypes.
The problem is that I have a column which has numeric data and the package uploads as NULL every number that starts with a zero. (note: in excel this column is formatted as "text", despite it has only numbers, because it's the only way excel maintains the left sided zeros).
So I checked the data types by right clicking the excel source component -> show advanced editor and my surprise is that this column's data type is detected as double-precision float, and it doesn't let me change it. URL... but it only works when the first row of data has a number beginning with zero on this column. How to get the data imported correctly?
View 15 Replies
View Related
Jul 6, 2015
While importing data from Excel source , some column is getting null value even though excel column has value.To Resolve the issue we tried with
HKEY_LOCAL_MACHINESOFTWAREWow6432NodeMicrosoftOffice14.0Access Connectivity EngineEnginesExcel
1.Change the Value of the Row TypeGuessRows from 8 (Default value) to 0 and ImportMixedType = text
• xls
HKEY_LOCAL_MACHINESOFTWAREMicrosoftJet4.0EnginesExcel
1.Change the Value of the Row TypeGuessRows from 8 (Default value) to 0 and ImportMixedType = text
the connection string of the excel
UPPER(REVERSE(SUBSTRING( REVERSE(@[User::VarInputExcelFile]), 1, 5) ) ) == ".XLSX" ? "Provider=Microsoft.ACE.OLEDB.12.0;Data Source=" + @[User::VarInputExcelFile] + ";Extended Properties="Excel 12.0;HDR=Yes;IMEX=1";":"Provider=Microsoft.Jet.OLEDB.4.0;Data
Source=" + @[User::VarInputExcelFile] + ";Extended Properties="EXCEL 8.0;HDR=Yes;IMEX=1";"
by doing the above setting also , the column is coming as null from excel source even though there is data in excel.
View 2 Replies
View Related
Nov 20, 2007
I have looked far and wide and have not found anything that works to allow me to resolve this issue.
I am moving data from DB2 using the MS OLEDB Provider for DB2. The OLEDB source sees the column of data as DT_TEXT. I setup a destination to SQL Server 2005 and everything looks good until I try and run the package.
I get the error:
[OLE DB Source [277]] Error: An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record is available. Source: "Microsoft DB2 OLE DB Provider" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
[OLE DB Source [277]] Error: Failed to retrieve long data for column "LIST_DATA_RCVD".
[OLE DB Source [277]] Error: There was an error with output column "LIST_DATA_RCVD" (324) on output "OLE DB Source Output" (287). The column status returned was: "DBSTATUS_UNAVAILABLE".
[OLE DB Source [277]] Error: The "output column "LIST_DATA_RCVD" (324)" failed because error code 0xC0209071 occurred, and the error row disposition on "output column "LIST_DATA_RCVD" (324)" specifies failure on error. An error occurred on the specified object of the specified component.
[DTS.Pipeline] Error: The PrimeOutput method on component "OLE DB Source" (277) returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Any suggestions on how I can get the large string data in the varchar column in DB2 into the varchar(max) column in SQL Server 2005?
View 10 Replies
View Related
Nov 20, 2006
For the life of me I cannot figure out why SSIS will not convert varchar data. instead of using the table to table method, I wrote a SQL query so that I could transform the datatype ntext to varchar 512 understanding that natively MS is going towards all Unicode applications.
The source fields from Access are int, int, int and varchar(512). The same is true of the destination within SQL Server 2005. the field 'Answer' is the varchar field in question....
I get the following error
Validating (Error)
Messages
Error 0xc02020f6: Data Flow Task: Column "Answer" cannot convert between unicode and non-unicode string data types.
(SQL Server Import and Export Wizard)
Error 0xc004706b: Data Flow Task: "component "Destination - Query" (28)" failed validation and returned validation status "VS_ISBROKEN".
(SQL Server Import and Export Wizard)
Error 0xc004700c: Data Flow Task: One or more component failed validation.
(SQL Server Import and Export Wizard)
Error 0xc0024107: Data Flow Task: There were errors during task validation.
(SQL Server Import and Export Wizard)
DTS used to be a very strong tool but a simple import such as this is causing me extreme grief and wondering of SQL2005 is ready for primetime. FYI SP1 is installed. I am running this from a workstation and not on the server if that makes a difference...
Any help would be appreciated.
View 7 Replies
View Related
Sep 24, 2005
What are some good strategic approaches to using freeform text fields fordata that needs to be queried? We have a product whose tables we can'tchange, and I need to count on a "description" field for storing a value.Two, actually. I'm thinking of adopting this convention:InvoiceNumber@VendorAcronymThere'd be a lot of vendors.Additional issue: sometimes these values would be referred to in thedescription field, and I'd need to distinguish them as referrals ratherthan as original recorded instances of the values. For that, I imaginedeither:InvoiceNumber@@VendorAcronymorInvoiceNumber&VendorAcronymInvoiceNumber//VendorAcronymetc. -- something like that.I'm just wondering if there's best practice for doing anything this stupid(hey, I'm stuck with this as our only option just now; hopefully it's onlytemporary). How to parse out whatever I end up implementing -- well, itneeds to be tractable.Thoughts?--Scott
View 21 Replies
View Related
Jul 20, 2005
I am setting up a database that will receive a lot of data from twoseparate telephone centers, the log table will in a short time haveover 1 million lines, and I was wondering if I should use 1 identifyfield or two:case 1:[Id] [int] IDENTITY (1, 1) NOT NULL[ServerId] [int] NOT NULLcase 2:[Id] [varchar(20)] IDENTITY NOT NULLWhere in case 1 I would just use a combination of Id and ServerId toidentify the line, where in case 2 I would have the Id field a varcharthat would look something like A-000001, A-000002 for server 1 andB-000001, B-000002 for server 2Which solution will be faster when searching for a record when thewill have over 1 million lines?
View 3 Replies
View Related
Feb 23, 2007
I have a DTSX package which reads values from a fixed-length text file using a data reader and writes some of the column values from the file to an Oracle table. We have used this DTSX several times without incident but recently the process started inserting NULL values for some of the columns when there was a valid value in the source file. If we extract some of the rows from the source file into a smaller file (i.e 10 rows which incorrectly returned NULLs) and run them through the same package they write the correct values to the table, but running the complete file again results in the NULL values error. As well, if we rerun the same file multiple times the incidence of NULL values varies slightly and does not always seem to impact the same rows. I tried outputting data to a log file to see if I can determine what happens and no error messages are returned but it seems to be the case that the NULL values occur after pulling in the data via a Data Reader. Has anyone seen anything like this before or does anyone have a suggestion on how to try and get some additional debugging information around this error?
View 12 Replies
View Related
Mar 22, 2000
I have a varchar field which holds IDs like (1, 3, 5, 19, 23) when I order it, i get it ordered in ASCII order like (1, 19, 23, 3, 5) rather than (1, 3, 5, 19, 23) Even if I convert it to int, I won't be able to order it.
is there any way I can order a varchar field numerically?
Angel
View 1 Replies
View Related
May 10, 2006
Hi, I'm starting a new application in java using JTDS jdbc driver(http://jtds.sourceforge.net) and SQLServer 2005 Express.I have to design the database from scratch and my doubt is if I have to usevarchar or nvarchar fields to store string data.Any experience about performance issues using nvarchar instead of varchar(considering that Java internally works in unicode too)?Thanks in advance,Davide.
View 4 Replies
View Related
May 29, 2008
I am trying to add a carriage return to the end of a text field through a script. This is what I'm trying:
UPDATE Table_Name SET Column_TEXT = Column_TEXT) + '
' WHERE Column_TEXT = 'Some text'
I also tried
UPDATE Table_Name SET Column_TEXT = Column_TEXT) + '<cr>' WHERE Column_TEXT = 'Some text'
But I keep getting the error
The data types text and varchar are incompatible in the equal to operator.
Help!!
Thanks in advance
Mangala
View 3 Replies
View Related
Mar 29, 2006
Hi
can somebody explain me how I can assign a NULL value to a datetime type field in the script transformation editor in a data flow task.
In the script hereunder, Row.Datum1_IsNull is true, but still Row.OutputDatum1 will be assigned a value '0001-01-01' which generates an error (not a valid datetime). All alternatives known to me (CDate("") or Convert.ToDateTime("") or Convert.ToDateTime(System.DBNull.Value)) were not successful.
Leaving out the ELSE clause generates following error: Error: Year, Month, and Day parameters describe an un-representable DateTime.
If Not Row.Datum1_IsNull Then
Row.OutputDatum1 = Row.Datum1
Else
Row.OutputDatum1 = CDate(System.Convert.DBNull)
End If
Any help welcome.
View 1 Replies
View Related
Mar 30, 2004
(I'm using MS SQL 2000)
I've two tables that I've made from some query subsets. Each table has a varchar field with notes/memos and I want to concatenate the fields into one long field.
The problem I'm running into is that when I run the query to check the concatenation, the field is truncated maybe 256 chars in.
I tried converting and casting the field as nvarchar 4000, and I've also done the same for the fields in the two tables, but that doesn't seem to help.
I can query for the fields from each table and none of them are truncated by themselves. It only happens after I concatenate them.
I've created a new table and inserted the results into it, but the field in it is also truncated.
Am I missing something obvious here?
View 2 Replies
View Related
Mar 13, 2015
I'm archiving some data. In a 2 step process.
1. Copy old data from each table in LiveDB to same table in ArchiveDB.
2. Delete the data from each table in LiveDB which is in ArchiveDB
Both DBs SIMPLE recovery mode.
Each table has a clustered PK on a single int value. In both DBs
The tables with varchar(max) columns are taking a v.long time to copy over.
IS there anything I can change in the ArchiveDB to make it run faster.
It is the insert that is taking the time. I've tried dropping the clustered PKs in ArchiveDB tables and then rebuilding afterwards but it has not made any difference. After all I am adding data to the ArchiveDB in clustered index order, so wouldn't have expected it to.
How I can change the Archive DB but cannot touch the schema/settings of Live DB.
View 9 Replies
View Related
Nov 21, 2006
I'm using SSIS to import data from a table (SQL) containing varchar fields. The problem is, that those varchar fields are changing over time (sometimes shrinking and sometimes expanding). I.e. from varchar(16) to varchar(20).
When I create my SSIS package, the package seem to store information about the length of each source-field. At runtime, if the field-length is larger then what the package expects an error is thown.
Is there anyway around this problem?
Oh, yeah... My destination fields are a lot wider then the source fields, so the problem is not that the varchar values doesn't fit in my destination table, but that the package expects the source to be smaller...
Regards Andreas
View 3 Replies
View Related