Hello, I have a column in my table that is a nvarchar.The information that we need to store in this column has exceeded the limit.Can we simply change the datatype to 'text' ? Will there be any issues that we might experience? Thank you in advance.
Hi all,I need to store data into about 104 columns. This is problematic with MSSQL, since it doesn't support rows over 8kb in total size.Most of the columns are of type NVARCHAR(255), which means we can't havemore than 8092/(255*2) = 15 columns of this type.With a row length of more than 8kb, SQL gives a warning that any rows overthat amount will be truncated.So far I'm seeing two possible solutions to this problem:1. Split data into multiple tables with the same ID column accross alltables, and then join them on SELECT statements.2. Use NTEXT instead of NVARCHAR. NTEXT's length is 16 bytes because itcontains a pointer to the actual value stored somewhere else. However, NTEXTdoesn't support regular indexing, only through a Full-Text Index catalog. Inthis case I'll need to user "WHERE CONTAINS(columnName, 'sometext')" toperform searches, which is bearable.I'm inclined toward #2. However I haven't used Full-Text indices before anddon't know their limitations. Will I run into problems with NTEXT? Is therea better solution?Thanks.-Oleg.
I have an application with highly compressable strings (gzip encodingusually does somewhere between 20-50X reduction.) My base 350MBdatabase is mostly made up of these slowly (or even static) strings. Iwould like to compress these so that my disk I/O and memory footprintis greatly reduced.Some databases have the ability to provide a compressedtable, compressed column, or provide a user defined function tocompress an indvidual Field with a user defined function[ala. COMPRESS() and DECOMPRESS() ].I could right a UDF with an extended prodcedure if I need to but I'mwondering if there are any other known methods to do this in MS SQLServer 2000 today?--Frederick Staatsfrederick dot w dot staats at intel dot com (I hate junk mail :-)
A SqlServer 2005 db has a table with an nvarchar(max) column containing text with paragraph format. When displayed in a Windows form textbox each paragraph exhibits CRLF. When the table is opened in Management Studio the paragraph text is separated with two boxes at each CRLF.
I would like to insert an additional CRLF (or whatever is required) so that when viewed in a textbox each paragraph has a blank line separating it from the next paragraph. Much easier to read!!!
Can this be done? If so, how? Thanks in advance for any help you can provide.
Hi,This is probably an easy question for someone so any help would beappreciated.I have changed the columns in a table that where nvarchar to the samesize of type varchar so halve the space needed for them.I have done this a) becuase this is never going to be an internationalapplication, b) we are running out of space and c) there are 100million rows.I have done this with the alter table statement which seems to work butthe space used in the database hasn't altered.I'm presuming that the way the records are structured within the tablethere is just now more space free inbetween each page???Is there a way or re-shrinking just an individual table and free upsome of the space in there or am i missing the point somewhere?Thanks in advance,Ian
Hi, i have a table with a nvarchar column,i want to send this column value as unicode content to customer mail box , but when i send it a mail with '?' customer receive , how can i accomplish this? thanks
Hi, I was wondering if any SQL Server gurus out there could help me...
I have a table I'm trying to apply a full text catalog to, however no results are ever returned due to the text column being cataloged being of varbinary(max) that's being populated from a converted nvarchar(max) value.
To re-create the problem quickly...
If I populate the column via CONVERT(varbinary(max), 'test text') then there is no problem, I get results as expected.
However if I populate the column via CONVERT(varbinary(max), CAST('test text' as nvarchar(max))) no results are ever returned.
Is this a bug with SQL Server 2005 Full Text Indexing? I'm happily creating full text catalogs when an nvarchar is not getting converted into a varbinary.
I'm setting the Document Type column to '.html' (I've tried changing this to '.txt' in case it was a fault with the html ifilter but the problem persists so I believe I can rule this out).
The reason I need to convert an nvarchar to varbinary is that the table holds multi-lingual text and I'm adding a html meta tag <META NAME="MS.LOCALE" CONTENT="ES"> to the beginning in order for the full text indexing word breaker to select the correct language to catalog the text with. The aim being to provide more relevant searches in users native languages (I've read a few articles that describe this technique, but it's the first time I've tried to apply it).
Any pointers / suggestions would be greatly appreciated. Cheers, Gavin.
Below is a T-SQL script you can run to demonstrate the effect I'm experiencing...
-- Create test database CREATE DATABASE FullTextTest GO USE FullTextTest GO
-- Create test data table CREATE TABLE TestTable ( pk UNIQUEIDENTIFIER NOT NULL CONSTRAINT tablePK PRIMARY KEY, varbinarycol VARBINARY(MAX), documentExtension VARCHAR(5), ) GO
-- The below single entry WILL BE FOUND (the text source is being entered directly) INSERT INTO TestTable (pk, varbinarycol, documentExtension) VALUES (NEWID(), CONVERT(VARBINARY(MAX),'<META NAME="MS.LOCALE" CONTENT="EN">test entry 1'), '.html')
-- The bellow two entries below WILL NOT BE FOUND (the text source is taken from an NVARCHAR(MAX) value) INSERT INTO TestTable (pk, varbinarycol, documentExtension) VALUES (NEWID(), CONVERT(VARBINARY(MAX), CAST('<META NAME="MS.LOCALE" CONTENT="EN">test entry 2' AS NVARCHAR(MAX))), '.html') INSERT INTO TestTable (pk, varbinarycol, documentExtension) VALUES (NEWID(), CONVERT(VARBINARY(MAX), CAST('<META NAME="MS.LOCALE" CONTENT="EN">test entry 3' AS NVARCHAR(MAX))), '.html') GO
-- Create the full text catalog sp_fulltext_database 'enable' GO CREATE FULLTEXT CATALOG TEST AS DEFAULT GO CREATE FULLTEXT INDEX ON TestTable (varbinarycol TYPE COLUMN documentExtension LANGUAGE 1033) KEY INDEX tablePK GO
-- NOTE: You might need to give the catalog a chance to build before running the script below.
-- Now do a search that SHOULD RETURN 3 ROWS of data, but ONLY 1 ROW IS RETURNED SELECT CAST(varbinarycol AS NVARCHAR(MAX)) FROM TestTable WHERE CONTAINS(varbinarycol, 'test')
Hi, I was wondering if any SQL Server gurus out there could help me...
I have a table I'm trying to apply a full text catalog to, however no results are ever returned due to the text column being cataloged being of varbinary(max) that's being populated from a converted nvarchar(max) value.
To re-create the problem quickly...
If I populate the column via CONVERT(varbinary(max), 'test text') then there is no problem, I get results as expected.
However if I populate the column via CONVERT(varbinary(max), CAST('test text' as nvarchar(max))) no results are ever returned.
Is this a bug with SQL Server 2005 Full Text Indexing? I'm happily creating full text catalogs when an nvarchar is not getting converted into a varbinary.
I'm setting the Document Type column to '.html' (I've tried changing this to '.txt' in case it was a fault with the html ifilter but the problem persists so I believe I can rule this out).
The reason I need to convert an nvarchar to varbinary is that the table holds multi-lingual text and I'm adding a html meta tag <META NAME="MS.LOCALE" CONTENT="ES"> to the beginning in order for the full text indexing word breaker to select the correct language to catalog the text with. The aim being to provide more relevant searches in users native languages (I've read a few articles that describe this technique, but it's the first time I've tried to apply it).
Any pointers / suggestions would be greatly appreciated. Cheers, Gavin.
Below is a T-SQL script you can run to demonstrate the effect I'm experiencing...
-- Create test database CREATE DATABASE FullTextTest GO USE FullTextTest GO
-- Create test data table CREATE TABLE TestTable ( pk UNIQUEIDENTIFIER NOT NULL CONSTRAINT tablePK PRIMARY KEY, varbinarycol VARBINARY(MAX), documentExtension VARCHAR(5), ) GO
-- The below single entry WILL BE FOUND (the text source is being entered directly) INSERT INTO TestTable (pk, varbinarycol, documentExtension) VALUES (NEWID(), CONVERT(VARBINARY(MAX),'<META NAME="MS.LOCALE" CONTENT="EN">test entry 1'), '.html')
-- The bellow two entries below WILL NOT BE FOUND (the text source is taken from an NVARCHAR(MAX) value) INSERT INTO TestTable (pk, varbinarycol, documentExtension) VALUES (NEWID(), CONVERT(VARBINARY(MAX), CAST('<META NAME="MS.LOCALE" CONTENT="EN">test entry 2' AS NVARCHAR(MAX))), '.html') INSERT INTO TestTable (pk, varbinarycol, documentExtension) VALUES (NEWID(), CONVERT(VARBINARY(MAX), CAST('<META NAME="MS.LOCALE" CONTENT="EN">test entry 3' AS NVARCHAR(MAX))), '.html') GO
-- Create the full text catalog sp_fulltext_database 'enable' GO CREATE FULLTEXT CATALOG TEST AS DEFAULT GO CREATE FULLTEXT INDEX ON TestTable (varbinarycol TYPE COLUMN documentExtension LANGUAGE 1033) KEY INDEX tablePK GO
-- NOTE: You might need to give the catalog a chance to build before running the script below.
-- Now do a search that SHOULD RETURN 3 ROWS of data, but ONLY 1 ROW IS RETURNED SELECT CAST(varbinarycol AS NVARCHAR(MAX)) FROM TestTable WHERE CONTAINS(varbinarycol, 'test')
I am facing a problem while using SQL Server with VB application.Implicit conversion from datatype text to nvarchar is not allowed.Use the convert function to run this query.When i see the trace file, i see one stored procedure called but nolines of code get executed, and immediately after that the ROLLBACKTRANSACTION occurs and the applications fails.But to my surprise i am able to do the same thing on a differentmachine using the same application and the same database on the sameserver with the same user id.Can anyone explain the reason of occurance of this problem.I require this very urgently, so i will be oblized if anyone can comeup with a quick response.Kind Regards,Amit Kumar
I know that if I have an nvarchar column I can use an equality like = N'supersqlstring' so it doesn't implicit cast as a varchar, like if I were to do ='supersqlstring'. And then I'll be a big SQL hero and all my stored procedures will run before a millisecond can whisper.
But if I'm comparing an nvarchar column to a varchar column, is it better to cast the varchar 'up' to an nvarchar or cast the nvarchar 'down' to a varchar?
For instance:
cast(a.varchar as nvarchar(100)) = an.nvarchar
or
cast(an.nvarchar as varchar(100)) = a.varchar
Leaving aside non-matching, like (at least I don't think) that SQL considers the varchar n to be equal to the nvarchar Å„, what's the best way to handle this?
Pretend for a moment that each column contains a mixed letter and number ID with no accented or wiggly-squiggly Unicode characters; it's just designs clashing.
Is there a performance hitch doing it one way or another? Should I use COLLATE? Should one of the columns be altered?
I have an issue where I am storing various international characters in nvarchar columns, but need to branch the data at one point of processing so that ASCII characters are run through an additional cleansing process and all non-ASCII characters are set aside.
Is there a way to identify which nvarchar values are within the ASCII range and can be converted to varchar without corruption? Also, the strings may contain a mix of english and international character sets, so the entire string must be checked and not just the first character.
The following is the full error message. I am posting the code after.
Server Error in '/XprtDr' Application.
The data types text and nvarchar are incompatible in the equal to operator. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.SqlClient.SqlException: The data types text and nvarchar are incompatible in the equal to operator.Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace:
Hi, We are in process of converting all of the data type of the fields from CHAR/VARCHAR/TEXT into NCHAR/NVARCHAR/NTEXT (DBCS). Having more than 900 store procedure its look like real pain to make modification in all of the SPs.
After failed to find any help from GOOGLE, I am posting this request. I am basically looking for any automated tool which are convert data type in SP based on the field of the table used in the SP. Or at least which can provide me some sort of list which can helpful for doing manual reactoring.
Hi - I am changing a field from type nvarchar to type text, given thatI need to store strings longer than 255 characters. To do this Ichange the data type in SQL Server, then I change the parameter code inthe calling procedure, as per below:cmd.Parameters.Append(cmd.CreateParameter("@title", adVarWChar,adParamInput, 255, title));becomes:cmd.Parameters.Append(cmd.CreateParameter("@title", adLongVarWChar,adParamInput, 1073741823, title));However, when I do this, for some reason, the field is still limited to255 characters - when I try to update the field with 256 characters,the error 'Application uses a value of the wrong type for the currentoperation.' occurs.Why is this? I've checked that the correct data is contained in theparameter. When I look at the data in the database, the column inquestion shows the content, whereas the next column, which has alwaysbeen of type text, shows '<LongText>' - does this mean anything? Do Ineed to do something special to convert the column from nvarchar totext?Many thanks,Iain
Hi, i have a table in sqlexpress named Contacts: ID (int) -primary key- name (varchar(30)) lastname (varchar(30)) phone (varchar(15)) fax (varchar(15)) desc (text) In my default.aspx page, i have a GridView that has the conecction to this table. The GridView has the Editing and Deleting checkbox enabled but my problem is that i can't edit or delete any row when the page is running and the massage is this: "The data types text and nvarchar are incompatible in the equal to operator" It would have to work, but i don't know what happen, Please, any help!
Hi, We have moved our Access 2003 database to SQL Server 2005. A number of the original tables contained MEMO fields and these became nvarchar(MAX) fields in SS 2005. We continue to use the original Access application and therefore have linked the tables to the Access application via ODBC connection. However, the fields that were originally Memo in Access native tables are brought over as TEXT fields (255 char limit) by the linking process.
Is there a way I can link the tables and have those big text fields be Memo fields in the linked tables?? Or is there some way to work around this? (Moving to Access 2007 is not an option at this time.) I would really appreciate any suggestions! Thanks.
Hi, I was wondering if any SQL Server gurus out there could help me...I have a table I'm trying to apply a full text catalog to, however no results are ever returned due to the text column being cataloged being of varbinary(max) that's being populated from a converted nvarchar(max) value - I've narrowed it down to this specifically, populating with non nvarchar text seems to work fine.To re-create the problem quickly...If I populate the column viaCONVERT(varbinary(max), 'test text')then there is no problem, I get results as expected.However if I populate the column viaCONVERT(varbinary(max), CAST('test text' as nvarchar(max)))no results are ever returned.Is this a bug with SQL Server 2005 Full Text Indexing? I'm happily creating full text catalogs when an nvarchar is not getting converted into a varbinary.I'm setting the Document Type column to '.html' (I've tried changing this to '.txt' in case it was a fault with the html ifilter but the problem persists so I believe I can rule this out).The reason I need to convert an nvarchar to varbinary is that the table holds multi-lingual text and I'm adding a html meta tag <META NAME="MS.LOCALE" CONTENT="ES"> to the beginning in order for the full text indexing word breaker to select the correct language to catalog the text with. The aim being to provide more relevant searches in users native languages (I've read a few articles that describe this technique, but it's the first time I've tried to apply it).Any pointers / suggestions would be greatly appreciated. Cheers,Gavin.
Hi, I was wondering if any SQL Server gurus out there could help me...
I have a table I'm trying to apply a full text catalog to, however no results are ever returned due to the text column being cataloged being of varbinary(max) that's being populated from a converted nvarchar(max) value.
To re-create the problem quickly...
If I populate the column via CONVERT(varbinary(max), 'test text') then there is no problem, I get results as expected.
However if I populate the column via CONVERT(varbinary(max), CAST('test text' as nvarchar(max))) no results are ever returned.
Is this a bug with SQL Server 2005 Full Text Indexing? I'm happily creating full text catalogs when an nvarchar is not getting converted into a varbinary.
I'm setting the Document Type column to '.html' (I've tried changing this to '.txt' in case it was a fault with the html ifilter but the problem persists so I believe I can rule this out).
The reason I need to convert an nvarchar to varbinary is that the table holds multi-lingual text and I'm adding a html meta tag <META NAME="MS.LOCALE" CONTENT="ES"> to the beginning in order for the full text indexing word breaker to select the correct language to catalog the text with. The aim being to provide more relevant searches in users native languages (I've read a few articles that describe this technique, but it's the first time I've tried to apply it).
Any pointers / suggestions would be greatly appreciated. Cheers, Gavin.
UPDATE: Below is a T-SQL script you can run to demonstrate the effect I'm experiencing...
Code Snippet
-- Create test database CREATE DATABASE FullTextTest GO USE FullTextTest GO
-- Create test data table CREATE TABLE TestTable ( pk UNIQUEIDENTIFIER NOT NULL CONSTRAINT tablePK PRIMARY KEY, varbinarycol VARBINARY(MAX), documentExtension VARCHAR(5), ) GO
-- The below single entry WILL BE FOUND (the text source is being entered directly) INSERT INTO TestTable (pk, varbinarycol, documentExtension) VALUES (NEWID(), CONVERT(VARBINARY(MAX),'<META NAME="MS.LOCALE" CONTENT="EN">test entry 1'), '.html')
-- The bellow two entries below WILL NOT BE FOUND (the text source is taken from an NVARCHAR(MAX) value) INSERT INTO TestTable (pk, varbinarycol, documentExtension) VALUES (NEWID(), CONVERT(VARBINARY(MAX), CAST('<META NAME="MS.LOCALE" CONTENT="EN">test entry 2' AS NVARCHAR(MAX))), '.html') INSERT INTO TestTable (pk, varbinarycol, documentExtension) VALUES (NEWID(), CONVERT(VARBINARY(MAX), CAST('<META NAME="MS.LOCALE" CONTENT="EN">test entry 3' AS NVARCHAR(MAX))), '.html') GO
-- Create the full text catalog sp_fulltext_database 'enable' GO CREATE FULLTEXT CATALOG TEST AS DEFAULT GO CREATE FULLTEXT INDEX ON TestTable (varbinarycol TYPE COLUMN documentExtension LANGUAGE 1033) KEY INDEX tablePK GO
-- NOTE: You might need to give the catalog a chance to build before running the script below.
-- Now do a search that SHOULD RETURN 3 ROWS of data, but ONLY 1 ROW IS RETURNED SELECT CAST(varbinarycol AS NVARCHAR(MAX)) FROM TestTable WHERE CONTAINS(varbinarycol, 'test')
I am using a DTS to copy columns from one database to another. How do I do the same thing, but first check if the data alreday exists in the destination table and then only copy if it doesn't exist?
Im` trying to import some images (BMP,JPEG and GIF) into a table using the TEXT COPY tool. However, when doing this I keep getting the error;
Error: Text or Image pointer and timestamp retrieval failed
Does anyone have an idea what can be done about this. I`m running TEXT COPY from a stored procedure I created that calls TEXT COPY through the xp_cmdshell stored procedure. This isn`t the problem though as I get the same error when running TEXTCOPY from the command line.
Can you not add a text column to a full text index?? If I change it to a nvarchar it works fine but if I change it to a text column it wont index. Anyone know how to fix this?
At first I have to apologize that I write here. This is not right place for my problem but I could not find any better place to ask help. I'm creating a new table and I have to copy some columns from the old one to the new. Desciption of the old table is something like that: Table1: name char(32) not null primary key, variable1 char(32), variable2 char(32). Now I would like to copy columns variable1 and variable2 to the new table (called Table2). I need to change the names to the id-numbers. (Don't ask me why, I just have to :) I have created Table2. Table2: id int not null primary key, newvariable1 char(32), newvariable2 char(32). Now the problem is that how can I copy just columns variable1 and 2 without the name-column (witch is the primary key)? I just an error message: DB21034E The command was processed as an SQL statement because it was not a valid Command Line Processor command. During SQL processing it returned: SQL0407N Assignment of a NULL value to a NOT NULL column "TBSPACEID=2, TABLEID=578, COLNO=0" is not allowed. SQLSTATE=23502 It just says that (if I understood the message) I can't leave the id-column blank. How can I generate the keys to Table2? It would be nice to get keys from 1,2,3... Please help me. :)
i have data in a table.. i want the values in the rows to place in columns and columns into rows.. for ex:-there is one table name id dept a 1 x b 2 y c 3 z i want the resultant table should look like this a b c 1 2 3 x y z
I want to copy 2 columns from 1 database to another database. I managed to do this, using a Ole DB source and a Ole DB destination dataset.
Now I want to merge 2 colums into 1: Source Database: Column A: first name, Column B: Lastname. Destination Database: Column 1: First and lastname customer.
I have 1 table in my db that I have imported from a txt file using DTS.
In my database "DOJ" I have a table named "DOJGRAB" with ALL my data fresh from a import using DTS.
Also in "DOJ" are 6 other tables (NAME, PERSON, etc...)
How can I parse the columns out from my DOJGRAB table to fill existing columns in my other 6 tables? Basically a copy of specific columns from DOJGRAB to their proper places in the other tables.
Can somebody post an example of the syntax I would use to copy a column from one table to another? Thanks!
I would like to dynamically copy a table in Transact SQL, like this:
SELECT * into NEW_TABLE from MY_TABLE where 1 = 2
This creates an empty NEW_TABLE as desired. However, NEW_TABLE retains the column nullability of MY_TABLE; I would like NEW_TABLE to have all its columns nullable.
I tried to write Transact SQL logic to update the isnullable column in syscolumns for NEW_TABLE, but was told that the entire SQL Server would have to be reconfigured to allow this.