Any idea on performance comparison between VARCHAR and CLOB? For example, if I want to insert or update 4000 characters in a field of a row and I am confused whether that column's type would better be decalred as VARCHAR or CLOB, what do you suggest?
how SQL 2012 would treat a literal string for a comparison similar to below. I want to ensure that the server isn't implicitly converting the value as it runs the SQL, so I'd rather change the data type in one of my tables, as unicode isn't required.
Declare @T Table (S varchar(2)) Declare @S nvarchar(255) Insert into @T Values ('AR'), ('AT'), ('AW') Set @S = 'Auto Repairs' Select * from @T T where case @S when 'Auto Repairs' then 'AR' when 'Auto Target' then 'AT' when 'Auto Wash' then 'AW' end = T.STo summarise
in the above would AR, AT and AW in the case statement be treated as a nvarchar, as that's the field the case is wrapped around, or would it be treated as a varchar, as that's what I'm comparing it to.
There are so many ways to use database in asp.net/ado.net, I'm a bit confused about their difference from the performance point of view.So apparently SqlDataSource in DataReader mode is faster than DataSet mode, at a cost of losing some bolt-on builtin functions.What about SqlDataSource in DataReader mode vs manual binding in code? Say creating a SqlDataSource ds1 and set "DataSourceID" in Gridview, vs manually creating the SqlConnection, SqlCommand, SqlDataReader objects and mannually bind the myReader object to the gridview with the Bind() method.Also Gridview is a very convenient control for many basic tasks. But for more complex scenarios it requires lots of customization and modification. Now if I do not use gridview at all and build the entire thing from scratch with basic web controls such as table and label controls, and mannually read and display everything from a DataReader object, how's the performance would be like compared to the Gridview-databind route?
I recently converted a column that was once an int to an bigint on one of my tables. The modified column provided a generic row id information and there are duplicates within this column. I am trying to perform a self join via the following:
SELECT a.row_id FROM test_db a INNER JOIN test_db b ON b.row_id < a.row_id.
This code use to work when the column was an int but now I am getting high CPU issues since I converted to bigint. I am unsure on why the change to bigint will cause such an issue. The OS/SQL is 64BIT.
We use SQL 7.0 and I want to know some guideline for choosing varchar or char data type. I heard that varchar takes more computation than char, then we should use all character type as a char. If then, it will takes more storage, so Is there any guideline for choosing varchar data type? And also, Is the NULL value type same as varchar data type ?
We have few stored procedures that use nvarchar datatype, this was not issue on SQL server 7.0 but in 2000 becomes a big issue. For example query that runs for 3 minutes in SQL server 2000 by replacing NVARCHAR to VARCHAR the same query runs for 2 seconds. The biggest challenge that I have deals with tables and user-defined datatypes of NVARCHAR that has been bounded to the table. How can I alter those without data corruption?
Does using varchar in SQL Server 2005 significantly affect performance on updates?
Why or why not?
I have seen many SQL Server databases with many varchar columns - in other databases other than SQL Server it is advised not to use varchar because it significantly impacts performance.
I am trying to weigh when to waste space to help performance.
Hi, My package dumps the errors into a table. The problem is, it couldnt dump Error Output column to a varchar field. I have added an script component in between to transform to string but no success.
I tried ErrorOutput.GetBlobData(0, ErrorOutput.Length)
but when I query the database, it says "System.Byte[]'
Using a code snippet borrowed from a co-worker, I have put together a query that, among other things, pulls a list value out of an xml clob field and displays it in the query results. My query as it stands right now is below, followed by a snippet from the xml clob that I am pulling from.
select * from (Select Wtr_Service_Tag, Wtr_Tran_Origin, Wtr_Send_Date, Wtr_Receive_Date, to_char(substr(wtr_req_xml,instr(substr(wtr_req_xml,1,8000),'SID')+8,12)) Asset_Tag from ws_transactions Where Wtr_Service_Tag In ('20458749610')
[Code] ....
This query is only able to pull the first value in the list.
I have two questions...
[1]How can I edit this query to pull all of the list items when there are more than 1? I have another field, in a separate table, that I can pull from to get that number.
[2]This one may be more complex. As currently written, the query pulls a fixed number of characters from the xml clob and either returns not enough data, or too much because the values I need to pull could be of varying lengths. I have no way to query what those lengths might be.
The microsoft JDBC driver doesn't accept CLOB & BLOB field. Does anyone know about this problem ? Is it possible to found another driver that works correctly and that is free ?
My source data is present in XML File which is stored in CLOB column Of Oracle. CLOB column is compressed.I need to Migrate data by Uncompressing XML to SQL 2012 .
Do I need to define XML column in SQL Server 2012 for storing Uncompressed CLOB values ?
How to uncompress the clob and extract the required data from XML using SSIS .
I have two tables Encounter & Encounter_History. They have same columns. One column is of type CLOB. My requirement is to retrieve all the distinct records from both the tables with order by a date column. But problem is, UNION does not work in case of CLOB data type.
I know it will work if I use UNION ALL, but it returns duplicate records.
Please give me suggestion, how to solve this problem.
For example: The following query does not work since column1 is a CLOB data type
select column1 from table1 union select column1 from table2
Hi everyone,I encountered an error "Need to run the object to perform this operationCode execution exception: EXCEPTION_ACCESS_VIOLATION" When I try to import data from Oracle to MS SQL Server with EnterpriseManager (version 8.0) using DTS Import/Export Wizard. There are 508 rowsin Oracle table and I did get first 42 rows imported to SQL Server.Anyone knows what does the above error message mean and what causes therest of the row failed importing?Thanks very much in advance!Rene Z.--Posted via http://dbforums.com
I have looked far and wide and have not found anything that works to allow me to resolve this issue.
I am moving data from DB2 using the MS OLEDB Provider for DB2. The OLEDB source sees the column of data as DT_TEXT. I setup a destination to SQL Server 2005 and everything looks good until I try and run the package.
I get the error: [OLE DB Source [277]] Error: An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record is available. Source: "Microsoft DB2 OLE DB Provider" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
[OLE DB Source [277]] Error: Failed to retrieve long data for column "LIST_DATA_RCVD".
[OLE DB Source [277]] Error: There was an error with output column "LIST_DATA_RCVD" (324) on output "OLE DB Source Output" (287). The column status returned was: "DBSTATUS_UNAVAILABLE".
[OLE DB Source [277]] Error: The "output column "LIST_DATA_RCVD" (324)" failed because error code 0xC0209071 occurred, and the error row disposition on "output column "LIST_DATA_RCVD" (324)" specifies failure on error. An error occurred on the specified object of the specified component.
[DTS.Pipeline] Error: The PrimeOutput method on component "OLE DB Source" (277) returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Any suggestions on how I can get the large string data in the varchar column in DB2 into the varchar(max) column in SQL Server 2005?
I am trying to create a store procedure inside of SQL Management Studio console and I kept getting errors. Here's my store procedure.
Code Block CREATE PROCEDURE [dbo].[sqlOutlookSearch] -- Add the parameters for the stored procedure here @OLIssueID int = NULL, @searchString varchar(1000) = NULL AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON; -- Insert statements for procedure here IF @OLIssueID <> 11111 SELECT * FROM [OLissue], [Outlook] WHERE [OLissue].[issueID] = @OLIssueID AND [OLissue].[issueID] = [Outlook].[issueID] AND [Outlook].[contents] LIKE + ''%'' + @searchString + ''%'' ELSE SELECT * FROM [Outlook] WHERE [Outlook].[contents] LIKE + ''%'' + @searchString + ''%'' END
And the error I kept getting is:
Msg 402, Level 16, State 1, Procedure sqlOutlookSearch, Line 18
The data types varchar and varchar are incompatible in the modulo operator.
Msg 402, Level 16, State 1, Procedure sqlOutlookSearch, Line 21
The data types varchar and varchar are incompatible in the modulo operator.
For the life of me I cannot figure out why SSIS will not convert varchar data. instead of using the table to table method, I wrote a SQL query so that I could transform the datatype ntext to varchar 512 understanding that natively MS is going towards all Unicode applications.
The source fields from Access are int, int, int and varchar(512). The same is true of the destination within SQL Server 2005. the field 'Answer' is the varchar field in question....
I get the following error
Validating (Error)
Messages
Error 0xc02020f6: Data Flow Task: Column "Answer" cannot convert between unicode and non-unicode string data types. (SQL Server Import and Export Wizard)
Error 0xc004706b: Data Flow Task: "component "Destination - Query" (28)" failed validation and returned validation status "VS_ISBROKEN". (SQL Server Import and Export Wizard)
Error 0xc004700c: Data Flow Task: One or more component failed validation. (SQL Server Import and Export Wizard)
Error 0xc0024107: Data Flow Task: There were errors during task validation. (SQL Server Import and Export Wizard)
DTS used to be a very strong tool but a simple import such as this is causing me extreme grief and wondering of SQL2005 is ready for primetime. FYI SP1 is installed. I am running this from a workstation and not on the server if that makes a difference...
I have a table that contains a lot of demographic information. The data is usually small (<20 chars) but ocassionally needs to handle large values (250 chars). Right now its set up for varchar(max) and I don't think I want to do this.
How does varchar(max) store info differently from varchar(250)? Either way doesn't it have to hold the container information? So the word "Crackers" have 8 characters to it and information sayings its 8 characters long in both cases. This meaning its taking up same amount of space?
Also my concern will be running queries off of it, does a varchar(max) choke up queries because the fields cannot be properly analyzed? Is varchar(250) any better?
Should I just go with char(250) and watch my db size explode?
Usually the data that is 250 characters contain a lot of blank space that is removed using a SPROC so its not usually 250 characters for long.
I am new to the topic T-SQL. I am trying to use T-SQL to merge the content of two tables (table1 and table2) into one table making sure there are no duplication.
I wonder if any body can let me have a simple code.
Hey guys what would be the easiest way to create a report of value changes for particular records from one day to the next..... ? Any suggestions would be greatly appreciated....
Hi there! can anyone help me out??? I need to compare a date from the database and the system date which will be coded in a store procedure in SQL... HELP!!!!!
I have a table with a field with a bit datatype. When I execute the stored procedure line if @bitvalue = 1 begin ... and the value is passes as 0 the statements beneath the begin execute. What am I doing wrong?
Is there a way to compare two similar tables? I'm more interested in finding out if the data content is exactly the same or not between the two tables.
We are converting our project into new version. They have done lot of changes in new version including normalization/denormalization. I need to compare the old and new database. Do you have any standard script or procedure like SQL Compare software?. Let me know what are the possibilties we need to check. Your help appreciated. Thanks, Ravi
Assuming a table with a column defined char or varchar. I have a SQL query like this :
Select * from table1 where column1='Building'
It returns the same result that
Select * from table1 where column1='BUILDING'
It is my understanding SQL Server (verison 7 or 2000, I tried on both) is lower/capital insensitive by default when it is installed. If I want SQL Server to be case-sensitive with my char or varchar columns, where can I set it?
Is it at database level or server level I can find this setting . What is the setting that control it?
Best Regards, Alain Gagne, Lead DBA gagnea@msagroup.com
Morning! Folks, i want some links where i could find healthy stuff regarding New Features and Compliance levels, performance comparisons, TPC tests etc about SQL 2000 and Yukon specially. I've to submit a document regarding Top database features that shall be used for a Medical-Billing Software in plan.
I have two tables that I am needing to link by the tables date field. In one table the date field is defined as varchar(23). The time for this field is always zeros. Example: '2005-12-27 00:00:00.000'
The other table is defined as datetime, and it does have the date and time in this field. Example: 2005-12-27 08:00:35.000
The problem i am having is 2005-12-27 00:00:00.000 does not = 2005-12-27 08:00:35.000.
Because I will never have more than one record on the same date I would like to be able to only compare the date. Example 2005-12-27 = 2005-12-27
Since the fields are 2 different field types, this is giving me a problem. Could someone please help. I have tried everything I know to do.
What I really need is the a way to format the datetime fields date into a string such as '2005-12-27'.