Is It Possible To Use A Varchar Variable As A Table Name?
Mar 31, 2008
Hi,
What I am trying to achieve is renaming a table by appending a timestamp value if the table exists!
But I am using a varchar variable to achieve the same.
I feel am stuck as what I need to do is rename the table and use the new name to query that table.
This is what I am doing.
DECLARE @NewTableName varchar(100)
IF OBJECT_ID('TableName') IS NOT NULL
BEGIN
SET @NewTableName = 'ExistingTableName' + CONVERT(VARCHAR(25),GETDATE(),126)
END
Please let me know if there is a way to achieve the table renaming and querying the renamed table!
View 3 Replies
ADVERTISEMENT
May 19, 2004
Guys.
I have some data like this one:
'UploadfilesCompanyIDProjectCodeFilesFile1.doc'
P.S: the values may vary each time.
I need to programmatically extract from that kind of string only file1.doc
Ho can I dynamically do it? any function? statement? etc? please help.
The idea is getting only the string after the last ""
Thanks
Fr
View 1 Replies
View Related
Jul 26, 2005
Hi, is there a type of data that we don't need to specify the length but can grow dynamically?instead of using varchar 2000, a type that acts like a varchar but the length is unlimited ....Thanks,
View 5 Replies
View Related
Feb 26, 2008
im trying to learn how to calculate table size.
i understand some of it, but im stuck at calculating the varchars
Ex. i have 2 varchar columns
- varchar(50)
- varchar(100)
i'm suppose to find the average length for them?
i'm suppose to use that to add up to my ROW SIZE
and also after i got the average, do i add 2 for variable columns overhead and another 2 for Row pointer in row offset array
please help me asap before 2morrow night.
Thanks!
i have a test
View 2 Replies
View Related
Mar 13, 2004
Why can varchar datatype variable only 4000 byte?
For example:
in a storedprocedure
declare @aa varchar(8000)
......
while
select @aa=@aa+@otherinfo
end
when the length is more than 4000 ,the data in the behind will be lost
View 1 Replies
View Related
Dec 28, 2007
Hi there, I've a question regaarding datatype conversions... I can't convert the above mentioned datatype. I always get an error message that the conversion fails. Doesn't matter If convert or cast is used.
How would you convert the above mentioned variable into float???
View 8 Replies
View Related
Jun 2, 2008
Cannot use dynamic sql in current context. So need some help regarding this.I am developing a stored procedure to update a table. Sending Column names as parameters, but not able to use them as given below.INSERT INTO Books (@Column1, @Column2) values.. Any way to execute without using dynamic sql?..Thanx.
View 1 Replies
View Related
Apr 2, 2008
Hi,
I'm seeing a problem with printing very long strings using the PRINT
command on a VARCHAR(MAX) variable. After a certain amount of
characters the string is truncated....it looks like the limit is at
around 8,000 characters.
Does anyone know of a solution or a workaround for this?
View 16 Replies
View Related
Jan 31, 2008
I have created a table Table with name as Varchar and id as int. Now i have started inserting the rows like, insert into Table values ('arun',20).Yes i have inserted a row in the table. Now i have got the values " arun's ", 50. insert into Table values('arun's',20) My sqlserver is giving me an error instead of inserting the row. How will you solve this problem?
View 3 Replies
View Related
Aug 5, 2015
declare @var varchar(8000)
set @var='Name1~50~20~50@Name2~25.5~50~63@Name3~30~80~43@Name4~60~80~23'
---------------------
Create table #tmp(id int identity(1,1),Name varchar(20),Value1 float,Value2 float,Value3 float)
Insert into #tmp (Name,Value1,Value2,Value3)
Values ('Name1',50,20,50 ), ('Name2',25.5,50,63 ), ('Name3',30,80,43 ), ('Name4',60,80,23)
select * from #tmp
I want to convert to @var to same like #tmp table ..
"@" - delimiter goes to rows
"~" - delimiter goes to columns
View 6 Replies
View Related
Nov 4, 2015
CREATE TABLE #T(branchnumber VARCHAR(4000))
insert into #t(branchnumber) values (005)
insert into #t(branchnumber) values (090)
insert into #t(branchnumber) values (115)
insert into #t(branchnumber) values (210)
insert into #t(branchnumber) values (216)
[code]....
I have a parameter which should take multiple values into it and pass that to the code that i use. For, this i created a parameter and temporarily for testing i am passing some values into it.Using a dynamic SQL i am converting multiple values into multiple records as rows into another variable (called @QUERY). My question is, how to insert the values from variable into a table (table variable or temp table or CTE).OR Is there any way to parse the multiple values into a table. like if we pass multiple values into a parameter. those should go into a table as rows.
View 6 Replies
View Related
Aug 29, 2007
which is more efficient...which takes less memory...how is the memory allocation done for both the types.
View 1 Replies
View Related
Jan 17, 2008
Hi,
I have a Users table that I use for membership. I am using username varchar(30) as the primary key for this table since username will always be unique.
The question I have is regarding how SQL Server actually stores data:
I see that when I add users, they are always stored alphabetically sorted on username.
I was expecting that all users will appear on the users table in the order they were added.
Example: I have 3 users (john, jonah, wilson). Now I added 4 user with username='bob'
If I execute select * from users, it returns me (bob, john, jonah, wilson). Look bob has become the first row of the table.
My question: Is Sql server moving 3 older rows to make room for 'bob' and it is also rebuilding part of the index due this new username 'bob'?
If this is the case, then it will have big impact if I have 100K users and I add one user that becomes firstrow. In that case 99,999 rows will have to move.
Bottom line, insert, delete will be very expensive.
I know sql server keeps data physically sorted on PK. But I am concerned here since rows are losing the order in which they were inserted.
Thanks
View 8 Replies
View Related
Jan 3, 2001
It appears that when I insert data into a varchar(8000) field, SQL Server truncates everything after the 256th byte. When I change the field to text, this problem is eliminated. Can someone give me an explanation of why? And can I actually to the insert with the field being a varchar(8000) instead of a text data type. This will do wonders for the size and indexing.
View 1 Replies
View Related
Nov 11, 2011
I have a table that imported as varchar. Most of these columns need to be in a numerical format. How can I convert a table with columns named column0 (needs to be int),column1 (stays varchar), column2(needs to be int), and column 3(needs to be int)?
View 4 Replies
View Related
Jul 19, 2012
I don't know if it's a local issue but I can't use temp table or table variable in a PP query (so not in a stored procedure).
Environment: W7 enterprise desktop 32 + Office 2012 32 + PowerPivot 2012 32
Simple example:
declare @tTable(col1 int)
insert into @tTable(col1) values (1)
select * from @tTable
Works perfectly in SQL Server Management Studio and the database connection is OK to as I may generate PP table using complex (or simple) queries without difficulty.
But when trying to get this same result in a PP table I get an error, idem when replacing table variable by a temporary table.
Message: OLE DB or ODBC error. .... The current operation was cancelled because another operation the the transaction failed.
View 11 Replies
View Related
Sep 17, 2007
In a table-valued UDF, does the UDF use a table variable or a temp table to form the resultset returned?
View 1 Replies
View Related
Oct 6, 2014
I am trying to use a stored procedure to update a column in a sql table using the value from a variable table I getting errors because my syntax is not correct. I think table aliases are not allowed in UPDATE statements.
This is my statement:
UPDATE [dbo].[sessions_teams] stc
SET stc.[Talks] = fmt.found_talks_type
FROM @Find_Missing_Talks fmt
WHERE stc.sessionid IN (SELECT sessionid FROM @Find_Missing_Talks)
AND stc.coupleid IN (SELECT coupleid FROM @Find_Missing_Talks)
View 2 Replies
View Related
Jul 27, 2004
I have a table in my database and the table has almost 45 columns and the rowsize is 10468 bytes.in that most of the colums have varchar datatypes and and i think coz of poor knowledge of the data most of the columns with varchar data were given more column length. Now i want to decrease the size of those columns and to see the row size would be around 8k Bytes.If i do this now, does it affect the table performance much....Infact can i do this as there is lot of data (almost 2 million rows) in the table.If it is possible is there anything to be taken care before changing the column lenghts.
Thanks.
View 2 Replies
View Related
Feb 17, 2012
I am currently woking on transfering the table contents in sql server in an csv file i have created a stored procedure which would do same but the prob i am facing is the data in the table is not clean it contains tab,newline etc so i had to clean the data i had applied the folowing procedure
declare @columns varchar(8000), @sql varchar(8000), @sql1 varchar(8000),@data_file varchar(100)
set @columns=''
--'@columns+''replace(replace(replace(''+column_name +'',Char(10),''''''''),Char(13),''''''''),Char(19) ,'''''''')'''
--print @sql
select
@columns=@columns+'replace(replace(replace('+colum n_name+',Char(10),''''),Char(13),''''),Char(19),'' '') as '+column_name+ ', '
from
[code]...
the table_name contains around 300 columns with the column_name of min 20 characters .When i run the above query i get only few columns instead of all the columns so i tried to find the length which gives me the result as 4000.I had declared @columns as Varchar(8000) i dont know why this issue is coming up?? is there any other way i can clean the data an the transfer it into the file
View 9 Replies
View Related
Jun 1, 2007
Should I change my current design which has a Lookup Child Table and a Parent Table containing a ID link tot he Child Table, and instead just putting a varchar for the Value in the Parent Table?
The issue is that the value is unique for a large percentage of cases (although some specific values reoccur very frequently).
However, the parent table has a high number of inserts, and that in turn often requires inserts to the Child Table, which in turn pushes the Child Table out of shape, and performance suffers.
The data is very seldom used (usually for debugging only)
Should I de-Normalise this for performance?
Details of the actual circumstances:
I have a table that Logs each "page hit" on our web site. That stores the "Referrer".
We store the Referrer for the SESSION separately (the first page viewed will have an external referrer, thereafter all pages will have an "internal" referrer).
This Referrer data is used VERY LITTLE - just for the occasional debugging issue.
Within some URLs are frequently used (www.mydomain.com/HOME.ASP) and others contain &Parameters and stuff and are either relatively unique(www.mydomain.com/ViewBasket.ASP&BasketID=1234) or variable (www.mydomain.com/ViewProduct.ASP&ProductID=1234)
Currently I store the Referrer in a Child "Lookup" Table, and the PK ID in the "Logging" Table.
Some analysis of the data shows:
119,509 rows (pages logged [purged after a couple of days, only includes data where the Referrer is not blank!])
Of those there are 10,760 distinct Referrer values. An average of 11 Page Hits per Referrer value.
On the face of it that suggests that using a child table is good! However, for each row inserted into the Page Hit table I have to do the following to get the Referrer ID:
SELECT @MyReferrerID = ID FROM MyLookupTable WHERE MyValue = @MyReferrerValue
IF @@ROWCOUNT = 0
INSERT ... @MyReferrerValue ...
SELECT @MyReferrerID = @@IDENTITY
(MyLookupTable has a PK on [ID] and a Unique Index on [MyValue] )
(In a busy month we will add 100,000 rows to this Lookup table, in a slack month 20,000. These get "purged" after a while, so the table size stays about the same. The table is defragged [if required] daily and the Stats updated. There are 350,000 rows in MyLookupTable)
Obviously some values in MyLookupTable reoccur, others are unique - and get purged after a while. Inserting these new entries pushes MyLookupTable out of shape, the SHOW_STATISTICS start to look awful, an sp_recompile sorts it out (Why? The query plan doesn't change ... although obviously the "shape" of the table has, maybe the Stats got auto-updated, would that change anything?)
So I'm thinking about just storing @MyReferrer as a VARCHAR in the Logging table, and throwing away MyLookupTable.
So really I'm just looking at performance, not Normalisation ...
What do you think?
Kristen
View 2 Replies
View Related
Nov 20, 2007
I have looked far and wide and have not found anything that works to allow me to resolve this issue.
I am moving data from DB2 using the MS OLEDB Provider for DB2. The OLEDB source sees the column of data as DT_TEXT. I setup a destination to SQL Server 2005 and everything looks good until I try and run the package.
I get the error:
[OLE DB Source [277]] Error: An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record is available. Source: "Microsoft DB2 OLE DB Provider" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
[OLE DB Source [277]] Error: Failed to retrieve long data for column "LIST_DATA_RCVD".
[OLE DB Source [277]] Error: There was an error with output column "LIST_DATA_RCVD" (324) on output "OLE DB Source Output" (287). The column status returned was: "DBSTATUS_UNAVAILABLE".
[OLE DB Source [277]] Error: The "output column "LIST_DATA_RCVD" (324)" failed because error code 0xC0209071 occurred, and the error row disposition on "output column "LIST_DATA_RCVD" (324)" specifies failure on error. An error occurred on the specified object of the specified component.
[DTS.Pipeline] Error: The PrimeOutput method on component "OLE DB Source" (277) returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Any suggestions on how I can get the large string data in the varchar column in DB2 into the varchar(max) column in SQL Server 2005?
View 10 Replies
View Related
Jan 4, 2008
I am trying to create a store procedure inside of SQL Management Studio console and I kept getting errors. Here's my store procedure.
Code Block
CREATE PROCEDURE [dbo].[sqlOutlookSearch]
-- Add the parameters for the stored procedure here
@OLIssueID int = NULL,
@searchString varchar(1000) = NULL
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
IF @OLIssueID <> 11111
SELECT * FROM [OLissue], [Outlook]
WHERE [OLissue].[issueID] = @OLIssueID AND [OLissue].[issueID] = [Outlook].[issueID] AND [Outlook].[contents] LIKE + ''%'' + @searchString + ''%''
ELSE
SELECT * FROM [Outlook]
WHERE [Outlook].[contents] LIKE + ''%'' + @searchString + ''%''
END
And the error I kept getting is:
Msg 402, Level 16, State 1, Procedure sqlOutlookSearch, Line 18
The data types varchar and varchar are incompatible in the modulo operator.
Msg 402, Level 16, State 1, Procedure sqlOutlookSearch, Line 21
The data types varchar and varchar are incompatible in the modulo operator.
Any help is appreciated.
View 5 Replies
View Related
Jan 6, 2004
Hi there,
Can someone tell me if it is possible to add an index to a Table variable that is declare as part of a table valued function ? I've tried the following but I can't get it to work.
ALTER FUNCTION dbo.fnSearch_GetJobsByOccurrence
(
@param1 int,
@param2 int
)
RETURNS @Result TABLE (resultcol1 int, resultcol2 int)
AS
BEGIN
CREATE INDEX resultcol2_ind ON @Result
-- do some other stuff
RETURN
END
View 2 Replies
View Related
Feb 26, 2008
Hi all,
my stored procedure have one table variable (@t_Replenishment_Rpt).I want to create an Index on this table variable.please advise any of them in this loop...
below is my table variable and I need to create 3 indexes on this...
DECLARE @t_Replenishment_Rpt TABLE
(
Item_Nbr varchar(25) NULL,
Item_Desc varchar(255) NULL,
Trx_Date datetime NULL,
Balance int NULL,
Trx_Type char(10) NULL,
Issue_Type char(10) NULL,
Location char(25) NULL,
Min_Stock int NULL,
Order_Qty int NULL,
Unit char(10) NULL,
Issue_Qty int NULL,
Vendor varchar(10) NULL,
WO_Nbr varchar(10) NULL,
Lead_Time int NULL,
PO_Nbr char(10) NULL,
PO_Status char(10) NULL,
Currency char(10) NULL,
Last_Cost money NULL,
Dept_No varchar(20) NULL,
MSDSNbr varchar(10) NULL,
VendorName varchar(50) NULL,
Reviewed varchar(20) NULL
)
I tryed all below senarios...it is giving error...
--Indexing the @t_Replenishment_Rpt table on the column Names Item Number, Vender , Department Number
--EXEC sp_executesql(CREATE UNIQUE CLUSTERED INDEX Replenishment_index ON @t_Replenishment_Rpt (Item_Nbr))
--CREATE UNIQUE CLUSTERED INDEX Idx1 ON @t_Replenishment_Rpt.Item_Nbr
INDEX_COL ( '@t_Replenishment_Rpt' , ind_Replenishment_id , Item_Nbr )
--EXEC sp_executesql('SELECT INDEXPROPERTY('+ '@t_Replenishment_Rpt' + ', ' + 'Item_Nbr' + ',' + 'IsPadIndex' + ')')
--EXEC sp_executesql(SELECT INDEXPROPERTY('@t_Replenishment_Rpt', 'Vendor','IsPadIndex'))
--EXEC sp_executesql(SELECT INDEXPROPERTY('@t_Replenishment_Rpt', 'Dept_No','IsPadIndex'))
View 3 Replies
View Related
Jul 29, 2015
I've look at several different methods for removing leading zero's from a column but I need to remove trailing data from a VARCHAR column. For some reason, the old database saved the time along side the date in my client's app.
For example:
The old database format "2015-07-28 00:00:00"
I need the data in this column in the new database to only be the date "2015-07-28", there are alot of rows with this issue.
Is there a query I can run to remove the 00-00-00 from all of the rows? Some of the fields actually have a time in there like this: 2015-07-28 12:15:35, with this one, I don't think it's going to be easy but if I could at least remove the 00-00-00 from all the rows that have it, that would be a good start.
View 9 Replies
View Related
Jan 17, 2008
Hi,
I have a Users table that I use for membership. I am using username varchar(30) as the primary key for this table since username will always be unique.
The question I have is regarding how SQL Server actually stores data:
I see that when I add users, they are always stored alphabetically sorted on username.
I was expecting that all users will appear on the users table in the order they were added.
Example: I have 3 users (john, jonah, wilson). Now I added 4 user with username='bob'
If I execute select * from users, it returns me (bob, john, jonah, wilson). Look bob has become the first row of the table.
My question: Is Sql server moving 3 older rows to make room for 'bob' and it is also rebuilding part of the index due this new username 'bob'?
If this is the case, then it will have big impact if I have 100K users and I add one user that becomes firstrow. In that case 99,999 rows will have to move.
Bottom line, insert, delete will be very expensive.
I know sql server keeps data physically sorted on PK. But I am concerned here since rows are losing the order in which they were inserted.
Thanks
View 2 Replies
View Related
Jul 22, 2015
I have one of the sample values in my table. I need to convert below value to Decimal(18,5)
DECLARE @i
VARCHAR
SET @i
= '0.9'
Output i m looking for is 0.90000
View 16 Replies
View Related
Nov 20, 2006
For the life of me I cannot figure out why SSIS will not convert varchar data. instead of using the table to table method, I wrote a SQL query so that I could transform the datatype ntext to varchar 512 understanding that natively MS is going towards all Unicode applications.
The source fields from Access are int, int, int and varchar(512). The same is true of the destination within SQL Server 2005. the field 'Answer' is the varchar field in question....
I get the following error
Validating (Error)
Messages
Error 0xc02020f6: Data Flow Task: Column "Answer" cannot convert between unicode and non-unicode string data types.
(SQL Server Import and Export Wizard)
Error 0xc004706b: Data Flow Task: "component "Destination - Query" (28)" failed validation and returned validation status "VS_ISBROKEN".
(SQL Server Import and Export Wizard)
Error 0xc004700c: Data Flow Task: One or more component failed validation.
(SQL Server Import and Export Wizard)
Error 0xc0024107: Data Flow Task: There were errors during task validation.
(SQL Server Import and Export Wizard)
DTS used to be a very strong tool but a simple import such as this is causing me extreme grief and wondering of SQL2005 is ready for primetime. FYI SP1 is installed. I am running this from a workstation and not on the server if that makes a difference...
Any help would be appreciated.
View 7 Replies
View Related
Jul 8, 2015
I have a table with column "Data" as VARCHAR, with entries like below.
1
11
2
A1
A10
A11
246
AB1
AB10
100
256
B1
B2
124
20
B21
B31
32
68
I want to select the data by converting varchar to int for numeric values and for alphanumeric it should display as it is.
SELECT CAST(dataAS INT) FROM record_tab
getting below error
Conversion failed when converting the varchar value 'A1'
View 9 Replies
View Related
Jan 26, 2007
Hi All,Hope someone can help me...Im trying to highlight the advantages of using table variables asapposed to temp tables within single scope.My manager seems to believe that table variables are not advantageousbecause they reside in memory.He also seems to believe that temp tables do not use memory...Does anyone know how SQL server could read data from a temp tablewithout passing the data contained therein through memory???Is this a valid advantage/disadvantage of table variables VS temptables?
View 2 Replies
View Related
Jul 20, 2005
SQLLY challenged be gentle --Trying to create code that will drop a table using a variable as theTable Name.DECLARE @testname as char(50)SELECT @testname = 'CO_Line_of_Business_' +SUBSTRING(CAST(CD_LAST_EOM_DATEAS varchar), 5, 2) + '_' + LEFT(CAST(CD_LAST_EOM_DATE AS varchar),4)+ '_' + 'EOM'FROM TableNamePrint @testname = 'blah...blah...blah' (which is the actual tablename on the server)How can I use this variable (@testname) to drop the table? Undersevere time constraints so any help would be greatly appreciated.
View 2 Replies
View Related
Jan 17, 2008
In a previous post "Could #TempTable within SP cause lock on tempdb?" http://forums.microsoft.com/msdn/showpost.aspx?postid=2691763&siteid=1
It was obvious that we have to limit the use of #Temp table to a minimum. Let assume that some of the temp tables are really difficult to replace and we have to live with them.
Would it be easier on tempdb if the #TempTable is replaced by a table variable? Or do they all end up in tempdb?
Thanks in advance for any help.
View 4 Replies
View Related