I'm trying to go deeper into SQL Server's XML support. When looking at a recent forum post, I got curious to see if I could achieve the same result with XPath and FLOWR. Here's the data I'm working with:
select data.mynodes.value('@Name', 'varchar(50)') Name
from #data
cross apply myXMLData.nodes('/order/Product/Drops/Drop[@Number="1"]/Area') data(mynodes)
Results:
Name
43001PBOX
[Code] .....
Here's my first attempt at a FLWOR expression. Note that doesn't produce the same results.
select myXmlData.query('
for $drop in /order/Product/Drops/Drop
for $Name in $drop/Area/@Name
where $drop/@Number=1
return data($Name)
')
from #data
Results:
43001PBOX 43001R001 43001R002 43023PBOX 43023R001 43023R002 43031PBOX 43031R001
How to build a FLOWR-based query that produces the same result as the pure XPath one?
I am using a custom sql query to import data into Tableau. Problem is I need to change the varchar column data in SQL currently returning 18/01/2014 08:35:13 as a format into the date format 05/21/2014 3:55:51 PM before I can use it.
I have a column name DateofRecord and it is nvarchar type..all the values in this column are like this
"04/24/2013' "05/01/2014"...etc...
My requirement is to convert this column into Datetime ?
I tried so many ways using cast and convert functions like cast(dateofrecord,datetime) or like convert(datetime,replace(DateofRecord,'"','''')) ..it didnt worked..
SELECT ROUND ('6.465',2) --- result 6.46 and SELECT ROUND (6.465,2) --- result 6.47 with
It's because you're relying on an implicit conversion from a string to a decimal data type which SQL server will do to 2 decimal places by default...
Alright:
SELECT ROUND (CONVERT(DECIMAL(3,2),'6.465'),2) --- result 6.47 Now please explain this: SELECT ROUND('0.285',2) -- 0.28 SELECT ROUND(0.285,2) -- 0.29 SELECT ROUND (CONVERT(DECIMAL(3,2),'0.285'),2) --- result 0.29 The string value does not seem to be converted to decimal with 2 decimal places.
MS is on the safe side with mentioning the last digit is always an estimate But because the result of the estimate is always the same, I would like to know:
* how is a string value exactly implicitly converted?
* how exactly does the estimation work, that in case of doubt rounds a value up or off?
Within in Visual Studio 2012 solution, I have several projects, one of which is a Database project. I am defining several tables. The one in question is:
CREATE TABLE [dbo].[tblKppHierarchyPcl] ( [ID] NUMERIC(18,0) NOT NULL, [Name] VARCHAR(500), [PartStructureKey] NUMERIC(18,0) NOT NULL, [PartNumber] VARCHAR(500) NOT NULL, [ParentPartNumber] VARCHAR(500) NULL,
[code]...
Error SQL72014: .Net SqlClient Data Provider: Msg 245, Level 16, State 1, Line 76 Conversion.failed when converting the varchar value 'Coolant Quick Disconnect' to data type int.So it has a problem with inserting 'Coolant Quick Disconnect' into the Name column. The Name column is CLEARLY a varchar column but somehow it thinks it's an int column.
I have a lot of rows of hours, set up like this: 0745, 0800, 2200, 1145 and so on (varchar(5), for some reason).
These are converted into a smalldatetime like this:
CONVERT(smalldatetime, STUFF(timestarted, 3, 0, ':')) [this would give output like this - 1900-01-01 11:45:00]
This code has been in place for years...and we stick the date on later from another column.
But recently, it's started to fail for some rows, with "The conversion of a varchar data type to a smalldatetime data type resulted in an out-of-range value".
My assumption is that new data being added in is junk. If I query for these values and just list them (rather than adding a column to convert them also) that's fine, of course. I've checked all the stuffed (but not yet converted - so 11:45 rather than 1145) output to see if it ISDATE(), and it is. There are no times with hours > 23 or minutes greater than 59 either.
If I add the CONVERT in, we see the error message. But here's the oddity, if I place all of the rows into a holding table, and retry the conversion, there is no error. It's this last bit that is puzzling me. Plus I can't see any errors in the hours data that would cause a conversion problem.
I've put the whole of this into a cursor to try to trap the error rows too, but all processes fine. Why would it fail if NOT in a cursor?
I have been trying to convert datetime but keep getting this error(Conversion failed when converting date and/or time from character string.It works just fine if I don't use execute sp_executesql,.
<code> DECLARE @tablename AS nvarchar(max) DECLARE @SQLQuery AS NVARCHAR(MAX) DECLARE @ParameterDefinition AS NVARCHAR(100) DECLARE @startdate datetime
I am working on a code that will convert the query output into a html query output. That is the output of the query will be along with html tags so that we can save it as a html file.
The stored procedure that i have used is downloaded from symantec website and is working fine. Below is the code of that stored procedure.
/****** Object: StoredProcedure [dbo].[sp_Table2HTML] Script Date: 12/21/2011 09:04:30 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[sp_Table2HTML] ( @TABLENAME NVARCHAR(500), @OUTPUT NVARCHAR(MAX) OUTPUT,
[Code] ....
The code works fine and i am able to get the output with html tags. The problem occurs when i insert stylesheet in the code. I tried to enforce styles using a stylesheet for the table returned in my sql code, so that it will look good. below is the stylesheet code that i inserted between <head> and </head> tags.
If I run the procedure without the style sheet code, it works fine. but when i run the procedure with style sheet code i am getting the below errors.
Msg 105, Level 15, State 1, Line 98 Unclosed quotation mark after the character string '</table></body><'. Msg 102, Level 15, State 1, Line 98 Incorrect syntax near '</table></body><'.
[Code] .....
I checked the code and i am not able to find what is the mistake. I tried changing the quotation mark but it didn't worked.
I need to get this data into an SQL table in the following form so I can use it to further manipulate the data and update several other tables. I am thinking that UNPIVOT or CROSS APPLY might be the way to go, but am not sure how to code it.
I am doing a Case statement to create a unified field for account type and vendor, in the code below I am receiving the error in the subject, because the account numbers have alpha characters in the string, I need to make them as OTHER if the first 2 left chars are Alpha, I thought how I have ISNUMERIC would do that but I am still getting the error.
I am also including example of how the account_numbers are formatted.
Am converting varchar field to float and summing using group by and next inserting to varchar field(table).
while inserting float value it is converting to exponential ex:1.04177e+006 but if i execute only select statment actual float value will get display ex:1041765.726
My question is why it is converting while inserting ? and how to avoid it.
OK, so I have this query where I'm trying to subtract values like this, when I do this I am getting (Arithmetic overflow error converting varchar to data type numeric.) I have tried many different things, and now of these work, it'll either return 0 because it loses the .XXXXX.
Convert(DECIMAL(10,7),CAST([TIME_OF_COMPLETION] as DECIMAL(10,7)) - Convert(DECIMAL(10,7),CAST([OPR_DELIVERED_TIME] as DECIMAL(10,7)) round(cast(cast(hist.[TIME_OF_COMPLETION] AS float) as DECIMAL(15, 5)) - CAST(hist.[OPR_DELIVERED_TIME] AS FLOAT),1 SELECT convert(FLOAT,CAST('735509.00053' AS DECIMAL(10,5))) - convert(FLOAT,CAST('735509.00047' AS DECIMAL(10,5)))
When I set the SecondOperand to /* the output is a concatenated text string of DeviceID and UserID values. I'm trying to get just the DeviceID but perhaps my understanding of XPath syntax is wrong. I've tried setting the SecondOperand to the following:
SELECT XmlField.value('(//step/@userName)[1]', 'VARCHAR(100)') AS userName FROM theTable
I would like to have some kind of indicator in my TSQL results of the number of each <step> node. For instance, saf2 would be a 3. I think this is similar to a ROW_NUMBER() kind of function in normal table data.
Is there a good way to get a 1, 2, 3, and 4 value for this sample data?
I'm moving data from one database to another (INSERT INTO ... SELECT ... FROM ....) and am encountering this error:
Msg 8114, Level 16, State 5, Line 6 Error converting data type varchar to numeric.
My problem is that Line 6 is:
set @brn_pk = '0D4BDE66347C440F'
so that is obviously not the problem and my query has almost 200 columns. I can go through one by one and compare what column is int in my destination table and what is varchar in my source tables, but that could take quite a while. How I can work out what column is causing the problem?
My question is how i can save the first result in a temp_array (or table i dont know) so I can get the result and split up the results again with the delimiter ','.
I have a very large database running in a production environment. The backup files are getting to be very difficult to manage. They are at 70gb as of now and growing at a rate of 10gb per month. This is due to document images being stored in the database in one table. I have read a little about a new feature in SQL Server 2012 called Filetable. Can we migrate to SQL Server 2012 and convert to the filetable structure, will this decrease the size of the backups? Would backup of the document images be taken care of by our nightly file system backups now?
Is there an easy way to convert Access Queries to SQL Views without doing it manually?I have used the Databse tool to migrate tables, but cannot see to find something similiar for queries.
I have a stored procedure in which we are deriving some flags. So, we used series of CASE statements.
For examples
CASE WHEN LEFT(CommissionerCode, 3) IN ('ABC','DEF',...) THEN 1 WHEN PracticeCode IN (.......) THEN 1 WHEN (CommissionerCode IN (.....) OR PracticeCode NOT IN (.....) OR .....) THEN 1 ELSE 0 END
I need to put these conditions in config table and generate dynamic sql.
What is the best way to do this? especially, 3rd condition with OR logic with multiple columns involved.
I need to take a temporary table that has various times stored in a text field (4:30 pm, 11:00 am, 5:30 pm, etc.), convert it to miltary time then cast it as an integer with an update statement kind of like:
Update myTable set MovieTime = REPLACE(CONVERT(CHAR(5),GETDATE(),108), ':', '')
how this can be done while my temp table is in session?
Hi folks,I was working on MS sql server 2005 evalution where i have built a number of databases. However, i came to discove that the evalution version has expried before i finished my work. Now i have disinstalled the sql server 2005 and installed the Sql express edition.My concern here is how can i keep my databases so they can work with sql express edition?Thank you very much in advance.
Hi, I am developing a project that using the one of the starter kits which use the MS SQL EXPRESS database.The project is almost ready to be launch. few questions:
I am looking for a good host with good support reasonable paid. What is my options if I would like to convert from the current database, to other databases like MySql, MS SQL Server or any? which tools can help with this convertions? thats all, thanks.
So where I work is thinking about one day moving to SQL server. Right now they have indexed files that aren't normalized with repeating fields in them and lots of repeat data and blank space (so a customer number in one file may be stored literally in 10 other files that are easily realted). In the intrest of saving time and money I think that they will not normalize, index, or anything to any of these files. From what I hear it will be a straight field by field creation for the most part and preserving the primary keys.
My question: I keep thinking this is going to be massive hit on performance and maintaince. How much would converting in such a manner hurt the performance of their database and how much could it potentially add to maintaince?
So where I work is thinking about one day moving to SQL server. Right now they have indexed files that aren't normalized with repeating fields in them and lots of repeat data and blank space (so a customer number in one file may be stored literally in 10 other files that are easily realted). In the intrest of saving time and money I think that they will not normalize, index, or anything to any of these files. From what I hear it will be a straight field by field creation for the most part and preserving the primary keys.
My question: I keep thinking this is going to be massive hit on performance and maintaince. How much would converting in such a manner hurt the performance of their database and how much could it potentially add to maintaince?
I'm new to SQL server. I've been working with oracle and now need to convert this sentence to SQL Server but I get this error:
Server: Msg 170, Level 15, State 1, Line 1 Line 1: Incorrect syntax near 'A'. Server: Msg 156, Level 15, State 1, Line 6 Incorrect syntax near the keyword 'where'.m_proceso;
My sentence is: /***************** update data_grup A set new_grup_code =( select cod_grup_2 from data_grup_new B where A.cod_grup = B.cod_grup ) where zone = '001' ; /***************** Can anyone help me to convert this to SQL server, I don't know why I get this error. Thanks.