Comma Appears In The Decimal Node Of XML File, How To Handle?
Nov 1, 2007
we defined a xml schema. ONe node was defined as decimal data type.
When the coming data of this node in this XML file is more than 999, then it will add a comma in the figure , looking like 1,024.00.
then SSIS can not regard this as a decimal.
Any experience to share ?
i dont wanna change the schema from decimal to string.
I wanted to convert a dataset from vb.net (2.0) to an .XLS file, by MS Jet. My national standard is using decimal commas, not decimal points for numbers signing the beginning of decimal places. But the MS Jet Engine uses decimal point,in default. Therefore, in the Excel file only string formatted cells can welcome this data, not number formatted. How can I solve or get around this problem? (with jet if it possible) iviczl
I am designing some reports for a German branch of my company and need to replace decimal point with a comma and the thousand comma seperator with a decimal point.
e.g. ‚¬1,500,123.00 to ‚¬1.500.123,00
Is there a property that I can change in the report designer to allow this to happen or is this something I need to convert in a Stored Proc.
I have a table named People(#Code, Name, eMailList), where the column eMailList has a list of email addresses separated by commas, like "someone@hotmail.com,someone@gmail.com,someone@yahoo.com".
I would like that for each Person, extract the email address info in such a way that I can insert into the following table: EMail(#Code, #ID, Address), where Address has a single email address.
I have a table named People(#Code, Name, eMailList), where the column eMailList has a list of email addresses separated by commas, like "someone@hotmail.com,someone@gmail.com,someone@yaho o.com".
I would like that for each Person, extract the email address info in such a way that I can insert into the following table: EMail(#Code, #ID, Address), where Address has a single email address.
Using Flat File Connection Manager, I am specifying Text Qualifier = Double quotes{"}, and i have TXT file with one column for lastname and first name as "LN,FN", and settings are set to comma delimted, now the connectin manager is creating two different columns for LN and FN,
I have a file which contains comma separated columns. One of columns contains names of companies. Sometimes the names of the companies have a comma as part of the name. For those, the value is surrounded by double-quotes.
But it seems that SSIS ignores the double quotes and ONLY looks for the column separator. This causes my value to be split in half.
Traditionally, I thought parsers that deal with this type of import do not automatically take the first comma following the double-quote as the column separator but instead look for the first comma following the ending quote. (i.e. Look at how Excel performs imports...)
I cannot set the column separator of the column to double-quote comma since only those values that HAVE a comma in them are qualified.
The last entry should be imported as 12 in the first column, "Peter, Paul, Mary" in the second column and 09643 in the third but instead ends up as 12 in the first, "Peter in second column and Paul, Mary", 09643 in the last.
(Oddly enough, if I remove the first column of numbers the import works like it is supposed.)
I have a file where there is a partial row at the end. It doesn't cause an error, but I get a "partial row" warning during execution.
What do most people do with these partial rows? Do they just ignore them as long as they don't cause errors? Or is it better to handle the partial row with a conditional split, for example?
Just wondering what other people's thoughts on this are. I tend to be of the "get rid of it" camp, but maybe that's overkill? Just looking for opinions, best practices.
I have not used log shipping before and find myself in a position where I need to reboot the secondary node and then the primary node and I don't actually need to failover.
Is there anything I need to be aware of. When rebooting the secondary node I assume the transactions will be held in the primary nodes log till the secondary comes back and just carry on once back up?
When rebooting the primary node nothing needs to be done and the log shipping will just start again once it has come back?
But I'm not sure if I have to install SQL Server first on node 2, then add it to the cluster. Or does adding it to the cluster also install the software?
I'm contemplating running two availability groups on a two node WSFC. The WSFC is setup with a file share witness (i.e. no shared storage). Can I safely run 1 AG on one primary node, and the other AG on the other node (as primary). Each AG would have replicas on the passive node. This would effectively allow both servers to be in use at the same time. In a failover event, I understand that both workloads would transfer to a single server - so the box needs to be sized appropriately.
We are in the process of building a 3 node SQL Server Cluster (Server 2012/ SQL Server 2012), and we have configured the quorum so that all 3 nodes have a vote (no file share witness as we already have an odd number of nodes).
As I understand it, this should allow the cluster to run as long as 2 of the nodes remain online.
However, the validation report states that 2 node failures would be acceptable and, when we tested this by powering off two of the nodes, the cluster did indeed continue to run on a single node.
Hi I'm pretty new to using Microsoft Visual C# .NET and I want to upload a comma delimited text file from my local machine into a table in an sql server database through a web app. How would I go about programming this and what controls do I need? Any help would be much appreciated. Thanks in advance.
I'm trying to upload a small Web application with a one table database. The hosting company, GoDaddy requires that I upload the database as a comma delimited file. I created the database in Visual Web Developer Express but also have Visual Studio and SQL Server Management Studio Express. I can't figure out how to export the database into a comma delimited file using any of these tools. This should be simple like it is in Access but that doesn't seem to be the case. This is holding up deploying my Web Application.
Hi,On SQLServer 2000, I have a table with a following structure:MYTABLEcol1 char,col2 date,col3 numberMy Objective:------------Externally (from a command line), to select all columns and write theoutput into a file delimited by a comma.My method:---------1. Probably will use OSQL or BCP to do this.2. Use the following syntax:select RTRIM(col1) +','+ RTRIM(col2) +','+ RTRIM(col3)from MYTABLE;My 3 Problems:-------------1) If there is a NULL column, the result of concatenating any value withNULL, is NULL. How can I work around this? I still want to record thiscolumn as null. Something like say from the example above, if col2 isnull, would result to: APPLE,,52) The time format when querying the database is: 2003-06-24 15:10:20.However, on the file, the data becomes: 24 JUN 2003 3:10PM. How can Ipreserve the YYYY-MM-DD HH:MM:SS format? Notice that I also lost theSS.3) Which utility is better? BCP or OSQL?For OSQL, it has a "-s" flag which gives me the option of putting acolumn separator. But the result is:"APPLE ,14 JUN 2003 , 5"I don't need the extra space.While for BCP, there is no column separator flag.You will notice from my inquiry above that my background in SQLServer isnot very good.Thanks in Advance!!RegardsRicky*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
I have a problem... When I insert data from a comma delimited file using this mehod a flat file connection sorting, merge join and inserting into the database I get "" around all the data!! The quotes end up around the column names and everything! I had to go in and manually remove the quotes in the text file to get some of my data conversions to work. I know there is a better way. How do I get SSIS to load the data without the quotes? This is an example of the data in the file:
"1007","1","A","","Congratulations - No Health Violations Found","11/02/2005","1007"
When I remove the quotes I do not have any problems. How do I do this without modifying the underlying data? Any ideas would be greatly appreciated!!
Here is my idea but I am looking for the best practice.
Each record can have 3 possibilites.
I would read and write the data 3 times to different tables and add an identity key on all 3 files then I would reassembe the data back together on the identity key and map the data to a fourth and final table.
I configure Windows 2003 R2 and SQL 2005 two nodes Cluster. When I move cluster resource from one node to anther node it takes around 30 seconds to become online. So in that time if any query is running it stops responding.
Ok I am faced with working with XML on a regular basis, which is fine.
DECLARE @ViewSN INT IF NOT EXISTS (select null from tblviews where viewcode = 'loadAtTerm') --<workflowEventType>loadAtTerminal</workflowEventType> insert into tblviews (ViewName,Description,OutBoundForm,StoredProcSN,TriggersReply,ViewCode,DispXactLayer,DispXactViewType,DispXfcTag,Comments) select 'QC:WF-LoadAtTerminal','This View Corresponds to the XML for loadAtTerminal in Omnitracs Workflow','0',NULL,'0', 'loadAtTerm','MCOM','MCOM',NULL,NULL
[code]...
What would be really useful is to be able to present any xml file and automatically parse the NODE names into a memory variable table and then the fields of each node in another.
Hello all, I have read many topics about this error but it doesn't fix my packages in my particular case. The problem is that I access to a database in Oracle using the Ole DB provider for Oracle
I get that the mentioned error when I try to run a job from the agent.
1 - When I set "DontSaveSensitive" to ProtectionLevel I get this error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E4D. An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80040E4D Description: "ORA-01017: username/password no valid". This is normal because the connection needs a password.
2 - When I set "EncryptSensitiveWithUserKey" to ProtectionLevel I get the mentioned error: "Failed to decrypt protected XML node "DTSassword" ..
The other options to ProtectionLevel doesn't work neither. I have tried to write the password in the field required...
I have tried many possibilities but nothing new, it doesn't work.
Does anybody have this error with Oracle as source file ???
Hi, I'm trying to deploy my Web site to GoDaddy. They told me I have to export the SQL Server Express database to a comma delimited file and then upload that file. The export procedure is simple in Access but I don't see any way to do it in SQL Server or from Visual Web Developer or Visual Studio. Also, I can ask them, but I assume I have to export each table separately and also export the ASPNETDB as well. Thanks for the help
I was wondering if anyone might be able to say how I could export data captured via a view into a comma delimited csv file.
So far I have tried using BCP to access my view and export to a CSV file, but the CSV file isn't comma delimited. I tried finding examples but couldn't see what I should do to have a comma delimited file. (I'm getting a bit tired now, so I might be missing something!)
I have created a bat file containing the following code:
I have 2 sql tables. 1 is the header table and another is the detail table. How can I have the header record being appended in the text file and then have the detail records being appended to a same text file again with comma delimited ?
I have created a Test SSIS Package within BIDS (VS 2K8, v 9.0.30729.4462 QFE; .NET v 3.5 SP1) that connects to our Test Listener.
There is only 1 Connection Manager Object, and OLE DB Provider for SQL Server.
The ConnectionString lists: Provider=SQLOLEDB.1;Integrated Security=SSPI
The Test Connection within BIDS works.
The Package Control Flow has just 1 Object, and Execute SQL Task that performs an Exec on an SP that contains only a Select (Read).
The Package runs within BIDS.
I've placed this Package within a Job on the Primary Node. Ive run the job successfully using 32 bit runtime on and off. The location of the file on the server happens to be on a share that resides on what is currently the Secondary Node.
When I try to run the exact copy of this Job on the Secondary Node (Which has been Set up for Read All Connections; Yes), I get an error, regardless of the 32 bit runtime opiton. At this point, the location of the file is on the Secondary Node.
The Error is: "Login failed for user 'OurDomainAgent_Account'".
The Agent is a member of NT ServiceSQLServerAgent on both instances, and that account is a member of SysAdmin. Adding the Agent account as well, and giving that account SysAdmin, makes no difference either.
I want to join differnet tables and import the data into comma delimited text file. There will be lot of checks like if then else to manipulate data. I want to use stored procedure but don't know how to output to text file. Is there any utility which can be used in stored procedure. In future this will be run as an automated job. Thanks in advance.
I want to insert things to SQL-server from an ascii file but I also want to add logics with IF - Then _ Else statements. I guess the only way is to make a stored Procedure.
How do you do when you want to read from a text file using the data into variables and then write it into the database ?