Normalizing The Data
Jun 9, 2006
Hi,
I have a table like this:
Col1 First_Year Last_Year
a 1990 1993
I want this data to be converted to
a 1990
a 1991
a 1992
a 1993
Is there any simple way to do it in SSIS without using Script Component?
Thx
View 3 Replies
ADVERTISEMENT
Nov 7, 2005
I have a view with patient data. It looks like below
patid date pulmdc pulmstatus endodc endostatus
100 4/1/05 10 Good null null
100 5/1/05 10 Good 12 Poor
I want to create sql which by each patient, by date, by these four fields ,get
patid 4/1/05 pulmdc-10 pulmstatus-good
patid 5/1/05 pulmdc-1 pulmstatus-good
patid 5/1/05 endodc-12 endostatus-poor
View 1 Replies
View Related
Apr 27, 2006
Greetings all,
I have created an SSIS package that takes data from a very large table (301 columns) and puts it in a new database in smaller tables. I am using views to control what data goes to the new tables. I also specified that it drop the destination table and recreate it prior to copying the data. The reason for this is so that old data removed from the larger database will get removed from the normalized databases.
I have 2 things I am trying to figure out..
1. I would like to have the package set a specific row in each new table to be the primary key (this will allow us to use relationships when querying the data).
2. I decided I wanted to sort the data as it copies. I am using the BI Visual Studio for my editing. In the Data Flow view I cannot seem to disconnect the output from the Source block so I can connect it to the Sort block and then feed that to the output block. What am I missing here?
Thanks
View 7 Replies
View Related
Nov 6, 2007
I'm new to SSIS and have run into a problem I'm hoping someone can help me with.
Basically, I have a flat file that looks something like:
ID,Type,Description,Results
1,Test1,This is a test,5
2,Test1,This is also a 1 test,7
3,Test1,This is also a 1 test,13
4,Test2,This is a second test,14
5,Test2,This is also a second test,18
I'm trying to normalize the data by extracting out individual rows that have the same "Type" column value. So what I want is to extract each unique type and description into a separate table. This would give me two new rows, one for a type of Test1, and one for a type of Test2, with the descriptions. Does this make sense? Then I could relate the individual results to these test types. In my scenario, I don't care which description is used; I just want to take the first description that shows up with the associated "Type."
Does anyone have any idea of how I could go about doing this? I could pull out all unique "Types" from the rows with the Aggregate transformation, but I'm trying to figure out how to get the description that goes along with it.
Thanks,
Brian
View 1 Replies
View Related
Aug 21, 2015
I'm presented with a problem where I have a database table which must be migrated via a "custom tool", moving the data into a new table which has special character requirements that didn't exist in the source database. My data resides in an SQL Server 2008R2 instance.
I envision a one-time query which will loop through selected records and replace the offending characters with --, however I'm having trouble understanding how this works.
There are roughly 2500 records which meet the criteria of "contains bad characters", frequently containing multiple separate bad chars, and the table contains roughly 100000 rows.
Special Characters are defined as #%&*:<>?/{}|~ and ..
While the field is called "Filename" it isn't always so, it is a parent/child table where foldernames are also stored.
Example data:
Tablename = Items
ItemID Filename ListID
1 Badfile<2015>.docx 15
2 Goodfile.docx 15
3 MoreBad#.docx 15
4 Dog&Cat#17.pdf 15
5 John's "Special" Folder 16
The examples I'm finding are all oriented around SELECT statements, to change the output of what I see returned, however I'd rather just fix the entire column using an UPDATE. Initial testing using REPLACE fails because I don't always have a single character as the bad thing in a string.
In a better solution, I found an example using a User Defined Function to modify the output of a select, but I cannot use that UDF in an UPDATE.
My alternative is to learn enough C# to modify the "migration tool" to do this in-transit, but I know even less about C# than I do of SQL.
I gather I want to use @@ROWCOUNT to loop through the rows but I really can't put it all together in a cohesive way.
View 3 Replies
View Related
Nov 8, 2006
Please i have created some tables Delivary with this columns (DelivaryId,DelivaryNo,QtyRecieved,DelivaryDate,ProductId) and Product with this columns (ProductId,ProductCode,ProductName,ProductPrice) as you can see the product table keeps record of products whlie the delivary table keeps record of stock supplied. I will like to create another table that will keep record of stock sold out (Invoice Table) based on the qty recieved from the delivaries table
Please help
View 6 Replies
View Related
Aug 10, 2005
So I'm creating an administrative back end for a site that's already been created, and whoever made the tables the site uses didn't know much about database design. So I need to normalize this table of Links so it can be easier to have someone make changes and updates to it, but then I need to put all my normalized tables back together to create a View exactly like the old table which the old site can select from. Basically the stipulation is I can't change the code for the old site so I have to make it think it's still selecting from the same table with the same type of parameters. Is it worth doing all this? Or should I just tough it out with this really ugly table?Here's the table: and here's the site that uses this table:http://waahp.byu.edu/links.aspThanks!~Cattrah~
View 3 Replies
View Related
Aug 28, 2006
Hi
Please can someone point me in the direction, i built a very badly designed database consisting of only one huge table when i first started databases, since learning about normalization i have designed and set up a new database which consists of many more tables instead of just the one. My question is where do i start in transfering the data from the old single tabled database to my new multi-tabled database?
I have MS SQL server 2005 managment studio if that helps, but want to transfer around 200,000 rows of data into the new database. Both new and old databases are on the same server.
thanks in advance
View 11 Replies
View Related
Feb 24, 2004
I am a beginner, so please bare with me. I get very confused on how to normalize my database.
Firstly: The employees in the company I work for are in various departments and can have more then one title and work in more then one department.
Example: John Smith can work in the engineering department as a detailer and an engineer and at the same time work as a project manager for the management department.
How do I setup this table structure?
Employees Table
Login (PK) | First | Last | Extension.......
---------------------------------------------
jsmith | John | Smith | 280
Department Title Breakdown
Department | Title
--------------------------
Engineering | Detailer
Engineering | Engineer
Management | ProjectManager
Job Description
Login | Title
-------------------------
jsmith | Engineer
jsmith | Detailer
jsmith | ProjectManager
This is important to break this down because for each project the following is saved:
Project Listing
Project | Detailer | Estimator | Sales | Engineer |....... | Location
10001 | jsmith | jdoe | mslick | sjunk | ...... | Las Vegas
Or should the project be broken down as well
Project Listing
Project | Location
10001 | Las Vegas
Project Team
Project | Member | Activity
10001 | jsmith | Engineer
10001 | mstevens | Detailer
Any thoughts on how to normalize this?
Mike B
View 6 Replies
View Related
Jul 13, 2006
I have this table...CREATE TABLE #Test (ID char(1), Seq int, Ch char(1))INSERT #Test SELECT 'A',1,'A'INSERT #Test SELECT 'A',2,'B'INSERT #Test SELECT 'A',3,'C'INSERT #Test SELECT 'B',1,'D'INSERT #Test SELECT 'B',2,'E'INSERT #Test SELECT 'B',3,'F'INSERT #Test SELECT 'B',4,'G'....and am searching for this query....SELECT ID, Pattern=...?? FROM #Test....??....to give this result, where Pattern is the ordered concatenation ofCh for each ID:ID PatternA ABCB DEFGThanks for any help!Jim
View 2 Replies
View Related
Aug 24, 2006
I re-designed a predecessor's database so that it is more properlynormalized. Now, I must migrate the data from the legacy system intothe new one. The problem is that one of the tables is a CROSSTABTABLE. Yes, the actual table is laid out in a cross-tabular fashion.What is a good approach for moving that data into normalized tables?This is the original table:CREATE TABLE [dbo].[Sensitivities]([Lab ID#] [int] NULL,[Organism name] [nvarchar](60) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[Source] [nvarchar](20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[BACITRACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[CEPHALOTHIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[CHLORAMPHENICOL] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[CLINDAMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[ERYTHROMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[SULFISOXAZOLE] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[NEOMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[OXACILLIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[PENICILLIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[TETRACYCLINE] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[TOBRAMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[VANCOMYCIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[TRIMETHOPRIM] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[CIPROFLOXACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[AMIKACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[AMPICILLIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[CARBENICILLIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[CEFTAZIDIME] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[GENTAMICIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[OFLOXACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[POLYMYXIN B] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,[MOXIFLOXACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[GATIFLOXACIN] [nvarchar](2) COLLATE SQL_Latin1_General_CP1_CI_ASNULL,[SENSI NOTE] [nvarchar](255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL) ON [PRIMARY]
View 5 Replies
View Related
Dec 27, 2003
THE LAYOUT:
I have two tables: "Applicant_T" and "StreetSuffix_T"
The "Applicant_T" table contains fields for the applicant's current address, previous address and employer address. Each address is broken up into parts (i.e., street number, street name, street suffix, etc.). For this discussion, I will focus on the street suffix. For each of the addresses, I have a street suffix field as follows:
[Applicant_T]
CurrSuffix
PrevSuffix
EmpSuffix
The "StreetSuffix_T" table contains the postal service approved street suffix names. There are two fields as follows:
[StreetSuffix_T]
SuffixID <-----this is the primary key
Name
For each of the addresses in the Applicant_T table, I input the SuffixID of the StreetSuffix_T table.
THE PROBLEM:
I have never created a view that would require the primary key of one table to be associated with multiple fields of another table (i.e., SuffixID-->CurrSuffix, SuffixID-->PrevSuffix, SuffixID-->EmpSuffix). I want to create a view of the Applicant_T table that will show the suffix name from the StreetSuffix_T table for each of the suffix fields in the Applicant_T table. How is this done?
View 6 Replies
View Related
Jan 23, 2006
Hi!
I have a table with the following columns:
account_nr, account_totaling_members, account_type
the account_totaling_members column contains a pipe sperated list of accounts in a varchar: "1001|1002|1003"
I need to normalize this so that i get records like:
"10", "1001", "sum"
"10", "1001", "sum"
"10", "1002", "sum"
..and so forth
Does anyone have any idea how to accomplish this?
View 3 Replies
View Related
Jun 21, 2006
I am copying data from one denormalized table to a COUPLE of normalized ones.
I am using multicast, following advices from the forum.
The problem I have is that the two destination tables (A and B) are sharing a foreign key relationship.Filling in A is no problem, but when I want to fill in B, I don't know how to populate its foreign key, since the multicast doesn't know the corresponding primary key in table A.
View 9 Replies
View Related
Aug 20, 2015
CREATE TABLE CATEGORIES(CATEGORYID VARCHAR(10), CATEGORYLIST VARCHAR(200))
INSERT INTO CATEGORIES(CATEGORYID, CATEGORYLIST) VALUES('1000', 'S01:S03, S09:S20, S22:S24')
INSERT INTO CATEGORIES(CATEGORYID, CATEGORYLIST) VALUES('1001', 'S11:S12')
INSERT INTO CATEGORIES(CATEGORYID, CATEGORYLIST) VALUES('1002', 'S30:S32, S34:S35, S60')
INSERT INTO CATEGORIES(CATEGORYID, CATEGORYLIST) VALUES('1003', 'S40')
The CATEGORYLIST strings are composed of value ranges separated by a colon (:) and multiple value ranges separated by a comma.
The results I need are:
CATEGORYID STARTRANGE ENDRANGE
1000 S01 S03
1000 S09 S20
1000 S22 S24
1001 S11 S12
1002 S30 S32
1002 S34 S35
1002 S60 S60
1003 S40 S40
I have tried taking the original data and parsing it out as an XML file. Is there a less cumbersome way to do this in TSQL?
View 1 Replies
View Related
May 22, 2015
I have a large data set with 10s of millions of rows of contact information. The data is in CSV format and contains 48 columns of information (First name, MI, last name, 4 part address, 10+ demographic points, etc.) and I'm struggling with how I should design the database and normalize this data, or if I should normalize this data.
My 2 thoughts for design were either:
Break the columns into logical categorical tables (i.e. BasicContactInfo, Demographics, Financials, Interests, etc.) Keep the entire row in one table, and pull out the "Objects" into another table (i.e. ContactInformation, States, ZIPCodes, EmployementStatus, EthnicityCodes, etc.)
The data will be immutable for the most part, and when I get new data, I'll just create a new database and replace the old one.
The reason I like option 1 is because it makes importing easier, since I can just insert the appropriate columns from each row into the appropriate tables. Option number 2 feels like it would be faster to get metrics on the data, like how many contacts live in which states, or what is the total number of unique occupations in the data set. Plus I'll be able to make relationships between the tables, like which state is tied to which zipcode, which city is tied with which county, etc. Importing that data might be more tricky, since I don't think SQL Bulk Copy will allow for inserting into normalized tables like that.
The primary use for this data is to allow our sales force to create custom lists of contact information based on a faceted search page. The sales person would create the filter, and then I will provide them with the resulting data so they can start making business contacts. Search performance needs to be good. Insert, update, and deletes won't happen once the data has been imported.
What should I look for in designing this database? Any good articles on designing tables around wide data sets like my contact information?Â
View 6 Replies
View Related
Oct 17, 2012
I need to normalise comma separated strings of tags (SQL Server 2008 R2).
E.g. (1, 'abc, DEF, xyzrpt') should become
(1, 'abc')
(1, 'DEF')
(1, 'xyzrpt')
I have written a procedure in T-SQL that can handle this. But it is slow and it would be better if the solution was available as a view, even a slow view would be better.
Most solutions I found go the way round: from (1, 'abc'), (1, 'DEF') and (1, 'xyzrpt'), generate (1, 'abc, DEF, xyzrpt').
If memory serves, it used "FOR XML PATH". But it's been a while and I may be totally wrong.
View 2 Replies
View Related
May 28, 2015
I have a script that resolve's data into xml like this, ex:
<root>
<title>A</title>
<id>1</id>
<nodes>
<node>
<id>2</id>
<title>A.1</title>
</node>
</nodes>
</root>
And works perfectly, but ... how to make sure every item has an element "nodes" ? The case here is for the child leafs obviously. This, because on the client i have to inject this element "nodes" on a json version of this xml, and just wanted to avoid normalizing the structure on the client.
For the root I am using
FOR XML PATH('root'),TYPE; and for the hierarchy that follows
FOR XML RAW ('node'), root('nodes'), ELEMENTS
View 0 Replies
View Related
Nov 24, 2006
Hi, all here,
Thank you very much for your kind attention.
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
Thank you very much in advance for any help.
With best regards,
Yours sincerely,
View 5 Replies
View Related
Dec 14, 2005
After testing out the application i write on the local pc. I deploy it to the webserver to test it out. I get this error.
System.Data.SqlClient.SqlException: The conversion of a char data type to a
datetime data type resulted in an out-of-range datetime value.
Notes: all pages that have this error either has a repeater or datagrid which load data when page loading.
At first I thought the problem is with the date, but then I can see
that some other pages that has datagrid ( that has a date field) work
just fine.
anyone having this problem before?? hopefully you guys can help.
Thanks,
View 4 Replies
View Related
Dec 4, 2007
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
View 1 Replies
View Related
Nov 2, 2015
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
View 4 Replies
View Related
Oct 18, 2006
When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
View 7 Replies
View Related
Aug 12, 2015
I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.
The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?
Is there any performance impact if we enable CDC on OLTP db?
Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?
What is the best way to implement CDC to take incremental changes for reporting.
View 0 Replies
View Related
Jul 20, 2005
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
View 1 Replies
View Related
Nov 10, 2015
I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:Â
SQL Server:
Oracle:
DDL:
create table Person
(
BusinessEntityID Integer,
FirstName nvarchar2(50),
MiddleName nvarchar2(50),
LastName nvarchar2(50)
);
Result:
I follow up this article:Â [URL] ....
VB Script:Â
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
[Code] ..........
View 8 Replies
View Related
Apr 16, 2008
Hi all, i got this error:
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected.
It seems that somehow the 3 columns are not read in from the source file?
ans alslo fiscal year, fiscal week is not set up up properly in my data destination?
anyone faced such errors before?
Thanks
View 13 Replies
View Related
Jul 24, 2015
When I execute the below stored procedure I get the error that "Arithmetic overflow error converting expression to data type int".
USE [FileSharing]
GO
/****** Object: StoredProcedure [dbo].[xlaAFSsp_reports] Script Date: 24.07.2015 17:04:10 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
[Code] .....
Msg 8115, Level 16, State 2, Procedure xlaAFSsp_reports, Line 25
Arithmetic overflow error converting expression to data type int.
The statement has been terminated.
(1 row(s) affected)
View 10 Replies
View Related
Feb 5, 2007
is there a step by step paper to get there? here is what i need to consider. I Iwill have many customers that will need their own set of records and access pages "branded for their company" each customer will have many clients. I am hosting this application on a windows 2003 server with SQL 2005 server enterprise.
I am using windows authentication, I have created a username in windows, then i added the windows user in SQL management studio in security, granted "DB Read" and "DB write" and again under the database security tab. still from the web authentication fails. i must be nissing a step or two?
I expect to set up a username for each database as i setup new customers.
View 1 Replies
View Related
Feb 23, 2008
RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.
I want my XML Data source to be an expression as i will be looping through a directory of xml files.
I don't see the expression property or the connection property??
I tried setting the XMLData property to @[User::filename], but that results in:
Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning.
Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data.
Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0.
Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows.
Task failed: Load XML Files
Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted.
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).
Thanks for any help or information.
View 3 Replies
View Related
Sep 28, 2015
I setup this package to import data from a Sharepoint list to a SQL Server data table. The primary key of my SQL table is mapped to the Title column of my Sharepoint list. There is a possibility that duplicate values will be entered in the Title field of the Sharepoint list. So when importing data into my table via SSIS, my package always error-out when there it comes across duplicate values. how you others have managed data integrity when importing from a Sharepoint list with the Title column being mapped to the primary key of a table.
View 4 Replies
View Related
Feb 12, 2008
Hello,
I am wondering what conversion rules apply, when a string, which contains a number, is saved to a SQL Server 2005 into a column of type decimal.
This is the code I€™m using (C++):
CString cValue = "0.75"
_variant_t vtFieldValue;
vtFieldValue = _variant_t(cValue)
pRecordSet->Fields->Item["MyColumn"]->Value = vtFieldValue;
"pRecordSet" is an ADO recordset. The database column "MyColumn" is of type "decimal(19,10)".
The most important question for me is, if the regional settings of the database server or the regional settings of the client PC are considered during the conversion from the string to the decimal value. For example in standard French regional settings the "." would not be recognized as decimal separator.
I am also wondering if the language of the database instance, in which this data is saved, is considered during this conversion or any other settings of this database instance.
So my general question is: Does anybody know exactly what rules apply during the above mentioned conversion?
Thank you for your help.
Regards,
Volker
View 2 Replies
View Related
Oct 12, 2015
I've question about how to handle structural datamodel changes in a datasource of PowerPivot. Suppose I'm developing a starmodel in SQL Server and sometimes a datatype changes or a name of a field changes in a table. It seems to me that PowerPivot handle this not gracefully as Analysis MD does (mostly). I received an error because of a wrong fieldname or even no error when a dattype changes in PowerPivot. Is this common or do I something wrong here. Does this mean that every time the datamodel changes the PowerPivot should be recreated? Or am I missing the clue here?
View 6 Replies
View Related