Which Of The 2 Approaches Are Better
Apr 29, 2008
Hi
I have 2 approaches to fetch the data from the database
In first approach I have defined a function say GetResult() in a common file say function.cs. to fetch the data and bind it to the gridview.
View 1 Replies
Jul 14, 2006
I need to import a CSV file with a few million records and 50 fields into a table. Only 1 column in the file needs to be transformed and a second column needs to be checked for data validity (e.g. don't want to let someone pass in 'CA' for an integer field.). Two approaches come to mind:
1. Use SSIS to read the file directly into the table, then apply t-sql to do a mass update to the single field that needs to be transformed. (with this approach it is not clear how to check the data valdity in each row via t-sql, though).
2. Use SSIS to import the file, 1 line at a time, transforming the data and checking its validity.as it goes. I suspect this approach will be much slower than that in 1) but I haven't tried it yet.
Which way do you think would be the fastest?
TIA,
barkingdog
View 3 Replies
View Related
Jun 26, 2006
I am retrieving data from an AS400. I used the import data utility to pull the data in initially and to create the table for the data in SQL server. There are a couple of text fields that represent dates in the data I imported.
The Import data utility imported these fields as character data. That was fine by me.
Now, when I pull new data from the same AS400 data source I have to use the Data Reader source (due to the OLEDB components not being able to work with an SQL command;i,e, it works okay when pulling the entire table but does not work at all with even the simplest of sql statements). The data reader source insists on interpreting the text fields containing date data from the AS400 as DB_DATETIMESTAMP data type. The net effect of this is that the data it pulls in in the DB_DATETIMESTAMP format is too large to fit into the 10 character varchar field that was created when the entire table was imported.
I know I can get around this in a couple of ways. But, my question is this. Why will the DateReader source not let me specify how I want the incoming data to be interpreted? There seems to be no way to change the way the external column is interpreted. By this I mean that I can change the external columns representation on the datareader source to indicate that I want it treated as a 10 character text field but I cannot change the output column representation. When I try to do so I get an error message telling me the data type of the output columns on the component datareader source cannot be changed. It seems a bit odd to me that character data coming in cannot be treated as character data in output.
It looks like I will need to use a script component to force the data conversion that I want and it just doesn't make a lot of sense to me as to why I can't do what I need to do in a conversion component.
I would feel better if I could understand what is going on behind the scenes that makes this state of affairs desirable.
Thanks in advance.
View 2 Replies
View Related
Dec 11, 2014
I am try to put together options in regard to creating a test environment for our Dynamics NAV system. The environment will be mainly used to test new releases / changes ahead of applying them to production.The 2 options I am considering areā¦
1.Create a second Test instance on our Production SQL Server to host a test database
2.Purchase a set of SQL developer licences and having a totally separate server for our test environment.
My preference would be option 2. However I need to build a convincing case that this is the best way forward. I wondered if I could tap into the thoughts of the SQL Central community and see how other approach this.
View 3 Replies
View Related