My client has an app that is A2k FE with A2k BE. They have asked me
to move the BE to SQL Server.
I have a bit of experience with SQL Server, and I'm happy with
scripting the database etc.
However, when it comes time to move the data itself, I have a teensy
little concern.
Let's take our typical Customers -> Orders relation, where CustomerID
is the Foreign Key in the Orders table.
Let's say I have 4 customers, but I used to have 6. The CustomerID is
an AutoNum column, and they are 1, 2, 4, 6. However, when I insert
these records into the matching SQL Server table with an Identity
column, they will (presumably) be 1, 2, 3, 4.
So what can I do about the matching orders? Is there any alternative
to this, which seeems very long winded.
1. Insert the records as above, but copy the OLD Access AutoNum
column (called OldCustomerID) into a temporary column in the new SQL
Server table for both the Customers and Orders tables.
2. Insert the Orders records into the new SQL Server table.
3. Run an update query thus:
UPDATE O
SET CustomerID = C.CustomerID
FROM Orders O
INNER JOIN Customers C ON O.OldCustomerID = C.OldCustomerID
Can I do this any easier? For example, is there any way in SQL Server
to maintain the values of the AutoNum field in the Insert but have
them still be Identity fields? I know that I can import the AutoNum
data into a simple int column, and once the import is complete, turn
ON the identity, but is this reliable? Might I lose any data
integrity?
TIA
Edward
--
The reading group's reading group:
http://www.bookgroup.org.uk
Our current sql server 7 is overloaded and I have been asked to move some data to a new box which is to be acquired. My query is as follows:
1what should be the ideal spec of a sql server to allow for windows 2000 in future. And how do I go about moving data from one to the other. I am new to sql and this is my first assignment.
I have a server that will be going off-line in a couple of weeks, and in the meantime, I need to backup everything and move it to a new location. This involves a lot of data and databases.
What I have is 33 databases in MS SQL Server 2005. The target machine is MS SQL Server 2008. Each of these databases has a myriad of stored procedures and compiled functions that will need to be moved as well. Essentially the task is to completely replicate the entire MS SQL environment.
Is there a shortcut to making this happen or am I relegated to backing up each database individually and then restoring them individually?
I need to write stored procedure in SQL Server 2000 that moves data from table in SQL Server 2000 to same table in Access. .mdb file is located on the same computer and I know its location (path).
as mentioned in title i'm trying to move data from excel into a table in a sql server 2000 database. We have spreadsheets that produce data in excel using querytables.
we want a system the other way, by typing data into a spreadsheet, pressing a button which pushes the data into a sql server table.
I have about 50,000 data entries to move from MS Access to SQL Server 2005 Express. There is no DTS in the tools. I already have the tables, just need to move the data. Appreciate any and all help.
Hello, I have been having a bit of trouble finding help on the safestway to move data files to a different disk on the same server. Mosthelp is about moving data files to a different sqlserver. I just wantto move the files to a different drive on the same server. Any helpwould be appreciated.Thanks,David
I am wishing to * take a table of data into a C# CLR procedure and store it in an array * take a second table of data into this procedure row by row, and return a row (into a third table) for each row (and this returned row is based on the data in the array and the data in this row).
I am new to CLR programming, and pulling my hair out at the moment. I€™m sure that what I€™m trying to do is simple, but I am not making any progress with the on-line help and error messages L. I have now described what I am trying to do in more detail, firstly in English, and then in a T-SQL implementation that works (for this simple example). I€™m looking for the C# CLR code to be returned €“ containing preferably two parts: * the C# code and * the CLR code to €˜make it live€™ Since I am not sure where my coding is going wrong. (I think it should be possible to read in the one table, and then loop through thte second table, calculating and returning the necessary output as you do the calculations...
Problem in English Consider a situation where there are three tables: DATATABLE, PARAMETERTABLE and RESULTSTABLE.
For each row in PARAMETERTABLE, I will calculate a row for RESULTSTABLE based on calculations involving all the entries form DATATABLE.
I am wishing to do this in a C# CLR for performance reasons, and because the functions I will be using will be significantly more complex, and recursively built up.
The values of the results table are calculated as the sum (for i = 1 to numberofrows) of (Parameter1+i*parameterb-datavalue)^2 Which leads to the values shown.
T-SQL Implementation
-- Set up database and tables to use in example
USE master GO
CREATE DATABASE QuestionDatabase GO
sp_configure 'clr enabled', 1 GO
USE QuestionDatabase GO
RECONFIGURE GO
CREATE TABLE DataTable (i INT NOT NULL, DataValue REAL NOT NULL) GO
CREATE TABLE ParameterTable (ParameterNumber INT NOT NULL, ParameterA REAL NOT NULL, ParameterB REAL NOT NULL) GO
CREATE TABLE ResultsTable (ParameterNumber INT NOT NULL, ResultsValue REAL NOT NULL) GO
--Initialise the Tables
INSERT INTO DataTable (i, DataValue) VALUES (1,12.5) INSERT INTO DataTable (i, DataValue) VALUES (2,10) INSERT INTO DataTable (i, DataValue) VALUES (3,14) INSERT INTO DataTable (i, DataValue) VALUES (4,17.5)
INSERT INTO ParameterTable (ParameterNumber, ParameterA, ParameterB) VALUES (1, 10, 2) INSERT INTO ParameterTable (ParameterNumber, ParameterA, ParameterB) VALUES (2, 11.7, 1.1)
-- The TSQL to be rewritten in C#, to produce the Output as hoped
INSERT INTO ResultsTable (ParameterNumber, ResultsValue) SELECT ParameterNumber, SUM((parametera+i*parameterb-datavalue)*(parametera+i*parameterb-datavalue)) AS b FROM DataTable CROSS JOIN ParameterTable GROUP BY ParameterNumber
-- Output as hoped
SELECT * FROM DataTable SELECT * FROM ParameterTable SELECT * FROM ResultsTable
-- which produced
1 12.5 2 10 3 14 4 17.5
1 10 2 2 11.7 1.1
1 20.5 2 18.26
-- but I hope to do the same with something like:
CREATE ASSEMBLY *** FROM 'C:***.dll'
CREATE PROCEDURE GenerateResultsInCLR(@i int, @r1 real, @r2 real) RETURNS TABLE (i int, r real) EXTERNAL NAME ***.***.***
EXEC GenerateResultsInCLR
This is a simple example, that can be easily written in T-SQL. I am looking to develop things that are recursive in nature, which makes them unsuited to T-SQL, unless one is using cursors, but this becomes very slow when the parameter table has 1m records, and the data table 100k records. This is why the datatable must be read in once, and manipulated many times, and the manipulation will need to be in the form of a loop.
Hi All, We have a Progress DB in our Company. We are trying to move all the data to SQL Server 2005. When I try to run the Import Data task from SQL Server 2005 the radio button called "Copy data from one or more tables or views" is getting disabled and it is asking me to write a SQL script to copy all data from 120 tables.
I am using a .NET Framework Data Provider for ODBC when I run the Import Data task from SQL Server 2005.
I am using the DataDirect 4.1 32-Bit Progress SQL92 v9.1D ODBC Driver to connect to Progress DB. I am getting a connection but the copy feature is getting disabled.
Can anyone please help me to resolve this issue, or even if you provide me with some pointers that will be really helpful.
I have a project where I am creating a test environment. My objective is to give the client staff a set of instructions to follow to make sure the test mirrors production. I have migrated the SRS databases over to the new test server. All of the reports are there as is the security but where I run into an issue is the data sources for SRS. They are all pointed at the production servers. Is there an easy way to make the change possibly using a SQL update statement or exporting all the reports to XML and doing a find and replace. The shared data sources I went through manually and changed but there were 40 data sources. Now I realized some of the reports have embedded data connections. There are hundreds of reports to go through.
I am trying to load data from an Excel spreadsheet file into SQL Server 2005 Express. I understand that DTS is the best tool for doing this but from my research it appears that DTS is not available with the Express edition and the import wizard that does come with Express is not well suited for this type of conversion.
Does anyone have any suggestions for how to achieve this objective? Thanks for any help you can provide.
Problem: Moving data from mysql to sql server 2005
I am trying to pull data over from mysql to sql server. First the import wizard greys out so I have to put in 1 query at a time which is pain. and second it does not even work! it takes me through the end of the wizard for me to click finish and then says oops it does not work. there was an error!
Anyway i tried going through the ssis route cuz its going to be a nightly job. i used the ado.net odbc connection. It worked but the performance is really not acceptable. it took 5 mins to import 24000 rows where as dts was taking 1 sec to do this. i wish i could use the native mysql odbc 3.51 connector and import. can some one give me step by step instructions on how to do that ?
I hear someone mentioned of using excute sql task which can use mysql odbc 3.51 driver. but since i am new how do i get it to work. say for example in the excute sql task i run a statement like select * from addr. then what?
cuz eventually i want the result to be saved in a sql server table called addr. How can i get the result from that excute sql task and put it inside of an addr table in sql server. should i save the result to a variable of type object. but then how do i get the data from object and tell sql server in the designer that the result contains these columns and it needs to map to these columns in the addr table of sql server.
Very confused. i wish the first option would have given me results which an enterprise ETL gives. but apparently it is too slow that it wont be acceptable in a production envrioment. when i will have millions of rows coming in .
We have both a production SQL 7 server, QA, and Development. From time to time, I want to move just the data from the production server to the other 2 servers without modifing the objects that may have been changed such as stored procedures and rights. Is there a way using the SQL tools provided that we can just move the data. Becuase also what happens is that the rights to the objects change which means my developers no longer have access to the tables for selects in QA since the changes where overwritten by production where they do not have the rights.
I am getting ready to start a project where I am charged with moving out old data from production into a newly created historical DB. We have about 8 tables that are internal audit tables, that are big and full of old data. These tables are barely used and are taking up way too much space and time for maintenance.
I would like to create a way (SSIS?) to look at the date field in each of the 8 tables and copy out anything older than two years into my newly created history DB. Then deleting the older records from the source DB.
I don't know if SSIS is the best method to use. If it is, what containers to use to move over data, then how to do delete from source?
Can I do the mass deletes on my audit source tables without impacting performance/indexes/fragmentation?
I'm a newbie on SSIS and am trying to grasp my way through this.
I am trying to copy data from a Sql Server 2000 database to a simplified table in Sql Server 2005 database.
What I want is to move the data to a staging table, then drop the main table and rename the staging table to the main table, to minimize the down-time of the data. I can't get the workflow to work, because the staging table has to exist when I run the package. I thought I could use an "Execute SQL" task to generate the table before I would run the task, but that doesn't work. Am I going about this the wrong way? Is there an optimal solution to this problem so my data can be accessible as much as possible.
I have looked far and wide and have not found anything that works to allow me to resolve this issue.
I am moving data from DB2 using the MS OLEDB Provider for DB2. The OLEDB source sees the column of data as DT_TEXT. I setup a destination to SQL Server 2005 and everything looks good until I try and run the package.
I get the error: [OLE DB Source [277]] Error: An OLE DB error has occurred. Error code: 0x80040E21. An OLE DB record is available. Source: "Microsoft DB2 OLE DB Provider" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
[OLE DB Source [277]] Error: Failed to retrieve long data for column "LIST_DATA_RCVD".
[OLE DB Source [277]] Error: There was an error with output column "LIST_DATA_RCVD" (324) on output "OLE DB Source Output" (287). The column status returned was: "DBSTATUS_UNAVAILABLE".
[OLE DB Source [277]] Error: The "output column "LIST_DATA_RCVD" (324)" failed because error code 0xC0209071 occurred, and the error row disposition on "output column "LIST_DATA_RCVD" (324)" specifies failure on error. An error occurred on the specified object of the specified component.
[DTS.Pipeline] Error: The PrimeOutput method on component "OLE DB Source" (277) returned error code 0xC0209029. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Any suggestions on how I can get the large string data in the varchar column in DB2 into the varchar(max) column in SQL Server 2005?
I have 2 databases on one server that I want to consolidate into one database. I'm just learning SQL Server 2005. What is the easiest way to move my 3 tables from one database to a new one on the same server? Do I have use SSIS to do it, or can a simple query be written? I'm new so please be a little detailed in your answer. Thanks in advance for any comments.
I have a scenario where I need to refresh a database that is in 7.0 (converted from 6.5 database) from the original database. Is there an easy way to do this. I have tried creating a DTS package but the data never seems to make it accross.
I'm having problems using the update wizard to move data from 6.5 sql server (on another machine) to a 7.0 server sitting out a PDC. The wizard dies (and passes me over to Dr Watson) when login fails for the 6.5 machine.
I am sure I have the right pasword (I've tried variations as well) and have updated the hosts file so that the machine is known by it's name (I think one of the FAQ answers suggested that)....
What other possibilities are there for moving the data ? I looked at bcp, but that seems a rather long winded route (the data contains time stamps so I suppose there will be a problem reading them in on the 7.0 side) ...
what would be the best way to move 59 million rows from one table to another. The table has no constraint, but has has three indexes. The table has only four columns. It will be going from SQL 2000 to SQL 2000.
Hi all, I have an Ms Access table and a MsSql table. I am running a windows service in my localhost where the data from Ms access table will be copied to Ms sql table for every one minute. Before copying the data, the Ms sql table will be flushed inorder to avoid replicates. Now i want to copy only the latest records updated within 1 min in Ms access table, to Ms sql table. My Ms access table Name Id jas 100 meena 101 viji 102
My Ms sql table Name Id jas 100 meena 101 viji 102
After 1 min, say 2 records are added to my Ms access table like, Name Id jas 100 meena 101 viji 102 bhuvana 103 pinky 104 Now i want to insert only the latest records from Ms access to Ms sql like, Name Id jas 100 meena 101 viji 102 bhuvana 103 pinky 104 how to do this? thanx in advance. Jasmeeta.
I am wondering the best way to go about a task I have been assigned. We have two similar websites but each is located on a different network. One network is secure so it cannot be accessed on the normal WWW. The secure network will contain the master database. I need to write a program or do something with SQL server to retrieve all records from the WWW site and get them onto the secure database. I also in the future will need to update records from the WWW site if they have been updated. What is the easiest way to move data from one network to the other when I cannot connect to both databases simultaneously? Thanks, Matt
I have a web app that has been regulated to a disconnected PC. It's runing IIS and 2.0 with sql server express, but no connectivity. I have changes that are made to some of the data in the db (data, bot schema). There is one particular table that I cannot overwrite, and must extract the data. What methods are available to do this swap of data between databases? I was thinking of doing something like this: Track last date that remote db was updated. Upload updated database into data directory, loop through records for all affected tables, any date that was past logged date then update the record if it exists or insert new record, and then loop through the remote db and delete this records that dont exist in the updated db. This seems intensive and slow - especially as the tables get bigger but I can't think of another solution that can be done by a user using sometime of web interface.
I am trying to move a data and log backup file from my C drive to another drive to free my C drive. Does anybody know the easiest way to do it? Thanks.
This is (probably) a simple question, but I'm new to SQL Server scripting.
What I want be able to do is move data from one table in a database to another table in the same database, once one of the fields (a date field) reaches a certain value.
Specifically, we are inserting rows into a table that are stamped with an insert date and an expiry date. When the expiry date is reached, we want to move the applicable row from the original table to an identically structured table.
Ideally, we want this to be a script that is run daily.
Does anyone have any ideas or examples that we could use.
I imported a little over 9 million records into my database without "cleaning" the data.
I'm looking for suggestions on what would be the most efficient way to get all the fields to the correct types and to insert decimals on the fields that need them.
My only idea thus far is to make another table with the correct field types and append the data to the new table.
I've been given the task of moving data from an old system to a new one. That's easy enough apart from some of the data fields do not match in the least:
Example: In the old system (tableOld) there is a field that contains text data such as 'Laboratory','Field','Market','Demo','Development', but this one field could contain one or more of these. In the new system (tableNew) there are boolean fields for each of the above.
What I need is a way to check for these and then make sure the correct field(s) gets ticked (1 entered into the field) in the new system.
Any suggestions, just trying to get my head around SSIS.