Im from Russia, sorry if my english is not very good.
Here's the case:
1)------------------------------- I made a DTS-package in sql2000 that transfers the [sql table] into [dbf file] via jet4. First i create (in delphi) the empty dbf with the same name and columns same as in sql table. Second, I run my DTS with variables - source and destination table names
In DTS there is source, destination and transformation . After I send the Variables(table names) , the transformation "arrow" needs to be "refreshed" to make column names in both tables correspond each other. For that in transformation I chose ActiveXScript Mode and wright VB Script:
' Copy each source column to the destination column Function Main()
dim i
For i = 1 To DTSSource.Count DTSDestination(i) = DTSSource(i) next
Main = DTSTransformStat_OK End Function
And it works
2)------------------------------ I want to do same thing in sql2005 SSIS but don't figure out how... I managed to make a package that recieves (in variables) table names and runs correctly. But after I change those variable names into any other it crashes - Description: "component "OLE DB Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
Of cource this happens 'cause I didn't "refresh" the transformatoin (and maybe also source and dest), but I don't know how.
Is it possible/advisable when transfering very large amounts of data from server to server to: trasnfer the data to a new table first second alter new table adding indexes, defaults, ets based on original table
if it is what flow item would be used to transfer/alter the indexes and defaults?
I'm very new to ssis so the more detail you can give the better.
I am new to SQL Server 2005 SSIS Packages. I want to transfer data from multiple tables from sql server to oracle database. I cannot use export wizard as it creates new tables in the destination (oracle) DB. I already have tables created in the destination DB. When I created an SSIS package, it allowed me create package to tranfer data for only for one table from source to destination. I have created DTS packages in 2000, where you have source db and destination db and just add links for multiple tables. Is there a way I can do it in SSIS. Please let me know.
I'm using a Business Intelligence project to copy stored procedures and tables from one database to another across servers. I'm having trouble copying tables or stored procedures using the Management.SMO.Transfer class.
I tried copying stored procedure with the property transfer.CopyAllStoredProcedures = true. This didn't work. As a workaround, I used the StringCollection property and executed every string as sql.
Now I'm having trouble copying tables. I don't want to copy all the tables in the database. How do I go about selecting what tables to copy. I tried using ObjectList property and provided the names of the tables in an ArrayList. I get the error
"Transfer cannot process System.String. You need to pass an instance class object."
How can I pass an "instance class object" for something that's in the database? The ScriptTransfer method fails so I can't even see the script that is being generated. There is virtually no documentation for this class.
I need to transfer data from SQL Server 2000 to SQL Server 2005 nightly. Replication DB1.T1 table in SQL 2000 to DB2.T1 table (the same table structure) in SQL 2005. What are my options? I need to extract a daily data and do this daily. Some example would be greatly appreciated. Thanks,
I have versions of sp_hexadecimal and sp_help_revlogin I used with SQL 2000 to transfer logins. Will they work with SQL 2005 as-is? Might I need updated versions?
I am using the "Transfer SQL Server Objects Task" to copy some tables from database A to database B including data.
The tables, primary key constraints, Foreign key, data and all transfers nicely except for "DEFAULT CONSTRAINTS" on the tables.
I have failed to find any option in the "Transfer SQL Server Objects Task" task to explicitly say "copy default constraints". So I guess logically it should happen automatically but it doesn't. I hope it is not a bug :-)
Made the jump from SQL2000 to 2005 - i'm not a DBA but like to play around with such nice tools- and tried to convert my DTS package into a 2005 one.
I've got a csv file containing some log in information.
As users log in everyday, this keeps getting added to the file.
I'm only interested in the latest appearance of a certain machine on the network so i filter by creating an unique Index
the CSV file get's imported into a temporary table first and then i executed the following on SQL2000
CREATE TABLE [patchings].[dbo].[allfilter] (
[username] varchar (20) NULL,
[machine] varchar (20) NULL,
[os] varchar (20) NULL,
[sp] varchar (20) NULL,
[date] smalldatetime NULL,
[time] varchar(10) NULL,
[logonserver] varchar (20) NULL,
[country] varchar (20) NULL,
[domain] varchar (20) NULL,
varchar (20) NULL,
[brand] varchar (20) NULL,
[model] varchar (20) NULL,
[speed] int NULL
)
CREATE unique INDEX [my filter] ON [dbo].[allfilter]([machine], [os]) with ignore_dup_key ON [PRIMARY]
go
insert into "Allfilter" select * from temptable order by date desc
This worked perfectly in SQL2000.
when i try to do the same in SQL2005, it doens't start inserting the most recent machines (filtered on the date) but machines at random
selecting the select * from temptable order by date desc shows me the correct order for the machines by date so i'ld presume that it adds the most recent combination of machine and operating system from "today" and ignores the older appearances.
if the temptable contained for instance..
MY-MACHINE MY_USER Windows XP Service Pack 2 2006/06/13 10:02:34 Mydomaincontroller BE MY_DOMAIN my_ip IBM type 2593 MY-MACHINE MY_USER Windows XP Service Pack 2 2006/06/15 9:59:59 Mydomaincontroller BE MY_DOMAIN my_ip IBM type 2593 MY-MACHINE MY_USER Windows XP Service Pack 2 2006/06/16 12:33:09 Mydomaincontroller BE MY_DOMAIN my_ip IBM type 2593 MY-MACHINE MY_USER Windows XP Service Pack 2 2006/06/19 10:48:56 Mydomaincontroller BE MY_DOMAIN my_ip IBM type 2593 MY-MACHINE MY_USER Windows XP Service Pack 2 2006/06/20 11:03:49 Mydomaincontroller BE MY_DOMAIN my_ip IBM type 2593
under SQL2000 i'ld see one record in the final "allfilter" table
MY-MACHINE MY_USER Windows XP Service Pack 2 2006/06/20 11:03:49 Mydomaincontroller BE MY_DOMAIN my_ip IBM type 2593
under SQL2005 i don't see the last one though coding is the same
MY-MACHINE MY_USER Windows XP Service Pack 2 2006/06/16 12:33:09 Mydomaincontroller BE MY_DOMAIN my_ip IBM type 2593
We have a maintenance plan running everyday for rebuild and re-organisation of indexes. But, somehow its getting failed. Here is the script that we are running for rebuild or re-org.
/* Script to handle index maintenance Tuning constants are set in-line current values are; SET @MinFragmentation SET @MaxFragmentation SET @TrivialPageCount
No matter what table, view, or stored proc I pick, it always says that it doesn't exist at the source. I know it exists because I am picking it from the list of tables, etc. that the GUI provides.
I am accessing a SQL2005 Express SP1 Server via TCP/IP (all machines fully up to date with windows update). For ease of configuration the Server IP is mapped to an Alias in Native Client Configuration Manager. On the Client Force Protocol Encryption is set to Yes and Trust Server Certificate is set to No. The clients are accessing the DB via SQL 2005 Management studio express and Ms Access 2003 (an access adp front end to the sql database).
As a test i turned off force encryption on the server. A Network trace shows no encryption! Turning on force encryption on the server corrects this.
As a further test I configured the server with a certificate the client doesn't trust. No error is generated by the un-trusted certificate! The only way to generate the certificate error is to enable encryption in the connection properties of each application. Why isn't the Native Client Configuration Manager utility enforcing this???
This brings me on to a second (off-topic problem). In my access adp file, if I enable €œuse encryption for data€? advanced option under the database connection dialog all works as expected (encryption + certificate verification). However, I am programmatically updating the connection in VBA
Application.CurrentProject.OpenConnection "Provider=SQLOLEDB.1;Data Source=AliasName;Use Encryption for Data=True;..."
If I do this no encryption is enabled!!! Access seems to just ignore the encryption statement.
Has anybody got any ideas as to why either of these issues are occurring or if there are any workarounds?
How do I move a table (test1) to a physical file group, and the table(test1) indexes to another file group. One clustered index and 3 nonclustered indexes. Thanks!!!!
I've a customer table and It has two indexes only. But when I view from the query analyser it list ten indexes with the following name:
"_WA_Sys_status_01EAB64E"
I do not know, what does the above index does and i remember, I did not created that. When trying to drop that, It says, Its not an index. I dont know, what to do. Would somebody please advice me on this.
Can someone point me to where I can find out if an SQL server table has an index ? I was in enterprise manager and brought up properties for any given table but was unable to find the information.
I have 2 tables with this design: one has ArticleID as primary key and multiple other fields and one has GroupID as primary key and multiple other fields. Each article can belong in multiple groups so I created a new table called articleGroups with only 2 fields: ArticleID and GroupID to show the groups associated with each article. There is a relation between this table and each of the main 2 tables. My question is, in the articleGroup table, does it make any sense to create an Index on ArticleID, GroupID or both? Since the group is needed for each article the Groups will always be queried everytime the article is queried. So, I am not sure if an index is needed?
Transfer tables from one server to the other on a regular basis. I'm doin this using the DTS import wizard (saved it as a package and scheduled a job). But i noticed that if the table structure changes on the soruce tables, the DTS package will fail.
I want to make a DTS package that imports 6 tables, now how do i make it such that if the structure of the tables changes it shud still work. ie, everytime the tables are tranferred, the old tables on the target server must be dropped and re-created with the source server table definition. I hope this is doable.
I need to drop and recreate indexes in some of my tables that are currently been replicated. I am not sure how this will affect my ongoing replication. Will this cause a problem for me? Please help
I was trying to find out how much space is available in a 2000 db for allocation to tables and indexes. I am trying to find the amount of space that has to be used-up before another allocation is automatically made to the database. I looked at sp_spaceused but BOL is rather sketchy at defining what the numbers it returns really mean. Is the "unallocated space" the value I am looking for?
I have a pretty large database that has tables that will contain millions of rows of records. I will predominantly be using Views just to select the data. (I will not be performing any updates or inserts). I propose creating indexes on the views. My question is - if I create indexes on my views, do I have to create them on the tables as well? Is it good practice to create indexes on tables by default even if I am not going to be performing select statements directly on my tables but via my indexed views? Any advice is appreciated.
In SQL Server 2000 one could DBReindex every index that exists in a given database. You can do the same in SQL Server 2005. But how can this be done with the new Alter Index command? It does not allow me to pass in a variable for the object. Any ideas on how to get this done in with Alter Index in 2005? Thanks!
This I can't get to work:
DECLARE @TableName nvarchar(100) SET @TableName = 'Account'
USE database; GO ALTER INDEX ALL ON @TableName REBUILD GO
Hi! I have 2 tables (both have the same structure): ID -> bigint (identity, not for replication, primary key) Url -> nvarchar(1000) MainUrl -> nvarchar(1000)
Tbl1 cantains about 0,5 mln records, and tbl2 - 1 mln. What I need, is to copy records from tbl2 to tbl1. But records in tbl1 are unique, and it can't change. (Unique must be only "Url"; (and ID, but it's automatic)). How can I do this in fast way? Now I'm using SELECT for each record in tbl2 to see if it exist in tbl1. But it's a bit slow... Is there any faster method? (One thing: I'm beginner in databeses, so I'm wrote VB application to transfer records. How can I do it using only Microsoft Sql server?) -------------- I'm forgot to write, I'm using MsSql 2005.
I am have a two tables. I want to transfer data from Table1 to Table2 but depending on BRK_TITLE. For example, if BRK_TITLE is "Testing" then table1.BRK_QUANT would go to Table2.Qty1, if BRK_Title ="My Testing" then it would go to table2.qty2 and so on... but problem is every time BRK_TITLE changes. these are not constant titles(e.g. title may be 'abc', '134' etc)
If data belongs to title3 then it would go to the specific quantity i.e. table2.qty3
hi, if exists (select * from sysobjects where id = object_id('dbo.MRDD_FINAL') and sysstat & 0xf = 3) drop table dbo.MRDD_FINAL
This code was generated when I used the create a script to build a table from an existing table. is there a way to check if a a table contains data or not, The whole idea is to check if table A contains data, I need to truncate the table,otherwise I do nothing... regards
I have come across a database system which isn't designed to work optimally. It is fairly large (~400GB) and performance of loading and querying is degrading (improper data types, fragmented indexes, non unique clustering key and other problems). So, I have quite a task in front of me, but I am up for the challenge. I figure this is not a unique situation, many of us would have come across this before. I have done this before too, but only for smaller databases, some of the operations here I expect to take a couple of hours or more to complete (depending on load/infrastructure speed etc, I know).
My plan is thus:
+ Take a full backup of the database + Set the recovery model of the DB to simple + Drop non clustered indexes + Drop clustered indexes + Remove PKs (wrong data types, too large!) + Narrow data types (add new column, update column in batches to old value, rename new column to old column) + Add PKs, which will create clustered indexes automatically based on PK ID + Create non clustered indexes + Run a SHRINKDB (normal operations I would never do this, but this is a special case, ensure log file is truncated to a logical size especially after all those table modifications...) + Set the recovery model of the DB to Full + Ensure everything works OK or better