Losing Unique When Exporting To Another Sql Server
May 1, 2007
I have 2 machines, each running the same version of SQL Server. I'm trying to use the dts to export my data from one to the other, and it works fine, except for the fact that I lose my unique index on a field. Is there a way I can keep this from happening, or am I going to have to recreate it each time? I wouldnt think I would have to, seems like bad design to me.
I'm fairly new to MS SQL & I have a problem copying tables from one d/b to another. When I do an import or export select Copy tables, the tables & the data get transferred, but all the Identities are set to NO. My D/Bs have 188 tables & it's a pain to reset all of them manually.
Is there a setting I don't know about that will transfer the tables & maintain the Identity? (I'm using SQL Server Management Studio 9.0)
Msg 2601, Level 14, State 1, Procedure DFP_report_load, Line 161 Cannot insert duplicate key row in object 'dbo.DFP_Reports_History' with unique index 'ix_report_history_creative_id'.
The duplicate key value is (40736326382, 1, 2015-07-03, 67618862, 355324). Msg 3621, Level 0, State 0, Procedure DFP_report_load, Line 161
The statement has been terminated.
Exception in Task: Cannot insert duplicate key row in object 'dbo.DFP_Reports_History' with unique index 'ix_report_history_creative_id'. The duplicate key value is (40736326382, 1, 2015-07-03, 67618862, 355324).
Hi, I'm working on a project that connects to a database on every view of a site. It works fine for a few hours but eventually it just stops connecting to the database. There's nothing in the application logs to suggest where to look, it seems like the connections are just dying. I am creating the connection in a pretty normal manner:SqlCommand myAwesomeCommand = new SqlCommand();myAwesomeCommand.Connection = new SqlConnection(myAwesomeConnectionString);myAwesomeCommand.CommandText = myAwesomeSQL;myAwesomeCommand.CommandType = CommandType.Text; // Otherwise known as awesomeText myAwesomeCommand.ExecuteScalar(); I also use a SqlDataAdapter and fill it for some of my stuff for grabbing more complex data, but all happy System.Data.SqlClient stuff. But yeah, the thing seems to just give up after a few hours. Everything I have read about connection pooling in ADO.Net says that the connections won't be reused. And yes, I believe I am closing all of my connections after I use them. The site is getting, oh, 7500 hits an hour maybe? Somewhere around that, maybe more maybe a little less. Anyone have any ideas?
This is an infrequent problem but very annoying when it happens. Every so often, I go into Enterprise Manager only to find that all of my SQL server registrations have disappeared from the group and I have to add them all again. As it happens so infrequently I don't know if there's something I'm doing on my machine to make them all disappear.
We have a clustered sql server 2000 installation on windows 2000. Werecently lost the select into/bulkcopy option from a database. Themost likely candidate appears to be a failover just before - afterthis happened the setting disappeared. Any ideas?ThanksTom
I am supporting a system that needs to allow users to have access to TempDB.
I set these users up using the GUI, but whenever the server is restarted, these users permissions are wiped out and the db_owner permission is lost and I have to manually go in and apply the permissons for the database to work again, it happens on most reboots but not all.
Is there anyway to keep these users permissions when the server is rebooted?
Your help is most appriciated.
P.S Could I create a stored procedure that when ever the Server is rebooted the procedure would recreate these permissions?
If I need to do this how would I go about doing this?
I have two machines, the one has Windows 2003 server and the other has Windows Xp Pro on.
The 2003 one has SQL Standard Edition 2000 on where as the XP Pro has MSDE 2000.
I'm trying to run a windows based app from the 2003 server and it connects to a database on the XP Pro PC.
When running the application i keep getting the following exception:
"A Transport-Level error has occured when sending the request to the server (Provider: Named Pipes Provider, error: 0 - The specified network name is no longer available)
After using Preformance monitor on the XP machine it seems that when the app starts and the user logins in it make a SQL connection but then the connection is lost then the exception pops up. then i click continue and try it again then it works, then a couple of secs when using the system it pops up again. The Performance Monitor Test shows that the SQL connection keep being lost. The network is fine. its has something to do with SQL and Named Pipes and stuff.
When running the app on the Xp machine and the Connection is made to the 2003 machine there is no problem.
There are other connection made to the Windows XP Machine but i doubt that it affects the app.
I have a stored procedure and one of the fields in the SELECT is a datetime type. On SELECT I am getting the date and the time but no AM/PM returned. The datetime in the query is returned in this format: 2008-03-21 08:37:20.000 Any help is appreciated.
We're using SQL Server 7 with an Access 2000 MDB as a front end with ODBClinked tables. I recently created a new set of tables for the app, and usersare complaining that unsaved data is being lost when they move to a newrecord. This seems to be the case when there are multiple users. When thereis a single user using it, we don't seem to have that problem.It seems that we had this problem when we first converted from an MDB backend to a SQL 7 back end, years ago, but we haven't had this problem in awhile. These are the first "entirely new" tables created in several years,and we seem to be having that problem again.Is this something with SQL 7 when it's dealing with new tables? Any ideas onwhat to do?Thanks!Neil
I'm a newbie to both Windows Forms development and SQL Server CE, so please bear with me...
I am developing a vocabulary builder application, and my data keeps reverting to the data set I originally imported. I keep losing the records I've added using the application I'm developing, or from within VS2005. I know about the 'Copy To Output Directory' option in the Solution properties and I set this to 'Do not copy'.
What gives? Not sure what other information you might need, but I suspect it's something really basic... Thanks!
A UNIQUE INDEX must inherently impose a unique constraint and a UNIQUE CONSTRAINT is most likely implemented via a UNIQUE INDEX. So what is the difference? When you create in Enterprise Manager you must select one or the other.
What's the difference in the effect of the followings: CREATE UNIQUE NONCLUSTERED INDEX and ALTER TABLE dbo.titles ADD CONSTRAINT titleind UNIQUE NONCLUSTERED
I found there're two settings in Indexs/Keys dialog box of the management studio, Is Unique, and Type. The DDL statements above are generated by setting Is Unique to yes plus Type to Index, and just Type to Unique Key, respectively. What's the difference between them?
. When I copy tables from one database to another (Using DTS Wizard) I lose my settings .. primary keys + default values !! Any help would be appreciated.. . Thanks
This isn't really DB related, except for the connection, but I was wondering if any of you guys had run into this before.
I have a ODBC System DSN Setup for a connection to my MSSQL2K Server. My users will occasionally lose the Default Database setting (Drop-down during DSN Setup), forcing me to go in and reset it. It's getting to be a pain in the ass and I was wonder if any of you guys had run into this before.
Following on from some problems I have been having with flat files, I seem to be stumbling from one issue to another.
I have coded my text file with some very contrived delimiters to ensure that my real data isn't tripping up the package. I am stripping all line break chars, tabs etc.
Now I seem to get all rows imported, but numerous integer columns within the file are being set to zero.
When I toggle the Keep Null options in the data flow task it will either import the row, but with the zeros, or not import the row at all. I have opened the file in text editors, and Excel and it all seems fine - in fact the same source file works OK with DTS/2000.
At a guess it seems as though these columns are seen as null by the package and so with the option switched on it defaults to zero - but they are most certainly not null in the file!! The table is a staging tabe and so is very basic in structure (no defaults or constraints)
SSIS seems a bit buggy or at the very least over sensitive to me - I am at the point of abandonment of it!!
Hi, i am having a problem when i tried to execute a SSIS package from a job in sql server 2005 First of all, when i create a new connection in the designer, i complete with server, user and password, and test connecion succesfull. Then, i execute all the package and it works fine. After building it, i tried tu execute the package using the file that it ve just been created into bin folder, but the connecttions didnt have the tag password, so after user id i add ;password=xxxx in every connection and it works fine. But every time i wantex to execute, i have to write the password. Also i tried openeing the text file that the DTS created, but when i open it again, the password wasnt there. Finally, i tried to create a job that execute this file, and i have to write the password as i did when i execute directly from the file, but i couldnt run. It has an authentication problem. If i open again the step configuracion, the password is not there. The same that happen every time, but now i can run the package
Hi - I am attempting to run an SSIS package through a SQL Agent job step. The package is fairly basic - I run some Execute SQL tasks, build an ADO recordset, enter into a For-Each loop, and perform some table updates based on the ADO recordset. So far so good. Approximately 1/3 of the records in my For-Each loop are processed, then I get a "Failed to acquire connection error". All I am doing for each iteration of the For-Each loop is updating one record in a table based upon the current value that is being processed from the ADO recordset. I am running the job on a 64-bit box, version 9.00.3050.00. I have also run this on two other 64-bit boxes (versions 9.00.1399.06 and 9.00.1406.00) and had the same issue. The owner of the Agent job is a sysadmin on the box. I don't understand why some items would be processed, then the connection drops mid-way through processing. What's even odder is the connection is dropped at the same point every time I run. The connection that I have defined in my package (I only have one connection defined in my package) is Provider type "Native OLEDBSQL Native Client", and I am using Windows authentication. The version of Visual Studio that I have developed this package in is 8.0.50727.42. Please help.
I have written a little bit of VB.NET code that basically takes three strings, transforms them, and returns a single string to be stored in my table.
I am running into a strange problem, however... for some reason, a number of my processed rows are missing a pair of double quotes (").
The vast majority of the records are formatted properly, and have the double quotes in the expected locations.
The most frustrating thing about it is that I have included the offensive input strings in my Test.sql test script, and when I step through the entire routine, the return value is perfect...
i apologize for being the worst ever at posting questions here, please let me know if i can add anything
I am facing a wierd problem while sync'ing. When I am do a synchonize from a mobile device to the server DB, one of my table column fails to update while all other columns get updated.
The table uses row level tracking (tried making it column level too) and the field which fails to update is a date field (Nullable).
This happens in case of syncing after updating >1 records on device. But syncing after updating just 1 record on device, this date field gets updated as expected.
Note: I have filters set for this table. The filter downloads fields only with this date field=null.
I need to downlaod all records meeting this filter condition but at the same time, this filter should not be applied while uploading changes as I think this is what the problem is.
If anyones has faced this and got a solution, please let me know.
We have several servers running SQL Server 2005. Each of these were upgraded from SQL Server 2000. Each server has about 50 databases on it. Each database has a table with the same name, containing login information to the database that the table resides in.
We have a VB 6.0 application that allows our Help Desk to do a SELECT on the table that contains login information. All it does is a SELECT; no UPDATEs, DELETEs, INSERTs, or DDL operations are performed by this application. The application logs into the databases using a Security Group login (let's call it MyGroup). Each member of the Help Desk team is a member of MyGroup.
About twice a month, the MyGroup security group loses SELECT permissions on all of the databases on a server. The server affected usually is different each time, but sometimes the same server will be affected two times in a row.
So far, we just run a script to update the SELECT permissions on the table in each of the databases, and this takes care of the problem (for now). But the problem seems to be recurring regularly.
What would make the server lose all permissions for a particular security group? The other logins continue to work fine. Only this one security group seems to be affected.
When I a set of databases from backup all login information is lost. The front end of the application provides a way to create new users, but it is a very long process. I am looking for a script or stored procedure that will help attach the users to the database.
I backed up and restored a database successfully except for the users. The only user that shows is dbo, but I can't add a user with the proper user name from my old server; I get the error that the user or role already exists in the current database.
When I go to role, name of public, there is the user I need to add. I can't remove it as it's a table owner, but I need to add it to the list of users, so I need to remove it then re-add it.
I tried using EXEC sp_changeobjectowner to no avail.
I'm testing the use of application roles for security. The customer I work for has still a lot of ASP intranet applications running. We're migrating the databases to a SQL Server 2005 server.
I've changed the connection string to a user without any permissions but to log on. After that I use an application role for permission to select different tables and to execute Stored Procedures.
The first queries do execute but after that I get "Permission denied", like I haven't got the application role anymore.
On my SSIS package I have 2 connection managers, both are connecting to an Oracle db. I enter in the ID and Password then click 'test connection' I'm able to connect to the database fine. I then go to a data flow of one of my control flow task. Open up my OleDb Source and click preview. The SQL query executes with no issues. I can see the data from my connection manager source.
I then go and run the package and I get this error message:
[OLE DB Source [1]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "OracleConnection" failed with error code 0xC0202009. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
I've been racking my brain on this for a few days and still no luck on getting this to work.
We got a server running SQL server 2000 SP4 with a database that€™s being replicated There are 17 subscribers running MSDE SP4 using merge replication. Replication is started manualy
Initially we tested this with two subscriptions an everything went well, but now, since 3 months, we are facing a weird problem while sync'ing. We have massive data loss on records that where inserted at the subscribers. Records seem to disappear, but only record that have a foreign key constraint. What I mean is that for example a record is inserted at the table that holds our client records with primary key €˜ClientID€™ and then a record is inserted in a table with actions for that client with a foreign key €˜ClientID€™ referring to the client table. After sync€™ing that client record is inserted correctly in database on the publisher but the records in the table with actions are gone.
As far as I know the tables are correctly formed with identity set not for replication and so on. Shortly, I can€™t find any problem, a specially when it doesn€™t happen always.
If anyones has faced this and got a solution, please let me know. Thanks.
1. Create Temp Table (Execute SQL) 2. Load Temp Table (Execute SQL) 3. Data Flow Task using the temp table 4. Data Flow Task using the temp table 5. Drop Temp Table (Execute SQL)
Tasks 3 & 4 are currently setup to run parallel/concurrently - meaning they both branch directly off of task #2. They both also lead to task #5. Also, my connection manager is set to retain its connection.
Here's the problem. When executing this flow, task #3 will sometimes fail, other times it will succeed - without me changing anything between runs (I simply end debugging and run it again). When it does fail it is always because it cannot find the temp table. One thing to note is that task #4 (running at the same time as #3 and using the same temp table) always finishes first because it does quite a bit less work. Last thing to note: if I set up task 4 to run after 3 it works everytime, but of course it takes much longer for the total package to execute.
How is it that the task can find the temp table on some runs, but not on others? If this is just quirky behavior that I am going to have to live with for the time being, is there a way to force a task to re-run X number of times before giving up and moving on?
I've experienced a problem (bug?) when deploying reports to the report server. My users kept complaining about losing authentication to reports when I deploy a new report. I found the problem but don't understand why this is happening. My collegue developped the earliest reports and loggs in with a different user on Remote Desktop to develop the reports. When the report is deployed, the datasources are deployed also.
Now when I log in on Remote Desktop with another user then my collegue and open the same report project, and check out the datasources, the credentials are gone ... ?!? So when I deploy a report the datasource with the empty credentials are deployed too, wich causes all the authentication problems.
My Question: How comes I can't see the user credentials my collegue filled in? When I open the report project the credentials are gone. I log out, my collegue loggs in, and the credentials are back
I suppose this has something to do with Remote Desktop and the users connecting to it?
I have written a simple SQL Server 2005 package to pull some data from Oracle (using ODBC) and pumping it into SQL Server. When I run it from the server in debug mode in VS it works fine. When I schedule the job it errors out with "ora-01005: null password given; logon denied." The password is there. Has anyone experienced this? Is there a security setting somewhere preventing me from saving passwords? Is there a work around? Thanks.
Hi report data sources keep losing their credentials. We use custom datasources and store credentials on the server yet for some reason? the credentials get removed and have to be re entered.