Is it possible to turn off transaction logging when making a table schema change? For example, when expanding a varchar field from 10 to 40 characters? This is occurring on a hosted site for a table with about 150,000 records. The db size is 200 Mb. If I try this normally with transaction logging enabled, I get the dreaded log file is full message, even if I first truncate it.
I was thinking about doing the following instead:
taking the db offline
creating a backup
disabling logging
change the schema
re-enable logging
put db back online
If a problem occurs during the schema change, I would just restore from the backup. Please let me know the following:
Is it possible to turn of the transaction logging for the schema change?
Do you see any problems with the above.
Alternatives?
One alternative is to look into increasing the db size. A second is to add a new field (named temp) to the table, copy the old field to the new field, delete the old field, add another field (with the original field name) with the new schema, copy the temp field to the new field, and finally delete the temp field. This should require less space in the transaction log. Unfortnately, it could possibly affect some linked Access databases with the new field order.
I am running sqlserver 2000 Enterprise edition on Windows NT 4.0 service pak 6. When ever I run a long query, the transaction log seems to fill up. How do you turn this transaction log off? Please let me know ASAP.
I need to know if the anyone has had success using any sort of tracking tool to manage schema in SQL 7. I'm responsible for several SQL servers, each with upwards of 10 databases - most still in deelopment phase. We use Erwin to store our models and Visual sourcesafe for stored procs and sql scripts. But now as a Dba I'm looking either:
(1) create a custom database to keep track of schema changes (as they movefrom development to test to prod). OR
(2) buy a 3rd party tools to keep track of schema changes? Platinum?
It's getting too much for a single Dba to manage a busy devel shop with 20 db's. Schema is comming in too often. Only some of our db's are in Erwin.
I have created a table on SQL Server from SAS. The table gets created fine. However, the table schema has my user ID in it (AD-ENTmyuserid.Table1). How can I change the schema to dbo.(dbo.Table1)? It's fine if I have to make this change in SQL Server Management Studio.
There is a report when you click servername, report and run SCHEMA CHANGE HISTORY I had my SQL 2005 running for a few weeks and this is many listed from day started is there a way to recycle this and clean it up on a weekly basic
Simple question, I hope. I need to add a column to a table of a database that is mirrored. How do I have to do that? Do I need to stop mirroring? Is it sufficient to simply pause mirroring? If I make the change on the principal db, what do I need to do the make the same change on the mirror?
How can I keep schema changes in the subscriber's database when I replicate the snapshot from the publisher?
I just want to move data from a remote server, but it seems that the tables are being dropped which is not good as we use the subscriber as a development box.
Is to Possible to Create a Triggers to capture Schema (alter table, Drop table) Changes only for certain tables.I don’t want schema change for entire database.
I am working on a dtsx package wherein i am sending the data from OLE DB Source (SQL Server) to OLE DB Destination (Oracle). For development purpose i use DEVLOPMENT environment on oracle but for unit testing i have to use QA or Some other Schema. when i use DEVELOPMENT Schema in ole db destination, tables are accessed under Schema name eg. "DEVELOPMENT"."EMPLOYEE", but when i m chenging schema name to QA table names are not changing as "QA"."EMPLOYEE". Data Flow Task is pushing the data to DEVELOPMENT environment only.
Can Anyone suggest me any remedy for it ? Or this is one more BUG in SQL Server 2005.
I am still having problem with making View automatically updates itself when the underlying table schema changes. Running sp_recompile on the view table doesn't seem to work either, as I am still getting old format from the view (in Design mode the view returns the right info, but not when I open the View by doing Open View) even though the underlying schema has changed. Right now I find that I have to go into the View and change it a bit to force a recompilation.
And even if sp_recompile does, it would require that I manually do it each time I change a table. Any idea?
I'm admitedly a bit new to the world of replication, so please bear with me. I've got two SQL Server 2000 servers running in different locations. Server A does transactional replication over a push subscription to server B. If I need to make a minor change to one of the replicated tables (for example, dropping a no longer used column or changing a varchar field's length) do I need to drop the subscription, make the changes and then re-initialize the schemas and data?
For minor changes, I really hate having to knock out the site runnign off server B while the subscription is re-initialized and data is bulk copied back over. If I want to just make the changes manually on both servers will that cause problems down the line?
Question re Merge rep (pull) and processing order. We have a group of changes associated with an app upgrade, the scripts run fine on the publisher. Part of the change includes creation of a new table , followed by altering a view to use new table.Following the change at the publisher, when the sync is kicked off from the subscriber, it fails - the alter of the view throws an'invalid object' error with regard to the new table. Seems as if the view alter is attempted before the dependant table has been created.
I have tried to amend the processing order of the view using sp_changearticle, which executes (quickly) with a 0 return code.But it is to no avail , the error still occurs. is it possible to change the processing order for a view article , which will be applied to schema changes ? Have
One of our databases has at some point in its dark past had the owner of the guest schema changed to be a named user, rather than the default guest user. Correcting this feels like it would be easy enough by running the following...
ALTER AUTHORIZATION ON SCHEMA::guest TO guest but that results in.. Msg 15150, Level 16, State 2, Line 3 Cannot alter the schema 'guest'.
I realise the guest schema is a special one, and cannot be dropped, but I'm not trying to do that. End goal is to export the database to a SQL Azure DB, and this guest schema assignment is blocking that process from completing.
Locally I develop in SQL server 2005 enterprise. Recently I recreated my db on the server of my hosting company (in sql server 2005 express).I basically recreated the tables and copied the data in it.I now receive the following error when I hit the DB:The 'System.Web.Security.SqlMembershipProvider' requires a database schema compatible with schema version '1'. However, the current database schema is not compatible with this version. You may need to either install a compatible schema with aspnet_regsql.exe (available in the framework installation directory), or upgrade the provider to a newer version.I heard something about running aspnet_regsql.exe, but I dont have that access to the DB. Also I dont know if this command does anything more than creating the membership tables and filling it with some default data...Any other solutions/thought on what this can be?Thanks!
Hello everybody!I'm using ASP.NET 3.5, MSSQL 2005I bought virtual web hosting .On new user registrations i have an error =(The 'System.Web.Security.SqlMembershipProvider' requires a database schema compatible with schema version '1'. However, the current database schema is not compatible with this version. You may need to either install a compatible schema with aspnet_regsql.exe (available in the framework installation directory), or upgrade the provider to a newer version. On my virtual machine it work fine but on web hosting i have an error =(What can you propose to me?
I would like to use SSIS tool to move the data from one database schema to another database schema.
For example:
Source table has
1. UserName (varchar 20) (no null)
2. Email (varchar 50) (can be null)
Destination table has
1. UserID (uniqueidentifier - GUID)
2. UserName (varchar 50) (no null)
3. EmailAddress (nvarchar 50) (can be null)
4. DateTime
Questions:
1. What controls do I use in my Data Flow to make data move between databases with different data types and include new value in UserID as a new GUID and DateTime as a date (GETDATE)?
OLE DB Source, OLE DB Destination, Data Converson and .....
How do I insert Guid and Date at the same time?
2. I have many tables to do data moving. Any sugestions? How do I architect my project? If I create many data flows for each table - it will look complicated.
in the process of migrating a big db from server 1 to server 2, we had to roll back the change. I started with taking a full db backup and restoring it on server 2 with norecovery, and then a couple logs with norecovery, and then the last log with recovery.
Is there some way to continue this chain now, I mean to change the db to norecovery, or other way to restore logs.
I dont want to do a new full backup.
If I try to do a log restore now i get the message:
Msg 3117, Level 16, State 4, Line 1
The log or differential backup cannot be restored because no files are ready to rollforward.
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON Begin distributed Tran update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 and DONE = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
I am getting this error :Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.OleDb.OleDbException: Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.have anybody idea?!
i have a sequence container in my my sequence container i have a script task for drop the existing tables. This seq. container connected to another seq. container. all these are in for each loop container when i run the package it's work fine for 1st looop but it gives me error for second execution.
Message is like this:
Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
i am getting this error "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.".
my transations have been done using LINKED SERVER. when i manually call the store procedure from Server 1 it works but when i call it through Service broker it dosen't work and gives me this error.
I used SSEUtil to add a schema to my database but I am having problems. Used these steps:SSEUtil -c> USE "c:Rich.mdf"> GO>!RUN Resume.SQL//indicates success>SELECT * FROM SYS.XML_SCHEMA_COLLECTIONS>GO//schema not shown in list> USE master>GO>SELECT * FROM SYS.XML_SCHEMA_COLLECTIONS>GO//schema is shown in the queryIt appears that the schema is not added to the desired database, so when I try to use the schema in Visual Studio, the schema does not appear when I connect to the Rich.mdf database. Any ideas on what I am doing wrong or why this might be happening?ThanksKevin