What Impact To Add A Column To A Big Table?
Dec 10, 2007
Hello, everyone:
I want to add a column, INT NOT NULL DEFAULT 0, to a table. There are about 9 mil. records, 57 columns in the table. SQL 2k on Win 2003. What impact maybe bring?
1) Is there a down time on database and server?
2) Is it possible to insert records during adding new column?
3) How long will be taken roughly?
Thanks a lot.
ZYT
View 1 Replies
ADVERTISEMENT
Apr 16, 2003
What causes SQL Server 2000 to create tables with default column sizes of 8000 for a varchar datatype??
I would assume this can cause significant performance problems.
(see attached)
View 2 Replies
View Related
Jul 3, 2001
Hi All,
I want to know what will be the impact of changing the primarykey on a table which already has a lot of data.
For example, column A is unique, primary key. I want to make column B as unique, primary key.
Can I do that? What will be the impact on database performance?
Thanks
Sri
View 1 Replies
View Related
Nov 7, 2007
Just a quick easy question. If I alter a table (add a column to the table), will it take the table offline during the ALTER process? I am adding the column to the end of the table not in the middle. I know if I add it in the middle it will offline the table.
View 4 Replies
View Related
Oct 12, 2004
SQL Server 2000:
Question 1
If you drop / rename a table and then recreate the table with the SAME NAME, what impact does it have on stored procedures that use these tables? From a system perspective, do you have to rebuild / recompile ALL the stored procedures that use this table?
I had a discussion with someone that said that this is a good idea, since the IDs of the tables change in sysobjects and from a SQL SERVER query plan perspective, this needs to be done...
Question 2
If you Truncate a Table as part of a BEGIN TRANSACTION, what happens if an error occurs? Will it Rollback? The theory is that it won't because Truncate doesn't utilize the logs where as Delete From uses the SQL Logs?
Thanks!
View 1 Replies
View Related
May 13, 2014
What is the impact on the users to drop an index on a table while in use? I will recreate the index afterwards. The table is used constantly by a three of processes/users at all times.
View 3 Replies
View Related
Sep 27, 2006
For example,I have a table "authors" with a column "author_name",and it has three value "Anne Ringer,Ann Dull,Johnson White".Here I want to create a new table by using a select sentence,its columns come from the values of the column "author_name".
can you tell me how can I complete this with the SQL?
View 2 Replies
View Related
Mar 29, 2008
OK, I have figured out how to hide the sys views and Information_Schema views from users but before I try it on the live database I have a question:
If I Deny Select on the Master Public Role for the sys views and Information_Schema views, what impact will this have for users other than not being able to see those views? Anyone know this?
Your feedback is greatly appreciated.
Thanks.
View 5 Replies
View Related
Jan 18, 2001
I want to determine the performance impact caused by the extensive use of the 'select into #' statement in a production environment. The current situation is that our reports team extensively uses the 'select into #' statement to build smaller subsets of data. These subsets are then used as the basis to create summary style reports and exports. All this is accomplished via the use of SQL pass-through.
After these reports/exports are completed and tested, they are then released to our operations department and the users. The reports/exports then can be run against the production server at the discretion of the user, provided they have the appropriate permissions. These reports/exports target the live data on the primary production server that already has been designated for the use of the application software.
Now I know that reporting against a transactional-based server, where the users run the application, is not a very good idea. (Inherited) I am currently migrating all reports/exports to a reporting server. Although it will still be transaction-based, the reports/exports will be isolated from user activity. Eventually we will be moving toward a warehouse scenario.
I also know that the extensive use of the 'select into #' statement is not a coding practice for use in production. I provided several alternatives to this practice
1) insert..select 2) insert..execute - from stored procedure
I have read that in the in sql 6.5 that this may cause severe performance and locking behaviors in system db's and tempdb. However, in the following document on the Microsoft Knowledge Base, it indicates that SQL 7.0 may have corrected this issue.
Q153441 - FIX SELECT INTO Locking Behavior.htm
Despite the indication of being corrected, I am still not convinced. I am frequently seeing drastic performance hits, especially when several of the reports are running. (which is very common) My concern is that moving these reports/exports to a reporting server may save the users; I believe that it may be migrating the problem to another location. I will be working with the developers to optimize their code and will investigate index issues.
** To make a long story short. I would like someone who has experience with this provide me with the top 5+ reasons not to use the 'select into #' methodology in a production environment. Further, if anyone has any documentation, I would surely like the info.
Thanks, Dave
View 2 Replies
View Related
Feb 19, 2004
Hi,
Anybody have any idea howmuch % of performance will be affect if we are using varchar instead of char data type?.
Thanks,
Ravi
View 2 Replies
View Related
Oct 4, 2006
Hi,
I am looking for a tool that is similar to SQL Impact (Quest). Quest has discontinued the tool.
This tool should be able to detect all database object dependencies for SQL Server, Sybase and Oracle. The objects should include tables, views, stored procedures, indexes and other objects. This should also detect DB object dependencies in front end applications as well.
Any suggestions are greatly appreciated...
Thanks!
View 2 Replies
View Related
Apr 18, 2008
I have been collecting information about 20 performance counters (memory, IO, cpu, SQL) that refresh every 15 seconds, would that have any performance hit in the server? what are best practices when collecting information via performance counters?
Thanks
View 2 Replies
View Related
Jan 15, 2008
I want to use "on delete cascade" in one of my tables but I'm worried though whether this can affect the performance when having millions of records. To explain more I'm working on a social networking website and I have two tables UserAccounts, in which I only keep the username and password and a few related fields, and Profiles in which I keep the profile data for users, I want to be sure that I won't have any records in the Profiles table without corresponding records in the UserAccounts table. Please see the DDL below to understand more the structure of the tables:
CREATE TABLE UserAccounts
(
UserID INT NOT NULL IDENTITY(1,1) PRIMARY KEY,
UserName VARCHAR(20) NOT NULL,
Password VARCHAR(20) NOT NULL,
--other fields (e.g. last login .. etc)
)
CREATE TABLE Profiles
(
UserID INT NOT NULL REFERENCES UserAccount(UserID),
-- other fields (e.g. birthdate, nationality .. etc)
)
Any suggestions are highly appreciated...
View 6 Replies
View Related
Jan 24, 2008
Hi,
Does anyone know how the key influencers impact values are calculated? Thanks!!
Kate
View 3 Replies
View Related
May 13, 2008
Hi,
I am currently working on with the ASP encryption of my application. I've tried to test the encryption of the connection string using the capicom.dll in my local, and it works successfully. However, I am not quite sure if this will still work after my OS is upgraded to WIN2K3 (my current OS is WINXP). Will this dll component be impacted after the OS Upgrade? or will there be no impact at all?
Any inputs from you guys would be much appreciated.
Thank you.
View 1 Replies
View Related
Sep 15, 2006
Hi all, I am not over familiar with SQL, I am a VB programmer, simply I need to achieve the following within Enterprise Manager.
I have 2 tables, different designs, different number of rows, I simply need to check whether the contents of a column in the first table is in a column in the second table, just simply a table/column to table/column data check for the same data content.
Easy Peasy for you guys, any help would be appreciated.
View 6 Replies
View Related
Jan 4, 2005
Thanks to all participants.
I am using SQL Server 2000 with replication object for two location. Log size on publisher go upto 25 times of data file size, I mean 80 MB Data files has maintains 2 GB log file and it is same for all five co's working on same windows 2000 advanced server board.
Since last week server randamly get disconnected from user applications and at that time few tables are not openable at server.
Can any one give a reason ? Why this type misbehaviou done by SQL Server 2000?
Thanks.
View 11 Replies
View Related
Jul 13, 2007
I have a question regarding FUll and differential backup.
We we take full or diff back up, does it create lot of logs ie. Does full or diff backup has any impact on log size?
Thanks
View 5 Replies
View Related
Sep 1, 2007
We have the following scenario:
Server A replicates Database A to Server B.
Server C has Database A on it as well, but in standby mode. We are applying the transaction logs generated by Database A on Server A to the database on Server C leaving it in standby mode each time.
Let's say we had planned maintenance for Server Aand dumped the last set of transactions on Server A in standby mode to be applied to to Server C. What happens to the replica on Server B? When I start to use Server C, can I backup its transactions and apply them to Server A, and then have those transactions replicated to Server B? And then what do I do when the maintenance is complete so that I can swithc back to Server A and have the replication continue on as before the maintenance to Server B?
Thanls
View 1 Replies
View Related
May 17, 2015
What would happen to the the queries which are under execution when I change the MAXDOP value from say 0 to 1?
View 11 Replies
View Related
Mar 31, 2014
I have a sp. I alter stored procedure by adding some logic to that. How can I test that is there any functional impact by altering that stored procedure? How to prove to the team that the modifications doesn't impact any functionality?
View 5 Replies
View Related
Aug 5, 2015
Our SQL Server 2008 r2 has collation Latin general.
And my database 'DB1' , 'DB2' has collation set to Japanese unicode.
My sql team has informed they cannot change system collation as it hosts other db's as well.
My Query: What will be the impact of changing collation from JP to Latin.
View 1 Replies
View Related
Nov 14, 2006
We are using SSIS for the first time. My team is working on a project that involves putting a date time stamp into a series of tables. These tables are being assembled in a series of child packages being executed by the parent. When the parent runs, we evaluate our timestamp variable as a GETDATE() expression and pass it to the children to be included as a derived column. We don't want the actual runtime of each step to be the timestamp, just the start of the batch (parent).
In order to get the variable to pass over to the child, we needed to set the package location to "file system"instead of "SQL Server". It seems unusual that this would be so. Are we doing something wrong?
What implications does this have for deployment? Will we need to customize the packages for each instance we plan to run this on? Can you have a parent run a child package on a different instance? This would be a performance plus since we have really huge source databases and would like to distribute the processing.
Hmmm, my boss just told me to scratch the whole idea of parent-child and go with a control table to store the variable for all the packages to access. Oh well, I'm still interested in why this is so cumbersome when really its just passing a parameter from one procedure to another.
Oh, and I think you could use a spellchecker on this message box. At least I could use one.
View 2 Replies
View Related
Jun 21, 2007
Hello All,
When creating my database I have modeled some of the tables after the Adventureworks sample database.
There are some fields or entire tables in Adventureworks that I do not see an imediate use for, however; I would hate to ommit them to find out later they would have been benificial. (.eg territory table).
In general terms what would the impact be on size and performance of a database which contains tables or fields that do not contain data.
Thanks for your help!
View 1 Replies
View Related
Feb 20, 2007
System Configuration :
OS : Windows 2003 latest SP
SQL Server : Standard Edition, SP2
Microsoft SQL Server 2005 - 9.00.3042.00 (Intel X86)
Feb 9 2007 22:47:07
Copyright (c) 1988-2005 Microsoft Corporation
Standard Edition on Windows NT 5.2 (Build 3790: Service Pack 1)
DownTime : This is not a 24x7 kind of machine. It can have downtime
Reference : http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1198044&SiteID=1
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1026884&SiteID=1
I have been discussing this Service Broker issues in this forum for quite sometimes and sorry to bother u all again€¦ To make things more clearer before implementing in production environment I have few doubts and it should be clarified..
As discussed in the first link we can clear sys.conversation_endpoints by just giving ALTER DATABASE GPx SET NEW_BROKER WITH ROLLBACK IMMEDIATE. But my apprehension is that if we run this command on production server and it truncate this table€¦ what will be the impact and overall overhead on the system€¦ Is it recommended to give this statement on Production Server daily at less traffic times, to clear this table ?€¦ will it have adverse effect. I repeat I have downtime , I can even shutdown this server daily€¦
I also, just want to know why Microsoft has not looked into this aspects€¦ why the system itself is clearing the expired messages.. What is the thought behind this architecture
I have the script to run in batch€¦ but in high-level meetings it is always difficult to convince this architecture/process€¦
Thanks in advance
View 5 Replies
View Related
Aug 28, 2006
I noticed that a database I am working with has a compatibility level set to SQL Server 2000. The instance is actually SQL Server 2005. I'm guessing that it was created like this because the database originally existed on 2000 and was created via backup/restore.
I'm trying to figure out if this needs to be changed and if so how to go about making the change in a non-disruptive manner. What features of 2005 are turned off as a reult of having a 2000 compatibility level?
View 4 Replies
View Related
Oct 12, 2007
I am using SQL 2005 build 9.0.2227
I have a custom conflict resolver - which fires on update conflicts (using row level tracking)
I have had a couple of occasions when the resolver has failed with the following error:
"The schema of the custom Dataset object implemented in the business logic handler does not match the schema of the source Dataset object. Verify that the custom Dataset object has been correctly defined"
In both cases I found that the row for which a conflict was being handled was not a conflict at all. One was a straightforward non conflicting update at the publisher and the other was a similar update at the subscriber.
I got round the problem by temporarily using a fix version of the conflict resolver dll that either set the custom Dataset to the publisher dataset or the subscriber dataset - depending on where the update had occurred.
When the first error (publisher update) occurred - the resolver code was basing the custom dataset on the publisher dataset - which was presumably empty - so I changed the code to base the custom dataset on the subscriber dataset. The second error therefore occurred when the custom dataset was based on the subscriber dataset - which again was presumably empty
Note that the tables involved in each occasion were different and neither table is filtered.
Is there a known bug in this area?
I am considering trying to change the resolver code to identify false conflicts in order to workround the problem - but this would be difficult to test as I can't reproduce the problem
aero1
View 2 Replies
View Related
Jun 18, 2007
Hi, all experts here,
Thanks for your kind attention.
I want to use time series algorithm to mine data from my case table and nested table. Case table is Date table, while nested table is the fact table. E.g, I want to predict the monthly sales amount for different region (I have region table related to the fact table), how can I achieve this?
Thanks a lot and I hope it is clear for your help and I am looking forward to hearing from you shortly.
With best regards,
Yours sincerely,
View 6 Replies
View Related
Jun 4, 2007
Hi, all experts here,
Thank you very much for your kind attention.
I am confused on key column of case table and key time column of nested table by using Time Series algorithm.
In my case, the case table structure is as below:
Territory key text (the ID is actually dimrisk_key, in this case, I use the name column binding to combine the Territory column of case table Dimrisks),
While the nested table structure is as below:
Cal_month key time (in this case, actually the ID is dimdate_key, again, I used name column bining property to bind the Cal_month to the ID)
So my question is, as the key column of case table has been set to be Territory, as a result, does the model training still cover all the cases (rows) based on the ID of the table?
Also, in the nested table, as the key time column has been set to Cal_month rather than Dimdate_key of the nested table, as a result, would the single series based on the cal_month?
Hope it is clear for your advices and help.
And I am looking forward to hearing from you shortly.
With best regards,
Yours sincerely,
View 1 Replies
View Related
Nov 14, 2011
I would like to know the impacts (if any) of adding nonclustered index with included columns on large tables (these tables are populated by bulk insert from text files).
View 3 Replies
View Related
Jul 27, 2015
For example in a Select Statement we have many tables and we have Where Clause with many conditions with AND operations. Do the SQL SERVER would apply the Where clause after all fetch or can dynamically decide about to include the related Tables from Select Statement Orderly with respect to where clause predicates? (SQL SERVER would not fetch data of those tables for its Select, where the AND condition in Where clause fails or by logic would be fruitless/not-related.)
View 5 Replies
View Related
Jun 15, 2007
If I have 6-8 queries running in parallel, Whether having a Single connection Manager (for the same source) for all the Extract performs faster or having Distinct Connection Manager for each of the extract performs faster ?
Regards
Subhash Subramanyam
View 1 Replies
View Related
Sep 5, 2013
I am upgrading my application's SQL Server from 2008 R2 to 2012.
As discussed in the below URL I am able to see the Identity jump after I upgrade and the server is restarted.
Now since I cannot afford this and at this moment I do not have the time to create a sequence with NOCACHE and test it again I have to go ahead and add trace flag 272 in the start up parameter as this is the only solution which I can implement and even rollback without much hassles.
[URL] ....
What I got to know, this flag will disable the new feature of IDENTITY property that has been implemented as part of SQL Server 2012 and will make it work like it was doing in SQL Server 2008 R2.
But I want to know implementing this flag would impact any other feature or performance (except the performance of IDENTITY) of SQL Server.
View 8 Replies
View Related