Reindex Tables

Apr 16, 2002

Hi guys,
I need a help with this question. In SQL 2000, Can i run update,insert or delete queries while the Indexes of that table is being Rebuilt? Will
i get blocked by the DBCC DBreindex process?

thank you

View 1 Replies


ADVERTISEMENT

Error 1105 When Using SQLMaint To Reindex Tables

Apr 22, 2004

I have a standard reorganise/reindex job running against a 32GB database on SQl Srever 2000. When trying to run the job it fails and returns Error 1105 <'PRIMARY' filegroup is full>. What's confusing me is that I have 53GB free on the drive on which my Primary file group sits.

Has anyone else come accross this problem when trying to set up a regular reindex job?

(more detail) the maintenance plan only includes the reorganisation/reindex job, no other jobs - including backing up the DB - are included. The DB in question is the only DB on the server: it's a test server.

View 2 Replies View Related

DBCC Reindex

Feb 19, 2002

Hi guys.
I an application here developed by a third party software house.
In the past, for some reason, the database would fail daily. The software
house recommended that we use dbcc reindex on all tables within the
databases twice daily. This was scheduled and is now running. Now the
database no longer fails.
The fix works and I don't understand why.
I don't understand why this would fix the problem. Why would reindex
twice daily solve the problem.
It seems excessive to have to reindex every user table twice daily.


Parg

View 1 Replies View Related

DBCC REINDEX

Apr 26, 2001

I tried to run dbcc reindex on all user tables in a database. There are no
clustered indexes, but multiple non-clustered indexes on each table.

The output file from dbcc reindex shows that it should have worked. But when I
run DBCC SHOWCONTIG, the scan density of the indexes that were in bad shape did
not improve.

Any ideas?
Thanks,
Ben Reeder
.

View 1 Replies View Related

Is Any Way To Reindex 27 Gig Db Without Logging

Sep 25, 2002

Hello everybody.
1. I have 28 Gig database with 4 tables above 4 Gig each with very bad
fragmenataion, each table has between 3 and 5 indexes
2. Database set for full recovery and I use custom log Shipping to restore
db on stand by server every 15 min.

I tried to run DBCC INDEXDEFRAG on one index on 4Gig table .
following took place
1. It took 4 hrs to complite DBCC INDEXDEFRAG
2. log shipping fail.
3. log file size of 2 Gig generated after DBCC INDEXDEFRAG complited

I tried to run drop and create clustered index on table it create same
problem - log growing, log shipping fails

(if log shipping fails and stand by database get suspended it will take 6- 8
hrs to restore it from backup and apply all logs)

So my question is

What would be best way to rebuild- reindex - defragment 28 Gig database
when it set to fully recovery and log shipping


Thank you
Alex

View 1 Replies View Related

SQL Import And Reindex

May 1, 2008

I have a process that runs each day and it imports about 550K records into a database. My questions is it appears I have to reindex the database after each import otherwise the sp's that I have written will just run and run and run. After the reindex job things run within 60 seconds. I am just looking for some insight on why, I understand why a reindex is done, but dont know if I understand why I have to reindex every time.

View 9 Replies View Related

Reindex Error

Jul 12, 2007

Hi experts, I would like to ask for this error that occurs upon executing my reindexing script

here is my script

USE mydatabase

DBCC DBREINDEX('outpatient', '', 70)
go

Then this error message will appear. I researched for this error and from the site that I've found they say that the table is corrupted? That I need to restore a better backup..?

The statement has been terminated.
Msg 824, Level 24, State 2, Line 1
SQL Server detected a logical consistency-based I/O error: torn page (expected signature: 0x55555555; actual signature: 0x55555545). It occurred during a read of page (1:353409) in database ID 10 at offset 0x000000ac902000 in file 'C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataBizbox_HS7.mdf'. Additional messages in the SQL Server error log or system event log may provide more detail. This is a severe error condition that threatens database integrity and must be corrected immediately. Complete a full database consistency check (DBCC CHECKDB). This error can be caused by many factors; for more information, see SQL Server Books Online.


Darren Bernabe Blanco

View 4 Replies View Related

Dbccd Reindex

Jul 18, 2007

how often should be done and if it don't do it what will happen?

=============================
http://www.sqlserverstudy.com

View 1 Replies View Related

Defrag / Reindex

Oct 21, 2007

I#ve been doing disaster recovery on a web box that died today.

So I thought I'd do some "downtime" maintenance on the DB server

I ran a BDREINDEX on all tables, all indexes. (I know this is the 2000 way, but I assume its as good as the proper 2005 way??).

5 minutes on a 10GB database. Not bad!

I checked the DEFRAG and UPDATE STATS processes that run overnight.

They are basically defragging only tables with SHOWCONTIG indicating fragmentation. And then doing an UPDATE STATISTICS WITH FULLSCAN on all tables

That is taking an average of 30 minutes ...

Is DBREINDEX the equivalent of an UPDATE STATISTICS WITH FULLSCAN, or is it in some way a smaller-sample version?

I'm wondering why I don't just lock the DB and do a REINDEX of everything in 5 minutes ...

Kristen

View 4 Replies View Related

Question On Reindex

Sep 27, 2006

Hi,i have several tables in production whose contents are renewd totally in 1week. So everyd day we delete ~15% records and then insert 15% new.And after a few days, the performances drops :TABLE level scan performed.- Pages Scanned................................: 169617- Extents Scanned..............................: 21630- Extent Switches..............................: 153827- Avg. Pages per Extent........................: 7.8- Scan Density [Best Count:Actual Count].......: 13.78% [21203:153828]- Logical Scan Fragmentation ..................: 45.06%- Extent Scan Fragmentation ...................: 52.66%- Avg. Bytes Free per Page.....................: 5042.5- Avg. Page Density (full).....................: 37.70%I can't program a dbcc reindex every day because of concurrent access (itlocks the tables too long), actually i can only program it on sunday.What else can i do ? I can adjust the fill factor but how to find the goodvalue if i don't want to waste space.The total size of the database is ~150GB.Thx

View 1 Replies View Related

DBCC REINDEX Command

Oct 2, 2001

I am currently running the Back Office Resource Kit Log shipping option for a database running on an SQL 7 installation. As part of the on-going maintenance work that we are being asked to perform by the application vendor I need to run a DBCC REINDEX run on most of the tables in the database. Currently this is done by stopping the log shipping routine and then running the reindex script, then taking a full backup and restoring the backup to the secondary server then restarting the log shipping scripts. This is a very time consuming task that has to be performed at unsociable hours.

Has anybody got an opinion as to if this would work at the same time as the log shipping scripts or do I have to continue as at present.

Responses gratefully received.

Regards
Phil Corby
Geest IT Services

View 3 Replies View Related

Update Statistics Vs. Reindex

Dec 14, 1999

I am maintaining a large table with millions of rows that has two non clustered indexes and data changing frequently, I need to keep the indexes fresh. Update Statistics runs much quicker than Reindex. What is the appropriate situation for each and why?
Thanks in advance.

View 1 Replies View Related

Disk Space After Reindex

Nov 29, 2007

All,

I first ran indexdefrag on a table with 1.5 billion rows.
logical fragmentation was at 95%.
logical frag went down to 3% with no real effect on disk.

DBCC reindex had previously been bombing undetected.


Now I've run a reindex on this table:
Reindex Job with Fillfactor =100
Ran in 3:05
Free Disk went from ~150GB before operation to 49GB
File4 went from 347GB to 504GB

Why has so much free disk been consumed by this operation and not released ??????????

Is my only choice to shrink data file???

thanks

Env.
Win2k ENT os
SQL 2k5 std 64bit

View 4 Replies View Related

Food For Thought (ReIndex And Log Shipping)

Dec 29, 2003

I have a production 60GB database set to Full Recovery and every 15 minutes I am log shipping to a Stand by Server .

During the production hours there are no problems but at night when I run DBCC DBREINDEX, the log grows to 22GB and because of this I have a problem sending this over the network to the stand by server.

I tried changing the recovery model to Bulk_Logged but the there is no difference in log file backup size.

AnyIdea

View 1 Replies View Related

Reindex Script In Replication Environment

Jun 19, 2008

Dear All,
is it ok to run index rebuild script on publisher and after that in subscriber? what are the steps i need to take to do this?

will it affect the replication? please help me here.

Arnav
Even you learn 1%, Learn it with 100% confidence.

View 3 Replies View Related

Dbcc Reindex Issue - - I Don't Understand!!

Jun 26, 2006

Hi Folks,SQL Server 2000 SP3 on Windows 2000. I have a database on which I ranthe command :dbcc dbreindex ('tablename')gofor all tables in the database. Then I compared the dbcc showcontigwith all_index output from before and after the reindex and on thelargest table in the database I found this. First output is prior toreindex:Table: 'PlannedTransferArchive' (1975014117); index ID: 1, database ID:7TABLE level scan performed.- Pages Scanned................................: 184867- Extents Scanned..............................: 23203- Extent Switches..............................: 23324- Avg. Pages per Extent........................: 8.0- Scan Density [Best Count:Actual Count].......: 99.07% [23109:23325]- Logical Scan Fragmentation ..................: 11.13%- Extent Scan Fragmentation ...................: 35.46%- Avg. Bytes Free per Page.....................: 60.0- Avg. Page Density (full).....................: 99.26%Second output is from after the reindex:DBCC SHOWCONTIG scanning 'PlannedTransferArchive' table...Table: 'PlannedTransferArchive' (1975014117); index ID: 1, database ID:8TABLE level scan performed.- Pages Scanned................................: 303177- Extents Scanned..............................: 37964- Extent Switches..............................: 42579- Avg. Pages per Extent........................: 8.0- Scan Density [Best Count:Actual Count].......: 89.00% [37898:42580]- Logical Scan Fragmentation ..................: 43.19%- Extent Scan Fragmentation ...................: 24.78%- Avg. Bytes Free per Page.....................: 75.1- Avg. Page Density (full).....................: 99.07%Following are my concerns:The following numbers are all higher after reindex than before reindex:pages scanned, extent switches, logical scan fragmentation, avg bytesfree per page, avg page density.scan density is lower after reindex than before reindexSeems to me that the numbers that are higher after reindex should belower and numbers that are lower after reindex should be higher? Ididn't specify the fill factor in the dbcc reindex command so it shouldhave used the default fill factor. The fill factor has never beenchanged on this machine.Am I missing something?Thanks,Raziq.*** Sent via Developersdex http://www.developersdex.com ***

View 3 Replies View Related

Does Dbcc Reindex Update Usage Information?

Feb 3, 2000

Does running DBCCReindex update the space allocated columns in sysindexes? I understand that running dbcc updateusage updates the space allocated columns in the sysindexes table. But, I cannot find any documentation that indicates whether dynamically rebuilding the indexes as opposed to drop and recreating the indexes updates the space allocated columns in the sysindexes table?

Any information would be helpful.
Thanks.
Gail Wade
Database Administration
Raymond James Financial
gwade@it.rjf.com

View 3 Replies View Related

SQL Server 2008 :: Table Size After Reindex

Feb 22, 2015

I'm trying to understand what is happening to one large table.

In a DB SQL 2008R2, I'm trying to track a rapidly increasing DB size. It's due to one table recently added.

Despite the table's no rows increasing its size reduces after a scheduled re-index.

I've recorded the space used by this table by recording EXEC sp_spaceused 'tableName'

---
NoRows reserved data index_size unused
128864512300384 KB 2290928 KB9432 KB24 KB AFTER reindex
128864515406232 KB 5366184 KB39280 KB768 KBBEFORE reindex

N.B. The only thing I'm aware of happening in the time period is a reindex as part of scheduled task. I could be missing something else happening.

The table has only one Index the PK which is clustered, No Fill factor is specified. The Server Default fill factor =0.

I read fillfactor=0=100 Will always try and fill the pages so space used will be minimised?

After running the reindex the index fragmentation is v.low. I've not recorded the fragmentation before reindex.

I can see the Data is not added in Clustered index order.

View 2 Replies View Related

SQL 2012 :: Weekly Reindex Job Failed Because Of Deadlock

May 18, 2015

I have a weekly Maintenance Plan Reindex job that has failed because of a deadlock. My question seems simple enough and I'm ashamed to say I ought to know this answer, but here goes: Does the rest of a given job continue after such failures (this one was maybe 3/4 through the log) occur?

View 6 Replies View Related

SQL Script To Set A DB To Simple Then Run A Reorg And Reindex Then Set DB Back 2 Full

Jan 18, 2006

Does anyone know what the commands would be? I am trying to create a job that puts a DB in simple mode then launches a reorg and re-index, then sets it back to full when it is complete. This way I can eliminate large transaction logs being created.

Any help would be great!
Thanks

View 1 Replies View Related

SQL Server 2008 :: Alter Column And Reindex 100M Row Table?

Jun 3, 2015

I inherited a system which has an index on a set of columns which allow more than 900 bytes of data in it. We know one of the fields can be shortened to shrink the potential key size below 900 bytes.

The problem is the table is about 120m rows, and the index currently on that column is seeked (sought?) on about 2.5m times a day.

At its simplest, I want to drop the existing index, alter the column to shrink the varchar size, and then rebuild the index on the newly shortened column.

On a smaller, less used table, I might just do this in outside of business hours and call it a day, but I'm concerned that this will take a long time and block a lot of operations.

1) IIRC, shrinking a column, unlike widening it, is much more expensive, even if there are no values which would actually end up trunacted. Is this right?

2) I did a few tests on some other smaller (2+ m) row tables and was still able to select data out of the table. I don't think this covered all the read scenarios, but are there known scenarios which would simply not work during an index build?

3) I haven't yet tried DML operations to the table while it's doing either the column update or an index build. what scenarios would or would not be blocked?

View 0 Replies View Related

What Is &"?&" Means In SQL SERVER DBCC REINDEX('?')

Sep 25, 2007

here is example: DBCC REINDEX('?')

Thanks!

View 2 Replies View Related

Why Would Tables Pulled In From ODBC In Access Be Different Than Tables In SQL 2005 Tables?

Jan 24, 2008

I'm new to my company, although not new to SQL 2005 and I found something interesting. I don't have an ERD yet, and so I was asking a co-worker what table some data was in, they told me a table that is NOT in SQL Server 2005's list of tables, views or synonyms.

I thought that was strange, and so I searched over and over again and still I couldn't find it. Then I did a select statement the table that Access thinks exists and SQL Server does not show and to my shock, the select statement pulled in data!

So how did this happen? How can I find the object in SSMS folder listing of tables/views or whatever and what am I overlooking?

Thanks,
Keith

View 4 Replies View Related

Track The Changes To Normalised Tables And Update The Denormalised Tables Depending On The Changes To Normalised Tables

Dec 7, 2006

We have 20 -30 normalized tables in our dartabase . Also we have 4tables where we store the calculated data fron those normalised tables.The Reason we have these 4 denormalised tables is when we try to dothe calcultion on the fly, our site becomes very slow. So We haveprecalculated and stored it in 4 tables.The Process we use to do the precalcultion, will get do thecalculation and and store it in a temp table. It will compare the thetemp with denormalised tables and insert new rows , delte the old oneans update if any changes.This process take about 20 mins - 60mins. Ittakes long time because in this process we first do the calculationregardless of changes and then do a compare to see what are changed andremove if any rows are deleted, and insert new rowsand update thechanges.Now we like to capture the rows/columns changed in the normalisedtables and do only those chages to the denormalised table , which weare hoping will reduce the processing time by atleast 50%WE have upgraded to SQL SERVER 2005.So We like to use the newtechnology for this process.I have to design the a model to capture the changes and updated onlythose changes.I have the list of normalised tables and te columns which will affectthe end results.I thought of using Triggers or OUTPUT clause to capture the changes.Please help me with the any ideas how to design the new process

View 3 Replies View Related

SQL 2012 :: Extract All Tables Names And Their Row Counts From Linked Server Tables

Oct 7, 2015

I am using the following select statement to get the row count from SQL linked server table.

SELECT Count(*) FROM OPENQUERY (CMSPROD, 'Select * From MHDLIB.MHSERV0P')

MHDLIB is the library name in IBM DB2 database. The above query gives me only the row count of table MHSERV0P. However, I need to get the names, rowcounts, and sizes of all tables that exist in MHDLIB librray. Is it possible at all?

View 1 Replies View Related

Solution!-Create Access/Jet DB, Tables, Delete Tables, Compact Database

Feb 5, 2007

From Newbie to Newbie,



Add reference to:

'Microsoft ActiveX Data Objects 2.8 Library

'Microsoft ADO Ext.2.8 for DDL and Security

'Microsoft Jet and Replication Objects 2.6 Library

--------------------------------------------------------

Imports System.IO

Imports System.IO.File





Code Snippet

'BACKUP DATABASE

Public Shared Sub Restart()

End Sub



'You have to have a BackUps folder included into your release!

Private Sub BackUpDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles BackUpDB.Click
Dim addtimestamp As String
Dim f As String
Dim z As String
Dim g As String
Dim Dialogbox1 As New Backupinfo


addtimestamp = Format(Now(), "_MMddyy_HHmm")
z = "C:Program FilesVSoftAppMissNewAppDB.mdb"
g = addtimestamp + ".mdb"


'Add timestamp and .mdb endging to NewAppDB
f = "C:Program FilesVSoftAppMissBackUpsNewAppDB" & g & ""



Try

File.Copy(z, f)

Catch ex As System.Exception

System.Windows.Forms.MessageBox.Show(ex.Message)

End Try



MsgBox("Backup completed succesfully.")
If Dialogbox1.ShowDialog = Windows.Forms.DialogResult.OK Then
End If
End Sub






Code Snippet

'RESTORE DATABASE

Private Sub RestoreDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles

RestoreDB.Click
Dim Filename As String
Dim Restart1 As New RestoreRestart
Dim overwrite As Boolean
overwrite = True
Dim xi As String


With OpenFileDialog1
.Filter = "Database files (*.mdb)|*.mdb|" & "All files|*.*"
If .ShowDialog() = Windows.Forms.DialogResult.OK Then
Filename = .FileName



'Strips restored database from the timestamp
xi = "C:Program FilesVSoftAppMissNewAppDB.mdb"
File.Copy(Filename, xi, overwrite)
End If
End With


'Notify user
MsgBox("Data restored successfully")


Restart()
If Restart1.ShowDialog = Windows.Forms.DialogResult.OK Then
Application.Restart()
End If
End Sub








Code Snippet

'CREATE NEW DATABASE

Private Sub CreateNewDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles

CreateNewDB.Click
Dim L As New DatabaseEraseWarning
Dim Cat As ADOX.Catalog
Cat = New ADOX.Catalog
Dim Restart2 As New NewDBRestart
If File.Exists("C:Program FilesVSoftAppMissNewAppDB.mdb") Then
If L.ShowDialog() = Windows.Forms.DialogResult.Cancel Then
Exit Sub
Else
File.Delete("C:Program FilesVSoftAppMissNewAppDB.mdb")
End If
End If
Cat.Create("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:Program FilesVSoftAppMissNewAppDB.mdb;

Jet OLEDB:Engine Type=5")

Dim Cn As ADODB.Connection
'Dim Cat As ADOX.Catalog
Dim Tablename As ADOX.Table
'Taylor these according to your need - add so many column as you need.
Dim col As ADOX.Column = New ADOX.Column
Dim col1 As ADOX.Column = New ADOX.Column
Dim col2 As ADOX.Column = New ADOX.Column
Dim col3 As ADOX.Column = New ADOX.Column
Dim col4 As ADOX.Column = New ADOX.Column
Dim col5 As ADOX.Column = New ADOX.Column
Dim col6 As ADOX.Column = New ADOX.Column
Dim col7 As ADOX.Column = New ADOX.Column
Dim col8 As ADOX.Column = New ADOX.Column

Cn = New ADODB.Connection
Cat = New ADOX.Catalog
Tablename = New ADOX.Table



'Open the connection
Cn.Open("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:Program FilesVSoftAppMissNewAppDB.mdb;Jet

OLEDB:Engine Type=5")


'Open the Catalog
Cat.ActiveConnection = Cn



'Create the table (you can name it anyway you want)
Tablename.Name = "Table1"


'Taylor according to your need - add so many column as you need. Watch for the DataType!
col.Name = "ID"
col.Type = ADOX.DataTypeEnum.adInteger
col1.Name = "MA"
col1.Type = ADOX.DataTypeEnum.adInteger
col1.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col2.Name = "FName"
col2.Type = ADOX.DataTypeEnum.adVarWChar
col2.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col3.Name = "LName"
col3.Type = ADOX.DataTypeEnum.adVarWChar
col3.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col4.Name = "DOB"
col4.Type = ADOX.DataTypeEnum.adDate
col4.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col5.Name = "Gender"
col5.Type = ADOX.DataTypeEnum.adVarWChar
col5.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col6.Name = "Phone1"
col6.Type = ADOX.DataTypeEnum.adVarWChar
col6.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col7.Name = "Phone2"
col7.Type = ADOX.DataTypeEnum.adVarWChar
col7.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col8.Name = "Notes"
col8.Type = ADOX.DataTypeEnum.adVarWChar
col8.Attributes = ADOX.ColumnAttributesEnum.adColNullable



Tablename.Keys.Append("PrimaryKey", ADOX.KeyTypeEnum.adKeyPrimary, "ID")


'You have to append all your columns you have created above
Tablename.Columns.Append(col)
Tablename.Columns.Append(col1)
Tablename.Columns.Append(col2)
Tablename.Columns.Append(col3)
Tablename.Columns.Append(col4)
Tablename.Columns.Append(col5)
Tablename.Columns.Append(col6)
Tablename.Columns.Append(col7)
Tablename.Columns.Append(col8)



'Append the newly created table to the Tables Collection
Cat.Tables.Append(Tablename)



'User notification )
MsgBox("A new empty database was created successfully")


'clean up objects
Tablename = Nothing
Cat = Nothing
Cn.Close()
Cn = Nothing


'Restart application
If Restart2.ShowDialog() = Windows.Forms.DialogResult.OK Then
Application.Restart()
End If

End Sub








Code Snippet



'COMPACT DATABASE

Private Sub CompactDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles

CompactDB.Click
Dim JRO As JRO.JetEngine
JRO = New JRO.JetEngine


'The first source is the original, the second is the compacted database under an other name.
JRO.CompactDatabase("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:Program

FilesVSoftAppMissNewAppDB.mdb; Jet OLEDB:Engine Type=5", "Provider=Microsoft.Jet.OLEDB.4.0;

Data Source=C:Program FilesVSoftAppMissNewAppDBComp.mdb; JetOLEDB:Engine Type=5")


'Original (not compacted database is deleted)
File.Delete("C:Program FilesVSoftAppMissNewAppDB.mdb")


'Compacted database is renamed to the original databas's neme.
Rename("C:Program FilesVSoftAppMissNewAppDBComp.mdb", "C:Program FilesVSoftAppMissNewAppDB.mdb")


'User notification
MsgBox("The database was compacted successfully")

End Sub

End Class

View 1 Replies View Related

Can I Export Tables So That Existing Tables In Destination Database Will Be Modified?

Jul 20, 2005

I'm working on an ASP.Net project where I want to test code on a localmachine using a local database as a back-end, and then export it tothe production machine where it uses the hosting provider's SQL Serverdatabase on the back-end. Is there a way to export tables from oneSQL Server database to another in such a way that if a table alreadyexists in the destination database, it will be updated to reflect thechanges to the local table, without existing data in the destinationtable being lost? e.g. suppose I change some tables in my localdatabase by adding new fields. Can I "export" these changes to thedestination database so that the new fields will be added to thedestination tables (and filled in with default values), without losingdata in the destination tables?If I run the DTS Import/Export Wizard that comes with SQL Server andchoose "Copy table(s) and view(s) from the source database" and choosethe tables I want to copy, there is apparently no option *not* to copythe data, and since I don't want to copy the data, that choice doesn'twork. If instead of "Copy table(s) and view(s) from the sourcedatabase", I choose "Copy objects and data between SQL Serverdatabases", then on the following options I can uncheck the "CopyData" box to prevent data being copied. But for the "CreateDestination Objects" choices, I have to uncheck "Drop destinationobjects first" since I don't want to lose the existing data. But whenI uncheck that and try to do the copy, I get collisions between theproperties of the local table and the existing destination table,e.g.:"Table 'wbuser' already has a primary key defined on it."Is there no way to do what I want using the DTS Import/Export Wizard?Can it be done some other way?-Bennett

View 3 Replies View Related

Saving Tables That Are Generated By Queries As HTML File Or Sub-tables

Oct 17, 2006

I have a trade data tables (about 10) and I need to retrieve information based on input parameters. Each table has about 3-4 million rows.

The table has columns like Commodity, Unit, Quantity, Value, Month, Country

A typical query I use to select data is "Select top 10 commodity , sum(value), sum(quantity) , column4, column5, column6 from table where month=xx and country=xxxx"

The column4 = (column2)/(total sum of value) and column 5=(column3)/(total sum of quantity). Column6=column5/column4.

It takes about 3-4 minutes for the query to complete and its a lot of time specially since I need to pull this information from a webpage.

I wanted to know if there is an alternate way to pull the data from server ?

I mean can I write a script that creates tables for all the input combinations i.e month x country (12x228) and save them in table (subtable-table) with a naming convention so from the web I can just pull the table with input parameters mapped to name convention and not running any runtime queries on database ??

OR

Can I write a script that creates a html files for each table for all input combinations save them ?

OR

Is there exists any other solution ?

View 1 Replies View Related

Exporting Data From Excel Tables To SQL Server Tables

Dec 9, 2007

Hi all,
 I have a large Excel file with one large table which contains data, i've built a SQL Server DataBase and i want to fill it with the data from the excel file.
 
How can it be done?
 
 
Thanks, Michael.

View 1 Replies View Related

Insert Records From Foxpro Tables To SQL Server Tables

Apr 22, 2004

Hi,

Currently, I'm using the following steps to migrate millions of records from Foxpro tables to SQL Server tables:

1. Transfer Foxpro records to .dat files and then bcp to SQL Server tables in a dummy database. All the SQL tables have the same columns as the Foxpro tables.
2. Manipulate the data in the SQL tables of the dummy database and save the manipulated data into the SQL tables of the real database where the tables may have different structure from the corresponding Foxpro tables.

I only know the following ways to import Foxpro data into SQL Server:

#1. Transfer Foxpro records to .dat files and then bcp to SQL Server tables
#2. Transfer Foxpro records to .dat files and then Bulk Insert to SQL Server tables
#3. DTS Foxpro records directly to SQL Server tables

I'm thinking whether the following choices will be better than the current way:

1st choice: Change step 1 to use #2 instead of #1
2nd choice: Change step 1 to use #3 instead of #1
3rd choice: Use #3 plus manipulating in DTS to replace step 1 and step 2

Thank you for any suggestion.

View 2 Replies View Related

Temp. Tables / Variables / Process Keyed Tables ?

Feb 22, 2008

I have 3 Checkbox list panels that query the DB for the items. Panel nº 2 and 3 need to know selection on panel nº 1. Panels have multiple item selection. Multiple users may use this at the same time and I wanted to have a full separation between the application and the DB. The ASP.net application always uses Stored Procedures to access the DB. Whats the best course of action? Using a permanent 'temp' table on the SQL server? Accomplish everything on the client side?

[Web application being built on ASP.net 3.5 (IIS7) connected to SQL Server 2005)

View 1 Replies View Related

Dynamic Tables Names And Temporary Tables Options

Oct 5, 2007

Firstly I consider myself quite an experienced SQL Server user, andamnow using SQL Server 2005 Express for the main backend of mysoftware.My problem is thus: The boss needs to run reports; I have designedthese reports as SQL procedures, to be executed through an ASPapplication. Basic, and even medium sized (10,000+ records) reportingrun at an acceptable speed, but for anything larger, IIS timeouts andquery timeouts often cause problems.I subsequently came up with the idea that I could reduce processingtimes by up to two-thirds by writing information from eachcalculationstage to a number of tables as the reporting procedure runs..ie. stage 1, write to table xxx1,stage 2 reads table xxx1 and writes to table xxx2,stage 3 reads table xxx2 and writes to table xxx3,etc, etc, etcprocedure read final table, and outputs information.This works wonderfully, EXCEPT that two people can't run the samereport at the same time, because as one procedure creates and writesto table xxx2, the other procedure tries to drop the table, or read atable that has already been dropped....Does anyone have any suggestions about how to get around thisproblem?I have thought about generating the table names dynamically using'sp_execute', but the statement I need to run is far too long(apparently there is a maximum length you can pass to it), and evenbreaking it down into sub-procedures is soooooooooooooooo timeconsuming and inefficient having to format statements as strings(replacing quotes and so on)How can I use multiple tables, or indeed process HUGE procedures,withdynamic table names, or temporary tables?All answers/suggestions/questions gratefully received.Thanks

View 2 Replies View Related

How To Drop An Identity Column From All Tables Tables In A Database

Mar 2, 2008

Does anyone have a script that can drop the Identity columns from all the tables in a database? Thanks

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved