Reindexing Tables
Sep 17, 2001
I've just recently tried to perform a scheduled reindexing job with the following command:
EXEC sp_MSforeachtable @command1="print '?' DBCC DBREINDEX ('?')"
Unfortunately, this command has only finished once of the five or six times I've tried to run it. The times it has failed, it deadlocks behind what seems to be a succession of other transactions. How can I make sure this command finishes?
View 1 Replies
ADVERTISEMENT
Dec 27, 2002
Hello Everyone,
I need to rebuild all indexes in one table, is there any command that will do automatically instead of deleting and recreating them all?
Any help will be appreciated.
Thanks
View 3 Replies
View Related
May 27, 2008
hi
i have reindex of the tables
is it really improve performance that making table reindexing ?
View 1 Replies
View Related
Sep 5, 2006
I need to reindex all tables in my database and would like to do this without using a Cursor. What is the simplest way to achieve this.
Cheers
Nat
View 2 Replies
View Related
Nov 3, 2005
Hello,
I'm looking for the query command that will go out to all the user Tables and will tell me what Indexes need to be reindexed.
We are having a problem with some of the tables and we don't know when our tables need to be reindexed other than when operations are stopped for our company.
Thanks,
Ron
View 2 Replies
View Related
Apr 15, 2002
Attempt to fetch logical page (1:166354) in database 'pm_pmc_prod' belongs to object '837120', not to object 'ng_visit'.
I got his error while importing data into the error, when i ran dbcc checkdb it gave me Msg 8928,8942,8976 etct witl serverity level 16...
is it possible to fix this curropt tables?
View 2 Replies
View Related
Jun 22, 1999
We load tables from text files for inquiries.
My procedure is to truncate the table
Use DTS to move the text file into the table
do the command DBCC DBREINDEX (TABLE,'')
Am I wasting my time?
Does SQL 7 rebuild the indicis as it loads the data from the text file?
View 1 Replies
View Related
Aug 16, 2006
Hello guys,
Two things:
1) Could somebody explain me how to reindex all tables in my db?
2) How do I know when I should reindex my db?
Thank you very much for any help!
Regards,
Fabian
my favorit hoster is ASPnix : www.aspnix.com !
View 4 Replies
View Related
Nov 15, 2007
Hi All,
is it really improve performance that making table reindexing?
what i mean to say is i've one script, which will automatically drops all the indexes in a database, and reconstruct them with the same name.
is it really worth doing that?.....
thankyou very much
Vinod
Even you learn 1%, Learn it with 100% confidence.
View 10 Replies
View Related
Oct 18, 2007
All,
I've got a medium sized database in a mirror configuration with witness. The database size is about 300gb and I would to reindex all of the tables in the database. My process would go something like this:
1) Backup principal
2) Break the mirror
3) Set the principal database to simple recovery mode
4) Perform the reindexing
5) Backup the principal and transfer that backup to the mirror
6) Restore the backup
7) Re-establish the mirror
Does anyone see any issues with the process itself?
Regards,
Ian
View 4 Replies
View Related
Dec 27, 2007
Hi experts,
For defragmenation and reindexing they are using the below cursor, and now they have asked me to remove the cursor and schedule the job accordingly to do the same functionality, so what will be the other way we can do without cursor? can you plase let me know the solution?
DECLARE @Database VARCHAR(255)
DECLARE @Table VARCHAR(255)
DECLARE @cmd NVARCHAR(500)
DECLARE @fillfactor INT
SET @fillfactor = 90
DECLARE DatabaseCursor CURSOR FOR
SELECT database_name FROM dbadmin.dbo.tdbstatus WHERE status='y'
AND database_name NOT IN ('master','model','msdb','tempdb')
ORDER BY 1
OPEN DatabaseCursor
FETCH NEXT FROM DatabaseCursor INTO @Database
WHILE @@FETCH_STATUS = 0
BEGIN
SET @cmd = 'DECLARE TableCursor CURSOR FOR SELECT table_catalog + ''.'' + table_schema + ''.'' + table_name as tableName
FROM ' + @Database + '.INFORMATION_SCHEMA.TABLES WHERE table_type = ''BASE TABLE'''
-- create table cursor
EXEC (@cmd)
OPEN TableCursor
FETCH NEXT FROM TableCursor INTO @Table
WHILE @@FETCH_STATUS = 0
BEGIN
SET @cmd = 'ALTER INDEX ALL ON ' + @Table + ' REBUILD WITH (FILLFACTOR = ' + CONVERT(VARCHAR(3),@fillfactor) + ')'
EXEC (@cmd)
FETCH NEXT FROM TableCursor INTO @Table
END
CLOSE TableCursor
DEALLOCATE TableCursor
FETCH NEXT FROM DatabaseCursor INTO @Database
END
CLOSE DatabaseCursor
DEALLOCATE DatabaseCursor
Your help will be appreciated.
View 1 Replies
View Related
Mar 24, 2005
In the Enterprise Manager of SQL Server 2000 I have set up a maintenance plan which rebuilds my indexes. I've stuided the documentation, and from what I've learned what happens behind the curtain is that several DBCC REINDEX commands are being issued.
Question:
If I have 20 tables and 40 indexes: Will SQL Server do the maintance plan in 1 single transaction, or will it divide the it up to eg. 20 or 40 transactions?
-h
View 1 Replies
View Related
Nov 22, 2004
Hi
We are upgrading from sql 7 to 2000.During the upgrade process do we have to do a reindexing of all tables or will update statistics take care of that.
Or do we have to do both?
What is the difference between reindexing and update statistics.
Thanks
Madhukar Gole
View 5 Replies
View Related
Dec 14, 2004
Hi All,
Just after some feedback on a scenario where we have full logging setup on one of the databases, and the transaction logs are backed up every 60min. At 0000-0100 the log jumps from being a few thousand k up to over 1.7gb.
I did some profiling for this time, and it appears that this jump is related to the reindexing of the indexes on the database.
Is this normal for the log file to jump in size so much? Or is this an indication of some other issue (potentially with the indexes)?
Is there any way that the reindexing can be excluded from the log files or is this a necessity?
Thanks in advance for your help.
Cheers
Troy
View 4 Replies
View Related
Jul 23, 2005
Folks,I work on a system which is growing rapidly, with the number oftransactions we process growing on a daily basis. While this is goodnews or the business, maintenance is starting to become an issue as thedatabase is the backend for a website which cannot be down for alengthy period of time.While I do defrag the indexes, periodically the indexes do need to berebuilt. When this happens, the process locks pages and transactionsstart getting bounced out.Are their any third party utilities which will rebuild an indexwithout this locking occuring? Any help in pointing me in the rightdirection would be appreciated.
View 1 Replies
View Related
Apr 30, 2007
Does someone know if doing a reindex on a clustered or non-clustered index cause the snapshot file to grow? In other words, is the data that makes up the snapshot copied from the source to the snapshot database? If a normal reindex is done on the underlying database, will it block users from acessing the snapshot? Any help would be appreciated.
View 1 Replies
View Related
Oct 18, 2007
All,
I've got a medium sized database in a mirror configuration with witness. The database size is about 300gb and I would to reindex all of the tables in the database. My process would go something like this:
1) Backup principal
2) Break the mirror
3) Set the principal database to simple recovery mode
4) Perform the reindexing
5) Backup the principal and transfer that backup to the mirror
6) Restore the backup
7) Re-establish the mirror
Does anyone see any issues with the process itself?
Regards,
Ian
View 1 Replies
View Related
Apr 25, 2008
Hi,
We have scheduled a job for DB Reindexing (Maitinance Plan) for a OLTP database on sunday.
We have used mirroring for automatic failover with a witness server now the DR Reindexing job fails after 30 mins without any error.
Please let me know why Database reindexing gets failed.
Regards
Sufian
View 16 Replies
View Related
Jul 18, 2007
Hello. When reviewing the DBCC SHOWCONTIG immediately after reindexing all indexes on a database, I see the ExtentFragmentation has values like 50 to 70%... These are SQL 2005 tables with clustered PK's, no large varchars/blobs, and at least 100 pages in the index... The numbers related to PAGE fragmentation are ok after reindexing, but not the EXTENT fragmentation numbers.
I noticed the drive is in need of being defragged at the disk level. Is that a reason why reindexing doesn't fix the Extent frag numbers?? ANy other ideas on this? I can try defragging the DISK over the weekend, bringing the database offline then, but any other thougths on why the Extents show these high %'s? Is there any command to reset them and maybe that isn't happening? Like must I do update usage to get valid Extent frag #'s??
If there were MANY autogrows on the files, is that a different level of fragmentation? and how could all those small pieces of files be pulled back together? Thanks, Bruce
View 7 Replies
View Related
Jan 24, 2008
I'm new to my company, although not new to SQL 2005 and I found something interesting. I don't have an ERD yet, and so I was asking a co-worker what table some data was in, they told me a table that is NOT in SQL Server 2005's list of tables, views or synonyms.
I thought that was strange, and so I searched over and over again and still I couldn't find it. Then I did a select statement the table that Access thinks exists and SQL Server does not show and to my shock, the select statement pulled in data!
So how did this happen? How can I find the object in SSMS folder listing of tables/views or whatever and what am I overlooking?
Thanks,
Keith
View 4 Replies
View Related
Dec 7, 2006
We have 20 -30 normalized tables in our dartabase . Also we have 4tables where we store the calculated data fron those normalised tables.The Reason we have these 4 denormalised tables is when we try to dothe calcultion on the fly, our site becomes very slow. So We haveprecalculated and stored it in 4 tables.The Process we use to do the precalcultion, will get do thecalculation and and store it in a temp table. It will compare the thetemp with denormalised tables and insert new rows , delte the old oneans update if any changes.This process take about 20 mins - 60mins. Ittakes long time because in this process we first do the calculationregardless of changes and then do a compare to see what are changed andremove if any rows are deleted, and insert new rowsand update thechanges.Now we like to capture the rows/columns changed in the normalisedtables and do only those chages to the denormalised table , which weare hoping will reduce the processing time by atleast 50%WE have upgraded to SQL SERVER 2005.So We like to use the newtechnology for this process.I have to design the a model to capture the changes and updated onlythose changes.I have the list of normalised tables and te columns which will affectthe end results.I thought of using Triggers or OUTPUT clause to capture the changes.Please help me with the any ideas how to design the new process
View 3 Replies
View Related
Oct 7, 2015
I am using the following select statement to get the row count from SQL linked server table.
SELECT Count(*) FROM OPENQUERY (CMSPROD, 'Select * From MHDLIB.MHSERV0P')
MHDLIB is the library name in IBM DB2 database. The above query gives me only the row count of table MHSERV0P. However, I need to get the names, rowcounts, and sizes of all tables that exist in MHDLIB librray. Is it possible at all?
View 1 Replies
View Related
Feb 5, 2007
From Newbie to Newbie,
Add reference to:
'Microsoft ActiveX Data Objects 2.8 Library
'Microsoft ADO Ext.2.8 for DDL and Security
'Microsoft Jet and Replication Objects 2.6 Library
--------------------------------------------------------
Imports System.IO
Imports System.IO.File
Code Snippet
'BACKUP DATABASE
Public Shared Sub Restart()
End Sub
'You have to have a BackUps folder included into your release!
Private Sub BackUpDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles BackUpDB.Click
Dim addtimestamp As String
Dim f As String
Dim z As String
Dim g As String
Dim Dialogbox1 As New Backupinfo
addtimestamp = Format(Now(), "_MMddyy_HHmm")
z = "C:Program FilesVSoftAppMissNewAppDB.mdb"
g = addtimestamp + ".mdb"
'Add timestamp and .mdb endging to NewAppDB
f = "C:Program FilesVSoftAppMissBackUpsNewAppDB" & g & ""
Try
File.Copy(z, f)
Catch ex As System.Exception
System.Windows.Forms.MessageBox.Show(ex.Message)
End Try
MsgBox("Backup completed succesfully.")
If Dialogbox1.ShowDialog = Windows.Forms.DialogResult.OK Then
End If
End Sub
Code Snippet
'RESTORE DATABASE
Private Sub RestoreDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles
RestoreDB.Click
Dim Filename As String
Dim Restart1 As New RestoreRestart
Dim overwrite As Boolean
overwrite = True
Dim xi As String
With OpenFileDialog1
.Filter = "Database files (*.mdb)|*.mdb|" & "All files|*.*"
If .ShowDialog() = Windows.Forms.DialogResult.OK Then
Filename = .FileName
'Strips restored database from the timestamp
xi = "C:Program FilesVSoftAppMissNewAppDB.mdb"
File.Copy(Filename, xi, overwrite)
End If
End With
'Notify user
MsgBox("Data restored successfully")
Restart()
If Restart1.ShowDialog = Windows.Forms.DialogResult.OK Then
Application.Restart()
End If
End Sub
Code Snippet
'CREATE NEW DATABASE
Private Sub CreateNewDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles
CreateNewDB.Click
Dim L As New DatabaseEraseWarning
Dim Cat As ADOX.Catalog
Cat = New ADOX.Catalog
Dim Restart2 As New NewDBRestart
If File.Exists("C:Program FilesVSoftAppMissNewAppDB.mdb") Then
If L.ShowDialog() = Windows.Forms.DialogResult.Cancel Then
Exit Sub
Else
File.Delete("C:Program FilesVSoftAppMissNewAppDB.mdb")
End If
End If
Cat.Create("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:Program FilesVSoftAppMissNewAppDB.mdb;
Jet OLEDB:Engine Type=5")
Dim Cn As ADODB.Connection
'Dim Cat As ADOX.Catalog
Dim Tablename As ADOX.Table
'Taylor these according to your need - add so many column as you need.
Dim col As ADOX.Column = New ADOX.Column
Dim col1 As ADOX.Column = New ADOX.Column
Dim col2 As ADOX.Column = New ADOX.Column
Dim col3 As ADOX.Column = New ADOX.Column
Dim col4 As ADOX.Column = New ADOX.Column
Dim col5 As ADOX.Column = New ADOX.Column
Dim col6 As ADOX.Column = New ADOX.Column
Dim col7 As ADOX.Column = New ADOX.Column
Dim col8 As ADOX.Column = New ADOX.Column
Cn = New ADODB.Connection
Cat = New ADOX.Catalog
Tablename = New ADOX.Table
'Open the connection
Cn.Open("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:Program FilesVSoftAppMissNewAppDB.mdb;Jet
OLEDB:Engine Type=5")
'Open the Catalog
Cat.ActiveConnection = Cn
'Create the table (you can name it anyway you want)
Tablename.Name = "Table1"
'Taylor according to your need - add so many column as you need. Watch for the DataType!
col.Name = "ID"
col.Type = ADOX.DataTypeEnum.adInteger
col1.Name = "MA"
col1.Type = ADOX.DataTypeEnum.adInteger
col1.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col2.Name = "FName"
col2.Type = ADOX.DataTypeEnum.adVarWChar
col2.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col3.Name = "LName"
col3.Type = ADOX.DataTypeEnum.adVarWChar
col3.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col4.Name = "DOB"
col4.Type = ADOX.DataTypeEnum.adDate
col4.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col5.Name = "Gender"
col5.Type = ADOX.DataTypeEnum.adVarWChar
col5.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col6.Name = "Phone1"
col6.Type = ADOX.DataTypeEnum.adVarWChar
col6.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col7.Name = "Phone2"
col7.Type = ADOX.DataTypeEnum.adVarWChar
col7.Attributes = ADOX.ColumnAttributesEnum.adColNullable
col8.Name = "Notes"
col8.Type = ADOX.DataTypeEnum.adVarWChar
col8.Attributes = ADOX.ColumnAttributesEnum.adColNullable
Tablename.Keys.Append("PrimaryKey", ADOX.KeyTypeEnum.adKeyPrimary, "ID")
'You have to append all your columns you have created above
Tablename.Columns.Append(col)
Tablename.Columns.Append(col1)
Tablename.Columns.Append(col2)
Tablename.Columns.Append(col3)
Tablename.Columns.Append(col4)
Tablename.Columns.Append(col5)
Tablename.Columns.Append(col6)
Tablename.Columns.Append(col7)
Tablename.Columns.Append(col8)
'Append the newly created table to the Tables Collection
Cat.Tables.Append(Tablename)
'User notification )
MsgBox("A new empty database was created successfully")
'clean up objects
Tablename = Nothing
Cat = Nothing
Cn.Close()
Cn = Nothing
'Restart application
If Restart2.ShowDialog() = Windows.Forms.DialogResult.OK Then
Application.Restart()
End If
End Sub
Code Snippet
'COMPACT DATABASE
Private Sub CompactDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles
CompactDB.Click
Dim JRO As JRO.JetEngine
JRO = New JRO.JetEngine
'The first source is the original, the second is the compacted database under an other name.
JRO.CompactDatabase("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:Program
FilesVSoftAppMissNewAppDB.mdb; Jet OLEDB:Engine Type=5", "Provider=Microsoft.Jet.OLEDB.4.0;
Data Source=C:Program FilesVSoftAppMissNewAppDBComp.mdb; JetOLEDB:Engine Type=5")
'Original (not compacted database is deleted)
File.Delete("C:Program FilesVSoftAppMissNewAppDB.mdb")
'Compacted database is renamed to the original databas's neme.
Rename("C:Program FilesVSoftAppMissNewAppDBComp.mdb", "C:Program FilesVSoftAppMissNewAppDB.mdb")
'User notification
MsgBox("The database was compacted successfully")
End Sub
End Class
View 1 Replies
View Related
Jul 20, 2005
I'm working on an ASP.Net project where I want to test code on a localmachine using a local database as a back-end, and then export it tothe production machine where it uses the hosting provider's SQL Serverdatabase on the back-end. Is there a way to export tables from oneSQL Server database to another in such a way that if a table alreadyexists in the destination database, it will be updated to reflect thechanges to the local table, without existing data in the destinationtable being lost? e.g. suppose I change some tables in my localdatabase by adding new fields. Can I "export" these changes to thedestination database so that the new fields will be added to thedestination tables (and filled in with default values), without losingdata in the destination tables?If I run the DTS Import/Export Wizard that comes with SQL Server andchoose "Copy table(s) and view(s) from the source database" and choosethe tables I want to copy, there is apparently no option *not* to copythe data, and since I don't want to copy the data, that choice doesn'twork. If instead of "Copy table(s) and view(s) from the sourcedatabase", I choose "Copy objects and data between SQL Serverdatabases", then on the following options I can uncheck the "CopyData" box to prevent data being copied. But for the "CreateDestination Objects" choices, I have to uncheck "Drop destinationobjects first" since I don't want to lose the existing data. But whenI uncheck that and try to do the copy, I get collisions between theproperties of the local table and the existing destination table,e.g.:"Table 'wbuser' already has a primary key defined on it."Is there no way to do what I want using the DTS Import/Export Wizard?Can it be done some other way?-Bennett
View 3 Replies
View Related
Oct 17, 2006
I have a trade data tables (about 10) and I need to retrieve information based on input parameters. Each table has about 3-4 million rows.
The table has columns like Commodity, Unit, Quantity, Value, Month, Country
A typical query I use to select data is "Select top 10 commodity , sum(value), sum(quantity) , column4, column5, column6 from table where month=xx and country=xxxx"
The column4 = (column2)/(total sum of value) and column 5=(column3)/(total sum of quantity). Column6=column5/column4.
It takes about 3-4 minutes for the query to complete and its a lot of time specially since I need to pull this information from a webpage.
I wanted to know if there is an alternate way to pull the data from server ?
I mean can I write a script that creates tables for all the input combinations i.e month x country (12x228) and save them in table (subtable-table) with a naming convention so from the web I can just pull the table with input parameters mapped to name convention and not running any runtime queries on database ??
OR
Can I write a script that creates a html files for each table for all input combinations save them ?
OR
Is there exists any other solution ?
View 1 Replies
View Related
Dec 9, 2007
Hi all,
I have a large Excel file with one large table which contains data, i've built a SQL Server DataBase and i want to fill it with the data from the excel file.
How can it be done?
Thanks, Michael.
View 1 Replies
View Related
Apr 22, 2004
Hi,
Currently, I'm using the following steps to migrate millions of records from Foxpro tables to SQL Server tables:
1. Transfer Foxpro records to .dat files and then bcp to SQL Server tables in a dummy database. All the SQL tables have the same columns as the Foxpro tables.
2. Manipulate the data in the SQL tables of the dummy database and save the manipulated data into the SQL tables of the real database where the tables may have different structure from the corresponding Foxpro tables.
I only know the following ways to import Foxpro data into SQL Server:
#1. Transfer Foxpro records to .dat files and then bcp to SQL Server tables
#2. Transfer Foxpro records to .dat files and then Bulk Insert to SQL Server tables
#3. DTS Foxpro records directly to SQL Server tables
I'm thinking whether the following choices will be better than the current way:
1st choice: Change step 1 to use #2 instead of #1
2nd choice: Change step 1 to use #3 instead of #1
3rd choice: Use #3 plus manipulating in DTS to replace step 1 and step 2
Thank you for any suggestion.
View 2 Replies
View Related
Feb 22, 2008
I have 3 Checkbox list panels that query the DB for the items. Panel nº 2 and 3 need to know selection on panel nº 1. Panels have multiple item selection. Multiple users may use this at the same time and I wanted to have a full separation between the application and the DB. The ASP.net application always uses Stored Procedures to access the DB. Whats the best course of action? Using a permanent 'temp' table on the SQL server? Accomplish everything on the client side?
[Web application being built on ASP.net 3.5 (IIS7) connected to SQL Server 2005)
View 1 Replies
View Related
Oct 5, 2007
Firstly I consider myself quite an experienced SQL Server user, andamnow using SQL Server 2005 Express for the main backend of mysoftware.My problem is thus: The boss needs to run reports; I have designedthese reports as SQL procedures, to be executed through an ASPapplication. Basic, and even medium sized (10,000+ records) reportingrun at an acceptable speed, but for anything larger, IIS timeouts andquery timeouts often cause problems.I subsequently came up with the idea that I could reduce processingtimes by up to two-thirds by writing information from eachcalculationstage to a number of tables as the reporting procedure runs..ie. stage 1, write to table xxx1,stage 2 reads table xxx1 and writes to table xxx2,stage 3 reads table xxx2 and writes to table xxx3,etc, etc, etcprocedure read final table, and outputs information.This works wonderfully, EXCEPT that two people can't run the samereport at the same time, because as one procedure creates and writesto table xxx2, the other procedure tries to drop the table, or read atable that has already been dropped....Does anyone have any suggestions about how to get around thisproblem?I have thought about generating the table names dynamically using'sp_execute', but the statement I need to run is far too long(apparently there is a maximum length you can pass to it), and evenbreaking it down into sub-procedures is soooooooooooooooo timeconsuming and inefficient having to format statements as strings(replacing quotes and so on)How can I use multiple tables, or indeed process HUGE procedures,withdynamic table names, or temporary tables?All answers/suggestions/questions gratefully received.Thanks
View 2 Replies
View Related
Mar 2, 2008
Does anyone have a script that can drop the Identity columns from all the tables in a database? Thanks
View 1 Replies
View Related
Aug 12, 2006
Hello all,
Being still a relative newcomer to SQL Server (people may say I'm trying to take on too much being somewhat inexperienced once they read about the problem I'm trying to tackle, but alas...) I'm running into the following problem: I need to create tables in my user database on the fly (using Stored Procedures) so that each table can be created many times in the database but only once for every user. The tables should be named something like "username.Table1", "username.Table2" etc. as opposed to "dbo.Table1". I then want to use the stored procedure from .NET/C# in my web application, so that i can create the complete set of usertables for each of my clients.
Now, I tackled the stored procedure part (that is, it creates all the tables I need with all the parameters I want) and am able to use it from my web application (which took some time to learn but proved quite easy to do), but I cannot seem to get it coupled to the current user (instead of the dbo). Every time I trie, the tables are created as dbo.Table1 and when I try to create a new set, it gives a warning ("table with name such and so already exists..."). I made sure to log in as an authenticated user (using forms authentication) before trying to create the tables but this gives the aforementioned result.
What am I doing wrong? I use Visual Web Developer Express, SQL Server 2005 Express and IIS version 5.1
Please help :-D
Greetingz,
DJ Roelfsema
View 6 Replies
View Related
Nov 17, 2007
hi. I am very new to databases.
At the moment I have one big database that has column titles such as author, book title, rating, isbn, publisher, publish date, copyright date, graphical image review, availability, genre.
I want to break this large table down into say four smaller relational tables. Partly because it is so large and partly to make it easier to bind the data to the web site. I also want to keep the large table.
How can i do this? I use sql server management express. The database is on my hosts database. I also use VWD 2005 express edition.
thanks for your time
nick
View 8 Replies
View Related
Dec 19, 2003
Ok say I would like to build a table for of the following questions(say 6 questions for the sake of argument):
Do I just stored the index of the radiobuttonlist. What are some resources that I could look at. Should I make a look up table.
5) If money were no object, I would live . . .
Prefer not to say
On a tropical island
In a New York penthouse
In an English castle
On a Texas ranch
In a Malibu beach house
In a mountain retreat (Selected)
On the moon
None of the above
1 --->
2 --->
3 --->
4 --->
5) --->6 This is the question we are looking at.
6 --->
How should I create the database table for the above example.
View 1 Replies
View Related