How To Quickly Fill A Database
Apr 23, 2008
I have a database with a couple of tables i need to expand to 4 gigabytes in order to run some tests. (currently 300 megs)
Does anyone have a script or some method that would quickly populate my tables with random data so that i can grow my database to the desired size for testing.
Thanks
I have SQL server 2005 express. I have the management studio installed too.
View 4 Replies
ADVERTISEMENT
May 19, 2004
There have been several threads about changing a database's collation but none have come up with an easy answer before.
The suggestion before was to create an empty database with the correct collation and then copy the data across.
However this is hard work as you have to populate tables in a specific order in order not to violate foreign keys etc. You can't just dts the whole data.
There follows scripts we have written to do the job. If people use them, please could you add to this thread whether they worked successfully or not.
Firstly we change the default collation, then change all the types in the database to match the new collation.
===================
--script to change database collation - James Agnini
--
--Replace <DATABASE> with the database name
--Replace <COLLATION> with the collation, eg SQL_Latin1_General_CP1_CI_AS
--
--After running this script, run the script to rebuild all indexes
ALTER DATABASE <DATABASE> COLLATE <COLLATION>
exec sp_configure 'allow updates',1
go
reconfigure with override
go
update syscolumns
set collationid = (select top 1 collationid from systypes where systypes.xtype=syscolumns.xtype)
where collationid <> (select top 1 collationid from systypes where systypes.xtype=syscolumns.xtype)
go
exec sp_configure 'allow updates',0
go
reconfigure with override
go
===================
As we have directly edited system tables, we need to run a script to rebuild all the indexes. Otherwise you will get strange results like comparing strings in different table not working.
The indexes have to actually be dropped and recreated in separate statements.
You can't use DBCC DBREINDEX or create index with the DROP_EXISTING option as they won't do anything(thanks to SQL Server "optimization").
This script loops through the tables and then loops through the indexes and unique constraints in separate sections. It gets the index information and drops and re-creates it.
(The script could probably be tidied up with the duplicate code put into a stored procedure).
====================
--Script to rebuild all table indexes, Version 0.1, May 2004 - James Agnini
--
--Database backups should be made before running any set of scripts that update databases.
--All users should be out of the database before running this script
print 'Rebuilding indexes for all tables:'
go
DECLARE @Table_Name varchar(128)
declare @Index_Name varchar(128)
declare @IndexId int
declare @IndexKey int
DECLARE Table_Cursor CURSOR FOR
select TABLE_NAME from INFORMATION_SCHEMA.tables where table_type != 'VIEW'
OPEN Table_Cursor
FETCH NEXT FROM Table_Cursor
INTO @Table_Name
--loop through tables
WHILE @@FETCH_STATUS = 0
BEGIN
print ''
print @Table_Name
DECLARE Index_Cursor CURSOR FOR
select indid, name from sysindexes
where id = OBJECT_ID(@Table_Name) and indid > 0 and indid < 255 and (status & 64)=0 and
not exists(Select top 1 NULL from INFORMATION_SCHEMA.TABLE_CONSTRAINTS
where TABLE_NAME = @Table_Name AND (CONSTRAINT_TYPE = 'PRIMARY KEY' or CONSTRAINT_TYPE = 'UNIQUE') and
CONSTRAINT_NAME = name)
order by indid
OPEN Index_Cursor
FETCH NEXT FROM Index_Cursor
INTO @IndexId, @Index_Name
--loop through indexes
WHILE @@FETCH_STATUS = 0
begin
declare @SQL_String varchar(256)
set @SQL_String = 'drop index '
set @SQL_String = @SQL_String + @Table_Name + '.' + @Index_Name
set @SQL_String = @SQL_String + ';create '
if( (select INDEXPROPERTY ( OBJECT_ID(@Table_Name) , @Index_Name , 'IsUnique')) =1)
set @SQL_String = @SQL_String + 'unique '
if( (select INDEXPROPERTY ( OBJECT_ID(@Table_Name) , @Index_Name , 'IsClustered')) =1)
set @SQL_String = @SQL_String + 'clustered '
set @SQL_String = @SQL_String + 'index '
set @SQL_String = @SQL_String + @Index_Name
set @SQL_String = @SQL_String + ' on '
set @SQL_String = @SQL_String + @Table_Name
set @SQL_String = @SQL_String + '('
--form column list
SET @IndexKey = 1
-- Loop through index columns, INDEX_COL can be from 1 to 16.
WHILE @IndexKey <= 16 and INDEX_COL(@Table_Name, @IndexId, @IndexKey)
IS NOT NULL
BEGIN
IF @IndexKey != 1
set @SQL_String = @SQL_String + ','
set @SQL_String = @SQL_String + index_col(@Table_Name, @IndexId, @IndexKey)
SET @IndexKey = @IndexKey + 1
END
set @SQL_String = @SQL_String + ')'
print @SQL_String
EXEC (@SQL_String)
FETCH NEXT FROM Index_Cursor
INTO @IndexId, @Index_Name
end
CLOSE Index_Cursor
DEALLOCATE Index_Cursor
--loop through unique constraints
DECLARE Contraint_Cursor CURSOR FOR
select indid, name from sysindexes
where id = OBJECT_ID(@Table_Name) and indid > 0 and indid < 255 and (status & 64)=0 and
exists(Select top 1 NULL from INFORMATION_SCHEMA.TABLE_CONSTRAINTS
where TABLE_NAME = @Table_Name AND CONSTRAINT_TYPE = 'UNIQUE' and CONSTRAINT_NAME = name)
order by indid
OPEN Contraint_Cursor
FETCH NEXT FROM Contraint_Cursor
INTO @IndexId, @Index_Name
--loop through indexes
WHILE @@FETCH_STATUS = 0
begin
set @SQL_String = 'alter table '
set @SQL_String = @SQL_String + @Table_Name
set @SQL_String = @SQL_String + ' drop constraint '
set @SQL_String = @SQL_String + @Index_Name
set @SQL_String = @SQL_String + '; alter table '
set @SQL_String = @SQL_String + @Table_Name
set @SQL_String = @SQL_String + ' WITH NOCHECK add constraint '
set @SQL_String = @SQL_String + @Index_Name
set @SQL_String = @SQL_String + ' unique '
if( (select INDEXPROPERTY ( OBJECT_ID(@Table_Name) , @Index_Name , 'IsClustered')) =1)
set @SQL_String = @SQL_String + 'clustered '
set @SQL_String = @SQL_String + '('
--form column list
SET @IndexKey = 1
-- Loop through index columns, INDEX_COL can be from 1 to 16.
WHILE @IndexKey <= 16 and INDEX_COL(@Table_Name, @IndexId, @IndexKey)
IS NOT NULL
BEGIN
IF @IndexKey != 1
set @SQL_String = @SQL_String + ','
set @SQL_String = @SQL_String + index_col(@Table_Name, @IndexId, @IndexKey)
SET @IndexKey = @IndexKey + 1
END
set @SQL_String = @SQL_String + ')'
print @SQL_String
EXEC (@SQL_String)
FETCH NEXT FROM Contraint_Cursor
INTO @IndexId, @Index_Name
end
CLOSE Contraint_Cursor
DEALLOCATE Contraint_Cursor
FETCH NEXT FROM Table_Cursor
INTO @Table_Name
end
CLOSE Table_Cursor
DEALLOCATE Table_Cursor
print ''
print 'Finished, Please check output for errors.'
====================
Any comments are very welcome.
View 1 Replies
View Related
Nov 13, 2000
Good morning one and all,
I need to transfer a database (contining one table) containing over 35 million records from one server to another. I have two options at present :
(a) Use DTS to do the transfer
(b) Copy the mdf file across and sp_attach_db it
Does any1 have a better idea, or does any1 know which of the two methods will be the quickest?
TIA
Gurmi
View 1 Replies
View Related
Apr 28, 2008
I am using SQL server to create a rather complicated client database for a nonprofit organization. I have access to an ancient version of the database in Access format, but would rather create a new database from scratch instead of "up-sizing" the old database. Although the old database is mostly useless, it contains a goldmine of names and addresses that I could use to populate the new database that I'm creating. My question is this: Is there any relatively easy way to cut and paste from external data sources into a new SQL database? For example, I would love to just select twenty rows of "first names" from the old database and then paste that into my new table. Can anyone suggest any quick and easy tricks for populating a new database with place-holder content? Thanks!
View 2 Replies
View Related
Apr 24, 2007
Hello everyone,I face currently a problem where I could need some input for searchingthe source of the ProblemSystem: SQL Server 9.0I fill from Database A with triggers Database B, everything worksfine.On Database B there is a Stored Procedures that checks the records andadd additional information accordingly, this Stored Procedures isnormally called by the application on "update and insert" in theaccording table.When I try to call this Stored Procedures from the Database A, thetrigger does not work anymore, even if I do a try catch over the wholetrigger, he never reach the Catch and the insert I try to do there toget the error message.On both Databases the user, that is taken to execute the trigger isexistent and DB-Owner of both Databases.If I go and execute the Stored Procedures manually after an insert orupdate to Database B everything works fine.I also already tried to check on Database B if there is an insert orupdate from Database A and if, to execute the Stored Procedures, withthe same result, nothing and all happens anymore, neither update onDatabase A and also not on Database B.And also I cant catch the error as the Try/Catch is not working.Hope I could explain it understandable and maybe someone remembersalready having the same problem.Thanks & Best regardsPascal
View 2 Replies
View Related
Aug 19, 2002
The tables in my database somehow are getting set with a fill factor of 90. In the properties of the server/Database Setting, the "Fixed" option is unchecked. Last Friday, I reset the each table to have a fill factor of zero, but when I came in today, the tables reset themselves to having a fill factor = 90. Any ideas of why this is happening and how I can stop this? Your help is greatly appreciated.
Mark
View 1 Replies
View Related
Feb 19, 2008
Ok, so I can connect to the database without any errors, however im not sure about the syntax for filling a table. Heres what i have so far in the pageload. Like i said this all works with out any errors. Thanks in advance for the help.testDS = New DataSet()
testDataTable = New DataTable("Tbl")testDataTable.Columns.Add("username")
testDataTable.Columns.Add("datecompleted")testDataTable.Columns.Add("lastfive")
testDS.Tables.Add(testDataTable)Me.dgrdSearch0.DataSource = testDS.Tables("Tbl")
Dim Conn As Object = Server.CreateObject("ADODB.Connection")
Dim strConn = "DRIVER={SQL Server};SERVER=serverName;UID=userID;PWD=password;DATABASE=net"Dim DSNtest As String = strConn
Dim sql As String = "SELECT * FROM Tbl"
Conn.open(DSNtest)
Conn.close()
View 1 Replies
View Related
Apr 30, 2007
Hi all,
In my VB 2005 Express, I created a Windowds Form application "shcDataSet" that used 1 SqlConnection, 1 SqlDataSet and 3 SqlDataAdapters associated with the Northwind Database in my SQL Server Management Studio Express. The SqlConnection had "User Instance" in the following ConnectionString: Data Source=.SQLEXPRESS;AttachDbFilename="C:Program FilesMicrosoft SQL ServerMSSQL.1MSSQLDataorthwnd.mdf";Integrated Security=True;Connect Timeout=30;User Instance=True. The 3 SqlDataAdapters were for the Northwind files "Customers", "Orders" and "Order Details" used in the SQLDataSet "AllOrders" with a AllOrders.xsd file. I ran the "shcDataSet" applicatyion and it worked fine. After the execution of "shcDataSet", I checked the Northwind database in my SQL Server Management Studio Express and clicked on "+" in front of Northwind database-I did not see any file showed up and I got the following error message: Failed to retrieve data for this request. (Microsoft.SqlServer.Express.SmoEnum) Additional information: one or more files do not match the primary file of the database. If you are attempting to attach a database, retry the operation with the correct files. If this is an existing database, the file may be corrupted and should be restored from a backup. (Microsoft SQL Server, Error:5173). If I executed the "shcDataSet" application again, I got a new error: SqlException was unhandled - Cannot open user default database. Login failed. Login failed for user 'myPC##myName' ------->daCustomers.Fill(AllOrders11, "Customers"). I have 3 questions to ask: (1) How can I re-install the Northwind database in SQL Server Management Studio Express? (2) When I use the .Fill Method, do I have to tell the SQL Server Management Studio Express to load and/or unload the Northwind files "Customers", "Orders" and "Order Details"? (3) After I executed the "shcDataSet", should I Upload the 3 Northwind files back to the SQL Server Management Studio Express? Please help and advise. Thanks, Scott Chang
View 7 Replies
View Related
Mar 20, 2007
We have a server with some pesky RAID5 - which has on 3 separate occasions corrupted the Databases when a drive failed. So we had a maintenance window and decided to change it to RAID10.
We started the configuration at 13:00 today, its now 18:00 and it has done 25% ... is that normal?
Its 8 disks (so 4 mirror-pairs), it will have around 300GB of usable space when its done.
What would happen if we needed to do this in a time critical window? (for this debacle we have moved the database onto the Web Server, so we can survive for a few more hours ...)
Kristen
View 4 Replies
View Related
Jun 15, 2007
Hi, All:
I know oracle SQL, now I need to do a lot of SQL query on Microsoft SQLSERVER, can any one point out any place that I can find out the syntax of SQLserver SQL statement? Since this is just a short term assignment, so I don't want to buy a book, just hoping I can learn something quickly from online. I don't need learn anything deep, just need to know some simple syntax so I can do join, count, concatenate, min(), max(), sum () etc.
thanks in advance.
View 5 Replies
View Related
Oct 22, 2007
I am trying to search for stored files "for example from date: 15/12/2003 to: 24/6/2006" and when i press search no results appeare the following is the database code:
1 public DataTable searchData(string fileNo, string Title, string dFrom, string dTo, string brief)2 {3 string str = "";4 5 str = "select * from Tb_File where Active = 1 ";6 7 if (fileNo != "")8 str += " and FileNo='" + fileNo + "'";9 if (Title != "")10 str += " and Title like '%" + Title + "%' ";11 if (brief != "")12 str += " and Brief like '%" + brief + "%' ";13 if (dFrom != "")14 str += " and DFrom >= convert(datetime,'" + Convert.ToDateTime(dFrom).ToShortDateString() + "',103) ";15 if (dTo != "")16 str += " and DTo < convert(datetime,'" + Convert.ToDateTime(dTo).ToShortDateString() + "',103) ";17 18 ole.Open();19 SqlDataAdapter DA = new SqlDataAdapter(str, ole);20 DataTable DT = new DataTable();21 DA.Fill(DT);22 ole.Close();23 return DT;24 25 }
i am using sql 2000, with Visual Studio 2005.
View 14 Replies
View Related
Mar 26, 2007
Good day,
I have a table of approximately 10 million rows. The table has 3 field making up the key, namely:
ID, Date, Program
I need to extract all the distinct Program's from the table.
I have don so with:
Select distinct Program from table
This unfortunately takes roughly 2 minutes which is far to long. Is there something I can do to help speed this process up?
Thanks in advance.
View 14 Replies
View Related
Mar 6, 2008
I am very new to SSIS. Can someone give me a basic out line to this problem. I kind of understand control tasks, data flow, etc... but not in details(watched couple of webcasts). I need to see something like below in action to understand this better.
Basically, I need to process a flat csv file on daily basis and load it into a table. As I am loading the records, I will need to verify(on a key column) to see if record exists in table already. If so then just update the record otherwise insert a new record. When I find a record, I need to possibly do a checksum on a set of columns before I do update. So, only update if these set of columns are different from file vs. table. I also need to keep performance in mind as I am processing this record one at a time looking up this record. I am thinking this should be fairly easy but I am getting little lost in control tasks and dataflow as to what goes on what. By the way I am using visual studio 2005 and sqlserver 2005.
I would appreciate your help. thanks again. I dont mind an example solution file.
View 3 Replies
View Related
Jul 10, 2007
I have five small tables that I need to insert to a SQL CE database.
I am using the 2.0 Compact Framework with the 2.0 System.Data.SqlServerCe.
My table definition is dynamic so I never know it's design.
1- If I go Row by Row using an this.ExecuteNonQuery(_global, par); it takes about 26 seconds to insert 5 tables of 330 rows.
2- If a use
StringBuilder sbColumns = new StringBuilder();
foreach (DataColumn dc in table.Columns)
{
if (sbColumns.ToString() != "")
sbColumns.Append(",");
sbColumns.Append(dc.ColumnName);
}
SqlCeDataAdapter da = new SqlCeDataAdapter("SELECT " + sbColumns.ToString() + " FROM " + _tablename, m_con);
SqlCeCommandBuilder cb = new SqlCeCommandBuilder(da);
da.MissingMappingAction = MissingMappingAction.Passthrough;
da.InsertCommand = cb.GetInsertCommand();
da.Update(table);
da.Dispose();
it takes about 46 seconds.
How Can write it faster or is this fastest it can go?
Thanks
View 1 Replies
View Related
Apr 8, 2015
I just did index defragmentation for some databases include MSDB . I notice there are 3 indexes from MSDB database that fragmented quickly ( I did rebuild last nite at 10 PM - > fragmentation level becomes zero but today at 9 am it become 80 % ).The indexes are backupsetuuid, backup media family uuid, backupmediasetuuid. I am thinking to set the fill factor for those indexes = 80 respectively.
View 6 Replies
View Related
May 1, 2015
This application runs on a SQL Server 2008 R2 database.This application receives messages from an integration module. It has a core table: Table-A. Each message is inserted as 1 row into Table-A. Then when it is processed, that row in Table-A is updated.
There are two environments which are both connected to the same integration. So in both environments, Table-A has exactly the same amount of records inserted and updated. In both environments Table-A has around 80 million rows, with an extra 150,000 rows being inserted and then updated every day.Table-A has 8 indexes. For some reason unknown to me, the 8 indexes fragment really quickly in one environment but not in the other.
e.g. In Environment-1 the index fragmentation ranges from 0 - 19% and this environment has not been re-indexed for over 2 months.BUT a reindex was performed in Environment-2 and only 2 days later the index fragmentation ranges from 72 - 99.93%!
Our DBA has confirmed the re-index in Environment-2 completed successfully and has shown stats before and after the reindex to show that the 8 indexes for Table-A in Environment-2 went down to 0% fragmentation.
My question is, how can the indexes in Environment-2 fragment so much more quickly than the indexes in Environment-1? Both environments are on exactly the same hardware and have exactly the same inbound messages. The database on Environment-1 is actually a clone from Environment-2. The only known differences between the 2 databases is Environment-1 is STANDARD edition - SQL Server 2008 R2 (SP2) whereas Environment-2 is ENTERPRISE edition - SQL Server 2008 R2 (SP1). Could this difference be due to the Service Pack levels or even because one is STANDARD and the other ENTERPRISE?
This is what I have checked so far:
1) In both Environments all 8 indexes have "Set Fill Factor" unchecked and "Automatically recompute statistics", "Use row locks...", "Use page locks..." checked.
2) The "Index Usage Statistics" report in both Environments shows a similar amount of #UserUpdates and #UserScans
View 9 Replies
View Related
Dec 15, 2006
We have a system that has 35 million conversations piled up. We didn't know to explicitly end the conversation once the processing has completed. Oops. Now, our production box has 35 mm sitting in the table, and we have run into the problem where the amount in sys.conversation_endpoints has exceeded memory and they are being dumped into tempdb, which is killing our disk space, thus bringing the box down. We have fixed the code to end the conversations, but we now have to end the conversations in a hurry. If we select one by one out of the table and end the conversation via END CONVERSATION, it is slow. Very slow. It will finish in a few months. :(
Does anyone know how to get rid of these conversations in a hurry? All of the messages have been applied to our system, so killing the conversations will (should) have no affect on the processed data. Something like a TRUNCATE statement?
Thank you so much in advance,
John Hennesey
View 5 Replies
View Related
Mar 9, 2015
I have a table with hundreds of millions of records and need to update an xml column to null. I do something like this:
UPDATE [Mydb].[MySchema].[MyTable]
SET [XMLColumn] = null
Currently it is taking around 6 hours.
Is there a quicker way to do this?
View 1 Replies
View Related
May 2, 2007
Hi ! I have a textbox and a Search button. When the user inputs a value and press the button, i want a datagrid to be filled.
The following code runs:
Dim connect As New Data.SqlClient.SqlConnection( _
"Server=SrvnameSQLEXPRESS;Integrated Security=True; UID= ;password= ; database=dtbsname")
connect.Open()
Dim cmd As New SqlCommand
Dim valor As New SqlParameter("@valor", SqlDbType.VarChar, 50)
cmd.CommandText = "Ver_Contactos_Reducido"
cmd.CommandType = CommandType.StoredProcedure
cmd.Connection = connect
cmd.Parameters.Add(valor)
cmd.Parameters("@valor").Value = texto
Dim adapter As New SqlDataAdapter(cmd)
Dim ds As New System.Data.DataSet()
adapter.Fill(ds)
GridViewContactos.DataSource = ds
GridViewContactos.DataBind()
connect.Close()
I drag a datagrid on the page and only changed its Id.
Into the SQL database the stored procedure is the following:
ALTER PROCEDURE [dbo].[Ver_Contactos_Reducido]
(@Valor VARCHAR(100))
AS
BEGIN
SET NOCOUNT ON;
SELECT NombreRazonSocial, Nombre, Apellido,TelefonoLaboral,
Interno, TelefonoCelular, Email1, Organizacion
FROM CONTACTOS
WHERE NombreRazonSocial = @Valor OR Nombre = @Valor OR Apellido = @Valor OR TelefonoLaboral = @Valor OR Interno = @Valor OR
TelefonoCelular =@Valor OR Email1 = @Valor OR Organizacion= @Valor
END
When i run the page i can't see the datagrid, and after i enter a text and press the button, nothing happens. What am i doing wrong??
Thks!!
View 2 Replies
View Related
Mar 28, 2015
Our system runs a SQL Server 2012 DB, it has a table (table_a) which has over 10M records. Our system have to receive data file from previous system daily which contains approximate 3M updated or new records for table_a. My job is to update table_a with the new data.
The initial solution is:
1 Create a table (table_b) which structur is as the same as table_a
2 Use BCP to import updated records into table_b
3 Remove outdated data in table_a:
delete from table_a inner join table_b on table_a.key_fileds = table_b.key_fields
4 Append updated or new data into table_a:
insert into table_a select * from table_b
As the test result, this solution is very inefficient. Step 3 costs several hours, e.g. How can I improve it?
View 9 Replies
View Related
May 7, 2008
hi,
how can i fill dropdownlist through code not through visit.and i need to know which is fastest and easy way for web application throught this below query.
cmd.Connection = conn
conn.Open()Dim ds As New DataSet
cmd.CommandText = "SELECT Emp_Name,Emp_ID FROM Employee "
da.Fill(ds, "data")
cmd.ExecuteNonQuery()
conn.Close()
View 4 Replies
View Related
May 11, 2006
Can a SqlDataSource be used to fill a DataSet? If so, will you show me how it's done?
View 2 Replies
View Related
Apr 3, 2001
Currently we have tables (in sql 6,5), many of them do not have primary keys.
While I was trying to re-index (re-org), many of them got an error:
"fillfactor 204 is not a valid percentage; fillfactor must be between 1 and 100."
(many tables' fillfactor exceed 100 or more...)
How can I fix them so I can upgrade to sql 7 ?
Thank you for your help.
View 3 Replies
View Related
Aug 13, 2001
I am really confused about this whole fill factor thing. The way I understand it, is if you have a table whose data remains pretty much static, you should use a higher fill factor. Suppose you had a database where you had at most 150 transactions a day that changed the data, should the fill factor be left at the default(0) or increased? How do you determine how much to increase it? Is there a rule of thumb that suggests if you have x number of changes against a table, you should have a fill factor between y and z percent?
Please Help
Chris
View 2 Replies
View Related
Mar 12, 2001
Hi all,
While creating indexes for a table, I specified a fill factor of 70%. I then inserted a few hundred rows into the table. Is it possible to check to what percent the pages are full after the rows have been inserted?
Thanks in advance,
Praveena
View 1 Replies
View Related
Jul 16, 1999
You have a db with 50,000 records and you want to add 100,000 more. What should the right fill factor be? Is there a way to "calculate" a fill factor if you don't want to use default? Any help is appreciated. Thank you.
View 3 Replies
View Related
Jan 18, 2005
If fill factor is specified as 100 for a table what will be the impact of this?I want to know any updation or insertion will be possible or not?
View 3 Replies
View Related
Aug 27, 2007
Hellow, everyone"
I have a web online table that is inserted about 1500 record one day. Each night, a DST is running to pull all data to anther database. How to set fill factor on a one column index to get the best performance? Current fill factor is 80%.
Thanks
ZYT
View 7 Replies
View Related
Jul 13, 2007
Hi experts, I would like to ask regarding FILL FACTOR. I observed that our system's loading is a bit slow, and some of the modules take 1 to 2 minutes loading. Maintenance activity is regularly executed based on the scheduled sets. Then I tried to checked the tables indexes/keys turns out that the FILL FACTOR is set to ZERO(0). I would like to know if the FILL FACTOR set to zero will be a factor for the system to slow down..????
Darren Bernabe Blanco
View 2 Replies
View Related
Oct 4, 2007
How do you fill out an order form;
there is an Order(OrderID, CustomerID, Subtotal, Tax, Total), Orderdetail(OrderID, ProductID, Qty, UnitPrice, ExtendedPrice)
How do I get those tow together in the same form, which can be called order, or invoice it doesn't matter as long as I can get them in the same form numberd like order 1, then order 2, ect. is it by stored procedure, or by ado.net, or both ? or is there anywhere I can find information? like a book? or website ?
For example;
Order no. 1, Customer 3,
Product 3, Qty 2, UnitPrice $20.00, ExtendedPrice $40.00
Product 2, Qty 3, UnitPrice $10.00, ExtendedPrice $30.00
Product 4, Qty 2, UnitPrice $5.00, ExtendedPrice $10.00 ---here I can be adding as many as needed
Subtotal $80.00
Tax $5.00
Total $85.00
View 1 Replies
View Related
Oct 28, 2007
I have one complicated problem with SQL data files. Say that we have four data files:
F1=2000MB
F2=1000MB
F3=3000MB
F4=1500MB
We want to add 3000 MB on these files, and in the same time we want to make the files proportional one with each other:
Will do:
F1 + 500=2500MB
F2 + 1500=2500MB
F4 + 1000=2500MB
With my logic, the final result we’ll be:
F1=2500 MB
F2=2500 MB
F3=3000MB
F4=2500MB
There is an algorithm or a formula to calculate the right value to add to each F1, F2….Fn files having S1, S2….Sn sizes for a T total value to add? Thank you in advance.
P.S. The problem can be complicated when we deal with more than one filegroups and with disk space availability.
View 2 Replies
View Related
Jul 23, 2005
All,SQL Server uses a proportional fill method to add data to file groups wherea file group has multiple files. In general this is a good thing.Unfortunately, the method appears to get switched off when new file(s) isadded. Other than exporting the data out of the filegroup, recreating thefilegroup and putting the data back in, is there a way to respread theexisting data back out and switching the proportional fill method back on?Thanks,Danny
View 2 Replies
View Related
Jul 14, 2006
How do i find out how much fill facot has been spcified in given table?.
View 1 Replies
View Related