Hard Drive Defragmentation Opinions
May 9, 2008
I'm wondering what other people do in regards to running hard drive defragmentation programs on SQL Server 2005 servers (assume 64-bit and Windows 2003). From what I can tell the most common opinions are:
1. Don't defragment because it doesn't help and it can cause problems.
2. Use Diskeeper
3. Use the built-in Windows defragmenter
Other respected defragmented programs are PerfectDisk, O&O Defrag, JkDefrag, and Contig.
What is your hard drive defragmentation strategy?
View 1 Replies
ADVERTISEMENT
Nov 14, 2000
hi, I have NT server which has drive c: 500 MB and drive d has 44 GB.
I know that the person who set up this server did not give enough space to the c drive, here is the problem. I am running sql server 7.0 which has 30 GB of data in the d drive. I need to reconfigure the NT hard drive so I can allocate 2 GB for C drive and 42 GB for D drive.
What is the best, safe method to accomplish this task.
Ahmed
View 3 Replies
View Related
Sep 27, 1999
After experiencing a hard drive failure i have reinstaled MSSQL7 on one drive and have a database which I need to recover on separate physical drive. How can I go about doing this?
View 1 Replies
View Related
Jul 17, 2003
Hi,
I have ran
1. xp_fixeddrives and got the result
drive MB free
----- -----------
C 1708
D 16311
2. I ran Backup Wizard in EM and able to see only above drives
3. But if ran backup in EM able to see more than 10 Drives(like C,D,H,I,J,M,N and etc).
Why I can able to see those difference?.
How do I find out exactly how many drives are there in this server without directly going to that server?.
I appreciated your valuable answere.
Thanks,
Ravi
View 8 Replies
View Related
Apr 7, 2004
Hello, everyone:
Does any one have an idea for hard drive configuration on SQL Server 2K? How to determine the RAID level for datafile and log file? Thanks
ZYT
View 2 Replies
View Related
May 18, 2004
I was wondering if anyone played around with changing the allocation unit size when formatting the hard drive the SQL server is running on. I would think that setting it higher to account for the larger size of the database files would help, but I'm not sure.
Anyone have any thoughts?
View 3 Replies
View Related
Feb 2, 2007
Hi guys,
I have a general question. Would SQL server have slower performance if you placed the ldf or mdf files on a dynamic drive setup or should it always be basic? I noticed that a server had 2 dynamic drives and the log files and mdf files are located on these drives. Usually I see all the drives as basic not dynamic. Does this even matter?
thanks
View 1 Replies
View Related
Oct 19, 1998
I want to perform backups to a network drive. I need to know if I can access the backup drive via UNC. I have not been able to get it to work and, for now, I would just like to know if what I am trying to do SHOULD work.
For example I want to backup to device mdtnts_prod02LM2BackupNameBack.DAT.
Thanks,Bob
View 1 Replies
View Related
Apr 5, 2007
Hi. Has anyone else had any problem with their SQL Server 2005 going nuts after installing the SP2 + the above hotfix?
We reboot the SQL Server and as soon as SQL receives a command of any sort, the SCSI RAID 5 hard drives light up and start trashing. Therefore every query and even Windows response time is painfully slow or fails to respond. When we look at the Activity Monitor and sp_who2 there is a process running a Select Into from Microsoft SQL Server Management Studio using the sa account from the last database name in our system. So we can't kill the process as it says "Cannot use KILL to kill your own process. Microsoft SQL Server, Error: 6104)".
The last Transact-SQL command from this process is:
create table #tmpDBCCinputbuffer ((Event Type) nvarchar(512),
[Parameters] int, [Event Info] nvarchar(512))
insert into #tmpDBCCinputbuffer exec ('DBCC INPUTBUFFER(56)')
select [Event Info] from #tmpDBCCinputbuffer
The only other issue I should mention is that we had to uninstall and reinstall SQL Server yesterday as SP2 only half installed. The SP2 install log indicated that SQL SP1 wasn't in a good state as we previously (3 months ago) performed an upgrade from Windows 2000 Server to Windows 2003 R2 Server.
Our other 3 Windows 2003 R2 Servers which were a clean install of both Windows and SQL are running fine.
I've been insisting that we do a new clean Windows and SQL install, but the other team members insist that it's been fine until we installed SQL Server SP2.
So can anyone provide and help or evidence to say who is right either way?
Thanks.
View 3 Replies
View Related
May 12, 2007
In our SQL Server database we will have a table that will be populated with about 2000 recordsper day. That is 2000 records per day for 5 days per week. Currently the computer we are using has about 50 gigabytesof available hard drive space on it. We are concerned that maybe we will need a bigger hard drive,based solely on the number of records entered into this table per day. The problem is I don'tknow how to calculate how much hard drive space we need. I think I read that using varchar,sql server 2005 really optimizes a database. Here is a typical example of data in ourdatabase. I put dots on three lines between the first and last sample record to justillustrate that there are many records in between.
Basically we only need 8 months of data at a time in the table and then we can purgerecords older than 8 months.Can someone help me approximate how much hard drive space I might need for 8 months of data,given the following sample record in the database?
Sample: -->34.5 4.08 10.6 .0012
Sample Table in my DB just for illustration:
(PPsquare inch) (Diameter) (Weight gm) (coeffOfSatFriction)
34.5 4.08 10.6 .0012...21.7 3.54 6.22 .019
View 4 Replies
View Related
Nov 26, 2007
Hi,
I use sql server management express. I have created a table on my hosts remote database and i want to copy the table (or the data) in some format or other to my hard drive. does anyone have any good ideas how i may do this either through management express or other means.
thanks a lot
nick
View 2 Replies
View Related
Feb 20, 2004
Hi,
I am running sql server 7 with 200+GB database size. I have one table with following fields
IIINDEX
DOCTYPE
IMAGE BLOB
I need to dump all the information from this table to hard drive.
I have try with delphi ado and delphi odbc (limit 1mb), somehow when I run the program it gives me an error message E_ timeout.
How can I dump this information without using delphi.
Any help will be highly appreciated.
If you have any code that can help please email me samirp@ix.netcom.com
Thanks.
Samir
View 1 Replies
View Related
Jul 23, 2005
Dear group:I have removed my hard drive from my laptop (which is now toast) andhave managed to recover nearly all the data from it by installing thedrive into my desktop. I was hoping to reboot the dektop to see if Icould load the operating system on the laptop's hard drive so I coulddo a manual backup of the SQL database on it. This does not work.Does anyone know of a way to recover my SQL database and all its tablesgiven the circumstances above?TIAISZ
View 5 Replies
View Related
May 5, 2006
Installed SQL Server 2000 Enterprise trial a week ago on XP Pro. Installed new Seagate 80G HD; used Seagate's utility to copy old C: to new drive as new boot drive. All seems to work fine, except, when booting up, SQL server doesn't start. When I try to start it manually I get the following:
Could not start SQLSERVER service on the local computer. Error 3: the system cannot find the specified path.
1. What could be wrong?
2. How do I fix it?
View 9 Replies
View Related
Oct 18, 2007
Hi all,
I have one table with a column of type 'image'. There are manytypes of files saved in that column (i.e. .Doc,Xls,Pdf,jpg,gif etc.). What I want is, read that files from database and save it in temp folder on d drive of server. Can anyone help me in my problem?
Thanx in advance
View 1 Replies
View Related
Jan 31, 2006
Hello.
One of our hard drives has crashed and as a result we have lost our master mdf/ldf & user db mdf/ldf files. It's not that a critical system by any means, but if the hard drive crashes and the master mdf/ldf files & user db mdf/ldf are lost, is there any way of restoring the system?
I'm thinking we probably need to re-install Enterprise Manager completely, and re-install the user db from a backup.
Any advice/suggestions would be much appreciated.
Thanks in advance,
View 1 Replies
View Related
Apr 25, 2006
Hi,
I've downloaded the sql server 2005 management studio express executable to my second hard drive, as there is no room on my main drive
However, when I try to install, the program shows me both drives, and shows that there is not enough room on my main drive, but it does not allow me to change the installation to my second drive.
I have previously installed the main sql express product to my second drive with visual studio. There's loads of room.
Can anyone give me a clue as to why this might be happening.
Thanks
John
View 1 Replies
View Related
Apr 9, 2007
I need to create a SQL server database and add some tables to it. Then access it with a C# application. The problem is that the new SQL server database and it's tables must reside on an external hard drive. How do I point SQL server to this external drive, so that I can create a database on this drive and then create tables and access data on it?
View 7 Replies
View Related
Jul 20, 2004
Hi
I have db with 2 tables and log
shipping running every 5 min
Size of tables 10GB and 14 GB
each table has 6 indexes.
I want ot defragent each of 2 tables
I tried to use DBCC INDEXDEFRAG but
it create 3 GB log file and log
shipping frezing on restore of this log
I tried to drop and recreate indexes,
but log grow to 8 GB
Is any way I can I avoid huge logs and defragment tables, or rebuld indexes one by one?
Thanks
Alex
View 6 Replies
View Related
Dec 21, 2001
How is the database defragmentation done in SQL Server, index defragmentation
is done using DBCC INDEXDEFRAG and results are seen using DBCC SHOWCONTIG.
Thanks in Advance
View 2 Replies
View Related
May 28, 2007
Dear Experts, what exactly the defragmentation in sql server? what is the use
Vinod
Even you learn 1%, Learn it with 100% confidence.
View 4 Replies
View Related
Oct 22, 2004
Hi ,
I have
SQL Server with 2 processors , 2 GB memory and RAID 5
40 GB db (Pricing ) working with 3rd party application.
in db Pricing
table A = 180000 rows (20 column and 5 indexes incling clustered primary)
table B = 1789000 rows (25 coumns and 6 indexes incling clustered primary )
table C = 10005 rows (15 columns and 4 indexes incling clustered primary)
Users start complaining about poor performance when selecting data from tables A,B and C
I used profier to capture all queries running with A,B and C tables
where duration more then 1 second.
Profiler shows 0 cpu and very high read for all capured queries
I run INDEXDEFRAG on tables A,B and C
then rerun Profiler
No changes at all in profiler reading
same low CPU and high read
If the is no changes in performance after INDEXDEFRAG can I conclude following
everything has been done to optimize tables A,B and C , query code or table structure should be modified?
or
any other improvment could be done without modifing code?
Thank you
Alex
View 10 Replies
View Related
Aug 6, 2015
I have a no of databases in full recovery model whose files are many times their datafiles. It is because these databases were copied from the development servers and in the development servers they were not taking the transactional log backups although once in the production server it is ensured that a transactional log is taken once in a day atleast. I plan to shrink the logfiles using the dbcc commands. However I am afraid that it may lead to severe defragmentation and performance hits.
We are using Sql Server 2008R2 enterprise edition which is clustered.
In this context my questions are:-
1)What is the best course to do the shrinking of log with out defragmentation?
2)Can I do the shrinking when the database is in use or is online in production?
3)Will the shrinking of the logfile improve the performance in any manner like that of the i/o operations or paging?
4)Can I do the shrinking of the log files alone without the shrinking of the corresponding data files?
View 10 Replies
View Related
Apr 4, 2008
Hi
I have been trying to use openrowset with a shared drive, and even though the share has "full control" permissions granted to "everyone" and the accout that SQL runs under has been granted explicit full control permissions I am unable to open the file which itself has no security on it.
Can I not use a \ path and only use mapped drives?
Thanks
below works...
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0','Excel 8.0;Database=C:5People.xls', [Sheet1$])
below doesn't work...
SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0','Excel 8.0;Database=\cluster02FileManager5People.xls', [Sheet1$])
View 3 Replies
View Related
Nov 7, 2002
Hi all.
We are currently running SQL7 on an NT4 server (dual 800Mhz, 1GB RAM) and it is being pounded mercilessly 24/7!
We are currently in the market to upgrade, and I would like to get your opinions on this setup. Maybe some has experience with this box, or other issues in upgrading to a new OS and new version of SQL...
Box:
Compaq Proliant ML530
Processors:
2 Xeon 2.8GHz/512KB with 400Mhz System Bus
Memory:
2GB (4x512)
Drives:
2 18.2GB U3 SCSI in Raid 1 (for Operating System)
3 72.8GB U3 SCSI in Raid 5 (for Backups/TLogs/OS Swap file)
4 36.4GB U3 SCSI in Raid 5 (for SQL data)
Operating System:
Windows 2000 Server
SQL Server:
MS SQL 2000 Standard Edition
Any thoughts/advice are appreciated. :)
View 8 Replies
View Related
Apr 21, 2004
Hi, I have probably exhusted the topic of shapes etc... but I am still having a hard time determining the best solution for my problem:
I have several products, each with several specific properties:
Double Tee
-----------------------------------------
Width | Height |Flange | Leg | Count
Column
------------------------
Width | Height
Round Column
-----------------
Radius
Now originally I wanted to create a scalable table structure, so with the help of several people on this site (and SQL Team) I have developed the following :
tbShape
------------------
ShapeID | Shape | XSectionFormula
-------------------------------------------
1 | Rect | Length X Width
tbShapeAttributes
---------------------------------------
fkShapeID | AttributeID | Attribute
----------------------------------------
1 | 1 | Length
1 | 2 | Width
tbProduct
---------------------------------------
ProductID | fkShapeID | Product
--------------------------------------
1 | 1 | Column
tbProductAttributeValues
--------------------------------------------
fkProductID | fkAttributeID | Value
---------------------------------------------
1 | 1 | 10
1 | 1 | 10
[/code]
From the above table structure I was able to select a product
and by obtaining the formula from the tbShape table, using a
cursor, replacing the Attribute names in the formula with the
attribute values from the tbProductAttributeValues table, using
dynamic SQL, I am able to determine the cross section of any
selected product.
The Problem now is, what if I need to apply different functions to
the data for any given product. This proves to be very difficult because
the attributes for the product are not necessarily consistent.
For Example, lets say the above was a slab 10 feet by 1 foot giving a cross section of 10 square feet. Because it is simple to get the cross sectional area, I can easily figure out the cubic feet of concrete used by multiplying the cross section by a length. But lets say the user want to get the cost / square foot? How is the application sure what attribute is the width of the product?
I guess what I am getting at is why the structure below is not any better then the one above?
tbTemplateCategories
---------------------------------------
CategoryID | Category
tbTemplates
----------------------------------------
TemplateID | fkCategoryID | Template |
-----------------------------------------
tbDoubleTeeTemplates
------------------------------------------
fkTemplateID | Width | Height | Flange | Avg. Leg Width | Leg Count
tbWallTemplates
-----------------------------------------
fkTemplateID | Width | Height
Now there would be a 1 - 1 relationship between the tbTemplates and tbDoubleTeeTemplates ON TemplateID - fkTemplateID. To add a new product, simple add the category, the new table, and then alter the Stored Procs which would use if() if else() statements based on the category to go to the appropriate template table.
Also, now I can write any customized functions for any product without the worry of user mispelling an attribute between the formula and attributes, etc...
Any opinions, thoughts on this would be appreciated!
Mike B
View 4 Replies
View Related
May 13, 2004
I've always used the identity field in SQL server to maintain the unique id for a table. With the new DB design at work we brought in a dba and she made us move away from allowing SQL maintain the unique field and having us maintain the unique field in code. To do that we had to start a transaction, do a select max(id) + 1, insert into table, commit transaction. Doing it this way, I'm starting to see deadlocks due to the transactions locking the table.
Getting down to what I wanted to know, what is the pro's/con's you guys see in maintaining he unique ID this way and is there a better way of creating an unique id in T-SQL code?
Thanks
View 2 Replies
View Related
Nov 9, 2004
Hi all,
I took a search through the archives for related topics (and got Des in trouble along the way :( ) but couldn't find a directly related thread. If I missed one, feel free to tell me where to go (hey...watch that...only if I MISSED one!)
I wrote what is, essentially, a data verification stored proc that goes out to each of FOUR servers we have - each one running a mirror database. In a nutshell, there is one table that contains a row with a column in it that, if everything has gone well in the daily processing in all 4 databases, will match identically between all 4 DBs.
So, that said, here is the output: Job 'Index - Verify PortfolioIndex Across Servers' : Step 1, 'PortfolioIndex Check across all servers and portfolios' : Began Executing 2004-11-09 15:30:00
------------------- BEGINNING PortfolioIndex VERIFICATION -------------------- [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 2 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 3 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 11 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 67 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 72 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 84 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 90 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 92 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 100 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 105 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 110 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 115 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 120 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 125 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 130 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 135 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 140 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 145 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 150 on 11/09/2004! [SQLSTATE 01000]
WHOO-HOO!!! EVERYTHING MATCHES for porfolio number 155 on 11/09/2004! [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 160 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of 110.582 [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of 110.582 [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 1000 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of 189.623 [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of 189.623 [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 1001 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of 164.058 [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of 164.058 [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 1002 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of 255.978 [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of 255.978 [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 1003 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of 159.009 [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of 159.009 [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 1004 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of 318.981 [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of 318.981 [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 1005 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of 145.921 [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of 145.921 [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 1006 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of 141.035 [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of 141.035 [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 1007 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of NULL [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
UH-OH - TROUBLE!!! CloseIndex mismatch for porfolio number 1008 on 11/09/2004! [SQLSTATE 01000]
--> Server TA1 shows an index of 123.179 [SQLSTATE 01000]
--> Server TRADEANALYSIS shows an index of 123.179 [SQLSTATE 01000]
--> Server RECEIVE1 shows an index of NULL [SQLSTATE 01000]
--> Server RECEIVE2 shows an index of NULL [SQLSTATE 01000]
------------------- COMPLETE -------------------- [SQLSTATE 01000]
This was cut-n-pasted here from a log file created by the actual SQL SERVER 2000 job created to run the afore-mentioned stored procedure.
After all that...my quandry is this:
What is the best way to send this info out in an email format to interested parties? Currently I have the job send out an email notification on completion, but that still requires my lazy buttocks to go look at the log file in the job (or, more accurately, on the server in the logfile directory).
I want to get the actual DATA as shown above into the email.
As I see it, my options are:
(1) write the data out to a flat file during the run (or, as is done now, into a log file by the SQL Server scheduled job) and then attach that FILE to the email - this still requires my lazy buttocks to OPEN the attachment that comes with the email.
ro (2) write the message out a line at a time to a table with an IDENTITY column (used to order them on the select) and a VARCHAR(128) column that each line in the log would be written to. This option allows me to just do a SELECT in the call to xp_sendmail to get the data into the actual email...but I just really hate the idea of creating a permanent table for this cheesy solution.
I tried it with a temp table within my stored proc, but of course, when I made the call to xp_Sendmail, it can't see my temp table in order to select from (mind you, it's not that I mind USING a cheesy table, just that I don't want it to have a lifespan longer than the time I need to use it and toss it aside)
I know the common denominator here is "My Lazy Buttocks", but I really can't understate the laziness of my buttocks, so this is a valid concern ;)
Any thoughts? How do people get status messages like this into an email without using an attachment or a cheesy middleman table?
Sorry, as always, about the miniseries...just trying to set the mood before popping the question ;)
View 1 Replies
View Related
Jul 30, 2007
I've a core component that is a Win32 DLL, this DLL implements some basic math calculations and conversions between several video systems.. PAL, NTSC and so on. Plus this DLL has a memory mapped file that stores a system value that is the current time of the day (House Clock), it is written by a service app and read by exernal apps that need this frame-accurate value.
I've the full control of the DLL and now I've to decide whether to use this DLL from SQLServer with P/Invoke or if it would be better to port this DLL to C#. Both solutions has pros and cons. (The DLL is written in delphi32 and cannot be easily ported)
- The most important pro is "code reuse"... sql implements the same math as applicstions and once the DLL is bug free, SQL math behaves the same as apps. No need to write code twice and so on.
- The most important con is about code security... not sure if an hard problem in the DLL may take down all the server, even if the situation is very unlikely to happen since the type of the DLL... but never to say...
I'd like to hear some comment from you...
_________________________________
« www.carlop.com × carlop-dev.blogspot.com »
View 1 Replies
View Related
Jan 24, 2007
I would like some opinions on how you would deal with the following scenario:
We probably have somewhere from 500 to 1000 reports (written in crystal). We also have about 120 clients; each client has their own database. One of the reasons for so many reports is because a lot of our customers want report A but with this or that extra column added so we end up with a lot of custom versions of one report for a particular client. My question is in converting over to Reporting Services, how would you setup the file structure?
Right now, we are thinking that every client is going to have their own folder which would contain all Reports, Models, and Datasources. What do you guys think?
View 1 Replies
View Related
Sep 15, 2004
Hearing complaints from users about speed on db server (I have almost no control on design) it just has to work. Ran profiler looking for all sql statements over 4000 millsec and in one hour returned over 715 tsql statements. Over 300 of these were over 10000 milliseconds. THis is on an 8 way Dell with 8 gig of RAM. Looking for opinions, how bad does this look compared to other servers you are taking care of? Cache hit ratio is at 99 % and system queue length still under 1, but this does not look good.
View 2 Replies
View Related
Oct 8, 2007
Hi all,I was wondering if I could get some experienced opinions on SQL hardware torun an ERP app on SQL 2000. The app does not yet support SQL 2005. The ERPapp has 25 users and likely won't exceed 30 users for several years. Alltraffic is on the LAN. The ERP clients basically submit SQL requests forreads and writes. The app makes heavy use of temp tables, temp views butnot many stored procedures. The current size of the db is 6GB and willlikely double in 4 years.Planned server:Windows Server 20034 GB RAMSQL 2000 Standard (ERP app does not yet support SQL 2005)RAID1 for OSRAID 10 for SQL dataRAID1 for SQL logsRAID1 for temp dbDual, teamed NICsI would try to get 15K SCSI drives. Any thoughts on SATA instead of SCSI?Could I expect much of an impovement by using SQL 2000 Enterprise since itcan use more RAM? I would rather wait for SQL 2005 to be supported.Does anyone have a Dell or HP server configured in an email-able cart thatthey would care to share?Thank you.
View 3 Replies
View Related
Jul 20, 2005
Anyone here tried ListCleaner by a company called WinPurehttp://www.winpure.co.uk/lists (a data deduping software) incomparison to other products that may be out there? Looks like somecompanies like Hewlett Packard use this.I am looking for a good (and inexpensive) datacleansing tool to dedupeand standardise lists, happy to extract the data from database andre-import it (which seems WinPure does) before incorporating it intoother BI tools. Tried MatchIt from help it systems but it is a bitcumbersome.Particularly interested in a tool with a good phonetic matchingengine, that handles multiple lists.Mostly work with oracle, sql and access.Recommendations appreciatated.dbdb
View 2 Replies
View Related