Getting Larger Drive Array

Aug 31, 2006

We are planning to install more disk space and need to temporarily move the data and log files and move them back after the disk space has been added.

Is Detach/Attach the best way to handle this?

View 3 Replies


ADVERTISEMENT

SQL Server Moving To Larger Drive

Nov 27, 2007

I have a Windows 2003 server with SQL Server 2005 installed. Theserver is on small drive and we would like to upgrade to much largerharddrives. I've been hearing of problems using Ghost to get an imageand placing the image onto the new drive. I think this is more of aWindows 2003 problem, but this server is for nothing but the SQLServer databases. Does anyone have a clear method of moving thisserver to the larger drives?TIA.

View 3 Replies View Related

RS2005: Export To Excel Error: Destination Array Was Not Long Enough. Check DestIndex And Length, And The Array's Lower Bounds.

Jan 25, 2007

All,

I am using Reporting Services 2005. One of my reports is getting the following error when I try to export to Excel. It will export to .CSV though.

"Destination array was not long enough. Check destIndex and length, and the array's lower bounds."

Any suggestions would be greatly appreciated. Please copy me at machelle.a.chandler@intel.com.

Machelle

View 10 Replies View Related

How Would I Send A String Array As A Integer Array?

Jun 25, 2007

I have a stored procedure that has a paramter that accepts a string of values. At the user interface, I use a StringBuilder to concatenate the values (2,4,34,35,etc.) I would send these value to the stored procedure. The problem is that the stored procedure doesn't allow it to be query with the parameter because the Fieldname, "Officer_UID" is an integer data type, which can't be query against parameter string type.
What would I need to do to convert it to an Integer array?
@OfficerIDs as varchar(200) 
Select Officer_UID From Officers Where Officer_UID in (@OfficerIDs)
 Thanks

View 5 Replies View Related

Can I Used A Shared Drive Rather Than A Mapped Drive With OpenRowSet?

Apr 4, 2008



Hi

I have been trying to use openrowset with a shared drive, and even though the share has "full control" permissions granted to "everyone" and the accout that SQL runs under has been granted explicit full control permissions I am unable to open the file which itself has no security on it.

Can I not use a \ path and only use mapped drives?

Thanks

below works...

SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0','Excel 8.0;Database=C:5People.xls', [Sheet1$])

below doesn't work...

SELECT * FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0','Excel 8.0;Database=\cluster02FileManager5People.xls', [Sheet1$])

View 3 Replies View Related

How To Move A Log File From 'e' Drive To 'f' Drive....

Nov 9, 2000

I am trying to move a log file from one drive to another.

What I have done is add another file to my file group. So now my log has a file on the 'e' drive and one on the 'f' drive. I now want to remove the file on the 'e' drive. I have emptied the file on the 'e' drive. When doing the command:

ALTER DATABASE Uniprodruntime
REMOVE FILE m_rk_runtime_log

I get the following error message..

Server: Msg 5020, Level 16, State 1, Line 1
The primary data or log file cannot be removed from a database.

I have also gone into enterprise manager and tried to delete the file and it does nothing.

Has anyone run into this?

View 2 Replies View Related

SAN Drive Speed Vs Local Drive

Feb 12, 2007

How do you compare SAN drive vs local drive on a 32 bit server?

Is it good idea to move my DB files to a SAN instead of local?

Canada DBA

View 4 Replies View Related

TempDB Keeps Getting Filled And It Is In C Drive But It Should Be In T Drive

Nov 28, 2015

Server: SQL 2008 R2

 1: TempDB keeps getting filled.  Restart of the server has not fixed it. I shrink it, but the space gets filled again. Now I can't even shrink it anymore
2: TempDB is at the wrong location. Its current location is this :C:Program FilesMicrosoft SQL ServerMSSQL10_50.SQLPROD6MSSQLDATA empdb

How do I change its location? 

C:Program FilesMicrosoft SQL ServerMSSQL10_50.SQLPROD6MSSQLDATA empdb
Correct location of TempDB should be: TempDB(T:) But its not there

View 13 Replies View Related

Help Me : Insert Larger String

Jul 2, 2007

 I am using sql server 2000 developer editionMy table defintion is as followin:    CREATE TABLE query_table (id IDENTITY (1, 1) NOT NULL ,qtext char (4000) )When i try to insert data with length more than 256 characterit only inserts only the first 256 characters.Please, let me know why this is happening.

View 2 Replies View Related

Restoring Smaller DB Into Larger DB

Mar 27, 2000

I have a situation where I need to migrate data from an older platform to a newer one. The data from the old system(s) will be available on DAT tapes. All database construction on the new system will be identical to the old one in size and schema, except for one table (call it "ARCHIVE").

If the ARCHIVE table on the old system is 210MB, and the ARCHIVE table on the new system has the same attributes but has been expanded to 380MB in size, can I simply restore the dump for the old table into the new ARCHIVE?

Empirically it works (I have done it with apparent success two times) but I seem to recall that backups are done by pages, and I'm concerned that there may be conditions not being met by simply doing the restore the way I'm planning to do it.

Also, are there any tests or checks built into SQL which I can use to check table integrity on the target ARCHIVE table after the restore?

Any help is greatly appreciated.
Best rgds,
Kevin

View 4 Replies View Related

Data Larger Than 255 Characters

Jul 20, 1998

Is it possible to make a text-field or something like that, which can contain more than 255 characters ?

View 1 Replies View Related

Alert If Database It Larger Then XxxGB

Apr 18, 2008

Hi,

i want to be alerted when any of my databases is more the xxxGB.

i know one method to do that and it is with the alerts in the SQL agent Performance condition ,but with this alert i needs to created alert monitor for every DB.

do you know a better way to achieve that ?
THX

View 3 Replies View Related

Evaluating 2 Fields And Returning The Larger

May 22, 2007

I have two int fields in my database, CEOAnnualBonus and CEOBonus, and I want to return the value of whichever one has the larger value as CEOBonusCombined. I thought using COALESCE would do the trick like below but there are many cases where either CEOAnnualBonus or CEOBonus have a zero value instead of NULL and it doesn't work.

SELECT
COALESCE(CEOAnnualBonus, CEOBonus)
AS CEOBonusCombined
FROM tbenchmarktemp
WHERE Ticker='F'

Thanks for any help

View 3 Replies View Related

When I Delete Data, The Database Gets Larger?

Jan 8, 2008



I have written code to combine, delete redundant data in my system. The table structure remains the same, except I changed some INTs to TINYINTs.

When I do sp_spaceused, it tells me that number of rows is smaller(which is correct), but the datasize, and index_size is significantly larger AFTER the deletions.

I tried using shrink, but that doesn't seem to change anything.

When I right-click the database, and choose PROPERTIES, it also confirms that the database got significantly larger.

I am confused about how deleting data and changing to TINYINTs could make my database bigger. What would cause this?




View 9 Replies View Related

Backup/recovery Larger Databases...

Apr 10, 2008

Greetings all and thanks for reading this post.

Here is my situation... I have 2 fairly large databases. Full backups are 83gb & 63gb. I am in the process of moving these database to a new data center. I've taken full backups of these databases and shipped them to the new center. I have been taking transaction log backups (larger db every 24 hrs smaller db every 15 min ... from log shipping).

I want to restore these databases in the new data center. I've gone ahead and restored the dbs in the new location.

Question final cutover.. can I just apply the transaction logs to the databases on final cut-over or do I have to restore the database backup first then apply the transaction logs?

Is there an other way to do this that I'm missing?

Thanks.
Kurt

View 3 Replies View Related

Subscriber Database MUCH Larger Than Publisher

Dec 7, 2006

I have replication setup between our main site and a remote one, and have recently noticed that the database at the remote site's .MDF file is about 3 times as large as the main site's. This doesn't seem to make sense since essentially all of our data is replicated between the two servers. Can anyone suggest why this might be happening and what is safe to do to shrink the remote file?

TIA

Ron L

View 8 Replies View Related

Problem Accessing SDF File Larger Than 128 MB

Feb 20, 2007

Hello.

I created SDF database files for "SQL compact" with a size larger than 128
MB (which is default for creation). Now when I try to open these files with
VS2005 I get the error "The size of the databasfile exceeds the configured
maximum...Required Max Databse size (in MB; 0 if unknown)" (translated from
german). The real problem is that I can not change the connectionstring in
VS, cause all field which show the connectionstring are readonly (greyed
out). I know I have to set the option "Max Database Size = 512" or so in the
connectionstring to get the things runnning, but don't know any way to do
that in VS2005.
My attempt to access the SDF files when copied to the device via active sync
results in the same error message.

This seems for me to be a design flaw, cause I can not add the optional
parameter to the connection string (even not in the details form, where only
"DataSource" and "Password" fields are displayed.
- Does anybody know a solution in VS2005?
- Does anybody have a workaround for me?
- Does anybody know where the connectionstrings of VS2005 are stored, so
that I may "hack" the connection string?

Thanks so far.

View 7 Replies View Related

Moving A SQL Server 2000 Database From A Local Drive To Another Local Drive

Jan 31, 2008

Being a very novice SQL Server administrator, I need to ask the experts a question.

How do I go about moving a database from 1 drive to another? The source drive (C is local to the server, but the target drive (E is on a Storage Area Network (SAN), although it is still a local drive for the server. I want to move the database from C: to E:. Can someone provide me with instructions?

Thanks,
Rick

View 4 Replies View Related

Dbuse Fails When Database Name Is Larger Than 28 Characters

Jul 20, 2005

Hi,We have troubles when we try to use the 'dbuse' calls with databaseslarger than 28 characters, looks like the dbuse truncates the nameafter it.Any ideas ???

View 4 Replies View Related

Analysis :: Power BI On Larger Data Sources

Jun 1, 2015

We have been trying for a while to use Power BI tools (Power Query, Power BI Designer (Desktop) and Power BI Designer (Cloud) in line with larger data sources ( 2-7m records in fact table). Unfortunately up to now with unsatisfactory results. It seems that these tools are just not designed to handle this kind of data volumes?

To my understanding it seems that the approach with PowerBI components is to always try to create a local (or cloud based) cache that only has a limited capacity. 2m+ records already seem to be exceeding this limit. So as opposed to firing off a query on demand (for example only based on distinct filter options as opposed to entire fact set) only the intermediary cache of the model can be used.

Our initial focus was to use a normalized table in SQL Server with around 4m records. First problem is that Power Query/Power BI Designer fails to provide a complete list of the distinct filter items:

This I can understand in some way as doing a distinct on a large data set like that is not trivial. As a workaround I could imagine to setup a star scheme dimension table with the distinct "dimension members".  I.e. that filters are driven by this table which has only a few thousand rows. The filters there are then applied to the fact table. I haven't found a way to do this effectively with Power Query. 

Another option that we have tried was using the cloud based Power BI service in conjunction with a SQL Cloud Service with the same data set. That  unfortunately didn't work. The setup of the source works fine but as soon as we try to start a query by dragging a value field on the dashboard errors occur:

View 5 Replies View Related

Pagesize Of Report Is Much Larger Than Expected When Printed

Jun 14, 2006

I created a report with RS in VS.NEt and set the width and height to 8.5in by 11in from the property window. When I designed it , everything fit on one page nice and neat.

When printed it prints on two pages and the font size comes out much larger than expected. The whole document seems to have been blown up bigger and the right side of the document has been cut off. Why is this? Do I need to configure vs.net to print????

Am I missing some setting somewhere?

Other documents print out fine on this printer, so it is not the printer.

Any help would be greatly appreciated, thank you

this seems small but if I can't get the report to print out right then..........

View 5 Replies View Related

Transfer Over To More Powerful Backend Due To Need For Larger Size Capacity

Mar 27, 2012

I have an existing Access database that I need to transfer over to a more powerful back-end due to the need for larger size capacity. We need to be able to have a backend that can exist up to just about any size due to us scanning in documents by ODBC. With Access I know I was limited to about 4gb size and when split onto my current SQL server I have heard I will be stuck at 10gb? If so can you recommend a better backend, but my question is about the front end. I hear Windows WPF can be linked into SQL server but does this limit the size as well?

View 3 Replies View Related

Problem Running Package With 'larger' Amount Of Data

Jun 9, 2006

Dear,

I created a package getting data from files and database sources, doing some transformations, retrieving dimension id's and then inserting it into a fact table.

Running this package with a limited amount of data (about a couple of 100.000 records) does not result in any errors and everything goes fine.

Now running the same package (still in debug mode) with more data (about 2.000.000 rows) doesn't result in any errors as well, but it just stops running. In fact, it doesn't really stop, but it doesn't continue as well. If I've only been waiting for some minutes or hours, I could think it's still processing, but I waited for about a day and it still is 'processing' the same step.

Any ideas on how to dig further into this in order to find the problem? Or is this a known problem?



Thanks for your ideas,

Jievie

View 4 Replies View Related

SQLServer2000 Xp_sendmail Message Larger Than 8000 Chars

Sep 3, 2007

Books Online gives a way to send a message larger than the VARCHAR max of 8000 chars, but the @query argument to xp_sendmail is a simple text string and my data is much more complex, and formatted. Also BOL shows an example using a temporary text file, but it is not clear precisely how you write your insert statements. I tried the following, which writes out all the data and sends it ok except, after each row, there is about a page of blank spaces. What is wrong with my syntax?

SET LANGUAGE British
GO
DECLARE @msgstr VARCHAR(80)
DECLARE @cmd VARCHAR(80)
DECLARE @PMID INT
DECLARE @forename VARCHAR(30)
CREATE TABLE ##texttab (c1 text)
SET @msgstr = 'THE FOLLOWING QUOTES ARE CURRENTLY MARKED AS PENDING:'
INSERT ##texttab SELECT @msgstr
DECLARE C2 CURSOR FOR SELECT ProjMgrID FROM surdba.SVY_QUOTES WHERE StatusID=6
OPEN C2
FETCH NEXT FROM C2 INTO @PMID
WHILE @@FETCH_STATUS = 0
BEGIN
IF @PMID > 1000
SELECT @forename = ISNULL(Forename,' ') FROM surdba.SVY_PERSONNEL_GENERAL WHERE EmployeeID = @PMID
ELSE
SET @forename = ' '
INSERT ##texttab values (RTRIM(@forename))
FETCH NEXT FROM C2 INTO @PMID
END
CLOSE C2
DEALLOCATE C2
INSERT ##texttab values ( ' - This information is autogenerated from the Survey database.')
SET @cmd = 'SELECT c1 FROM ##texttab'
EXEC master.dbo.xp_sendmail @recipients = 'Robin Pearce',
@subject = 'ALL PENDING QUOTES',
@query = @cmd,
@no_header = 'TRUE'
DROP TABLE ##texttab
GO


Would appreciate any help on this one, I do not have time to learn HTML,
thanks
Robin Pearce

View 1 Replies View Related

Moving Database Instance To Larger Storage Area?

Sep 16, 2015

I have a SQL server with multiple instances on it and would like to move one of them to a drive with more storage.  

I have SQL 2010 on a server with 2 partitions.

The database is located on the C: drive (original build) but the drive isn't partitioned to handle a db of the size that this one will grow to. I would like to move the full DB instance to another partition.

View 6 Replies View Related

SQL 2012 :: How To Backup Half Of DBs From A Server On C Drive And Other Half On D Drive

Jan 16, 2015

How to backup half of dbs from a server on C drive and the other half on D drive and vice versa, first half on D drive and other half On C drive using only one job and one stored procedure??

Using scheduling from job add 2 schedules to the job so first schedule backup first half to C and second half to D , the second schedule backup first half to D and second half to D.

View 1 Replies View Related

Replication :: Drop One Of Larger Table Indices On Subscriber Database

Jul 31, 2015

We have a database we are replicating to about 8 SQL Express subscribers from a SQL 2012 SP2 publisher.  The size of the database grew too large for the 10GB license limit for SQL Express and now replication refuses to replicate any of our deletions on the publisher to reduce the size of the database.  I've come up with a few options below.

1) Drop one of the larger table indices on the subscriber database to get below the size restriction. Permit the replication to replicate the deleted records and then rebuild the index.  (I'm not sure how important an index is to this table.  Is it merely performance related?)

2) "Upsize" SQL Express to SQL Standard on the affected boxes.  Allow the deletes to replicate.  Backup the database, downgrade to SQL Express and restore the database back to SQL a new SQL express instance.  This would involve a lot of work on each box. I'd like to avoid it if possible.

View 2 Replies View Related

The Database File Is Larger Than The Configured Maximum Database Size.

Mar 20, 2007

I'm getting this error while trying to insert records into a SQL Server Compact Edition database. I have pasted my connection string that was used when creating the database as well as for accessing that same database from my Windows application.

Thanks for any help any of you can give!

Data Source=OnTheGo.sdf;Encrypt Database=True;Password=<password>;Max Database Size=4091

View 3 Replies View Related

Recovery :: Differential Backup Much Larger Than Database / Full Backup

Nov 16, 2015

I have a database that is just over 1.5GB and the Full backup that is 13GB not sure how this is since we have compression on for full backups and my other full backups are much smaller than there respective databases...Now my full backup is taken every Sunday night and the differentials are taken every 6 hours after the full backup. Now I have been thrown into this DBA role with little to no experience just what I have picked up and read. So my understanding of backups are limited but what I think I understand is that we take a full backup and the differential only captures what changes in the database so my question is why is my database 1.5GB but my differential is 15.4GB? I have others database that are on the same instance and don't seem to have this problem. I also just noticed that we do not rebuild the index before a full backup like we do on other instances...

View 13 Replies View Related

Database File Is Larger Than The Configured Max Database Size.

Feb 18, 2008

Hello,

I am developing a smart device application with Visual Studio .Net 2005 and SQL Server Compact Edition database. And also using merge replication to synchronize the data from the mobile device to the SQL Server.

My database size is around 350MB. So when I am trying to synchronize this is the error message that I get.
" The database file is larger than the configured maximum database size. The setting takes effect on the first concurrent database connection only.[Required Max Database size ( in MB; 0 if unknown)=129].

I tried changing the Max database size in the connection string and my connection string looks as follows and still did not have any luck.

connstr= "Data Source=Storage CardItems.sdf;Max Database Size=500;"

Any help regarding this would be appreciated.

Thank you
.

View 6 Replies View Related

Smalller Tables From Larger Tables

Nov 17, 2007

hi. I am very new to  databases.
At the moment I have one big database that has column titles such as author, book title, rating, isbn, publisher, publish date, copyright date, graphical image review, availability, genre.
I want to break this large table down into say four smaller relational tables. Partly because it is so large and partly to make it easier to bind the data to the web site. I also want to keep the large table.
How can i do this? I use sql server management express. The database is on my hosts database. I also use VWD 2005 express edition.
 
thanks for your time
nick
 

View 8 Replies View Related

Array

Jul 17, 2007

how can i send array from asp.net to sql?

View 3 Replies View Related

Get Array Value From A Loop?

Jun 19, 2006

Hi, I am trying to do a loop while a list of array is assigned ('CHP,CNH,COW') ... I am using comma seperator to get each list value ... but, it donest really do what I am trying to do ... pls help!!! How do I loop through each value and do the rest ...??
=====================================
DECLARE @ABBR AS NVARCHAR(50)SET @ABBR = 'CHP,CNH,COW'
DECLARE @SEP AS NVARCHAR(5)SET @SEP = ','
 WHILE  patindex('%,' + @ABBR + ',%',  @ABBR ) > 0    BEGIN
   -- do the rest
END

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved