In MSQL Server 2000 how can I expand or use multiple transaction logs because the hard disk i am using windows dont have more than 4 GB free and the query i want to run overcomes this space.
I have another one HDD with 20-30 GB free space and i want to use this disk so to use a second transaction log or move this log to this disk.
Can this be happen and how ????
I want to backing up my hourly transaction log backups direct to a mass storage unit as opposed to the local server. However when trying to set this up it only gives me option of backing up to local drives, even though I have a drive mappping to the mass storage unit.. I'm there is a simple around this.Would appreciate any advice..many Thanks..
How to implement distinct storage tiers on SQL Remote BLOB Storage (RBS)?
I want to use this SQL Feature to move files(images, videos, pdf files) from a database to a distinct database dedicated to RBS. Then I want to have several storage tiers, where objects will be saved and moved according access frequency. Old data will be arquived in cheap storage, but it must be always accessible if needed.
Description: - 1st and main tier: new and frequently accessed objects stored in high performance storage; - 2nd tier: automatically move older or less accessed objects to an inexpensive and different storage tier; - in all cases, all objects must be accessible to all users, but accessing to archived objects(2nd tier) will be much slower;
I am a Windows developer for the IBM Tivoli Storage Manager Server (TSMS) product. Our product installation is built with InstallShield and uses the Windows Installer.
On a new installation of Windows 2003 x64 Storage Server R2, at a customer's site, the TSMS product fails to install. The install of the OS has version 3.01.400.3959 of the Windows Installer and I see no newer version that installs.
Part of our product is 32 bit (console) and another part is x64 (server). When installing I can see that the install's default is being redirected/reset to C:Program Files (x86)TivoliTSM after it is explicitly set by a custom action to ..Program Files.. . I further observe that our custom actions to write 64 bit registry entries are being refused.
REGSAM samMask = KEY_ALL_ACCESS; if ( regIsWow64Process () ) samMask = samMask | KEY_WOW64_64KEY; lStatus = RegCreateKeyEx( hLocalConnectKeyRoot, szSubkey, 0L, NULL, REG_OPTION_NON_VOLATILE, samMask, NULL, hKey, &dw ) ; The above fails to create the key.
We have tried four versions of our TSMS spanning many changes but the install acts the same. This does not happen on any other Windows OS we test on but we do not test on Windows 2003 Storage Server R2 being that it is an OEM product. We did test on Windows server 2003 R2 x64 and do not see this problem.
Do you have any suggestions on how to tackle this problem? I have full installation traces but can only see that the registry work is being refused. I can't see why.
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON Begin distributed Tran update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 and DONE = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx
I have a design a SSIS Package for ETL Process. In my package i have to read the data from the tables and then insert into the another table of same structure.
for reading the data i have write the Dynamic TSQL based on some condition and based on that it is using 25 different function to populate the data into different 25 column. Tsql returning correct data and is working fine in Enterprise manager. But in my SSIS package it show me time out ERROR.
I have increase and decrease the time to catch the error but it is still there i have tried to set 0 for commandout Properties.
if i'm using the 0 for commandtime out then i'm getting the Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
and
Failed to open a fastload rowset for "[dbo].[P@@#$%$%%%]". Check that the object exists in the database.
I am getting this error :Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.OleDb.OleDbException: Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.have anybody idea?!
i have a sequence container in my my sequence container i have a script task for drop the existing tables. This seq. container connected to another seq. container. all these are in for each loop container when i run the package it's work fine for 1st looop but it gives me error for second execution.
Message is like this:
Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.
i am getting this error "Distributed transaction completed. Either enlist this session in a new transaction or the NULL transaction.".
my transations have been done using LINKED SERVER. when i manually call the store procedure from Server 1 it works but when i call it through Service broker it dosen't work and gives me this error.
Hello Everyone and thanks for your help in advance. I am developing a document storage application for an intranet that will store various Word, Excel, and PDF documents. Most of the examples I see utilize SQL Server and an image field rather than the FileSystem Object to store documents. My concern with this method is that some of the documents may be several hundred pages (not exactly sure of the actual file size yet, but they must be fairly large). My question is, where does the use of SQL Server become impractical for this type of application? Any insight would be greatly appreciated. Thanks.
Does anyone know the upper limit of data size that one SQL 2K table can hold. I've seen 50GB tables in some warehousing servers, but what is the true limit. Soes the limit vary with the SQL2k version?
I have an MSDE installation on Windows server 2003. It looks like the C: drive is taking the brunt of the data when I load up the database. I would like to specify a different drive for data...Is there a way to do this?
How should i know size of the table in the DB. suppose my DB has 5 tables and the size of the DB is 500 MB. How can I know size of the indivdual table.
I'm migrating a images DB of a system I know the structure of the data tables and all type of data in it How can I learn about the STORAGE of IMAGES? In sql Server Where can I found information about that? I need to know something about that topic usually, whats the way for image’s storage ?
I am currently developing a system thats stores exchange stats in a db. Since our customers are companies with 20 employees up to 5 000 there a a big difference in the volume of data needed to be stored.
We currently thinking of supplying a SQL Server Express DB to the small customers and suggest a SQL Server to the bigger.
But since I would like to use the same structure for both types of customers I wonder how should i design the storeage.
Since the could be from 500 records a day up to 20 000. There are quite simple recordes with only simple datatypes. about 15 fields with no more than 10 chars each, mostly 2.
Should i separate the data in diffrent tables for a week or a day etc. Since I am only going to filter data on 1 or 2 fields the data will be easly indexed.
The reports generated will almost always only use 1-3 months of data, but historical reports have to be possible.
My question are ofcourse: Whats the best solution for me?
additional to data, what other type of information can be store in sql databases, i need to store pictures and mp3's that can be done, if not do you know what storage can be used for this purpose?
I have recently designed and built my first database using SQL server 2005 express. I have included an image (BLOB) column in one of the database tables. This is a bad idea according to some experts, and some say it is OK!
I am currently carrying out a trial with just 3 pictures via Visual Basic 2005 express forms, and there is no problem so far as the images are displayed for each record. But I anticipate between 300 - 1000 images for the table, and this could pose real problems for SQL server 2005 express and Visual Basic 2005 express, I guess.
I have just been reading that the cost of storing large images in the database is too high! I have also read it's better to store images (BLOB) into the file system because it is cheaper to store them no matter how many there are.
But the question is how I can reference an image in this path: C:PictureProductGrocery 0052745.jpg in the database table, so that when I select a record Visual Basic 2005 forms the image is displayed accordingly, similar as when stored directly in the database table? Your help very much appreciated.
From what I've read, if a row contains more than 8060 bytes and has varchar(MAX) columns in it, the data in those varchar(MAX) columns will be stored off-row. But what happens if you have two varchar(8000) columns instead and both contain more than 4030 bytes, is the data for both stored off-row? If so, just for that row, or for all rows in that table? And is there ever a good reason to have two varchar(8000) columns in a SQL Server 2005 table, instead of using varchar(MAX)?
Hi All from following link http://msdn2.microsoft.com/en-us/library/ms143432.aspx i got this information
Bytes per row 8,060 Bytes per varchar(max), varbinary(max), xml, text, or image column 2^31-1 Characters per ntext or nvarchar(max) column 2^30-1
so is it means we can store in nvarchar(max) or varchar(max) upto 2^31-1 characters/bytes. but on the other hand we can store only 8060 bytes in a row. so the concept of Row-Overflow Data Exceeding 8 KB comes into picture. my questions is suppose i declare a variable of type varchar(max) then i can store upto 2^30-1 bytes in it and SQL Server on its own determine how and where to stores it.
hello group,is it possible to do a storage snapshot of a running ms-sql databasewithout losing transactions?What tasks must be done before such a snapshot.thanks in advance,Bernhard
i want to know how many data can storage into sql server compact edition. I've got a db into a pocket pc that has a table with about 2000 records inside; are they too records?
it sounds like a column can be added to each row in a table that is the checksum or binary_checksum of an expression. How many bytes do each of these occupy? Does the answer depend on the number and/or length of items in the expression?
i am using compact framework do create applications now i am facing a problem....
i am getting a "Not enough storage is available to complete this operation" exception
i am testing my application on O2 pocket pc. running Sql mobile edition.......... and this error is causing me trouble this is espically happening when i am opening sql server connection or calling complex joined queries..... or so... any ideas???
I am new to windows mobile development, and I am looking to create an application (C#) which uses the SQL server CE. I am curious what to do about data loss. I have a Windows Mobile 2003 PDA, which means half of my storage is main memory (volatile) and the other is built in storage (no volatile).
I expect I would want to install my application to the main memory, along with the database. I Planned to give the user the abililty to select a subset of properties for running the application and storing them in the DB for running. But if the user's battery runs out I wouldn't want them to loose all of the information in the database (ie configurable property selections chosen by the user).
If the user looses the application, I plan to have them reinstall it (which would recreate the database). But I would like them to be able to load a 'configuration' file from their Built in storage area when the app is reinstalled, so that they don't loose all of their settings.
Expererience with this? or reccomendations on how to go about it?
hi all, I have a field which name is Information and it type is Varchar (8000),but some time data access than 8000 character, my client told me,make this field to store Unlimited data. So how can i achive this task, i m using VS 2003 (ASP.NET with VB.NET) with SQL 2000. Thanks Shally
Does anyone know if SQL7 will work with Storage Area Networks(SAN's)? I've read that SQL2000 implements something called a Virtual Interface System Area Network (VI SAN) that allows communication with devices connected via a SAN.
My site is installing a SAN and I need to know if SQL7 can utilize those resources (Storage,etc) and how reliable if so. Randy
I am having trouble trying to import a big file (aprox 250Mb is size) into an SQL Server database and I keep getting the message:
"Not enough storage is available to complete this operation".
The application tries to import the file by executing a stored procedure:
CREATE PROCEDURE sp_updateMaterialBlob @MaterialId Int, @BLOB image AS BEGIN Update Material SET blob = @blob where id = @materialId END
The application uses an ADO connection. I've tried increasing the memory of the client machine but that didn't work. Whenever I do run the import, nearly all the memory on the machine is used up but every time after several hours I get the same error message. What is the cause of the problem and how do I resolve it? Ideally I want to use my application to do the import rather than anything bespoke.
Can anyone explain to me how a column defined with a "bit null" datatype is physically stored in MSSQL? Is it stored like a "tinyint null" physically? In other words, how many bytes on the row on the page does a "bit null" datatype consume (assuming a non-null value 0, or 1 is the current value).
Is there any good documentation about the physical storage layout for a data page?
I am using Delphi 7 and need to save mulitple documents that range from 2K KB to 200K KB. These need to be stored in the database for inclusion in the database backup.
When i am working on the 200K KB file i continusely run out of memory, how can i resolve this.
I'm in the situation where we are suffering of poor performance on our SAN storage (VPlex) but it is mainly due to the quantity of data of different types which are on it (other applications, other I/O profile, bad storage usage...). As we plan to dedicate an ESX for SQL Server, we decide to have a new storage type. So we will go with NETAPP Clustered Data ONTAP on NAS technology.
Storage team want to enable only NFS protocol, so I'm wondering how SQL Server will handle that ? I read that NFS wasn't optimize for SQL Server and that block level (SAN) should be preferred.
I need to bulk insert very large amount of data into several MSSQLtables.The first Data model definition used identities to mantain relationshipbetween those tables but we found that natural keys (compound) arebetter forbulk insert (there is no need to obtain the identity first)My question is, changing the identities to natural keys (in some tablesinorder of 4, 5 attributes) will enlarge my database storage?I think MSSQL implements relationships with pointers (or hashcodes), sothestorage size will be similar, right?Regards,