Error When Trying To Disable Data Compression In 2014

Jun 16, 2015

I tried to disable the data compression for granular backup purposes on SQL 2014 with following query: EXEC [dbo].[prc_EnablePrefixCompression] @online = 0, @disable = 1

I received following error message: Msg 2812, Level 16, State 62, Line 1...Could not find stored procedure 'dbo.prc_EnablePrefixCompression'.

View 7 Replies


ADVERTISEMENT

SQL Server Admin 2014 :: DB Growth Report With Compression Turned On?

Apr 14, 2015

I got into confusion while working on DB Growth report for 2012 databases which has compression on.I am analyzing the DB Growth based on MSDB..dbo.backupset table which stores the backup information.

But here it gets tricky, In Previous versions we use "backup_size" column to get actual backup size and estimate the db growth based on the previous all backup file info. But now since compression is on in 2012 the "backup_size" colmuns gives a compressed file size(If i am right) so how do you know the actual backup size to estimate the db growth over a period of time??

View 7 Replies View Related

SQL Server Admin 2014 :: How To Disable Reboot Pending Through Command

Mar 16, 2015

Any command through that can I delete/disable the DWORD value "Pendingfilerename operations" so that while I start the SQL Install work, it does not fail due to this and can write this as one of the precheck options.

View 0 Replies View Related

DTS Data Compression

Jan 17, 2001

Does DTS do any data compression when transferring data from a table on Server A to another table on server B?

View 1 Replies View Related

Data Compression

Feb 28, 2001

Can anyone tell me whether there is any data compression in SQL6.5. Have concerns with network traffic, and was wondering if data compression was a function SQL6.5, or if the data compression have to be coded into the actual database?

View 1 Replies View Related

SQL Server Admin 2014 :: SSMS - Disable Check For Memory Optimized Tables?

Oct 2, 2014

I have the following setup:

- An MSSQL 2014 Standard server that houses multiple small databases (in excess of a hundred).
- These databases are frequently dropped and restored by an application that uses this SQL Server.
- There is a business need for this setup at this time, so I can't get away from it. Therefore answers like "don't have so many small databases that are frequently dropped and restored" would be somewhat unuseful

This is the problem I have:

- When I connect SSMS 2014 to the server and expand the "Databases" node, it takes forever to display. In comparison, SSMS 2008 connected to SQL 2008R2 server with the same number of databases displays the Databases tree very quickly.

I ran a trace to see what exactly SSMS 2014 is doing. When the "Databases" node is expanded, it runs a query that checks each database for Memory-Optimized Tables (new and wonderful feature of SQL 2014 for sure, but I'm not using it, at least yet). Naturally, when you have to loop through over a hundred DBs, it takes time. Worse yet, if one of these DBs is in process of being restored, the query sits and waits to time out before proceeding to the next DB. Sometimes this causes outright timeouts. Here is the query:

use [MyDatabase]
SELECT
ISNULL((select top 1 1 from sys.filegroups FG where FG.[type] = 'FX'), 0) AS [HasMemoryOptimizedObjects]

To be sure, this is NOT a SQL Server performance issue. This server processes a rather heavy workload and has been doing so for over a month, and the workload completes within expected time limits or better. Even so I've done some basic performance measuring, and the server itself is quite all right.

Moreover, if I connect SSMS 2008 to it, I get an error message (Index out of bounds or somesuch), but SSMS 2008 does connect, and displays the Databases tree much faster than SSMS 2014.

I'd like to turn off the option to check for Memory Optimized Objects altogether, as I'm not using the feature.

View 3 Replies View Related

Data Compression Over Low Bandwidth Lines

Feb 28, 2005

Does anyone know a wayto compress data between two database connections over "low bandwidth" lines in order to speedup datatransfer?
Used connections are Oracle<->SQL2000 and SQL2000<->SQL2000.

View 3 Replies View Related

SQL 2012 :: Data Compression In Server

Apr 1, 2014

Where can I get detailed notes on "Data compression in SQL server"?It would be useful if the notes are accompanied with examples.

View 4 Replies View Related

Transact SQL :: Data Compression And Rebuilding Indexes

Aug 31, 2015

If I'm doing data compression(page level) does it rebuild indexes too? and how about stats, does it update stats too?

View 4 Replies View Related

SQL 2012 :: Check Size Of Database After Implementing Data Compression Across All Tables

Dec 4, 2014

I found it pretty interesting. I checked the size of a database, before implementing database compression across all the user tables in a database. And Post implementation of compression too I checked the size of the database.

I did not find any difference. But if I expand the table and check propetires->storage and I can see that PAGE compression is implemented across all the tables, but no compaction in the size of the db. It still remains the same.

View 6 Replies View Related

Can I Disable A Database On Error?

May 10, 2006

If I got a TORN PAGE error, or something like that, in normal operation could something automated mark the database as Read-Only, or somesuch, so that all further updates to the database were disabled until we had a chance to look at the problem?

Kristen

View 11 Replies View Related

Disable Repl Error

Feb 2, 2007

one week ago, for testing purpose, I set up trans repl, trans repl with updateable, snapshot repl using the same instance.

configuration is like this,

publisher and distributor are on the same server, 2 remote subscribers, one is 2000, the other is 2005.

It works OK, today I am trying to disable the replicaiton, clean up the machine. keep getting the errors:

an exception occurred while executing a T-SQL statement or batch
only replicaiton jobs,or job schedules can be added, modified,dropped or viewed through replicaiton SPs
could not update the distribution database subscription table, the subscription status could not be changed.
changed database context to 'master',(MSSQL SERVER error 22538)

View 2 Replies View Related

Error 8157 When Trying To Disable Publishing

Aug 6, 2001

I am trying to disable publishing on a server. When I attempt to do so
I get the following error:

"Error 8157: All the queries in a query expression containing a UNION
operator must have the same number of expressions in their select lists.
Could no use view of function 'sysextendedarticlesview' because of binding errors."

There is more error message text but it's a repeat of the UNION and sysextendedarticlesview problem.

Can anyone help me with this?

Thanks!

Kurt

View 1 Replies View Related

SQL Server Admin 2014 :: Error While Updating Data Using Oracle Linked Server

Sep 11, 2015

We have oracle linked server created on one of the sql server 2008 standard , we are fetching data from oracle and updating some records in sql server . Previously its working fine but we are suddenly facing below issue.

Below error occurred during process .

OLE DB provider "OraOLEDB.Oracle" for linked server "<linkedservername>" returned message "".
Msg 7346, Level 16, State 2, Line 1
Cannot get the data of the row from the OLE DB provider "OraOLEDB.Oracle" for linked server "<linked server name>".

View 7 Replies View Related

Possible To Disable Welll-formedness Checking When Inserting XML Data.

Mar 20, 2006

Hi All,Can anyone help with the following problem? I have a database whichcontains a table with a 'text' field, and the text field contains anxml document - typically 50-100K. Now I'd like to make use of SQLServer 2005s XMLData type. To do this I have created a new field ofthe 'xmldata' datatype, and run an SQL statement to update the contentsfrom one field to another - hoping to end up with a complete table ofxml (based on the old text field).The problem I have is after a minute or so, it must come across anbadly-formed xml fragment because I get the following message:Msg 9436, Level 16, State 1, Line 1XML parsing: line 1, character 67640, end tag does not match start tagCan I turn off the checking during the update, or is it not possible toadd badly formed data to the xmldata field. Any help appreciated asthe table runs into tens of thousands of rows, so I can't really checkthe contents of each!Many thanks,Duncan.

View 1 Replies View Related

Using Disable Property For Data Flow Task Dynamically ?

Mar 18, 2008

hello guys,

I am having trouble with using disable property in the expression for data flow task. Here is the issue as explained below-

lets say i have 3 tables TableA, TableB, TableC from which i need to export data. So i create a table (TableList) where I save these table names and a unique id to these tables. e.g.

TableList will have-
TableName TableID
TableA 1
TableB 2
TableC 3


in the ssis package select these tableNames & Ids from tableList in Execute SQL Task. And assign the result set to a variable object (@TableList.

Then i use For Each Loop Container (For Each ADO Enumerator) , to loop through these tablesnames & iDs

Inside this loop container, i define three data flow tasks one for each table. So i have DataFlowTaskA (For TableA), DataFlowTaskB(For TableB), DataFlowTaskC (For TableC).

Now for a given table selected in the iteration, only the corresponding DataFlow Task should be exeuted. e.g. For the 1st iteration, if TableA is selected then only DataFlowTaskA should be executed and DataFlowTaskB& C should be skipped.

In order to achieve this, I am using a 3 variable @FlagA, @FlagB, @FlagC (type Boolean) one for each Table. and use the value of these flags for the "Disable" property of the data flow task (so @FlagA will be used for Disable property in the Expression for Data FlowTaskA, and so on..)

SotThe First Step inside the Loop, I use Script Task. (Input for the script task: read variable is @TableID and Read/Write varaibles are these 3 flags)

In this script task, I initialize these flags to true or false appropriately. So this is what i do


If (Dts.Variables("TableID").Value.ToString = "1") Then
Dts.Variables("@FlagA").Value = False
Else
Dts.Variables("@FlagA").Value = True

End If


If (Dts.Variables("TableID").Value.ToString = "2") Then
Dts.Variables("@FlagB").Value = False
Else
Dts.Variables("@FlagB").Value = True

End If


So in the 1st iteration, (if TableA comes) @FlagA=False and B&C will be True.
So the Disable property for DataFlowTask will be set false and for others it will be set to True. Thus, only DataFlowTaskA will be executed.

And this action should be repeated for each input table. this is the logic.



However only for the 1st iteration(say TableA is selected) it behaves as above. i.e. DataFlowTaskA is executed and DataFlowTaskB & C are skipped. But in the 2nd iteration(say TableB is selected) , it again executes DataFlowTaskA and doesnt exeute B & C (where it should have executed B & skipped A&C).

I do set daelay validation to true for all these but it still it doesnt working as expected. Even I checked the values for all the flags for each iteration and they seem to get the correct values. But somehow Diable propery in the expression not behaving as it should.

Am i missing anything. Do i need to set any other property to make this work.


I apprecite any help.

Thanks

View 3 Replies View Related

Compression

Apr 2, 2004

Hello,

I have been wanting to compress my database. I am not really sure how this is done. I was looking on Enterprise Mangr. and if you right click on the db and go to all tasks, there is an option to shrink database. Is this the way you would compress your database, or are there other ways of doing this?

Thanks for all the help.

View 5 Replies View Related

Log Shipping And Compression

Jan 14, 2003

Has anyone tried compressing/uncompressing transaction log files when implementing log shipping?

I saw an article on SQL Lite Speed product, which supports compression with log shipping, I was trying to do it without any third party product.

View 3 Replies View Related

Query Compression

Apr 10, 2008

Hi all,

My application send/retrieve large data from and to the database over the internet. Is there anyway or method that i can compress the query before submit to database server? really appreciate any advice and comments.

regards
Desmond

View 4 Replies View Related

Message Compression

Sep 15, 2005

We have run some tests on our application. Average message is about 2.5 MB. Messages are send once every 30 minutes. This is 3.5 Gb per month for one site. Now we already have three sites that will be sending this messages. This will be VERY high load on the WAN channel, and will cost us a LOT of money .

View 2 Replies View Related

Does Replication Use Compression?

Sep 29, 2006

Does SQL Server replication impliment any kind of compression? It seems to me that this would be very helpful for congested WAN links and costly merge replication.

View 5 Replies View Related

Column Compression

Dec 19, 2006

Hi, I was looking for a column compression functionality in SQL Server Compact and it seems that it doesn't exist (maybe I'm wrong?). I wonder if the SQL Server Team plan to implement column compression and if yes, when can we expect it?Thank you for your help!

View 3 Replies View Related

SQL Server Admin 2014 :: Change Data Capture(CDC) For Data Warehouse / Reporting?

Aug 12, 2015

I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.

The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?

Is there any performance impact if we enable CDC on OLTP db?

Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?

What is the best way to implement CDC to take incremental changes for reporting.

View 0 Replies View Related

SQL Server 2014 :: How To Insert CSV Data Into DB Where Some Data Don't Have Double Quotes

Aug 11, 2015

Example of data in CSV are as follows:

"XXX","0001",-990039739 ,0 ,0 ,0 ,0 ,0 ,0
"ABC"," ",-3422054702 ,0 ,481385 ,0 ,0 ,0 ,0
"JJZ","0001",0 ,0 ,0 ,0 ,0 ,0 ,0Here's my format:
12.0
10
1 SQLCHAR 0 0 """ 0 "" ""
2 SQLCHAR 0 5 "","" 1 OKCCY SQL_Latin1_General_CP1_CI_AS

[Code] ....

View 5 Replies View Related

Database Backups && Compression

Jun 19, 2007

Hi All

I have a database which is 72GB, which is backed up every night as part of the maintenance plan. I have plenty of storage space, and the server that runs the database is fairly powerful (quad-processor 3.2ghz, 64bit, 48GB RAM) and is part of an active-passive cluster. The database backup is also copied to a SAN location.

My issue is with the size of the backup file. As part of the Disaster Recovery plan, I need to copy this database backup file accross the network to a remote site, so that in the event of a disaster at the site, business can continue at the remote site after restoring the database backup file. However, my database backup file is so big that I cannot copy it accross the network in time for the next morning. I have tried using WinRar and have managed to achieve a file about 20% of its original size, but it takes 2 hours to produce this file.

Is there any recommended reeading for this type of issue? Log shipping / mirroring has been investigated and will be part of the DR model but the 'powers that be' insist on having a full copy performed to the remote site.

Any suggestions? Thanks in advance guys n gals :-)

View 4 Replies View Related

Use Backup Compression On Few Servers

Jul 26, 2015

I am wanting to use backup compression on a few sql servers (2008R2 and 2012). I have never touched compression before and always just gone with the default.I use Ola Hallengren's scripts to do the backups which, if not specifically specified, will use the server default.Backups (FULL and LOG) have been happening successfully on these servers for years.SQL won't care that the previous backup in the set was uncompressed or that the hourly transaction log backups previously taken were uncompressed?Restore statements (T-SQL) will be identical?From everything I am reading it is simply a case of setting the configuration and acting like nothing changed but I just wanted to be 100% certain.1 of the servers is a SharePoint backend.All of the backup files from all servers are picked up by Commvault backup system.

View 4 Replies View Related

Print Compression When Deployed

Dec 14, 2006

I have a report being utilized for return address labels, conforming to Avery 5167. I have tried designing both as a table and as data in rectangles. Since these are return labels their is only one instance of data replicated for all textboxes, therefore the columns are of consistant length.

The report has seven columns of precise measurement, the data filled colums are set as 1.75in, 0.25in, and between the data columns are blank columns set to .3125in, 0.25in. I have also tried to fill data into the blank colums and set the font color to white. All report margins are set to 0in and the table location is 0.04167in, 0.125in. all textboxes have the properties for increase/decrease to accomadate turned off.

The biggest issue I am having is in printing from the deployed report versus printing while designing. I have adjusted for the glitch in margins for RS2005 and have printed succussfully prior to deployment to adhere to the specific measures for the label, report margins and textbox height and width and blank column spacing.

However, after deploying the report the data spacing seems to be compressed when printed. I am not getting the same measure between data fields as I had when designing the report. The entire printout seems to be slightly compressed.

When printed from design colums 1,3,5, and 7 start respectively at .3125in,2.375in,4.3756in,6.4375in from the edge of the page.

When printed fromt he deployed report columns 1,3,5 and7 start respectively at .3125in,2.3125in,4.28125in,6.28125in.

The progression of compressed measure seems to increase from left to right inthat by the time the report is printed via the deployed report the "Tab" area is a difference of .15625in which when printing for very precise template format this creates a problem. As well as a headache when one thinks the report is correct before deployment.

View 1 Replies View Related

Transaction Log Shipping Over WAN With Compression

Jan 18, 2006

Hi guys,

I have a server in a datacenter (SQL 2005 ent) that collects large quantities of data from our visitors. I need to set up a secondary database in our office (different geographic location) that will server 2 purposes, 1, a backup of the database and 2, allow us to perform complex queries on the data.

There is no updating of the data on the secondary server so no changes need to go back to the primary server. A database in standby mode is fine and users on the secondary server can be disconnected when it's being updated.

I have transaction log shipping working well in a staging environment (LAN). My first question is is there any reason why transaction log shipping would not work over a WAN with a VPN connection?

And my second question is can I compress the trn files for transport over the WAN. If I manually compress the files with winzip they compress by 98%. That translates into a huge saving when I am leasing a line to transport these files.

Thanks in advance

Stephen

View 4 Replies View Related

SQL 2012 :: Backup Compression Default Should Be On With DPM?

May 21, 2014

We're changing Netbackup to DPM but not sure if "backup compression default" option should be ON or OFF or doesn't matter.

View 3 Replies View Related

Delta Compression Of Query Results

Feb 14, 2007

Suppose a database server and client are separated by a low bandwidthlink such as DSL, and the client repeatedly issues a query for, say, acurrent product list.Suppose the product list is large, but only a handful of entries havetypically changed between queries. It would be nice if only the changesfrom last query to current one could be sent, saving bandwidth.Is there any way to do this?Thanks,--"Always look on the bright side of life."To reply by email, replace no.spam with my last name.

View 3 Replies View Related

SQL 2005, Service Broker And XML Compression

Sep 28, 2007

We have a number of large binary xml messages. I would like to compress the XML before converting to VARBINARY as much as possible. What is the recommended way to handle this if the conversions are happening in TSQL? I will also need to be able to decompress in .NET code, so I'm wondering if the best way is to just use CLR Integration for the marshaling routines, rather than building the marshaled structure in SQL. Any suggestions on reducing the XML payload size for extremely large messages?

View 1 Replies View Related

SQL Express 2008 And Row Level Compression

Dec 7, 2007

Will the 2008 edition of SQL Server Express contain the new row level compression features? Is it available in the current CTP for us to try out?

Thanks

View 1 Replies View Related

Snapshot Replication Compression Limit.

Nov 20, 2006

my snapshot is exceeding the compression limit.

is there a way to configure the server to have a higher compression limit

thanks,

joey

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved