I have written code to combine, delete redundant data in my system. The table structure remains the same, except I changed some INTs to TINYINTs.
When I do sp_spaceused, it tells me that number of rows is smaller(which is correct), but the datasize, and index_size is significantly larger AFTER the deletions.
I tried using shrink, but that doesn't seem to change anything.
When I right-click the database, and choose PROPERTIES, it also confirms that the database got significantly larger.
I am confused about how deleting data and changing to TINYINTs could make my database bigger. What would cause this?
I am using Master Data Service for couple of months now. I can load, update, merge and soft delete data in MDS. Occasionally we even have to hard delete data from MDS. If we keep on soft deleting records in a MDS table eventually there will be huge number of soft deleted records. Is there an easy way to hard delete all the soft deleted records from all MDS tables in a specific Model.
I'm getting this error while trying to insert records into a SQL Server Compact Edition database. I have pasted my connection string that was used when creating the database as well as for accessing that same database from my Windows application.
Thanks for any help any of you can give!
Data Source=OnTheGo.sdf;Encrypt Database=True;Password=<password>;Max Database Size=4091
I am developing a smart device application with Visual Studio .Net 2005 and SQL Server Compact Edition database. And also using merge replication to synchronize the data from the mobile device to the SQL Server.
My database size is around 350MB. So when I am trying to synchronize this is the error message that I get. " The database file is larger than the configured maximum database size. The setting takes effect on the first concurrent database connection only.[Required Max Database size ( in MB; 0 if unknown)=129].
I tried changing the Max database size in the connection string and my connection string looks as follows and still did not have any luck.
i want to be alerted when any of my databases is more the xxxGB.
i know one method to do that and it is with the alerts in the SQL agent Performance condition ,but with this alert i needs to created alert monitor for every DB.
I have replication setup between our main site and a remote one, and have recently noticed that the database at the remote site's .MDF file is about 3 times as large as the main site's. This doesn't seem to make sense since essentially all of our data is replicated between the two servers. Can anyone suggest why this might be happening and what is safe to do to shrink the remote file?
Hi,We have troubles when we try to use the 'dbuse' calls with databaseslarger than 28 characters, looks like the dbuse truncates the nameafter it.Any ideas ???
We have been trying for a while to use Power BI tools (Power Query, Power BI Designer (Desktop) and Power BI Designer (Cloud) in line with larger data sources ( 2-7m records in fact table). Unfortunately up to now with unsatisfactory results. It seems that these tools are just not designed to handle this kind of data volumes?
To my understanding it seems that the approach with PowerBI components is to always try to create a local (or cloud based) cache that only has a limited capacity. 2m+ records already seem to be exceeding this limit. So as opposed to firing off a query on demand (for example only based on distinct filter options as opposed to entire fact set) only the intermediary cache of the model can be used.
Our initial focus was to use a normalized table in SQL Server with around 4m records. First problem is that Power Query/Power BI Designer fails to provide a complete list of the distinct filter items:
This I can understand in some way as doing a distinct on a large data set like that is not trivial. As a workaround I could imagine to setup a star scheme dimension table with the distinct "dimension members". I.e. that filters are driven by this table which has only a few thousand rows. The filters there are then applied to the fact table. I haven't found a way to do this effectively with Power Query.
Another option that we have tried was using the cloud based Power BI service in conjunction with a SQL Cloud Service with the same data set. That unfortunately didn't work. The setup of the source works fine but as soon as we try to start a query by dragging a value field on the dashboard errors occur:
I have a SQL server with multiple instances on it and would like to move one of them to a drive with more storage.
I have SQL 2010 on a server with 2 partitions.
The database is located on the C: drive (original build) but the drive isn't partitioned to handle a db of the size that this one will grow to. I would like to move the full DB instance to another partition.
I created a package getting data from files and database sources, doing some transformations, retrieving dimension id's and then inserting it into a fact table.
Running this package with a limited amount of data (about a couple of 100.000 records) does not result in any errors and everything goes fine.
Now running the same package (still in debug mode) with more data (about 2.000.000 rows) doesn't result in any errors as well, but it just stops running. In fact, it doesn't really stop, but it doesn't continue as well. If I've only been waiting for some minutes or hours, I could think it's still processing, but I waited for about a day and it still is 'processing' the same step.
Any ideas on how to dig further into this in order to find the problem? Or is this a known problem?
We have a database we are replicating to about 8 SQL Express subscribers from a SQL 2012 SP2 publisher. The size of the database grew too large for the 10GB license limit for SQL Express and now replication refuses to replicate any of our deletions on the publisher to reduce the size of the database. I've come up with a few options below.
1) Drop one of the larger table indices on the subscriber database to get below the size restriction. Permit the replication to replicate the deleted records and then rebuild the index. (I'm not sure how important an index is to this table. Is it merely performance related?)
2) "Upsize" SQL Express to SQL Standard on the affected boxes. Allow the deletes to replicate. Backup the database, downgrade to SQL Express and restore the database back to SQL a new SQL express instance. This would involve a lot of work on each box. I'd like to avoid it if possible.
hi. i'm trying to create a c# application which would insert, update and delete data from a database. could anyone pls point me to the right direction in which i should take? thanks in advance.
I have a database that is just over 1.5GB and the Full backup that is 13GB not sure how this is since we have compression on for full backups and my other full backups are much smaller than there respective databases...Now my full backup is taken every Sunday night and the differentials are taken every 6 hours after the full backup. Now I have been thrown into this DBA role with little to no experience just what I have picked up and read. So my understanding of backups are limited but what I think I understand is that we take a full backup and the differential only captures what changes in the database so my question is why is my database 1.5GB but my differential is 15.4GB? I have others database that are on the same instance and don't seem to have this problem. I also just noticed that we do not rebuild the index before a full backup like we do on other instances...
I am restoring a database with 10yrs worth of data which have monthly partitions but i would like to keep only 5yrs of data after the restore is done, what is the best/faster approach to delete the 5yrs data without deleting the partitions as that may cause the db in accessible.
following is the code which i am trying to use it throws an error and dosent work. error details: Unable to cast object of type 'System.Data.Linq.DataQuery`1[tbl_temp_bank]' to type 'tbl_temp_bank'. source code(aspx.vb file) Dim c As New temp_business_bankDataContextDim tag = From t In c.tbl_temp_banks Where t.TIN = Convert.ToInt32(tin.Text) Select t c.tbl_temp_banks.DeleteOnSubmit(tag) c.SubmitChanges()
I am using sql server 2000 developer editionMy table defintion is as followin: CREATE TABLE query_table (id IDENTITY (1, 1) NOT NULL ,qtext char (4000) )When i try to insert data with length more than 256 characterit only inserts only the first 256 characters.Please, let me know why this is happening.
I have a situation where I need to migrate data from an older platform to a newer one. The data from the old system(s) will be available on DAT tapes. All database construction on the new system will be identical to the old one in size and schema, except for one table (call it "ARCHIVE").
If the ARCHIVE table on the old system is 210MB, and the ARCHIVE table on the new system has the same attributes but has been expanded to 380MB in size, can I simply restore the dump for the old table into the new ARCHIVE?
Empirically it works (I have done it with apparent success two times) but I seem to recall that backups are done by pages, and I'm concerned that there may be conditions not being met by simply doing the restore the way I'm planning to do it.
Also, are there any tests or checks built into SQL which I can use to check table integrity on the target ARCHIVE table after the restore?
I have two int fields in my database, CEOAnnualBonus and CEOBonus, and I want to return the value of whichever one has the larger value as CEOBonusCombined. I thought using COALESCE would do the trick like below but there are many cases where either CEOAnnualBonus or CEOBonus have a zero value instead of NULL and it doesn't work.
SELECT COALESCE(CEOAnnualBonus, CEOBonus) AS CEOBonusCombined FROM tbenchmarktemp WHERE Ticker='F'
I have a Windows 2003 server with SQL Server 2005 installed. Theserver is on small drive and we would like to upgrade to much largerharddrives. I've been hearing of problems using Ghost to get an imageand placing the image onto the new drive. I think this is more of aWindows 2003 problem, but this server is for nothing but the SQLServer databases. Does anyone have a clear method of moving thisserver to the larger drives?TIA.
Here is my situation... I have 2 fairly large databases. Full backups are 83gb & 63gb. I am in the process of moving these database to a new data center. I've taken full backups of these databases and shipped them to the new center. I have been taking transaction log backups (larger db every 24 hrs smaller db every 15 min ... from log shipping).
I want to restore these databases in the new data center. I've gone ahead and restored the dbs in the new location.
Question final cutover.. can I just apply the transaction logs to the databases on final cut-over or do I have to restore the database backup first then apply the transaction logs?
Is there an other way to do this that I'm missing?
I created SDF database files for "SQL compact" with a size larger than 128 MB (which is default for creation). Now when I try to open these files with VS2005 I get the error "The size of the databasfile exceeds the configured maximum...Required Max Databse size (in MB; 0 if unknown)" (translated from german). The real problem is that I can not change the connectionstring in VS, cause all field which show the connectionstring are readonly (greyed out). I know I have to set the option "Max Database Size = 512" or so in the connectionstring to get the things runnning, but don't know any way to do that in VS2005. My attempt to access the SDF files when copied to the device via active sync results in the same error message.
This seems for me to be a design flaw, cause I can not add the optional parameter to the connection string (even not in the details form, where only "DataSource" and "Password" fields are displayed. - Does anybody know a solution in VS2005? - Does anybody have a workaround for me? - Does anybody know where the connectionstrings of VS2005 are stored, so that I may "hack" the connection string?
I created a report with RS in VS.NEt and set the width and height to 8.5in by 11in from the property window. When I designed it , everything fit on one page nice and neat.
When printed it prints on two pages and the font size comes out much larger than expected. The whole document seems to have been blown up bigger and the right side of the document has been cut off. Why is this? Do I need to configure vs.net to print????
Am I missing some setting somewhere?
Other documents print out fine on this printer, so it is not the printer.
Any help would be greatly appreciated, thank you
this seems small but if I can't get the report to print out right then..........
I have an existing Access database that I need to transfer over to a more powerful back-end due to the need for larger size capacity. We need to be able to have a backend that can exist up to just about any size due to us scanning in documents by ODBC. With Access I know I was limited to about 4gb size and when split onto my current SQL server I have heard I will be stuck at 10gb? If so can you recommend a better backend, but my question is about the front end. I hear Windows WPF can be linked into SQL server but does this limit the size as well?
Books Online gives a way to send a message larger than the VARCHAR max of 8000 chars, but the @query argument to xp_sendmail is a simple text string and my data is much more complex, and formatted. Also BOL shows an example using a temporary text file, but it is not clear precisely how you write your insert statements. I tried the following, which writes out all the data and sends it ok except, after each row, there is about a page of blank spaces. What is wrong with my syntax?
SET LANGUAGE British GO DECLARE @msgstr VARCHAR(80) DECLARE @cmd VARCHAR(80) DECLARE @PMID INT DECLARE @forename VARCHAR(30) CREATE TABLE ##texttab (c1 text) SET @msgstr = 'THE FOLLOWING QUOTES ARE CURRENTLY MARKED AS PENDING:' INSERT ##texttab SELECT @msgstr DECLARE C2 CURSOR FOR SELECT ProjMgrID FROM surdba.SVY_QUOTES WHERE StatusID=6 OPEN C2 FETCH NEXT FROM C2 INTO @PMID WHILE @@FETCH_STATUS = 0 BEGIN IF @PMID > 1000 SELECT @forename = ISNULL(Forename,' ') FROM surdba.SVY_PERSONNEL_GENERAL WHERE EmployeeID = @PMID ELSE SET @forename = ' ' INSERT ##texttab values (RTRIM(@forename)) FETCH NEXT FROM C2 INTO @PMID END CLOSE C2 DEALLOCATE C2 INSERT ##texttab values ( ' - This information is autogenerated from the Survey database.') SET @cmd = 'SELECT c1 FROM ##texttab' EXEC master.dbo.xp_sendmail @recipients = 'Robin Pearce', @subject = 'ALL PENDING QUOTES', @query = @cmd, @no_header = 'TRUE' DROP TABLE ##texttab GO
Would appreciate any help on this one, I do not have time to learn HTML, thanks Robin Pearce
This probably has been addressed before but I was unable to get the search to work properly on this site. I am needing a script/way of deleting all rows from a DB with the exception of one record left for each row that has duplicate column data. Example : Row 1 Field1 = 12345 Field2 =xxxxx Field 3=yyyyy Field4=zzzzz etc. Row 2 Field1 = 12345 Field2 =zzzzzz Field 3=xxxxxx Field4=yyyyyy etc. Row3 Field1 = 12345 Field2 =20202 Field 3=11111 Field4=zzzzz etc. Row 4 Field1 = 54321 Field2 =xxxxx Field 3=yyyyy Field4=zzzzz etc. Etc. Etc.
I want to be able to find the duplicates for Field1 and then delete all but 1 of those rows.( I don't care which one I keep just so only one is left.) The data in the other fields may or may not be unique.
I know how to find the duplicates it's just the deleting part I am having problems with. Any help would be much appreciated. Thanks,
i have 3 tables User (userid(P.K.),username,userpassword,usertypeid(F.K.)) UserRights (userrightsid(P.K.), insert, update, delete, select, modulename) User_UserRights(user_userrightsid(P.K.), userid(F.K.), userrights_id(F.K.)) i want to delete all the data from this tables for a particular id how can i do it i want a stored procedure for this please anyone can provide with the full procedure.
hello i am using sqldatasource 2005: i am using this <sqldatasource id="a"> deletecommand=delete from table where rec_key=@rec_key <deleteparameters> <asp: sessionparameter sessionfield="session_rec_key" type=string> this works fine but if i use this deletecommand=delete from table where rec_key=@rec_key and name=@name <deleteparameters> <asp: sessionparameter sessionfield="session_rec_key" fieldname="rec_key" type=string> <asp: sessionparameter sessionfield="session_name" fieldname="name" type=string> this gives error saying scalar variable @name is not defined.
but the value is going properly. please help me find the solution. thanks
Hello group I am new with working with a SQL database. I am trying to delete a record from the database. The code below just reloads the page. I can’t find any info on deleting just from the database. Can someone tell where I can find how to delete from the database not a datagrid/databind? Or can you tell me what is wrong with the code below? Thanks Michael
Dim delSQL As String = "DELETE FROM tbEmail WHERE (ID = @IDnum)" Dim SqlConn As New SqlConnection(ConnStr) Dim delCmd As New SqlCommand(delSQL, SqlConn) delCmd.CommandText = delSQL delCmd.Connection = SqlConn
Dim dbParam_fldcode As System.Data.IDataParameter = New SqlParameter dbParam_fldcode.ParameterName = "@IDnum" dbParam_fldcode.Value = IDNum dbParam_fldcode.DbType = System.Data.DbType.String delCmd.Parameters.Add(dbParam_fldcode)
SqlConn.Open() Try delCmd.ExecuteNonQuery() Finally SqlConn.Close() End Try
A database disappeared from one of our qa servers last night yet when we looked at the logs no record of the drop datbase command was to be found like wise no record of create database. Can any one tell me where we can look in sqlserver or on the windows 2000 server fro a record of these events
I've got a DB that had a problem restoring, and now it's status is showing (Loading). I'm unable to detach it, delete it, or do anything with it. If I delete it from Enterprise Manager then do a refresh, it shows back up again.
Is there a way to positively, absolutely nuke this database?