What is the easy way to clean up all ms sql 2005 tables?
For example, a database table named Customers which has triggers and primary key and foreign keys. Now I clean up the Customer table using the folowing statement
delete Customers
And the ms sql 2005 asks me to remove all triggers and foreign keys before allow me to clean up the Customers table. Is there a way to clean up all tables without remove the tiggers and foreign keys?
Is there some recommended way to clean input before submitting it to the database? We'd like to develop a library that can be used on our ASP/ASP.NET apps to filter input before it's sent to the SQL Server and Oracle databases. Is there a way to create a .NET DLL that can be used for both ASP.NET and classic ASP apps. Thanks.
I posted this in the SQL Server Express forum as well...
My app deployed via ClickOnce. It ships with a database creation script which creates the database in the ClickOnce data directory. Upon uninstall, the entire data folder is deleted successfully. But if you then reinstall the app, SQL Server Express still thinks the database exists. Right now, to get a re-install to work, I have to go into Management Studio Express, click on the db (get an pop-up error as the physical db file is no longer there) and then click delete.
So my question is... what system proc should I execute in the db creation script before I run the CREATE DATABASE command?
WHERE (dbo.document.docVisible = 'yes') AND (dbo.document.docHide = 'NO') AND (dbo.t_files.fileMoved = '1') AND (dbo.documentCategory.docCategoryName LIKE '%olic%') AND (dbo.minisite.sectionLive = 1) AND (dbo.t_files.fileName <> N'A LINK') AND (dbo.t_files.fileName NOT IN (SELECT exFileID FROM t_exclude)) OR (dbo.document.docVisible = 'yes') AND (dbo.document.docHide = 'NO') AND (dbo.t_files.fileMoved = '1') AND (dbo.minisite.sectionLive = 1) AND (dbo.t_files.fileName <> N'A LINK') AND (dbo.t_files.fileName NOT IN (SELECT exFileID FROM t_exclude)) AND (dbo.document.docDesc LIKE '%olic%') OR(dbo.document.docVisible = 'yes') AND (dbo.document.docHide = 'NO') AND (dbo.t_files.fileMoved = '1') AND (dbo.minisite.sectionLive = 1) AND (dbo.t_files.fileName <> N'A LINK') AND (dbo.t_files.fileName NOT IN (SELECT exFileID FROM t_exclude)) AND (dbo.document.docType BETWEEN 1 AND 4) OR (dbo.document.docVisible = 'yes') AND (dbo.document.docHide = 'NO') AND (dbo.t_files.fileMoved = '1') AND (dbo.minisite.sectionLive = 1) AND (dbo.t_files.fileName <> N'A LINK') AND (dbo.t_files.fileName NOT IN (SELECT exFileID FROM t_exclude)) OR(dbo.t_files.fileName IN ('127-2006-1-27-3321130.pdf', '127-2006-1-30-3726619.pdf', '127-2006-1-27-5700042.pdf', '127-2006-1-27-5678586.pdf', '127-2006-1-27-5693574.pdf', '127-2006-1-27-5873392.pdf')) ORDER BY dbo.document.docDesc
Right - as you can see, some of the line slook pretty similar. When I try and do:
(one AND two AND three) AND (four OR five), I get
(one AND two AND three AND four) OR (one AND two AND three AND five)
when using enterprise manager. Is there anyway to keep the first way of doing it. Otherwise, everytime I add anotehr OR statement in, I'll have to create a new line!
I had received a message that my log file is full and it do not enable to me to do a database backup before free up disk space. How do i clean up de log file (_log.ldf)?
I have the following query which strips out middle initial data from the first_name column and populate the middle_initial column with the relevant data.
What my query does not handle is when a first name has a single character (IE: A). In its current state, the "A" would be moved to the middle_initial field with a period added to the end (IE A.). The first_name column would also include "A". Basically, when a single character first name is found in the first_name column, I do not want to populate the middle_name field.
Hope this does not sound too cryptic; query is below along with some sample data when run.
SELECT first_name, CASE WHEN SUBSTRING(LTRIM(RTRIM(first_name)),LEN(LTRIM(RTRIM(first_name)))-1,1)=' ' THEN SUBSTRING(LTRIM(RTRIM(first_name)),LEN(LTRIM(RTRIM(first_name))),1) +'.' ELSE NULL END AS 'middle_initial', CASE WHEN SUBSTRING(LTRIM(RTRIM(first_name)),LEN(LTRIM(RTRIM(first_name)))-1,1)=' ' THEN LEFT(LTRIM(RTRIM(first_name)),CASE WHEN LEN(LTRIM(RTRIM(first_name)))>=2 THEN LEN(LTRIM(RTRIM(first_name)))-2 ELSE LEN(LTRIM(RTRIM(first_name))) END) ELSE LTRIM(RTRIM(first_name)) END AS 'first_name_removing_initials' FROM contact
Sample Data first_name, middle_initial, first_name_removing_initials, Paul, NULL, Paul A, A., A, A Fred, NULL, A Fred, Aaron, NULL, Aaron
I have a db server that had served both my application data and reporting services. I have since installed a second db server to be the reporting services server; the primary box still stores the application data. I am keeping reporting services installed and set up on the primary box, since I would like it to act as a warm standby for the reporting services function.
However, the ReportServerTempDB on my primary box is still quite large. The ChunkData table is quite close to 10 gig. The CleanupCycleMinutes property is set to the default of 10. Clearly any data that might be in this table is far older than that.
Is there some recommended method of clearing this data? Can I just delete the rows or do a truncate on the table? I would prefer not to uninstall/reinstall reporting services, as this machine is actively serving my application data.
Hi, I need some advise in the log shipping. The log files in the primary server get cleaned up according to what I have specified in the maintenance plan. But the log files that got shipped to the secondary server stay there for ever wasting my hard disc. Will it make any problem if I remove them or can I set it up to remove all files earlier than past 2 hrs? Please advise. Thanks
My problem is that I cannot completely clean buffer cache on SQL Server 2005 version 9.00.2047.00 (probably SP1).
Right after I run DBCC DROPCLEANBUFFERS in the context of my database (this is development server, and so far I am only the one who is working with a particular database), I run a script that quetries sys.dm_os_buffer_descriptors view also from the context of my database to make sure that the buffer cache is really clean. However it shows large number of entries totalling 42 MB.
I ran both DBCC an the script in the past too, and it always showed nothing in the results, that means that buffers were really clean. The reason why I am running this is for benchmarking of existing and new application.
Does anybody have any idea, suggestions, how to troubleshoot this issue ? I already closed all connections to this database, but rebooting the server is not an option since other people are also working on it.
I have a custom Data Flow task that creates temp files to the system temp directory during processing. A lot of times, we'll use SSIS to do one data transformation, running and tweaking the package along the way... we do this in the designer ... if we notice something that's incorrect in the data view, we just hit the stop button and fix it. However, when we do this, the Cleanup() function isn't called, and my temp files are left in the temp directory, when they really ought to be disposed of.
Is there a method that gets called every time when the DtsDebugHost quits, whether it finished, didn't finish properly, or was stopped in the middle? What would be a good way (other than having some service that monitors what temp files are used by what processes) to clean up temp files after we don't need them?
I am importing Visual FoxPro (6) views into SQL 2000 tables and I am looking for a snippet of code to "clean up" date fields upon insertion. I can insert the views into the SQL tables fine using the tables/views selection of DTS as my source. However, I would like to ensure the date fields are in fact vaild dates and not garbage using the SQL query option of DTS source. I would like to do the insert and cleanup in one step. Do you have a snippet of code to validate a date field that I can use? Thank you for any assistance in advance, Terry.
I have an IS package that contains some legacy DTS tasks. The package does run successfully on some machines, but others not at all, generating a weird error that says "Error: The task failed to load. The contact information for the task is "Execute DTS 2000 Package Task;Microsoft Corporate...." There's more to the error but searching on it hasn't turned up any results so I'm not posting the entire thing. I can if requested.
On the server where the DTS packages fail, regular IS packages that contain no DTS legacy tasks run successfully, and the only difference between the machines that work and don't work is SQL Server 2000 is not and was never installed, whereas the machines that work are running 2000 and 2005. The servers where the tasks fail version 9.0.3050.
One thing I have noticed is that on the machines where DTS works there is an extra file in C:Program FilesMicrosoft SQL Server90DTSBinn called Microsoft.SqlServer.Exec80PackageUtil.dll. Copying the file to a non-working server did not solve the issue. I tried installing the backwards compatibility update but it said that a newer version was already installed.
If anyone could help I would greatly appreciate it. If you need more information just ask and I would be happy to provide it.
A customer is about to replace their PDC with SQL Server running on it. MY companies role is to build the new server, and get their SQL Server running again. I need to have the devices, databases, tables, data and the USERS copied across. I cannot use transfer manager as their will be only 1 instance of SQL Server running on the network.
If all we have is an NT backup is perfomed on the entire disk, i.e. the MSSQL/DATA directory. How do we recreate the databases and their associated tables and also, how do we recreate the User accounts ? Any ideas would be appreciated.
Please feel free to e-mail me with a response NEILA@ANGLIABC.CO.UK
Quick history, I have substantial experience using MS Access, VBA, and Regular Expressions to clean up messy text files for use in MS Access.
I want to upgrade an MS Access back end to SQL Server (2008 to be specific). I need to upload several delimited files on a daily basis. The files are full of formatting that I strip out using VBA and Regular Expressions. I've poked around on the site and found some UDFs that implement the VbScript Regular Expressions object as a stored procedure in Transact SQL.
My questions is, am I better off processing the files externally using a procedural language like VB or can SQL Server handle the processing as gracefully as MS Access/VBA/RegEx does?
Thanks for letting me tap into your experience.
I could simply puzzle it out myself by implementing it in SQL Server but I'd rather not waste the time as a newb to transact SQL and then find out that I'd be better off doing it externally.
I cannot log in to SQL Server 2005 Dev Edition in my local machine using Windows Authentication. The server returned "Login failed..." when connecting with SQL Server Management Studio.
I have not change anything since installation of this server.
This problem happens in RTM and SP1 versions, both running on Windows Vista RTM.
Anyone having this kind of problem too? Any solution? I'm guessing it's Vista-related.
We are using SQL Server 2005 SP2 to do transactional replication.
We and have a separate service account for the SQL Agents (sqladmin) vs. SQL Replication Agents (sqlrepadmin). It is my understanding this is a replication security best practice. The sqlrepadmin has full permissions on the snapshot share folder and it's subdirectories. The sqladmin account does not have permissions at all.
I have been getting an error message when we run the distribution clean up job. Executed as user: PRODsqladmin. Could not remove directory '\Tes01boxRepldatauncqabox01_DB01_TO_ORACLE20070905104896'. Check the security context of xp_cmdshell
I have dropped the publication and recreated which is what appears to have caused the error.
From http://technet.microsoft.com/en-us/library/ms151151.aspx
Note:
If a publication is dropped, replication attempts to remove the snapshot folder under the security context of the SQL Server service account. If this account does not have sufficient privileges, log in with an account that does have sufficient privileges and remove the folder manually. Removing a folder requires the Modify privilege if the folder is a local path or the Full Control privilege if the folder is a network path.
The note above implies that the SQL Server service account (sqladmin) needs permissions on the snapshot folder as well.
Finally my questions: Is there a workaround that will allow the distribution cleanup job to run as sqlrepadmin and perform the delete?
If both sqlrepadmin and sqladmin need permissions to the snapshot what is the reasoning from a security perspective of separating them out?
I have problem when I create the clusters I created domain account and give him permission create computer objects and real all properties but there are problem when clean up the cluster always pop up and error but latter I tried using my domain admin and worked perfectly is there any permission is missing for cluster ? or all I need is Create Computer Object and Read Permission?
I have a database that I have been creating and testing. I have added some junk data and some data into lookup tables. Is there a way to create a clean copy of the db and keep the lookup table data? Also will I be able to create the db under a new name?
I have a number of test databases which were created from the same backup. Set up SQLMaint to run on them each night. SQL Maint seems to run clean on all of them, but on two the Scheduled Task shows "Failed". All tasks have been set up exactly the same.
Has anyone run into something like this?
Here's the output from the maintenance job for one of the failing db's:
Microsoft (R) SQLMaint Utility, Version 6.50.240 Copyright (C) Microsoft Corporation, 1995 - 1996
Logged on to SQL Server 'COSTGUARD2' as 'sa' (trusted) Starting maintenance of database 'Test15' on Thu Jun 17 23:10:01 1999
[1] Check Data and Index Linkage...
** Execution Time: 0 hrs, 0 mins, 1 secs **
[2] Check Data and Index Allocation...
** Execution Time: 0 hrs, 0 mins, 2 secs **
[3] Check System Data...
** Execution Time: 0 hrs, 0 mins, 1 secs **
[4] Update Statistics...
** Execution Time: 0 hrs, 0 mins, 27 secs **
End of maintenance for database 'Test15' on Thu Jun 17 23:10:30 1999
What is the difference between CleanExpiredCache and FlushReportFromCache Do we need to run both the SPs to clear all the SSRS reporting cache? Is it possible to clean all the Cache information and retain the logs? If yes how we can do so. Is it by deleting the REportServerTempDB.dbo.ExecutionCache table work in achieving this?
'You have to have a BackUps folder included into your release!
Private Sub BackUpDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles BackUpDB.Click Dim addtimestamp As String Dim f As String Dim z As String Dim g As String Dim Dialogbox1 As New Backupinfo
addtimestamp = Format(Now(), "_MMddyy_HHmm") z = "C:Program FilesVSoftAppMissNewAppDB.mdb" g = addtimestamp + ".mdb"
'Add timestamp and .mdb endging to NewAppDB f = "C:Program FilesVSoftAppMissBackUpsNewAppDB" & g & ""
Try
File.Copy(z, f)
Catch ex As System.Exception
System.Windows.Forms.MessageBox.Show(ex.Message)
End Try
MsgBox("Backup completed succesfully.") If Dialogbox1.ShowDialog = Windows.Forms.DialogResult.OK Then End If End Sub
Code Snippet
'RESTORE DATABASE
Private Sub RestoreDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles
RestoreDB.Click Dim Filename As String Dim Restart1 As New RestoreRestart Dim overwrite As Boolean overwrite = True Dim xi As String
With OpenFileDialog1 .Filter = "Database files (*.mdb)|*.mdb|" & "All files|*.*" If .ShowDialog() = Windows.Forms.DialogResult.OK Then Filename = .FileName
'Strips restored database from the timestamp xi = "C:Program FilesVSoftAppMissNewAppDB.mdb" File.Copy(Filename, xi, overwrite) End If End With
'Notify user MsgBox("Data restored successfully")
Restart() If Restart1.ShowDialog = Windows.Forms.DialogResult.OK Then Application.Restart() End If End Sub
Code Snippet
'CREATE NEW DATABASE
Private Sub CreateNewDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles
CreateNewDB.Click Dim L As New DatabaseEraseWarning Dim Cat As ADOX.Catalog Cat = New ADOX.Catalog Dim Restart2 As New NewDBRestart If File.Exists("C:Program FilesVSoftAppMissNewAppDB.mdb") Then If L.ShowDialog() = Windows.Forms.DialogResult.Cancel Then Exit Sub Else File.Delete("C:Program FilesVSoftAppMissNewAppDB.mdb") End If End If Cat.Create("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:Program FilesVSoftAppMissNewAppDB.mdb;
Jet OLEDB:Engine Type=5")
Dim Cn As ADODB.Connection 'Dim Cat As ADOX.Catalog Dim Tablename As ADOX.Table 'Taylor these according to your need - add so many column as you need. Dim col As ADOX.Column = New ADOX.Column Dim col1 As ADOX.Column = New ADOX.Column Dim col2 As ADOX.Column = New ADOX.Column Dim col3 As ADOX.Column = New ADOX.Column Dim col4 As ADOX.Column = New ADOX.Column Dim col5 As ADOX.Column = New ADOX.Column Dim col6 As ADOX.Column = New ADOX.Column Dim col7 As ADOX.Column = New ADOX.Column Dim col8 As ADOX.Column = New ADOX.Column
Cn = New ADODB.Connection Cat = New ADOX.Catalog Tablename = New ADOX.Table
'Open the connection Cn.Open("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:Program FilesVSoftAppMissNewAppDB.mdb;Jet
OLEDB:Engine Type=5")
'Open the Catalog Cat.ActiveConnection = Cn
'Create the table (you can name it anyway you want) Tablename.Name = "Table1"
'Taylor according to your need - add so many column as you need. Watch for the DataType! col.Name = "ID" col.Type = ADOX.DataTypeEnum.adInteger col1.Name = "MA" col1.Type = ADOX.DataTypeEnum.adInteger col1.Attributes = ADOX.ColumnAttributesEnum.adColNullable col2.Name = "FName" col2.Type = ADOX.DataTypeEnum.adVarWChar col2.Attributes = ADOX.ColumnAttributesEnum.adColNullable col3.Name = "LName" col3.Type = ADOX.DataTypeEnum.adVarWChar col3.Attributes = ADOX.ColumnAttributesEnum.adColNullable col4.Name = "DOB" col4.Type = ADOX.DataTypeEnum.adDate col4.Attributes = ADOX.ColumnAttributesEnum.adColNullable col5.Name = "Gender" col5.Type = ADOX.DataTypeEnum.adVarWChar col5.Attributes = ADOX.ColumnAttributesEnum.adColNullable col6.Name = "Phone1" col6.Type = ADOX.DataTypeEnum.adVarWChar col6.Attributes = ADOX.ColumnAttributesEnum.adColNullable col7.Name = "Phone2" col7.Type = ADOX.DataTypeEnum.adVarWChar col7.Attributes = ADOX.ColumnAttributesEnum.adColNullable col8.Name = "Notes" col8.Type = ADOX.DataTypeEnum.adVarWChar col8.Attributes = ADOX.ColumnAttributesEnum.adColNullable
'You have to append all your columns you have created above Tablename.Columns.Append(col) Tablename.Columns.Append(col1) Tablename.Columns.Append(col2) Tablename.Columns.Append(col3) Tablename.Columns.Append(col4) Tablename.Columns.Append(col5) Tablename.Columns.Append(col6) Tablename.Columns.Append(col7) Tablename.Columns.Append(col8)
'Append the newly created table to the Tables Collection Cat.Tables.Append(Tablename)
'User notification ) MsgBox("A new empty database was created successfully")
'Restart application If Restart2.ShowDialog() = Windows.Forms.DialogResult.OK Then Application.Restart() End If
End Sub
Code Snippet
'COMPACT DATABASE
Private Sub CompactDB_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles
CompactDB.Click Dim JRO As JRO.JetEngine JRO = New JRO.JetEngine
'The first source is the original, the second is the compacted database under an other name. JRO.CompactDatabase("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:Program
Data Source=C:Program FilesVSoftAppMissNewAppDBComp.mdb; JetOLEDB:Engine Type=5")
'Original (not compacted database is deleted) File.Delete("C:Program FilesVSoftAppMissNewAppDB.mdb")
'Compacted database is renamed to the original databas's neme. Rename("C:Program FilesVSoftAppMissNewAppDBComp.mdb", "C:Program FilesVSoftAppMissNewAppDB.mdb")
'User notification MsgBox("The database was compacted successfully")
I'm working on an ASP.Net project where I want to test code on a localmachine using a local database as a back-end, and then export it tothe production machine where it uses the hosting provider's SQL Serverdatabase on the back-end. Is there a way to export tables from oneSQL Server database to another in such a way that if a table alreadyexists in the destination database, it will be updated to reflect thechanges to the local table, without existing data in the destinationtable being lost? e.g. suppose I change some tables in my localdatabase by adding new fields. Can I "export" these changes to thedestination database so that the new fields will be added to thedestination tables (and filled in with default values), without losingdata in the destination tables?If I run the DTS Import/Export Wizard that comes with SQL Server andchoose "Copy table(s) and view(s) from the source database" and choosethe tables I want to copy, there is apparently no option *not* to copythe data, and since I don't want to copy the data, that choice doesn'twork. If instead of "Copy table(s) and view(s) from the sourcedatabase", I choose "Copy objects and data between SQL Serverdatabases", then on the following options I can uncheck the "CopyData" box to prevent data being copied. But for the "CreateDestination Objects" choices, I have to uncheck "Drop destinationobjects first" since I don't want to lose the existing data. But whenI uncheck that and try to do the copy, I get collisions between theproperties of the local table and the existing destination table,e.g.:"Table 'wbuser' already has a primary key defined on it."Is there no way to do what I want using the DTS Import/Export Wizard?Can it be done some other way?-Bennett
I am looking for a table where Maintenance Clean Up Task configuration is stored. For example, Delete file older than the following - which is 2 days. Which table can I retrieve the setting in msdb ?
Ok say I would like to build a table for of the following questions(say 6 questions for the sake of argument): Do I just stored the index of the radiobuttonlist. What are some resources that I could look at. Should I make a look up table.
5) If money were no object, I would live . . . Prefer not to say On a tropical island In a New York penthouse In an English castle On a Texas ranch In a Malibu beach house In a mountain retreat (Selected) On the moon None of the above
1 ---> 2 ---> 3 ---> 4 ---> 5) --->6 This is the question we are looking at. 6 --->
How should I create the database table for the above example.
in microsoft doc there is written on the topic of BP Extensions with SSD's in SQL Server 2014: only clean pages are written to disk... does this mean data pages that have not been modified yet? or also those data pages that have already been modified, and where log has finished writing and the transaction has been marked as commited??
why are there clean data pages being written to L2 cache to make space for other not modified pages? I mean, shoudnt they be modified first, before letting other unmodified data pages into the Cache? I mean they have still to be modified..that makes no sense to me to page them out and page them in again just for other data pages...
I am new to SSIS. I have been struggling with this for the past one week. I have a weird task. I need to import several tables from one database to a different server with a new database name. We need to do this at the end of every year. The main problem here is that the number of tables varies every year. You may not have all the tables as last year or may have more tables. So I need to create a dynamic task that takes care of this every year without changing the package.
I have performed the following tasks **
1. Create a new dynamic database. ( I have used Execute SQL Task to do this) 2. Copy all the table structures ( I have used Execute SQL Task to do this)
3. Import Data. This is the main problem. I was trying to create a dynamic connection string with variables as suggested in several forums but I finally came to know that this cannot be done if the table structures are different as the metadata cannot be refreshed at runtime.
4. The final step to create a process to validate the data (the count from each table for both source and destination. I think this can be done with Sql task.
What is the best method to do this? My DBA does not like “Transfer SQL Objects Task” or “transfer Database Task”. I would like to create this as a dynamic process.