I have two transaction log files, one on C drive and other on D drive. I want to drop the file from that database that is on C drive. Can anyone help me out on this with complete syntax.
I have taken a down time of one and half hour to accomplish this task
Here is what think
Take a full database backup
Take transaction log backup
Shrinkfile using DBCC with truncateonly option
emptyfile using dbcc or drop file using alter database remove
I am having problem to find the right syntax to DROP a column with contrainst and recrate it I get an error
if exists ( select * from INFORMATION_SCHEMA.COLUMNS where TABLE_NAME='myTable' and COLUMN_NAME='myDate' ) ALTER TABLE [dbo].[myTable] DROP COLUMN myDate GO
ALTER TABLE [dbo].[myTable] WITH NOCHECK ADD myDate datetime CONSTRAINT [DF_myDate] DEFAULT (GetDate()) GO
Query Analyser says : Server: Msg 5074, Level 16, State 1, Line 5 The object 'DF_myDate' is dependent on column 'myDate'. Server: Msg 4922, Level 16, State 1, Line 5 ALTER TABLE DROP COLUMN myDate failed because one or more objects access this column. Server: Msg 2705, Level 16, State 4, Line 2 Column names in each table must be unique. Column name 'myDate' in table 'dbo.myTable' is specified more than once.
Hi All,I am having a serious problem of removing and adding again an user in adatabase.Microsoft SQL Server 2000 - 8.00.760 (Intel X86)Dec 17 2002 14:22:05Copyright (c) 1988-2003 Microsoft CorporationEnterprise Edition on Windows NT 5.2 (Build 3790: Service PackSomehow, the user was created earlier but was not able to run queryassigned to him. As a result I wanted to drop and recreate the user.I have done many possible things to create the users again with thesame user id but failed.1. I tried to delete this user from the enterprsie manager security-->logins-->user1It deletes but when I try to add again, it gives me error message(Error 15023: user or role u'user1' already exist).2. Then I tried in the db:delete sysusers where name='user1'it deletes the user1.3. Again tried adding, got the message in 1.4. Then I trieduse db1EXEC sp_change_users_login 'Update_One', 'user1', 'user1'Server: Msg 15291, Level 16, State 1, Procedure sp_change_users_login,Line 88Terminating this procedure. The User name 'user1' is absent or invalid.I also tried master database and ran the following.EXEC sp_droplogin 'user1'The login 'user1' does not exist.But If I try to add the login user1, get the error message.Error 15023: user or role u'user1' already existI also ran the following when I got Ad hoc error messageexecute sp_configure "allow updates",1goreconfigure with overridegoCould you please tell me how I can solve this problem.I do highly appreciate your help.Thanks a million in advance.best regards,mamun
I just upgraded from SQL2000 to SQL 2005. When I have a query opened for a database already and I want to drag and drop a script in, I get the login box. Is there anyway to get this to not happen? I would like it that when I drag and drop the script it is just loaded in a new query with the database connection that I was using already. (This is what it did in SQL2000) Thanks.
I get the following error message while droping unique index.
Server: Msg 3723, Level 16, State 5, Line 1 An explicit DROP INDEX is not allowed on index 'dbo.ALEG.IX_ALEG'. It is being used for UNIQUE KEY constraint enforcement.
Anybody had experience dropping the system generated index?
I tried to drop some auto-generated index which usually have name like _WA_Sys_[column name}_07F6335A. Then I follow the table.index name rule. It always show no such index exists in the database. But I can query these indexes in sysindexes and verify they are there. What went wrong?
I'm familiar with how to check for the existence of a table before dropping it using the following command:
if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[xxx]') and OBJECTPROPERTY(id, N'IsUserTable') = 1) drop table [dbo].[xxx]
How does one check for the existence of a temp table (using # syntax) before dropping it? I've tried various flavors of this command and none work. One flavor is
use tempdb if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[#xxx]') and OBJECTPROPERTY(id, N'IsUserTable') = 1) drop table [dbo].[#xxx]
CREATE TABLE #xxx ( NumID INTEGER IDENTITY(1,1), Exhibitor_Id INTEGER NOT NULL, Company_Id INTEGER NOT NULL )
I have a database that is about 300 gig. I am setting up replication to a reporting server. We are doing a series or mock loads and I will need drop the tables and reload the main database a few times before we go live. To do that I plan to stop replication and drop all the articles, drop the subscription, then load the new data, then reinitialize and restart replication.
The first time I tried to do this, when I drop the articles, it seems to be trying to "clean up" the distribution database on the reporting server and that is taking a couple of hours to do. The disruption database is about 40 gig.
Is this correct behavior in SQL2005 replication? Is there a way to avoid this? I have all the replication pieces scripted out and would like to just drop replication, reload, and then run my scripts to recreate replication. But this "clean up" is going to cause me a lot of headache if I don't figure out what is going on.
Am I going down the wrong road here? Is there an easier way to do this? Any comments would be great!!!!
Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url. My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?
Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc.
I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).
Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.
I have a customer they are running raid 5 on a windows 2000 server one of the drives went bad. The customer replaced the drive and raid rebuilt the drive, every thing seamed to be fine but there is one database file that cannot be attached to SQL. The file is 15G so I know there is information the error states that the file is not a Primary file. Any clue on how to fix this?
Greetings, I have just arrived back into the country (NZ) and back into ASP.NET. I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share. It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page. Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer. Cheers ~ J
I am testing some maintenance tasks sql commands such as index rebuild, index reorg, update statistics and db integrity check on a SQL Server 2014 Database. This is a new non-production vendor database (DB Size 500 GBs, Log Size 25 GBs) which eventually will be created in production. Currently, it is in full recovery model and without log backups. The database has a whole lot of indexes. I am just trying to rebuild and reorganize all the indexes (that need it), in addition to trying to get an idea of how long these maintenance task will take and the space needed in the log file to complete these tasks/commands. I would like to execute these tasks manually (the first time) to gather the duration and space required information. Eventually, I would probably schedule a weekly job to perform this maintenance.
I ran the index rebuild task on the database and noticed that the log file grew by over 50 GBs. I killed the process and truncated and shrunk the log file back down.
1. Does the index rebuild, index reorg, update statistics and db integrity check commands all use the log file?
2. Does Indexs Reorg have less impact on log file then Index Rebuild?
3. Should a truncate log and shrink log file be performed after these maintenance commands?
4. Should a full database backup be performed after these maintenance commands? Or before the maintenance commands?
I have read and understand that shrinking is not good for the database (could lead to more fragmentation and more data file growth when data is added) and I know about rebuilding indexes when fragmentation is GT 30% and reorganizing indexes when fragmentation is GT 5% and LE 30%.
Since this is a non-production database maybe I should set the recovery model to simple, run the maintenance commands and leave the database in simple recovery model unless the vendor needs it in full recovery model for some unknown reason.
5. With the simple recovery model the log file should be reused in a circular manner and not grow during these maintenance tasks. Is this correct?
I need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?
I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.
For a database, we have 4 data files in a particular file group and the file sizes are almost 70 GB each.
Do I come across any performance issues if I create/pre-allocate an additional data file in the same file group so that the existing files don't grow too much?
I have a package in which there are only one Data flow Task and it has only three components. 1) Source , which is a SQL db 2) destination and 3) OLE DB Destination flat file Error output file. I want the error file to be created ONLY if there is any error while dumping the data into destination DB. But , the issue is, the error flat file is being created inspite of No error while dumping the data from Source to Destination.
I'm copying files to a folder with the naming convention as follows in the source folder:
CM_ABC_MY_TEST.txt
In the destination folder, this filename needs to appear as:
CM_XYZ_MY_TEST.txt
In my File System Task, I'm pretty sure I'm going to need an expression with a replace, substring, etc. But am having a hard time nailing down the exact syntax.
Need to know how I can get the dynamic filename created in the FlatFile destination for insert into a package audit table?
Scenario: Have created a package that successfully outputs Dynamiclly named flat files { Format: C:Test’Comms_File_’ + ‘User::FileNumber’+’_’+Date +’.txt’
E.g.: Comms_File_1_20150724.txt, Comms_File_2_20150724.txt etc} using Foreach Loop Container :
* Enumerator Set to: “Foreach ADO Enumerator” with the ADO object source variable selected to identify how many total loop iterations there are i.e. Let’s say 4 thus 4 files to be created
*Variable Mappings : added the User::FileNumber – indicates which file number current loop iteration is i.e. 1,2,3,4
For the DataFlow task have a OLDBSource and a FlatFile Destination where Flat File ConnectionString is set up as:
In my script task I have the following code. The task I'm trying to accomplish is: If the filename on FTP can be found in the local archive folder of e: drive then show message "FileAlreadyThere" (I will ultimatley change it to do nothing); if the filename on FTP cannot be found in the local archive folder of e: drive then transfer the file to the local package folder on d: drive.
While the script task is executing I was watching it closely, but the problem i saw is that: If some files on FTP are already in local archive folder and some are not, then it the files which are already in the archive folder are dumped to the package folder; then after that the files which are not in the archive folder are then dumped to the package folder. But I only want the new files on FTP to be transferred to the package folder for further processing.
Then after this is finished, I saw all the files in the package folder are refreshed one after another, after the first round of refresh the second round starts, after the second round finishes it then stopped. I saw it refreshes itself because the 'Date Modified' of the file changes. And I saw the script task turned green.
I don't see how the code below produced this result. Something is wrong in the logic of the loop? Anyone has any idea why it's behaving the way it is now? And how to change the code to accomplish what I want? Thanks a lot!!
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Imports System Imports System.IO Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() Dim cm As ConnectionManager = Dts.Connections.Add("FTP") cm.Properties("ServerName").SetValue(cm, "ftp2.name.com") cm.Properties("ServerUserName").SetValue(cm, "username") cm.Properties("ServerPassword").SetValue(cm, "password") cm.Properties("ServerPort").SetValue(cm, "21") cm.Properties("Timeout").SetValue(cm, "0") cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb cm.Properties("Retries").SetValue(cm, "1") Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing)) ftp.Connect() ftp.SetWorkingDirectory("/directory") Dim fileNames() As String Dim folderNames() As String ftp.GetListing(folderNames, fileNames) If fileNames Is Nothing Then MsgBox("NoFileOnFTP") Else Dim fileName As String For Each fileName In fileNames If File.Exists("c: emp" + fileName) Then MsgBox("FileAlreadyThere") Else ftp.ReceiveFiles(fileNames, "c: emp", True, True) End If Next End If
Similar to a previous post (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=244646&SiteID=1), I am trying to import data into a SQL Table.
I am trying to program a small application that will import product data obtained through suppliers via CD-ROM. One supplier in particular uses Fixed width colums, and data looks like this:
Example of Data
0124015Apple Crate 32.12
0124016Bananna Box 12.56
0124017Mango Carton 15.98
0124018Seedless Watermelon 42.98 My Table would then have: ProductID as int Name as text Cost as money
How would I go about extracting the data with an XML Format file? I am stumbling over how to tell it where to start picking up data for a specific column. Is there any way that I could trim the Name column (i.e.: "Mango Carton " --> "Mango Carton")?
I don't know if it makes any difference, but I've been calling SQL from my code by doing this:
Code in C# Form
SqlConnection SqlConnection = new SqlConnection(global::SQLClients.Properties.Settings.Default.ClientPhonebookConnectionString); SqlCommand cmd = new SqlCommand();
SqlConnection.Open(); cmd.ExecuteNonQuery(); SqlConnection.Close(); RefreshData(); I am running Visual Studio C# Express 2005 and SQL Server Express 2005.
I need find out the number of columns in flat file before i process that particular file.I have file name in @filename variable and file path is @filepath variable.But do not not that how i will check the column name in before i will process that file.
@filePath = C:DatabaseSourceFilesCAHCVSSourceFiles And i am using for each loop container to read the file one by one and put the file name in @filename variable.and my file name like
Now what i have to do is i need to make sure that ID,Name,City,County,Phone is there in flat file.if it is not there then i have to send mail to client saying that file is not valid.I need to also calculate the size of flat file.
The TEMPDB transaction log file keeps growing.The database server is new and the transaction log was presized to 1 GB on installation. After installing a number of databases, the log file grew over a day to 38GB. Issuing a manual checkpoint was the only way to free some space to allow it to be shrunk back to a usable size. The usage of the file is still going up.
I am struggling to find what process is causing the log to be used so heavily. Looking at the log reuse wait desc for tempdb returns "Nothing" and tempdb itself isn't being used very much or growing in size.
In a server we had File Growth,And then We had to Add New Hard Drive And New File On It.And Now We have New server with a Huge Hard Drive.But all files remaind.Can I Reduce This files to One data file or not ?
Hi,I am trying to use BULK INSERT with format file. All of our data hasfew bytes of header in the data file which I would like to skip beforedoing BULK INSERT.Is it possible to write format file to skip these few bytes ofheader before doing BULK INSERT? For example, I have a 1 GB data filewith 1000 byte header. Except for first 1000 bytes, rest of the data isgood for BULK INSERT.Thanks in advance. Sorry if it is really a dumb question as I am newto BULK INSERT and practicing still.Bob
I've production sql server 7 sp3 on windows NT. I had a 8GB data file ofwhich 5GB were used and 3GB were unused. I wanted to take back the unused3GB.So I did the following with EM GUI:1. I tried to "truncate fre space from end of the file". Didn't truncatethe file. I believe there was no empty space at the end of the file.2. Next I chose the option to "shrink file to 5GB". And to my horror thedata file instead of taking just 5GB took the empty spaces also and the sizeof the used data file went to 8GB.Any idea what's going on?TIA,SP
I have a ssis package where I need to have excel destination. In the Excel file, I need to have few rows with some text and then populate data below the text. One the text is like this:
Data as of: 08/25/2015
if the report ran today, then Data as of will have Yesterday. So, if the user opens that excel file after a week, then user should see same Data as of: 08/25/2015. not today()-day(1).
I was planing to handle on excel side with today()-day(1). but it only works the day it was run. Then the excel file is open after few days later, then it might as Data as of: 08/30/2015 which is not true. It should still stay Data as of:
08/25/2015 on what ever date the excel file is open. The SSIS package runs only once.
How do I handle this so that whenever user open the file, they will see Data as of: 08/25/2015. This is not a column in excel. It is like a description of data in excel.
Does anyone know how to do this using variables? Everytime I try it, I get the
Error: Failed to lock variable for read access with error 0xc00100001.
I also tried it writing a script and still the same error. If I hard code the values into the variables it works fine but I will be running this everday so that it will pull in the current date along with the filename. So the value of the variables will change everyday. Here is my expression:
its my flow in one of my packates (ETL job) Excel file contains monthly revenue details, i wanna import the excel data to my database staging table, so i've created the package. its working fine...
Problem if we change the new data for the next month and running the package its not running; the same file, same format, only we delete the contents, of the file except first row of the excel sheet, and pasting the new data; new data is coming from Oracle DataBase in the form of excel sheet ( manually they will copy the data and sending to us)
i open that package in design mode and while double clicking the excel file source it says <column name>'s Meta Data needs to be synchronized Do you want to Fix this issue automatically with the available external column's meta data
Clearly noted that its a data type issue; i have changed the corresponding data types as it is in the previous Excel sheet which is equivalant to the Table its copying to.
now the package is running with validation warnings, External Column "Invoice Amount" needs to be updated...etc. some 2 or three warning messages i can able to see in the package Execution wizard,
ok, i'm ready to accept these warnings, and i want my package running from my server;( packages had been deployed in to the Centeralized server; every time if we want to run the package, we have the asp.net webpage, that is executing the package in an On_click event)
The package is not running from the server, its due to the meta data change in the Excel file( i guess)
please suggest me some guide lines to resolve this meta data issue, i want my excel sheet meta data should not change when we have new updates in it;
otherwise suggest me some solutions that i can validate the excel sheet before running the package and testing whether the data is in correct format or not? its a kind of Data Profiling activity;
i know its some what crazy, but i need to maintain the system with permanent solution, instead of facing this meta data mismatch issue!!!
some what lenthy explanation--> its needed for my dear powerful microsoft responders. i think i 've explained my problem clearly, if i don't let me know your queries, i'll try my level best.
What is the easiest way to accomplish this task with SSIS?
Basically I have a stored procedure that unions multiple queries between databases. I need to be able to export this to a text file on a daily basis and add a total records: row to the end of the text file.
How to use Microsoft Data Link(Udl) type file to connect with SQL Server using SQL Native Client,If i use OLDEDB Connection i can make connection using UDL file with Database ,but how can i use a file for a connection using SQL Native Client.Is there any other data link file that can be used to connect with SQL server using SQL Native Client.