Could anyone tell me if and how can i automate processing in sql server. Example, i would like to automate reading from a source file, populating it into the respective tables and then running a view to process desired tables. I already did all the above manually. I just want to know if there is a way of automating all this process.
I am trying to create a process to automate the FTP process and then IMPORT to a table. Once Import is complete, I would like to move the file from the FTP server and rename it. I tested the "commands" via CMD and it works. I then created a store procedure with the DOS commands to rename/move the source file. The command executes and deletes the file from the source, however, the copied files to a different destination disappeared. I did a search in my local hard drive(testing using my local folders), but can't find where the copied/renamed file went.
HEre is the portion of the commands:
SET @cmd ='Copy "C:Documents and SettingsegllareMy DocumentsBoxtestData.txt" "C:Documents and SettingsegllareMy DocumentsArchiveCopybox.%random%.bak"' EXEC master..xp_cmdshell @cmd GO
Can you please tell me what's wrong with this,
or direct me to a sample store procedure I can model..
I was wondering if anyone can share the procedure(s) used in setting up an automated MS SQL database backup through tivoli Storage Manager ( 5.2.7 )..........?
The powers at be have decided that they would like to automatically run a trace on one of our analysis servers when it processes a cube in the early hours of the morning. Now I have no problem creating a SQL Profile to run and store the results in a database table for them, but I have no idea how to automate it so that it runs everyday, any help would be greatly appreciated. Apologise for not putting this in the correct forum.
'INSERT INTO SaltInvWhOpen (StockCode, Warehouse, TrnMonth, TrnYear, OpenBalQty, OpenBalCost)SELECT StockCode, Warehouse, Month(GETDATE()) AS TrnMonth, YEAR(GETDATE()) AS TrnYear, OpenBalQty12, OpenBalQty12 * UnitCost AS Expr1FROM InvWarehouse --------------------------------------------------------------------------------
In older versions I would have used a DTS package (It's been a long time and I'd have had to use a book to guide me through it), but as that is gone now, I am unsure where to even begin looking.
If someone could assist by giving me the current method, and dare I ask, walk me through it, the help would be much appreciated.
After SQL Server Express 2012 is installed, is there a way to automate setting up a few options? I need to enable the option for both windows and sql server authentication. Can this be done with Powershell or other tool? This is a situation where the user has absolutely no computer skills and there is no one available in their office that can set the options manually. I thought some sort of script would be useful for this.
We're trying to put a view of data maintained in desktop Access databases online and into SQL Server.The desktop Access system uses separate databases instead of tables within one database, It's a strange design, but it can't be changed.We have been importing all of the separate databases into a single, new Access database, then upsizing the new databse to SQL Server, then uploading it.This is not going to work long term, because we are stuck with a 250 mB Access database to upsize and upload, when we never need to update more than 2 or 3 of the tables and upload more than 2 mB.We'd like to be able to upload only the tables -- preferably the Access *.mdb's -- that have changed, and then replace the SQL Server tables with the new information. And we'd like to automate it as much as possible, without upsizing Wizardy.I don't know where to even begin looking for information about how this might be done.Any suggestions would be deeply appreciated.- Tinker
hi I would appreciate if someone demonstrate how to automate ftp in a command line from within a batch file. I do want to move certain files from one server to another via ftp command line in an automatic fashion via running the batch periodically.
i want to automate the exporting of data from an excel database to sql using dts.i don't want this to do manually.can u help me out in solving this.what are the things i have to do for this ?expecting u'r earliest reply
I'd like to build a process that will identify all the tables in a database with a scan density less than 100% and generate 'dbcc dbreindex' statements for them. The 'dbcc showcontig' command displays the information I want but I don't know how to access this information from within a script. Any ideas out there?
I am testing a procedure to automate the transaction log dump. I am following the steps located in Chapter 22 of the Microsoft SQL Server Administrator's Companion ("Automatic Transaction Log Dumps Using Performance Monitor"). The alert in Performance Monitor appears to be starting when the log is 75% full, but the alert is not firing off the file that contains the dump transaction sql command. For the 'Run Program on Alert' box this is what I have: isql -Ssvrname -Usa -P -id:appsmssqlinndump.sql The dump.sql file contains: 'dump transaction pubs with no_log'
I have also tried the following 4 steps: 1) Created a SQL Alert Messsage, 2) Created an NT Performance Monitor Threshold Alert to run sqlalrtr to issue a certain error when the pubs log is 75% full, 3) Created a TSQL Task, and 4) Created a SQL Server Alert to run the Task created in step 3. This appears to do the same thing. The Alert is fired off, but the Task is never executed. Note: I am able to execute the task from within the Schedule Tasks Window.
I am using Standard Security with SQL Server 6.5 (sp5a) running on NT4. Thanks for you help in advance.
question... i heard a bit about Control Macro Generator but have not found much reading material on it. here at our company we have employees fill out a weekely timesheet and typically have clerks enter the information into the TM.PTA.00 (Project Timesheet with Rate/Amount Entry) and submit it that way. say we have this timesheet in a nice little excel sheet with columns that match the input boxes on the form, how can we get it to automatically input the data instead of having a person sit there and type it all in?
I would like to automate a simple process which involves trucating a table and importing records back in. What would be the simplest solution to automate this kind of process? I'd like to run this process twice a day.
How can I do this with vbscript, or C# ? - Copy backup files down from a network share, into the data directory of my local sql 2005 instance - perform a restore using the files copied from above - Execute a dts package
More info: Our databases are scripted and exist on the typical development, and testing enviroments. So as I get ready to start a new application, I want my local sql instance to be updated based on structure changes as well as data. So I have to apply the changes from the scripted sources and pull over the data. I would naturally like to automate this.
1) Is it possible to run stored procedures at specified intervals without using the job system (through T-SQL)? I want the schedule to be independent of the MSDB database in case of temporary failures, etc.
2) Would extended stored procedures be helpful in this scenario?
I am new to SQL Server7. I need to populate some tables from an SQL Server7 database at the end of the day. How can I automate this process? I also need to export these populated tables to a text file on daily basis. I know I can use "DTS" to do this. But is there any way to make these automated also? Or is there any third party tool to do all these?
In order to automate regular maintenance on SQL Server (6.5 SP3) I'm trying to automate the process of putting the database into single user mode without having to stop/start the process.
The sp_dboption won't allow single user mode while users are still connected to the database, so I was trying the following script:
DECLARE @to_kill int
/* Declare Cursor */ DECLARE spid_cursor CURSOR FOR SELECT spid FROM sysprocesses WHERE dbid = 6
/* Open Cursor */ OPEN spid_cursor
/* Loop Through All Table Names */ FETCH NEXT FROM spid_cursor INTO @to_kill WHILE (@@fetch_status <> -1) BEGIN IF (@@fetch_status <> -2) BEGIN KILL @to_kill END FETCH NEXT FROM spid_cursor INTO @to_kill END
This, however gives a sytax error for the KILL command (it seems that KILL will not allow a parameter to be passed). I've tried making the @to_kill variable smallint, int, and varchar without any success.
Is there any automated way way to kill user connections without cycling the server?
I'm really puzzled with this. I am new to using MS SQL and need help with the primary key. When I was using Access, I would open the database table and just enter values for the various columns, each time a new column was begun, the computer would insert a unique number there. OK, so now I am using visual studio to work on a SQL database. I created a new table and the first field I specified was "ID", I then chose it as the primary key. For default value, I tried both auto number and auto increment. These don't seem to work, as when I type values into fields and try to move to the next row, having left the ID field blank, I get an error which reads "String or Binary data may be truncated". What is going on, what do I need to do so the field automatically populates with an ID number... Some research sort of led me to the concept of a "trigger" to do this. Is this the approach. I was able to r-click on the table and I saw an option for "create trigger" or "new trigger"... How do I learn to do this, is there a generic trigger to do an auto increment of a primary key ID field?
I have been experiencing a problem with a job I am trying to automate. My team requires two instances of a db on development. Every week, db1 is detached and reattached as db1_copy, and a new copy of db1 from production is copied over. I have used cursors for the dropusers, addlogins, grantdbaccess, addroles, and addrolemembers aspects of the restore process. Also, we use sql authentication and not windows authentication. The issue is that when the agent encounters a minor issue, ie. a login that already exists in that db, or adding a rolemember to a role that is already there, I get an error. I have ensure that at that step, the job simply proceeds because it is a minor step in the process (it happens during the script execution when done manually but I can override this because I am present and just let it run in one step instead of many). Is there any way to do this without adding a multitude of steps?
I am able to import - export manually, how can you schedule this import export in a job? I want the table to drop and be rebuilt each time it runs. Since linking servers isn't recommended I want go with that approach.
Hi,I need some advise on how to automate record selection on a storedproc. Here is my situation. I have a stored proc that I used onCrystal reports with two parameters - Acctcode and Subacct. When auser enters ' *' on these parameter, it means to report on allaccounts otherwise, report only on specific account.Here is my select statement with line numbers:Create proc rb_SubledgerRpt@Acctcode varchar(4), @SubAcct varchar(3)As3 Select AcctCode, SubAcct4 From GLDetails5 Where SubAcct <> ' '6 and AcctCode = @Acctcode -- for specific acctcode7 and SubAcct = @SubAcct -- for specific subacct8 Go-- If a user wants to see all Acctcode, and all Subacct, how do Idisable lines 6 and 7?Thank you in advance for your help.Edgar
Is there anyway to automate the configuring Reporting Server after is installed. I have many servers that I need to install SQL 2005 with Analysis servcies and Reporting Server and it will be good if I could run a tool to configure reporting server with some default values.
I have written a Data Processing extension for my application and can deploy it on my development machine no problems. My question is: what is the "correct" way of deploying an extension to an end user's machine? Do I have to write a special program to find and modify the Reporting Server config files and copy the extension over. Surely many developers have the same need so there must be a generic solution to this problem, however, I haven't managed to find one.
I first came across this issue in SQL2000 and I thought/hoped it would be rectified with SQL2005 but it appears not to be (unless I'm missing something).
I am able to do these actions interactively from SQL 2005 (not developers nor enterprise edition, just using SQL 2005 Mgmt Studio) and want to "script/batch" them so I can have them automatically run at a pre selected time.
First: I am able to delete the table by performing a right click on the table, then click Delete from Mgmt Studio SQL 2005. I verify the table is completely gone with a refresh. (I pulled the code that did this ..... DROP TABLE etc. to Notepad)
2nd: I am able to import the table (again from Mgmt Studio SQL 2005) and have saved this action as a SSIS. Execute the script and "waLa" I have all 17K rows of data. I pulled this create table code into notepad also.
Now I put the code of both of the above actions together (drop table and create table) into one SQL query and execute it. This does not give me the same results of above, instead my table is blank now.
Maybe there is a better way. The business problem I am attempting to solve: I am refreshing the data in a as/400 table weekly. I want that refreshed data to be available in the SQL2005 database without my having to press buttons first thing Monday morning. Can any one help? Thanks in advance.
Below is the Code:
USE [400kas] GO /****** Object: Table [dbo].[navar100] Script Date: 09/07/2007 16:09:04 ******/ DROP TABLE [dbo].[navar100] GO
USE [400kas] GO /****** Object: Table [dbo].[Query] Script Date: 09/07/2007 16:12:31 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[navar100]( [CMPNO] [decimal](3, 0) NOT NULL, [ARTDT] [datetime] NOT NULL, [AUDDT] [datetime] NOT NULL, [ARDDT] [datetime] NOT NULL, [CCUS#] [decimal](6, 0) NOT NULL, [CCNAM] [nvarchar](25) NOT NULL, [CUSNO] [decimal](6, 0) NOT NULL, [CNAME] [nvarchar](25) NOT NULL, [SHPNO] [decimal](4, 0) NOT NULL, [ARRCD] [nvarchar](1) NOT NULL, [AUDUS] [nvarchar](10) NOT NULL, [INVNO] [decimal](6, 0) NOT NULL, [CUSPO] [nvarchar](15) NOT NULL, [REFNO] [decimal](6, 0) NOT NULL, [COMNT] [nvarchar](10) NOT NULL, [SHPPO] [nvarchar](15) NOT NULL, [AMONT] [decimal](13, 2) NOT NULL, [AMOUNT] [decimal](24, 8) NOT NULL, [REMAN] [decimal](13, 2) NOT NULL, [INREG] [decimal](3, 0) NOT NULL, [INSAL] [decimal](3, 0) NOT NULL, [TMCOD] [nvarchar](2) NOT NULL, [CRHLD] [nvarchar](1) NOT NULL, [CRLIM] [decimal](13, 0) NOT NULL, [CRDAY] [decimal](3, 0) NOT NULL, [TCRCD] [nvarchar](3) NOT NULL, [TEXRT] [decimal](11, 6) NOT NULL, [R1RGL] [decimal](13, 2) NOT NULL, [TAXAM] [decimal](13, 2) NOT NULL, [TFRTX] [decimal](13, 3) NOT NULL, [TFRGT] [decimal](13, 2) NOT NULL, [TSPCH] [decimal](13, 2) NOT NULL, [SPCST] [decimal](13, 2) NOT NULL, [IRPFT] [decimal](13, 2) NOT NULL ) ON [PRIMARY]
I have a sql query that I would like to run every morning and output into a csv file.Unfortunately, I can't seem to find simple instructions on how to use SQL Server Agent to run that.
We have the following scenario: We receive CSV files every month for which SSIS packages were built to process the data. The following problems occur from time to time:
1. The structure of the CSV file changed (e.g. column added or removed) 2. There were no footers in the data, but now footers started to appear 3. Date format changed (e.g. used to be mm/dd/yyyy, but became mm.dd.yyyy) 4. Number format changed (e.g. from 2000 to 2,000)
Currently we have person who manually opens each file, and using our "validation document" validates to ensure none of these or similar problems occur. We would like to move away from this manual process if possible. I understand that items 3. and 4. could be caught by loading data into a staging table with VARCHAR data types, and performing validation before moving it any further.
Item 2 is a bit questionable (meaning depending on the footer size SSIS load could fail or not).
Item 1, however, is a sure fail of the SSIS package that directly loads the data into a table.
Thus I feel the two possible options are:
1. Create a custom script that will run through the file, row by row, apply all the necessary validations and report an error or continue if all checks out
2. Use some 3rd party tool to validate the files (semi-manually) before kicking off the SSIS processing.