Hello, SQL Server 2005 Enterprise and new hardware have been ordered for our department. We currently run SQL Server 2000 (sp4). We have almost 500 DTS packages, 293 Jobs, and 14 user databases with hundreds of objects within.
Is there any documentation out there on how to scrutenize a current system? I have searched, and most of what I can find addresses migration planning with the assumption that the databases, packages, jobs, security, etc are ready to move over. We have a lot to think about before we can do that. We know we have redundancy problems (like View proliferation), table schema issues, obsolete DTS packages and Jobs, and otherwise a host of opportunities to 'clean house' and/or improve. We would really like to get a handle on what we are migrating before we migrate.
If you have any ideas or resources to you feel would be worth looking at, please share.
Hi, I have two job J1 and J2, each one has 10 steps. Now I want J2 to be the 11th step of J1 and I did not want manually type all the steps of J2 to be 11-20 step of J2. Is there an easy way through TSQL to do this ? sp_add_jobstep only works when the step is OS command or a script but not a job
I'm using SQL Server 7.0. I have a job which runs DTS packages (1 package per step). When a task fails within my DTS package, I'd like an error returned for that step in the job thus stopping the job and not starting up the next step (DTS package) in the job. As it stands right now, if a task fails within the DTS package, that step in the job still returns a successful completion. Has anyone seen this before and is there something I can do to get the DTS to send a failure for that step in the job?
I am going to be moving multiple databases to a new server. Everything should go smooth, but I need to change a lot of the DTS packages that reference the old servername and replace it with the databases DNS record.
Is there an easy way to get a list of which dts reference the old server explicitly (not using database DNS)?
hi !!!i try to connect to my sql server local instance but it is always failed ..... can you please tell me the step by steps and options to use to install sql server on my machine and i think i need to use he personal copy rather than the standard as it will be on my machne not in the server??? please help
Can anybody tell me how many steps it's possible to put in one job. The reason I ask is that we have a job that has over 500 steps (import data from Excel file into SQl table) and every time it runs we have different steps failures.
Does fact, that excel file was dropped and recreated, change DTS Id ?
Hi, I am new to replication. I have to replicate a db on SQL7.0 sp5 . It's going to be transactionol. Is there any article which explains everything - where to start from and where to end? I mean everything step by step..... TIA.
Here is an interesting problem I can't figure out. I have a job with 6 steps as follows:
Step 1 - Import text file 1 Step 2 - Import text file 2 Step 3 - Delete all data from address tables 1 and 2 Step 4 - Copy data from imported table 1 to address table 1 Step 5 - Copy data from imported table 2 to address table 2 Step 6 - Delete imported taxt file table 1 and 2
Now when I run each of these steps individually, like running the dts packages and stored procedures my self it all works fine and the data in my tables appears to update. Then, when I set the job to run automatically, it says completed and no errors but my data hasn't updated. The job must be doing what it is meant to as it took about 40 seconds which is normal.
Generally speaking when you want to optimise an application that relies on a database which is the order of the following optimization techniques
a) optimizing the spread of the pysichal elements of the database on different disks of the server b) optimizing the use ot the RAM c) optimizing the SQL d) opimizing the OS
I've created SQL Server Agent jobs through management studio on SQL Server 2005. I can view and edit these jobs when I am logged into the server via remote desktop, but when trying to administer these jobs through Management Studio on a different machine, the steps do not appear in the job properties window. Anybody else see this behavior? Know why it occurs? Is it a bug, or another wonderful "feature" of Manglement Studio?
welcome everybody i want to publish my sql2005 server through my isa2004 so i do the following steps and i want to know if there is wrong in it or if there is another step is missing or not?
1-i make editing in router configuration file to natting requests on my real ip to the external interface of my isa
2-at isa i make sql publishing rule to forword requests to the ip of sql server (from:anywhere to: ip of sql server listner:external protocol:microsoft sql server requests:appear from isa not original client ports:default ports1433)
3-at sql server i enable allow remote and local connection over tcp only 4-at sql server i enable allow remote desktop 5-at sql server i enable firewall and in exception tab i add remotedesktop and 1433 port
but still when i try to connect from internet using the studio managment express tool using the real ip address(tcp:{my real ip address}) and login information of sql still error occure and no connection opened.... note:scw was installed and i uninstall it
so what is the problem why sql can't published also i make at isa another rule to allow remote desktop to my sql server using rdp protocol but when i try to connect using remote connection to sql server it failed but when connect to any other internal server it work succesfully
What in SSIS replaces DTS Task Steps? In DTS you could build tasks and assign them an order in which to execute. How is this replaced in the SSIS Control Flow. Thanks.
I am trying to create a SQL Agent job with 3 steps.
I want to delete three tables.
Step 1 ...delete table "X"
Step 2 ...delete table "X"
Step 3 ...delete table "X"
problem is that afer i create the three steps and start the job it never seems to finish the first steps and non of the other steps run, the job looks like is executing and never finishes. I break the job into three jobs each completes fine. I need this to runs one job, any ideas?
Hello, Thanks for reviewing my question. I am trying to install SQL Server 2005 but I keep running into the same error: SQL Server Setup failed to obtain system account information for the ASPNET account. To proceed, reinstall the .NET Framework, and then run SQL Server Setup again. The only documentation I find on this is configuring the issolation mode in IIS 6.0. Any help on is will be appreciated. Peter
Hi All, In creating 'steps' in JOBS, is it possible to execute many DOS CmdExec in one step, instead of creating several steps with a single DOS-cmd in each. For example:
If I created a job executing those 3 functions in 3 separate steps then it works fine. But if I put all those 3 DOS command in one step, it won't work. Somehow, SQL doesn't 'understand' it should execute after the end of each command OR I missed something here (apparently so!). I know if I put all those 3 DOS commands into a DOIT.BAT and execute it, it will work. But I want to use SQL Job to schedule it to run on a regular basis.
Anyone has run into this same problem? Thanks in advance. David Nguyen.
My end game is to automate some of my monthly queries in a Job in SQL Server Agent. Right now I have two metric tables. One table is the name and comment with the PK. The secondary table is attributes/detail, such as reporting month, target and actuals.
I am currently running all different types of queries to get the aggregates. I'd like to get these into a job so it would run automatically and update the reporting table.
Would you recommend making one step or multiple steps for each query? I am trying to use an intelligent approach to begin to load the tables.
Hi all, Can any one help me in this issue?I am new for this SQL server.By creating scripts from server system.I got the tables without data in my system.But i want to get the full database as usual there in server. I have to convert my existing MS Access frontend and SQL backend app into ASP.NET web app with sql server backend.The version is 2000.Pls it's urgent?
Hi. I have an 'Attendance' table like this:PIN Year Category Days1 2006 Authorized 11 2006 Available 21 2006 Personal 32 2006 Authorized 42 2006 Available 52 2006 Personal 63 2006 Authorized 73 2006 Available 83 2006 Personal 94 2006 Authorized 104 2006 Available 114 2006 Personal 121 2007 Authorized 131 2007 Available 141 2007 Personal 152 2007 Authorized 162 2007 Available 172 2007 Personal 183 2007 Authorized 193 2007 Available 203 2007 Personal 214 2007 Authorized 224 2007 Available 234 2007 Personal 24I need to sum the days by PIN, Year and Category (that's easy...) ANDobtain a layout like this:PIN Auth 2006 Avail 2006 Pers 2006 Auth2007 Avail 2007 Pers 20071 1 23 13 14 152 4 56 16 17 183 7 89 19 20 214 10 1112 22 23 24How can I do this by queries without writing too many intermediatesteps ?What I have done is this (5 queries, 2, 3, and 4 building on top of1,and 5 building on 2, 3, 4).1 = Table1_Crosstab:TRANSFORM Sum(Table1.Days) AS SumOfDaysSELECT Table1.PIN, Table1.YearFROM Table1GROUP BY Table1.PIN, Table1.YearPIVOT Table1.Category;Then, based on that,2 = Authorized:TRANSFORM First([1 = Table1_Crosstab].Authorized) ASFirstOfAuthorizedSELECT [1 = Table1_Crosstab].PINFROM [1 = Table1_Crosstab]GROUP BY [1 = Table1_Crosstab].PINPIVOT [1 = Table1_Crosstab].Year;3 = Available:TRANSFORM First([1 = Table1_Crosstab].Available) AS FirstOfAvailableSELECT [1 = Table1_Crosstab].PINFROM [1 = Table1_Crosstab]GROUP BY [1 = Table1_Crosstab].PINPIVOT [1 = Table1_Crosstab].Year;and4 = Personal:TRANSFORM First([1 = Table1_Crosstab].Personal) AS FirstOfPersonalSELECT [1 = Table1_Crosstab].PINFROM [1 = Table1_Crosstab]GROUP BY [1 = Table1_Crosstab].PINPIVOT [1 = Table1_Crosstab].Year;and finally5 = AllSELECT [2 = Authorized].PIN, [2 = Authorized].[2006] AS [Auth 2006],[3 = Available].[2006] AS [Avail 2006], [4 = Personal].[2006] AS[Pers2006], [2 = Authorized].[2007] AS [Auth 2007], [3 = Available].[2007]AS [Avail 2007], [4 = Personal].[2007] AS [Pers 2007]FROM ([2 = Authorized] INNER JOIN [3 = Available] ON [2 =Authorized].PIN = [3 = Available].PIN) INNER JOIN [4 = Personal] ON[3= Available].PIN = [4 = Personal].PIN;It works, but... I am sure that this is an awkward way of doing it.Isthere any other, more elegant, way, please ? Besides, what if I hadnot 3, but 15 categories, for example ????Thanks a lot for your time reading this, Alex
on the last few days i build SSIS packages for WH db and now i need to write doc. for packages steps defination i need to write the doc with Prof. ways if anyone help me on this , for sample doc. or concept or the way
I have a few reports. I want to run each report as one step of sql agent job. If the report succeeds only then the job should move to the next step ? is it possible
I am able to make separate jobs for separate reports but not as steps of one job.
I am putting my first steps into the new Service Broker. I (think i) undestand what it can do, so i execute some example sript i found on the Internet. But with none om them i receive any message into a queue. Can somebody help me. The undeneath example i use.
----------------------------------------
USE AdventureWorksDW GO CREATE MESSAGE TYPE HelloMessage VALIDATION = NONE GO CREATE CONTRACT HelloContract (HelloMessage SENT BY INITIATOR) GO CREATE QUEUE SenderQueue CREATE QUEUE ReceiverQueue GO CREATE SERVICE Sender ON QUEUE SenderQueue
CREATE SERVICE Receiver ON QUEUE ReceiverQueue (HelloContract) GO DECLARE @conversationHandle UNIQUEIDENTIFIER DECLARE @message NVARCHAR(100)
BEGIN BEGIN TRANSACTION; BEGIN DIALOG @conversationHandle FROM SERVICE Sender TO SERVICE 'Receiver' ON CONTRACT HelloContract SET @message = N'Hello, World'; SEND ON CONVERSATION @conversationHandle MESSAGE TYPE HelloMessage (@message) COMMIT TRANSACTION END GO
-----------
From here i do a select * from queue. with 0 rows selected
select * from ReceiverQueue or
RECEIVE CONVERT(NVARCHAR(max), message_body) AS message FROM ReceiverQueue
Hi, I have one data flow with 10 sources and destinations in the flow. For the sources I'm using the datareader for an odbc and for the destination I'm using the ole db destination source. By default these run parallel when executed. Is there a way in the dataflow to run them step by step instead of creating 10 different data flow tasks in the control flow? Or is it better to have 10 different data flow tasks?
Good Evening! First and foremost any help would be greatly appreciated!My question is how to use a temp table within two (2) seperate called stored procedures? I am using MS ReEporting Services and I call a stored proc that creates a temp table and executes some summarization code - so the scenario is as follows: 1. I select the first MS REporting services from a menu item - which creates the Temp Table (within a stored procedure) and the report is displayed 2. Within option (1) (MSRE - I have defined a navigation link to call another report - which uses a stored procedure) which I would like to access the temp table created in step 1.I know that #Temp_Table will only be available for step 1 and I have tried to use##Temp_Table which should be available for my session - which maybe I am not understanding correctly in that I thought a ##Temp_Table would be available for my session until no other T-Sql scripts are trying to access or I sign off!I don't really want to create a table and have it "hang around" until someone selects the menu option again to select the program again which then the T-Sql will then Drop the table.Any insight or help is sincerely appreciated!Best regards
Previously I was performing the process of populating my DataMart and then processing my cubes via 2 separate job scheduled one after the other and with about 30 minutes idle time in-between (just in case).
I now have tried to make a single job with both of the above as 2 steps (one after the other) of a single job.
However when I was done, I did a right click and tried to run the newly created job but instead of what I was hoping (that the job would start, would perform the 1st step, followed by the 2nd), it instead popped up a dialog box asking (I think) which step needs to be executed :(
Is there a solution so that when the job is scheduled to run, it should start performing the steps in the order rather than waiting for someone to select which step to perform.
I have set up my confirmation system to work as thus:
1) user registers 2) registration goes into a temp table with a code 3) user gets a MD5 code in his inbox 4) user clicks back 5) click-back page moves registration data from TEMP table to USER table and sends the user to the signin page.
6) ...
Now, my question is - how do I handle this final step in account confirmation? Should the first signin act as a final confirmation of the users account? This makes sense. Should the data in the USER table self-delete after a day if there has been no first sign-in? Should I have a column in the USER table to show if the first login has happened? How should I do this?
I am sure I could mess around with this but it would be great to get feedback from somebody that has done this multiple times and has a sense of what the best practice is (based on large volume examples).
Previously I was performing the process of populating my DataMart and then processing my cubes via 2 separate job scheduled one after the other and with about 30 minutes idle time in-between (just in case).
I now have tried to make a single job with both of the above as 2 steps (one after the other) of a single job.
However when I was done, I did a right click and tried to run the newly created job but instead of what I was hoping (that the job would start, would perform the 1st step, followed by the 2nd), it instead popped up a dialog box asking (I think) which step needs to be executed :(
Is there a solution so that when the job is scheduled to run, it should start performing the steps in the order rather than waiting for someone to select which step to perform.
I am going nuts trying to get this to work. Maybe someone can help me. I am running sql server 2000 and am using a dts package. The package runs fine on sql server. When I access it using asp.net I get the following error: ---------------- The execution of the following DTS Package succeeded:
Step 'Copy Data from Sheet1$' to [dbname].[dbo].[Maps] Step' failed
Step Error Source: Microsoft JET Database Engine Step Error Description:Failure creating file. Step Error code: 80004005 Step Error Help File: Step Error Help Context ID:5003436
Step Execution Started: 6/7/2006 3:33:12 PM Step Execution Completed: 6/7/2006 3:33:17 PM Total Step Execution Time: 5.015 seconds Progress count in Step: 0 ---------- I searched through several forums and found that this seems to be a permissions problem. I set the IIS process to Low, I also added the <identity impersonate="true" /> tag to my web.config file. After adding the tag I get a new error message: ---------- No Steps have been defined for the transformation Package. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Runtime.InteropServices.COMException: No Steps have been defined for the transformation Package. ------------------
I am at a stand-still trying to get any further. Can anyone teel me what else I might be able to try to resolve this problem? Here is my code for executing the package: -----------------
Sub Page_Load(Src As Object, E As EventArgs)
Dim cnnstring as String="Data Source=NS32;Initial Catalog=dbname;Pooling=False;Min Pool Size=100;Max Pool Size=200;User ID=userid;Password=password" Dim cnn as SqlConnection Dim cmd as SqlCommand Dim rs as SqlDataReader Dim sql as String="Truncate Table Maps"
'Empty Equipment Contract Pricing table cnn=New SqlConnection(cnnstring) cnn.Open() cmd=New SqlCommand(sql, cnn) sql="DELETE FROM Maps WHERE Name IS NULL" cmd=New SqlCommand(sql, cnn) rs=cmd.ExecuteReader()
'check to see if table is empty If rs.HasRows Then Response.Write("<p><b>Failed to empty table.</b></p>") Else Response.Write("<p><b>Table successfully emptied.</b><br><br>Importing Data...<br>Please wait...</p>") End If
'declare variables for DTS Dim objDTSPackage, objDTSStep, strResult, blnSuccess Const DTSSQLStgFlag_Default = 0 Const DTSStepExecResult_Failure = 1
'Use stored procedure on sql server to import data objDTSPackage = Server.CreateObject("DTS.Package") blnSuccess = True 'Load package from sql server objDTSPackage.LoadFromSQLServer ("NS32", "user", "pass", DTSSQLStgFlag_Default, "pass", "", "", "MapsImport") 'Explanation: LoadFromSQLServer ("ServerName", "Username", "Password", "Flags", "PackagePassword", "PackageGUID", "PackageVersionGUID", "Package Name", "PersistsHost")
objDTSPackage.Execute
'walk through steps and check for errors For Each objDTSStep in objDTSPackage.Steps
If objDTSStep.ExecutionResult = DTSStepExecResult_Failure Then strResult = strResult & "Package " & objDTSStep.Name & " failed.<br><br>" blnSuccess = False Else strResult = strResult & "Package " & objDTSStep.Name & " succeeded.<br><br>" End If
Next
'display success or failure message If blnSuccess Then Response.Write ("<p><b>Package Succeeded.</b></p>") Else Response.Write ("<p><b>Package Failed.</b></p>") End If
I have a problem with a SQL Server (2000) Agent Job that has 3 steps that don't produce the desired outcome when invoked in total but will produce the desired outcome if the first step is broken up from the last two steps.
1) The first step uses third party software to generate PDFs to a folder.
2) The second step executes the following command to ftp the PDFs to a folder and move these PDFs to a backup folder:
3) The third step determines if any files remain in the folder that contained the PDFs. If files are still in the folder then the FTP and move step (step 2) did not work and a stored procedure is invoked to send an email to the appropriate administrator
DECLARE @result int EXEC @result = master..xp_cmdshell 'Dir "\networkdriveArchiveGeneral*.pdf" | find /i "file"' IF @result = 0 exec ibc_sp_Email_Report_Failure 'FTP General Report'
When I invoke this job from Enterprise Manager and view the job history, it says that all steps executed successfully (which I understand may be the case even if the files were not FTPed or moved since the command can still return a code indicating success even though it didn't do what I expected.) In fact, the PDF is generated and written to \networkdriveArchiveGeneral eportname.pdf. Steps 2 and 3 do not do what I expect. The PDFs still remain in the folder.
But when I start the job from step 2, the files are then moved. So, invoking the entire job does not move the files; invoking the job from step 2 moves the files.
Simple, you think, it is obvious that you didn't set up your job steps correctly in that step 1 does not go to the next step upon success. But I already checked that. Step 1 goes to the next step upon success.
Anybody ever come upon this problem? Any suggestions as to what else I can look at?
I have a Job that runs every half an hour, and has about 30 steps. If a step fails it moves on to the next step (the steps are not dependant on one another)... However I would like to receive notification if that occurs. I have SQL 2005, with database mail enabled and set up (works fine for other jobs)... The question is how do I get it to send me an email if a single step fails, however the job on a whole succeeds?
Do I need to set up a step between each job?
Step 1 An actual step, on failure: step 2 On success Step3 Step 2 A step that emails the failure of step 1, then goes to step 3 Step 3 An actual step