Output File (.txt) Running Package From SQL Server Agent
Mar 19, 2008
Hello,
I have created a package that runs without problem.
I run the package with the command dtexec /F "package_name.dtsx" > package_name.txt.
Then I run the same package from SQL Server Agent, everything is OK
Then, I tried to edit the command line to have the output file, but I got an error.
The command line is:
dtexec /F "package_name.dtsx" MAXCONCURRENT "-1" / CHECKPOINTING OFF /REPORTING E > package_name.txt.
(MAXCONCURRENT "-1" / CHECKPOINTING OFF /REPORTING E are created by default)
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "'I:MyFolderMyRptsMyApplicationMyDatabase.mdb' is not a valid path. Make sure that the path name is spelled correctly and that you are connected to the server on which the file resides.".
The above path "I:...." is mapped on the server that SQL Server is running on.
This package runs with NO problems in MS Visual Studio.
Okay, i've got an ETL package which works flawless, we've created a scheduled job for it which also worked. This ETL accesses a flat file located on a server on a different location in the same domain.
Last week we start a migration to a new network, the location where the flat file is stored has already moved to the new domain but the SQL server agent is still on the old domain. There is a trust between the 2 domains, the network user from the old domain has been added on the server in the new domain where the flatfile is located.
If i run the etl package manually it works flawless, now if i run the etl using the SQL server agent it won't run anymore.
I get the following error:
Executed as user: OIdDomaindwh. ...n 9.00.1399.06 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 21:15:00 Error: 2007-10-02 21:15:02.10 Code: 0xC001401E Source: Rolmer_Product Connection manager "Flatfile Product" Description: The file name "\fs-cha-fs01dwhproduit.csv" specified in the connection was not valid.
Don't tell me its a problem related to the user that the SQL Server Agent is set to because its set to OldDomaindwh which is the same user i use to login on the server where i can manually run it without problems. The problem only occures when i run it using the SQL server agent which uses the same user to execute packages.
This is a very difficult problem i doubt anyone here will be able to crack this nut, but if they do i'll be very impressed because it would require a real smart network admin with SQL experiance to figure this one out...
I have setup a step in SQL Server Agent to run a SSIS package that I have created. However the step fails straight away and refers me to the history log, which doesnt seem to show what the problem is.
I've tried running the package manually through dtexec.exe and it runs through fine. Does anyone know what the problem could be?
My package is connecting to an external data provider using an OLEDB driver . The package runs fine in debug mode.When i tried to run the same from SQL server agent it failed to aquire the connection. The OLEDB provider does not contain too much of information , ( connection string, initial catalog, blank user name and password).The same package executes successfully if i run using dtexec in BAT file.But if i use the dtexec in sql server job step as operating system command and try to run, the job will fail reporting " can not aquire the connection".
I'm trying to run a task that executes a script file (cmd). When i run it with in bids with my own users (domain admin) it works. When i start a cmd prompt and try to run the cmd file directly from the network location where it is it works (with my own rights and with the sql server agent user).
Now when i try to run in from smss > agent jobs > job and run job it never completes. Im not getting any error message either it just keeps on running on the step ??? It seems like a rights issue, but the account running the sql server agent is able to execute the cmd file directly from the command prompt.
There are no errors in any error logs anywhere and no error is displayed...
Ps. Im running the job step as a integration service pacgake.
I recently installed standalone version of SQL 2014 Standard on my work computer. I used Access before but I want to use a SQL server instead.
We have a shared drive that a file gets deposited every day at midnight. I want to be able to get this file and import it to the server (its basically a list of names).
Here what I have done so far:
I created the database
Created the file and successfully imported data into it using the Import Data feature.
I saved the SSIS package
Scheduled an Agent Job for this package to run at certain time,daily
At first the jobs would fail with a Access is Denied. I added a user under Credentials with my network account ( have admin rights on the work computer).Also added a Proxy for the Credential user I made.
Jobs fail with a “Cannot open data file” error. I tried changing things here and there, but I can’t get it to work.
I'm testing my IS package in my local PC which extracts data from one SQL Server table to another SQL Server database. In BIDS, it's running without error. I also tried it in cmd with DTExec /f "c:package.dts", it also ran ok. I tried assigning it in SQL Agent as job. with same command DTExec. This time, it ran successfully but got 0 rows. I dunno what went wrong, why suddenly it cannot get data from the source.
I'm new here and hope you will be able to help me.
I have created several SSIS packages with Visual Studio 2005. They all work fine in debug mode. I have been able to make them work with a ODBC connection by using a ADO.NET connection.
Then I exported them to the file system in my SQL Server 2005 database and created a task in SQLAgent to run them.
All the packages using the ODBC connection fail with the following error :
Login failed for user XXX Error : 18456; Severity : 14 , State : 8
This error is a password mismatch.
I tried several database users and checked the passwords multiple times.
It looks like SQL Agent is not able to retrieve the password although it is stocked in both the ODBC connection and the SSIS connection.
I have a number of packages that I have moved from an old server. Each package was scheduled with a SQL Agent job. On the old server everything ran fine. All of the packages run fine from VS, from DTEXECUI and I have tried one from the command line with DTEXEC and it worked.
When I run from the SQL Agent job, I don't get a failure, the package just hangs. I let one of the agent jobs sit for an hour with no progress. The package typically takes about 15 minutes to complete.
Below is the output from my package log up to the point that it hangs:
thanks for the valuable information all the time!!! saved me a lot of time...
our team developes a system for text data improvement using ssis .
we have a few heavy packages that we want to execute on two separate "SSIS servers" that will be dedicated to runnung these packages only, and repeatedly. the main sql server will be placed on a different server machine .
my question is:
what is the best way to do this?
if we schedule these packages as a job of the Sql Ajent- does that mean that the packages will be executed on the sql server machine (which is not what we want) ? or could we define a remote machine to run the package on, and specify our "SSIS servers"?
or- should we use a simple scedualer on the "SSIS servers" using a dtexec command? but then i loose the benefits in using the sql agent- such as logging, notifications, etc.
do we need to install sqlserver on these "ssis servers"?
I'm trying to deploy a SSIS package and am having issues getting it to execute from a SQL Agent job. I've read this article http://support.microsoft.com/?kbid=918760 over and over and although it's helpful I must still be missing something.
The SSIS package uses a configuration file for connection string information and the ProtectionLevel is set to DontSaveSensitive. I'm using a proxy account with the appropriate permissions. I can execute the package from SQL Management Studio and from dtexec.exe just fine. I even copied the command line out of the job and used that to execute it with dtexec.exe and it worked.
I'm not sure what to try next. It's like it's not even getting to the SSIS package. I have logging turned on in the package and I"m not getting back any errors from it. Any ideas? One thing that I think is strange is that the job is running on an instance of SQL but Integration Services doesn't seem to support instances. Could that be the issue?
I don't know what else to try at this point...any advice is greatly appreciated!
I wonder if anybody can shed any light on this problem. I have a SQL Agent job which has three steps, each step runs an SSIS package.
The job is scheduled to start at 11.00 pm, which is does successfully. However, it has been taking between 2 and 3 hours to run, which is way longer than it should.
When I've looked at the logging, I've found that the although the job starts at 11.00 pm, the first package (in job step 1) does not start executing until about 11.30. It finishes in about 5 minutes, there is then about an hour delay before the second package (in job step 2) starts. This finishes in about 10 minutes, then there is another hour delay before the third package (in job step 3) starts.
I've tried configuring the steps as SSIS jobs, and also as cmd jobs using dtexec, both exhibit the same behaviour.
Any ideas about what could be causing this delay? The packages are stored in msdb on the same server as the SQL Agent job, if that makes any difference.
1) Gave the sql agent account permissions to the msdb database and other databases
2) Checked the paths for "Data Sources" under properties of my agent in the SSIS properties
3) Set the authentification to Windows and used the sql agent account (a domain account that I setup) to run the agent. I also tried my login. So it's not a permission issue, because I do not get permission errors...it's past this.
What works:
1) Other SQL Agent packages...they are not running an SSIS package, just SQL
2) Running the SSiS package by right-clicking on it in Management Sudio. No errors found in the execution and the package does it's intended function
3) Package runs fine with absolutely no errors in VS 2005
What doesn't work:
1) My agent that points to my SSIS package. All this agent does is run that package. When I run the agent, I get a minimal error message in the job history logs stating the following error message:
03/09/2006 12:49:58,Run EN Process,Error,0,BG-22SQL,Run EN Process,(Job outcome),,The job failed. The Job was invoked by User domainmy_account. The last step to run was step 1 (Run EN SSIS Package).,00:00:02,0,0,,,,0
03/09/2006 12:49:58,Run EN Process,Error,1,BG-22SQL,Run EN Process,Run EN SSIS Package,,Executed as user: domainsqlagent_account. The package execution failed. The step failed.,00:00:01,0,0,,,,0
when I run a package from a command window using dtexec, the job immediately says success. DTExec: The package execution returned DTSER_SUCCESS (0). Started: 3:37:41 PM Finished: 3:37:43 PM Elapsed: 2.719 seconds
However the Job is still in th agent and the status is executing. The implications of this are not good. Is this how the sql server agent job task is supposed to work by design.
I created a SSIS package which will generate an output file and place it on a remote fileshare location which will look something like this
\RemoteServerNameRemoteFilePath
The package is executing fine when I am executing it through BIDS or through execute package utility and writing the output file to remote file share location.
I created a SQL job for the package and ran the Job. Then, its throwing an error saying
Executed as user: DomainUser. Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:33:06 AM Error: 2008-03-10 10:33:22.22 Code: 0xC020200E Source: DFT_Generate Output File Description: Cannot open the datafile " \RemoteServerNameRemoteFilePathOutputFileName.txt". End Error Error: 2008-03-10 10:33:22.34 Code: 0xC004701A Source: DFT_Generate Output File DTS.Pipeline Description: component "FF_DST_Output" (160) failed the pre-execute phase and returned error code 0xC020200E. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 10:33:06 AM Finished: 10:33:22 AM Elapsed: 15.891 seconds. The package execution failed. The step failed.
DomainUser have all the permissions on the remote file share location. SQL server agent is running with the log-on account DomainUser(same as the above).
I've created a stored procedure that creates a script to create a number of objects within the database (based on what existing objects are in the database). From Management Studio, this works fine, and the output is exactly as I want it.
I'm now trying to create a job that will execute this stored procedure, and deposit the results into a file somewhere on the server. When the job runs, the script is created in the correct place and is essentially ok.
However, there are a couple of questions I'd like to ask.
Why does SQL Server Agent put a header at the top of the output file? I was hoping to be able to use that output file 'as is' and execute it automatically to recreate my objects when required. (Obviously, I can manually remove the header, but this is an inconvenience in this situation). How do I stop it?
Also, when executed from SSMS, the output is correctly line-spaced. But, the output from the scheduled job adds an extra line between each line of text, which is, again, inconvenient. Why does it do this, and how can I prevent this (again, without manually editting the output)?
When running a package in VS you can see something like this in the output window:
SSIS package "logging.dtsx" starting. Information: 0x40043006 at Data Flow Task, DTS.Pipeline: Prepare for Execute phase is beginning. Information: 0x40043007 at Data Flow Task, DTS.Pipeline: Pre-Execute phase is beginning. Information: 0x402090DC at Data Flow Task, Flat File Source [1]: The processing of file "C: est ssis loggingad_data1.txt" has started. Information: 0x4004300C at Data Flow Task, DTS.Pipeline: Execute phase is beginning. Warning: 0x8020200F at Data Flow Task, Flat File Source [1]: There is a partial row at the end of the file. Information: 0x402090DE at Data Flow Task, Flat File Source [1]: The total number of data rows processed for file "C: est ssis loggingad_data1.txt" is 477. Information: 0x402090DF at Data Flow Task, OLE DB Destination [1011]: The final commit for the data insertion has started. Information: 0x402090E0 at Data Flow Task, OLE DB Destination [1011]: The final commit for the data insertion has ended. Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning. Information: 0x402090DD at Data Flow Task, Flat File Source [1]: The processing of file "C: est ssis loggingad_data1.txt" has ended. Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning. Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "DataReaderDest" (87)" wrote 0 rows. Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "OLE DB Destination" (1011)" wrote 1 rows. SSIS package "logging.dtsx" finished: Success.
This is exactly when I need when a package is running but I want to be able to see it without using Visual Studio. I would do it in Reporting Services but I need to find out to get the information. The SSIS logging feature in a package does not provide that kind of info.
I'd like to know if there's a way to pass parent package parameters to a package executed by SQL Server Job Agent? It appears that sp_start_job doesn't have any variable that could accomodate this.
I have a ssis package which has 2 tasks, first it builds an excel sheet with data on the server from where I am running the ssis pkg and the second task (a sql process task), runs a bat. the bat file that basically copies the excel sheet created from the first step to another server. Problem is that the ssis pkg runs succesful when i run it from BIDS, bust when I run is as ajon, it gets hung. It completes the first step, it is the sescond step that is get hung. I noticed that when i run from BIDs, it does come up with a window open file - security warning and says the publisher could not be verified, are you sure you want to run this software. why does it do that ??? the file I am asking to run is a .bat file. it should automatically use the cmd.exe app to run the bat file. Please help !!!
I have packages stored in SQL store. I was letting users run the packages from a .net app that I made with
Microsoft.SqlServer.Dts.Runtime
Now I have noticed this causes the packages to run on the client pc cpu, as well as the network traffic is done via the client pc, in my particular case this is slow.
From the doc and in this forum I have found that you can run a package on the Server cpu through sql agent, let packages be run in a sql job. after that you can start a package from an application with the SQL sp_start_job .
But How do you set a user::varibale in a package if you have to start the package from a sql agent job ?
It is SQL server 7. Scheduled jobs were running fine until the hard drive crasked. Replace the hard drive and restored the dabases. However, all the scheduled jobs cannot run. I created new jobs using database maintenance plan and it still doesn't work. the same error occurs when viewing the Job history.
sqlmaint.exe failed. [SQLSTATE 42000] (Error 22029). The step failed.
However, if I create a job by right click on the database and select backup database. This job runs fine.
What should I do to make jobs created by the database maintenance plan running again?
I want to make a very simple package: Export all rows in a table to a flat file. This package I can create pretty much by only using the wizards. Now to my problems:
H is a header post, in this case with date and time following. D is a details post, that is all the rows that was exported. E is and end post, containing only the number of rows in the file, including H and E posts.
2) I need to set the file name dynamically, preferably using date and time to name the file.
Ive done this very same thing in T-SQL, like so:
Code Snippet USE AVK GO SET TRANSACTION ISOLATION LEVEL SNAPSHOT; GO SELECT * FROM tempProducts GO CREATE VIEW EXPORT_ORDERS AS SELECT 1 AS ROW_ORDER, 'H' + REPLACE(CONVERT(char(8), GETDATE(), 112) + CONVERT(char(8), GETDATE(), 108), ':', '') AS Data_Line UNION ALL SELECT 2 AS ROW_ORDER, 'D' + COALESCE (CONVERT(char(10), LBTyp), '') + COALESCE (CONVERT(char(50), Description), '') + COALESCE (CONVERT(char(5), Volume), '') AS Data_Line FROM dbo.tempProducts UNION ALL SELECT 3 AS ROW_ORDER, 'E' + RIGHT('0000000000' + RTRIM(CONVERT(char(13), COUNT(*) + 2)), 11) AS Data_Line FROM dbo.tempProducts AS tempProducts_1 GO IF @@ROWCOUNT > 0 BEGIN BEGIN TRANSACTION SELECT * FROM tempProducts DECLARE @date char(8) DECLARE @time char(8) DECLARE @sql VARCHAR(150) SELECT @date = CONVERT(char(8), getdate(),112) SELECT @time = CONVERT(char(8), getdate(),108) SELECT @time = REPLACE(@time,':','')
DECLARE @dt char(14) SELECT @dt = @date + '_' + @time SELECT @sql = 'bcp "SELECT Data_Line FROM avk..EXPORT_ORDERS ORDER BY ROW_ORDER" queryout "c:AVK_' + @dt + '.txt" -c -t -U sa -P dalla' EXEC master..xp_cmdshell @sql
--WAITFOR DELAY '0:00:10'; DELETE FROM tempProducts
COMMIT TRANSACTION END DROP VIEW EXPORT_ORDERS GO
But Im sure it can be done in SSIS aswell, giving me some nice options for i.e. error handling aswell. Pointers please
I have created a Test SSIS Package within BIDS (VS 2K8, v 9.0.30729.4462 QFE; .NET v 3.5 SP1) that connects to our Test Listener.
There is only 1 Connection Manager Object, and OLE DB Provider for SQL Server.
The ConnectionString lists: Provider=SQLOLEDB.1;Integrated Security=SSPI
The Test Connection within BIDS works.
The Package Control Flow has just 1 Object, and Execute SQL Task that performs an Exec on an SP that contains only a Select (Read).
The Package runs within BIDS.
I've placed this Package within a Job on the Primary Node. Ive run the job successfully using 32 bit runtime on and off. The location of the file on the server happens to be on a share that resides on what is currently the Secondary Node.
When I try to run the exact copy of this Job on the Secondary Node (Which has been Set up for Read All Connections; Yes), I get an error, regardless of the 32 bit runtime opiton. At this point, the location of the file is on the Secondary Node.
The Error is: "Login failed for user 'OurDomainAgent_Account'".
The Agent is a member of NT ServiceSQLServerAgent on both instances, and that account is a member of SysAdmin. Adding the Agent account as well, and giving that account SysAdmin, makes no difference either.
Started: 5:05:48 PM Error: 2014-08-21 17:05:50.64 Code: 0xC000F427 Source: File System Task Description: To run a SSIS package outside of SQL Server Data Tools you must install File System Task of Integration Services or higher. End Error
If a job is currently running and I want to disable it so that it does not run as scheduled for the following night, will it disrupt the current process?
I've tried finding an answer to this on this website but I don't see any. I'm running SQL Enterprise Manager v8.0 and have a job in the SQL server agent folder that won't run as scheduled nor will it run manually if I Right-Click and choose Start Job.
It's designed to run a DTS. I can run the DTS manually without errors but the job won't run it for me. I will try to provide as many details in this post as I can in hopes that someone who is more experienced at this can help me.
DTS DETAILS The DTS is designed to transfer data from a table on one server and transfer it through a firewall to another server into a table on that server. Then delete the entries from the originating table. This works very well if I run it manually. The purpose of deleting entries from the web server is for security. Once the data in on the other side of the firewall it's safe.
The job however will not run either manually or from it's schedule. I've tried creating a new job and setting up a very simple schedule that runs daily every 2 minutes for example. I've played around with different times but no luck. So I will try and provide you with job details below:
JOB DETAILS Name: TransferData Enabled, Target = Local Server Category: Uncategorized Local Description: Execute package: TransferData
STEPS ID: 1 Step Name: TransferData Type: Operating System Command On Success: Quit with Success On Failure: Quit with Failure
PROPERTIES OF STEP Step Name: TransferData Type: Operating System Command (CmdExec) Command: ( I have no idea what this translates to as I did not write it) DTSRun/~Z0xF6EE76ED6259A60AC370F465DE7F3095F4B37F7176C05B8E72E396DEBE60135E53ACFEDB00E1F216DD8EC8206782439BA435771E3C1E8A7A5D0DAE3F3AAA9F48BF08A9A9FA23C010A85573546F6B6B900323F76DC8A27EACAFD63904464DD5A9B28137
Nothing special in the advanced tab
SCHEDULES TAB ID: 20 Name: TransferData Enabled: No Description: Occurs every 1 day(s), every 2 minute(s) between 12:00:00 AM and 11:59:59 PM and is setup as Recurring Nothing in Notifications Tab
I only have two SQL books to reference. This is a production fire to put out and I really need the help here as I am not a certified SQL guru.
I hope I have provided enough information here for someone to help. I'm happy to provide an email address for easier contact should you request it.
Hello All,I am trying to create a DTS package.I have two tables tbl_A and tbl_B with similar data/rows but noprimary keys.tbl_A is master.I would like this package to query tbl_A and tbl_B and find1)all rows in tbl_A that are different in tbl_B, 2)all rows in tbl_Athat are not present in tbl_B and3)all rows in tbl_B that are not present in tbl_A, and then just showthose rows.Can this be done with a simple UNION?Perhaps this could produce a temp Table that can be dropped once theDTS package exists successfully.The 2nd part after all the above rows are retrieved is that I wouldlike to add an addional Column to the retrieved data called STATUSwhich has 3 possible values(letters) at the end of each row...M (modified) means that row exists in tbl_B but has 1 or moredifferent columnsA (add) means this row exists in tbl_A but not in tbl_BD (delete) means this row exists in tbl_B but not in tbl_AI'm hopping this DTS package would output a nice comma seperated TXTfile with only...1) rows from tbl_A that are different in tbl_B (STATUS M)2) rows from tbl_A that are not present in tbl_B (STATUS A)3) rows from tbl_B that are not present in tbl_A (STATUS D)Can a DTS package in MS SQL be used to perfom all of the above tasks?I would very much appreciate any help or any advise.Thanks in advance :-)
The master package has a configuration file, specifying the connect strings The master package passes these connect-strings to the child packages in a variable Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pcs. We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1: The master package starts by executing three sql-scripts (drop foreign keys, truncate tables, create foreign keys). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - cannot connect to database xxx? Info - package is preparing to get connection string from parent ?
The child package then executes OK, does all its work, and finish. Because there has been an error, the master package then stops with an error.
Case 2: When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3: When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4: When we then stop the sql-server on the default instance, the package faults again, this time with
Error - timeout when connect to database xxx? Info - package is preparing to get connection string from parent ?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
Hi, I ahve attached the bmp doc that will show my screen snapshot. My SQL server Agent service indicator is not showing green as like other server. I have started the serice and running fine. I don't have any problem running the job. My curious is why that indicator is not showing up. Thanks, Ravi
I have created a stored procedure for a routine task to be performed periodically in my application. Say, i want to execute my stored procedure at 12:00 AM daily.
How can I add my stored procedure to the SQL server agent jobs??
Hi. I was wondering if anyone has ever had a problem where nothing seems to be wrong with the server, and SQL Server Agent is up and running, but jobs fail. We had a job run at midnight, and it was the last successful run. Every job failed after that, with an error stating "Unable to retrieve steps for job..." There doesn't seem to be any reason for the problem. The jobs were kicked off manually and they all ran.
The only other error messages we could find were in the Application Event Viewer on the server. But the first error happened days ago and said this: "The description for Event ID ( 0 ) in Source ( .NET Runtime ) cannot be found. The local computer may not have the necessary registry information or message DLL files to display messages from a remote computer. You may be able to use the /AUXSOURCE= flag to retrieve this description; see Help and Support for details. The following information is part of the event: Unable to open shim database version registry key - v2.0.50727.00000."
Since then, every few seconds this error occurred: "Windows cannot load extensible counter DLL MSSQLServerOLAPService, the first DWORD in data section is the Windows error code."
We turned off the performance monitor counters and that error stopped. But we still have no idea if that was the cause of the problem.
This isn't the first time we've had an issue with jobs failing because the steps couldn't be 'found', while SQL Server Agent was up and running.
I was working on some reports on SSRS. Now I cant view those reports coz I can not open report manager. I noticed that SQLSERVER and SQL Server agent has stopped running in my local machince. But Other services are running properly (SSIS, SSRS, SSAS). When I go to SQL Configuration manager and try to run them, I get the following error massege.
" The request failled or the service did not respond in a timely fashion. Consult the event log or other applicable error logs for details. "
So now I can not connect database engine in my local machine through management studio. When I try to do that, I get the following error massage.
"An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server) (.Net SqlClient Data Provider)"
Also I can not access reportserver data base and reportservertemDB.
If I reinstall SQL Server again, what would happen? Would I still be able to use and view my old reports? Please can anyone help me to get this worked again correctly?