Low Virtual Memory When Running SSIS Package As SQL Job
Oct 18, 2007
I see following error when I execute a SSIS package as part of a job from within SQL Server
OnInformation,006-CIS-SQL,apdsvcPM2SQL,VistaMain,{F902B487-D543-4F31-AC80-EF088CD0CBA4},{74325B35-DC59-4B51-AE8E-756BCC879633},10/18/2007 6:15:12 AM,10/18/2007 6:15:12 AM,1074036748,0x,The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 4 buffers were considered and 4 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.
SQL Server has 6 GB memory allocated to it. How can I best troubleshoot this issue?
I am using SSIS packages for data transfer, When i run the package on virtual server it takes more time as when run on a PC. After analysing i found that Package when run on Virtual server takes time in startup around (50 sec) or so.Could anyone help me with a little bit of detail description as to why it runs slow.
Hey, I've a few jobs which call SSIS packages. If I run the SSIS package, it runs fine but if I try to run the job which calls this package, it fails. Can someone help me troubleshoot this issue? None of my jobs that call an SSIS package work. All of them fail.
I am running Visual Studio 2005. I have an SSIS Package which is consuming a huge amount of memory. During the execution of the package the memory keeps increasing. Until finally i get an Out of Memory exception. I have run this package using dtexec, and in the BIDS. No difference. I do have some script components and have added some code to get the assemblies in the current appdomain. I do see that one particular assembly is increasing on every loop. VBAssembly every time it hits the script component is increasing by 6, and along with it the memory is climbing. What is this VBAssembly being used for is there an update to SQL Server Integration Services that I need?
I'm running a package that has a XML Task in the control flow. This task tranforms a XML file with a XSLT.
The file is about 2 megs on a daily basis, but at the end of the month there is a full dump of data that makes the file to be around 400 megs. There is where my problem is.
I run this on my 2 GB memory workstation and when the memory gauge on the task manager reaches about 1.5gb the package fails with an "Out of memory exception".
I also run this package on a 8GB Ram server, and same applies.
Is there any way of making this package utilize all the available memory, I even increased the virtual memory to see if that helped my issue, but nothing.
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
The master package has a configuration file, specifying the connect strings The master package passes these connect-strings to the child packages in a variable Both master package and child packages have connection managers, setup to use localhost. This is done deliberately to be able to test the packages on individual development pc€™s. We do not want to change anything inside the packages when deploying to test, and from test to production. All differences will be in the config files (which are pretty fixed, they very seldom change). That way we can be sure that we can deploy to production without any changes at all.
The package is run from the file system, through a job-schedule.
We experience the following when running on a not default sql-server instance (called dkms5253uedw)
Case 1: The master package starts by executing three sql-scripts (drop foreign key€™s, truncate tables, create foreign key€™s). This works fine.
The master package then executes the first child package. We then in the sysdtslog get:
Error - €œcannot connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
The child package then executes OK, does all it€™s work, and finish. Because there has been an error, the master package then stops with an error.
Case 2: When we run exactly the same, but with the connection strings in the config file pointing to the default instance (dkms5253), the everything works fine.
Case 3: When we run exactly the same, again against the dkms5253uedw instance, but now with the exact same databases defined in the default instance, it also works perfect.
Case 4: When we then stop the sql-server on the default instance, the package faults again, this time with
Error - €œtimeout when connect to database xxx€? Info - €œpackage is preparing to get connection string from parent €¦€?
And the continues as in the first case
From all this we conclude, that the child package tries to connect to the database before it knows the connection string it gets passed in the variable from the master package. It therefore tries to connect to the default instance, and this only works if the default instance is running and has the same databases defined. As far as we can see, the child package does no work against the default instance (no logging etc.).
We have tried delayed validation in the packages and in the connection managers, but with the same results (error).
So we are desperately hoping that someone can help us solve this problem.
I have an SSIS Package that loads data from a log file. Prior to loading the data I need to prepare the file. I run a script that cleans the file. Then I import the flat file into SQL Server.
Log File Management Task 1. Run Unix Log File Task 2. Import the new log file (flat file) into SQL Server
Error i.Unix.dtsx Message: The script threw an exception: Exception of type 'System.OutOfMemoryException' was thrown.
Is this because the system is running out of memory? The RAM on the server is 4gb. Below is a sample of the script. The job doesn't always fail; there are times when the job executes with success and other times when it fails.
Script Source Code ----------------------------------------------------------- ' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task. Imports System Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime Imports System.Collections.Generic Imports System.IO Imports System.Text Imports System.Diagnostics Imports System.Globalization Imports Microsoft.VisualBasic Imports System.Text.RegularExpressions Public Class ScriptMain '********** Begin Error Log Settings ********** 'Dim sSource As String = "i.SSIS.Unix.FileManager" 'Dim sLog As String = "Application" 'Dim sMachine As String = "." 'Dim ELog As New EventLog(sLog, sMachine, sSource) '********** End Error Log Settings **********
Public Sub Main() 'variables for the unix log file Dim newFile As String = "D:iLogunixlog.txt" Dim copyFile As String = "\server16iLogunixlog.txt" 'variables for working log files Dim oldFile As String = "D:i empunixlog.txt" Dim difFile As String = "D:i empunixdiff.txt" Dim trimdiff As String = "D:i empunixdifft.txt" Dim formatTemp As String = "D:i empunixlog_formatted.txt" Dim errorFile As String = "D:i empunixlog_bad.txt"
'delete unixlog.txt copy unixlog.txt 'if the file is on the local server delete it and copy the new file over 'if the file is not present copy the new file over Try If File.Exists(newFile) Then File.Delete(newFile) File.Copy(copyFile, newFile) Else File.Copy(copyFile, newFile) End If While Not File.Exists(newFile) System.Threading.Thread.Sleep(1000) End While 'ELog.WriteEntry("FileCopy.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short)) Catch ex As Exception 'ELog.WriteEntry("FileCopy.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short)) End Try
'open the old file; read backwards until we reach the carriage 'return and store that "seek" position; now open the new file and 'seek to that stored position. finally, read the rest of the file 'and write that data to the difference file. ' determine position of last line in the old file Dim lastLine As Long = GetLastLinePosition(oldFile) ' get all data in new file starting at position determined above Dim fi As New FileInfo(newFile) Dim buffer(fi.Length - lastLine) As Byte Dim fs As New FileStream(newFile, FileMode.Open) Try fs.Seek(lastLine, SeekOrigin.Begin) fs.Read(buffer, 0, buffer.Length) fs.Close() ' write that new data to the difference file fs = New FileStream(difFile, FileMode.OpenOrCreate, FileAccess.Write, FileShare.None) fs.Write(buffer, 0, buffer.Length) fs.Close() 'ELog.WriteEntry("FileCopy.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short)) Catch ex As Exception 'ELog.WriteEntry("FileCopy.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short)) End Try
'remove the partial row from the difference file Try TrimFinal(difFile, trimdiff) 'ELog.WriteEntry("TrimFinal.Call.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short)) Catch ex As Exception 'ELog.WriteEntry("TrimFinal.Call.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short)) End Try 'perform the file formatting sFormatFile(trimdiff, formatTemp, errorFile) ' Dts.TaskResult = Dts.Results.Success End Sub
Function GetLastLinePosition(ByVal fileName As String) As Long Dim pos As Long = -1 Dim fs As New FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite) Try fs.Seek(-2, SeekOrigin.End) ' -2 to skip a potential vbcrlf at the end of file While fs.Position > 0 fs.Seek(-1, SeekOrigin.Current) If fs.ReadByte = 10 Then pos = fs.Position Exit While Else fs.Seek(-1, SeekOrigin.Current) End If End While fs.Close() 'ELog.WriteEntry("GetLastLinePosition.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short)) Catch ex As Exception 'ELog.WriteEntry("GetLastLinePosition.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short)) End Try Return pos End Function
Sub TrimFinal(ByVal difFile As String, ByVal trimdiff As String) Dim fi2 As New FileStream(difFile, FileMode.OpenOrCreate, FileAccess.Read) Dim fo2 As New FileStream(trimdiff, FileMode.OpenOrCreate, FileAccess.Write) Dim sr2 As New StreamReader(fi2) Dim sw2 As New StreamWriter(fo2) Dim line2 As String Try Do While sr2.Peek <> -1 line2 = sr2.ReadLine() If (sr2.Peek <> -1) Then sw2.WriteLine(line2) End If Loop sw2.Flush() : sw2.Close() sr2.Close() fi2.Close() : fo2.Close() 'ELog.WriteEntry("TrimFinal.Success".ToString(), EventLogEntryType.SuccessAudit, 4, CType(4, Short)) Catch ex As Exception 'ELog.WriteEntry("TrimFinal.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 2, CType(4, Short)) End Try End Sub
Sub sFormatFile(ByVal currentFile As String, ByVal tempFile As String, ByVal errorFile As String) Dim tfp As New Microsoft.VisualBasic.FileIO.TextFieldParser(currentFile) Dim sw As New System.IO.StreamWriter(tempFile) Dim swErrorFile As New System.IO.StreamWriter(errorFile) tfp.TextFieldType = FileIO.FieldType.Delimited tfp.SetDelimiters(",") tfp.HasFieldsEnclosedInQuotes = True tfp.TrimWhiteSpace = True Dim fields() As String Try While Not tfp.EndOfData Try fields = tfp.ReadFields() If fields.Length <> 23 Then 'write bad rows to error-file swErrorFile.WriteLine(String.Join(",", fields)) Else If fields(3) = "" And fields(13) = "" Then sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34)) ElseIf IsDate(fields(3)) = True OrElse fields(3) = Format(CDate(fields(3)), "yyyy-MM-dd HH:mms") And fields(13) = "" Then sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34)) ElseIf IsDate(fields(13)) = True OrElse fields(13) = Format(CDate(fields(13)), "yyyy-MM-dd HH:mms") And fields(3) = "" Then sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34)) ElseIf IsDate(fields(3)) = True OrElse fields(3) = Format(CDate(fields(3)), "yyyy-MM-dd HH:mms") _ And IsDate(fields(13)) = True OrElse fields(13) = Format(CDate(fields(13)), "yyyy-MM-dd HH:mms") Then sw.WriteLine(Chr(34) & String.Join(Chr(34) & "," & Chr(34), fields) & Chr(34)) Else swErrorFile.WriteLine(String.Join(",", fields)) End If End If Catch ex As Exception 'ELog.WriteEntry("sFormatFile.TFP.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 0, CType(4, Short)) End Try End While tfp.Close() sw.Close() swErrorFile.Close() File.Delete(currentFile) File.Move(tempFile, currentFile) 'ELog.WriteEntry("sFormatFile.Success".ToString(), EventLogEntryType.SuccessAudit, 0, CType(4, Short)) Catch ex As Exception 'ELog.WriteEntry("sFormatFile.Failure" & ControlChars.CrLf & ex.ToString(), EventLogEntryType.Error, 0, CType(4, Short)) Finally GC.Collect() End Try End Sub End Class -------------------------
Does my script seem okay for releasing the server memory usage?
I have SSIS package When I run it SQL Server memory Shoots Very Much How To check and Solve this problem When I queried In the same package as sp it does not take that much memory please help me how to control SQL Memory shoot.
I have a SSIS package that is constantly running out of virtual memory, right now I am on a development server, running only this package. The package is moving data from one table into another on the same server in the same database. The server has 3 Gb of memory and is only running SS2005 and SSIS. I am a local admin on the server and running the package through BIDS, once again for our initial testing. I tried setting the property BufferTempStoragePath to our E drive so it can utilize the 100 Gb of free space we have but that doesn't seem to work either. I have also tried setting the MaxRowSize to many different values to no avail. I am constantly getting an error, see below for exact error, when it gets through roughly half the load. Moreover it reports this error about 500 times in the progress report if I let the package run to completion. Finally, when all is said and done the package has moved the data successfully but the package always shows as failing.
I have googled continuously on this problem but have not found a resolution. I did see on a post here where it was recommended to run the package out of process, however I don't see the benefit at this point when this is the only package I am running. I also don't understand why it would report the error so many times and fail the package when it is completing successfully? Source and Destination have the same number of records at the end of the task. Could someone please try to make sense of this.
Getting Error: [DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 4 buffers were considered and 4 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked
I have one package that is constantly running out of virtual memory, right now I am on a development server, running only this package. The package is moving data from one table into another on the same server in the same database. The server has 3 Gb of memory and is only running SS2005 and SSIS. I am a local admin on the server and running the package through BIDS, once again for our initial testing. I tried setting the property BufferTempStoragePath to our E drive so it can utilize the 100 Gb of free space we have but that doesn't seem to work either. I have also tried setting the MaxRowSize to many different values to no avail. I am constantly getting an error, see below for exact error, when it gets through roughly half the load. Moreover it reports this error about 500 times in the progress report if I let the package run to completion. Finally, when all is said and done the package has moved the data successfully but the package always shows as failing.
I have googled continuously on this problem but have not found a resolution. I did see on a post here where it was recommended to run the package out of process, however I don't see the benefit at this point when this is the only package I am running. I also don't understand why it would report the error so many times and fail the package when it is completing successfully? Source and Destination have the same number of records at the end of the task. Could someone please try to make sense of this.
Getting Error: [DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 4 buffers were considered and 4 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked
using lookup component to split record if exist start update else insert as new record every time I start execute get error on lookup component low virtual memory
Kindly if anyone have suggest I will be appreciated
I have a sbs 2003 sp2 server running exchange 2003 sp2 it was running fine until one day I got this low virtual memory error, I checked the memory usage in the task manager and it is using 7 gigs of virtual memory, I increased the max size from 6 gigs to 8 gigs just to see what happens and now it is using 9 gigs of virtual memory. I don't have any clue as to what the problem is! Please Help!!!
when i am trying to execute package in ssis then given below errors comes many times.how to fix it.any body can ......
in ssis default buffer size 10 mb.
soure is iseries-db2 on as400 in production server ,
and destination is db2 udb on windows in dev server.
usersapce page size in db2 is 16-32k
4 gb ram support in server with 2003 server standard edition.
errors are---
Information: 0x4004800D at CHDRPF 312-315, DTS.Pipeline: The buffer manager failed a memory allocation call for 15728400 bytes, but was unable to swap out any buffers to relieve memory pressure. 3 buffers were considered and 3 were locked. Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked. Error: 0xC0047012 at CHDRPF 312-315, DTS.Pipeline: A buffer failed while allocating 15728400 bytes. Error: 0xC0047011 at CHDRPF 312-315, DTS.Pipeline: The system reports 83 percent memory load. There are 3488509952 bytes of physical memory with 558743552 bytes free. There are 2147352576 bytes of virtual memory with 222920704 bytes free. The paging file has 7416537088 bytes with 3703283712 bytes free. Error: 0xC0047056 at CHDRPF 312-315, DTS.Pipeline: The Data Flow task failed to create a buffer to call PrimeOutput for output "DataReader Source" (15437) on component "DataReader Output" (15442). This error usually occurs due to an out-of-memory condition. Error: 0xC0047021 at CHDRPF 312-315, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0x8007000E. Error: 0xC0047039 at CHDRPF 312-315, DTS.Pipeline: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown. Error: 0xC0047021 at CHDRPF 312-315, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0xC0047039.
We have a Server that is set to use 80mb of a 128mb machine. The machine is also set to use up to 300mb of Virtual memory as needed.
After running for 10 days, we got the message "Low on Virtual memory". We did some looking and found SQLServer was using 300mb between real and virtual memory.
So the question: Is there a manual method to get SQLServer to release extra memory with out stopping and restarting the service?
I have two instances of SQL Server running on my Development machine.I am having some performance problems and while investigating theproblem I saw with the Process Explorer form Sysinternals that bothinstances consume each 800 Mbytes of memory!I experimented with sp_configure and by giving both instances a fixedmemory size. Both methods do not seem to have any effect.Can anybody explain me why SQL Server is using so much memory?Thanks for any information.Evert WiesenekkerPSBesides the northwind database I only have one simple extra database(70 Mb in size) installed.
I am wondering if there is a way to solve a virtual memory error? We randomly get the following error when trying to run sycn over the http websync. Some clients have 512 MB running sql express, others are full instances that have 1.5 GB.
The merge process could not allocate memory for an operation; your system may be running low on virtual memory. Restart the Merge Agent.
for the last six months or so my pc has been shutting down all applications for no apparent reason when a 'low virtual memory' bubble appears. I have removed dozens of items, such as games, image editos; all programmes that require a lot of memory but it is no good. Every 40mins or so the pc decides to shut everything down and where it is impossible to start any further applications, unless I log off and on or shut down the pc myself. I really am fed up with this, its so annoying. Is it because of a virus or do I still have too much on my pc?
I'm trying to run a SSIS package (dtsx) from inside an sql job (SQL Server agent). This works fine if the user running (run as) the step is a local admin on the server. If it's not, I get the error message "The package could not be loaded. The step failed". This happens even if the user has all possible serverroles such as "sysadmin" etc in SQL.
So, my question is, is there any way to load an SSIS package without being local admin on the machine? In case it is, what is needed?
I have an issue when a job is scheduled to run a SSIS package. The package (exporte a table to a text file) runs fine from microsoft visual studio but when i create a job and run it, i get the following error:
[298] SQLServer Error: 15404, Could not obtain information about Windows NT group/user 'VOLCANOAdministrator', error code 0x534. [SQLSTATE 42000] (ConnIsLoginSysAdmin)
Hi,We have a prod server running on SQL server 2000 64 bit. It is a4cpu server with 16GB of RAM. we have a maxmemory setting of 15.5GBfor sql server. Inspite of 15GB being available for sql server, itstill uses paging file space, a lot. When looking thru task maanger wecan see sql server using 15.5GB of Memory usage and 22GB of Virtualmemory usage. I don't understand why it should even be using closer to7GB of Paging space, when it has so much memory. How does SQl serveruse Virtual memeory vs Physical memory?HAs anyone seen this before.ThanksGG
Hi, Is there any setting in IS that I should have adjusted in order to avoid this message?
Information: 0x4004800C at EXTRACT from MSCRM and AX (From Source to Working Tables for Dimension), DTS.Pipeline: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 124 buffers were considered and 124 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked
i have a nightly job (SSIS Package) scheduled using MS. The package loads data from the OLTP db to the warehouse. The server has 256GB memory and out of which 211GB is free.
the job runs w/o any problems but some times it fails with the following error"DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "The statement has been terminated.".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Violation of PRIMARY KEY constraint '<var>PrimaryKeyName</var>'. Cannot insert duplicate key in object '<var>TableName</var>'.".
When i researched this error i found out that its because of the memory issue. we have 222GB free memory and how come this is possible. Is there a way in the package or anywhere else where i can specify how much (percentage) of the memory that the SSIS package should use (something like SSRS threshold levelp).
Visual Studio runs out of memory when trying to use SSIS package. I am trying to create and run a SSIS package that validates and imports some large xml files >200MB. Validation fails because Visual studio cannot open large files without running out of memory.
The SSIS package throws this error when I run the package..at the validation task.
Error: 0xC002F304 at Validate bio_fixed, XML Task: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.".
How do I increase the amount of RAM that VIsual Studio can use...I have plenty of RAM on my workstation >3GB, but VS chokes maybe around 100MB files?
Hi i get a error when i run my SSIS package. Here is the message
Error: 0xC02020A1 at import file, Flat File Source [1]: Data conversion failed. The data conversion for column "su_supplier_code" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.". Error: 0xC020902A at import file, Flat File Source [1]: The "output column "su_supplier_code" (61)" failed because truncation occurred, and the truncation row disposition on "output column "su_supplier_code" (61)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.
The funny thing about this is when i run the job a 2nd time it works fine.
Has any one any idea about this error or why the job would work fine the 2nd ?
I'm new here and hope you will be able to help me.
I have created several SSIS packages with Visual Studio 2005. They all work fine in debug mode. I have been able to make them work with a ODBC connection by using a ADO.NET connection.
Then I exported them to the file system in my SQL Server 2005 database and created a task in SQLAgent to run them.
All the packages using the ODBC connection fail with the following error :
Login failed for user XXX Error : 18456; Severity : 14 , State : 8
This error is a password mismatch.
I tried several database users and checked the passwords multiple times.
It looks like SQL Agent is not able to retrieve the password although it is stocked in both the ODBC connection and the SSIS connection.
I am running into an error when running a package using a scheduled job under SQL Server Agent account. I am getting following error:
Date 6/12/2007 4:19:15 PM Log Job History (VistaODSFeed)
Step ID 0 Server 006-DEVSQL2005 Job Name VistaODSFeed Step Name (Job outcome) Duration 00:00:00 Sql Severity 0 Sql Message ID 0 Operator Emailed Operator Net sent Operator Paged Retries Attempted 0
Message The job failed. Unable to determine if the owner (ACIsnasir) of job VistaODSFeed has server access (reason: Could not obtain information about Windows NT group/user 'ACIsnasir', error code 0x6ba. [SQLSTATE 42000] (Error 15404)).
ACIsnasir is not the account under which SQL Server Agent service runs. However ACIsnasir has sa priveleges. not sure why am I getting ACIsnasir in the error and not the account under which sql server agent runs.