Package Succeeds, But Doesn't Work When Run As Job
Mar 3, 2008
I have an SSIS package that has this logic:
Map drive(batch file)
For each file (csv) loop
Pump file data into sql server
Move file to "archive" directory(file system task)
Delete File (file system task)
End Loop
Unmap Drive (batch file)
The Map/unmap code is in a batch file
c:windowssystem32
et use \10.10.10.10ShareName MyPassword /USER:MyUserName /YES
Unmap:
c:windowssystem32
et use \10.10.10.10ShareName /DELETE /YES
Here are the results when running this package:
1. Running in BIDS on separate workstation. Everything OK.
2. Running on Server by right clicking on package in Integration Services (SSMS) and choosing "run". Everything OK
3. Running as job with SQL Agent: Package succeeds but no action was taken on the files, files in "ShareName" still there, so therefore no data pumped into SQL Server.
Now, the difference is the SQL Agent jobs are running using a domain account proxy. I'm not sure how that would affect things though--I have the tasks in the package set to fail the package if they fail, so they are not failing, the drives are being mapped o.k.
The computer with the share is non-domain, but that shouldn't matter--I am specifiying the local username and password in the batch file as you can see, and as you can see it works from the workstation in BIDS on a separate machine, and works on the server too as long as I don't run it as a job. The batch file sits on both the server and the local workstation with the same local path.
Any idea why the files aren't actioned when run as a job?
I have several DTS packages saved 'locally' to the SQL server. I want to duplicate a package, so that I can make some changes then replace the original. I certainly don't want to rebuild the entire package from scratch. So, I open up the original package, go to the 'Package' menu and choose 'Save As', then give it a new name and press OK. No errors, all appears well, the title bar even shows the new name of the package. But, when I close the package and go the the 'local' package list, it (the new package name) doesn't appear in the list. Refresh, exit SEM, reboot - doesn't show up. I even looked in the MSDB table where packages are supposed to be stored (at least the name / package id / etc), and it doesn't show there as well. Tried from several client machines.
OS: Windows 2000 Server (advanced) SP2 SQL: SQL 2000 Server (no SP's)
I have many jobs on sql 05 and all work but one. This one writes to an Access DB on the same server as SQL. The package works fine. But when executed in the context of the SQL Agent job, it fails.
Jobs that write to a text file work fine. The Access DB has no password required. By the way, that job in sql 2000 worked fine.
I'm stuck with this and I haven't idea how to solve it. I'm trying to migrate a dts 2000 package from BIDS and I obtain this message:
This wizard will close because it encountered the following error:
Index was out of range. Must be non-negative and less than the size of the collection. Parameter name: index (mscorlib)
I'm going to Migrate DTS 2000 Package select my current sql2k production server (it has almost 600 dts although I think that is not any problem at all)
Wizard recognize without problems my server and then I put a folder for save them but on the next step appears the aforementioned message.
I am trying to execute an SSIS package from an MS Access 2003 database that imports a table from the Access database into a target table in SQL 2005. I saved the package in SQL 2005 and tested it out. If I run it from the Management Studio Console with Run->Execute ... everything works just fine. However, if I try to run it using the following line command "Exec master.dbo.xp_cmdshell 'DTExec /SER DATAFORCE /DTS SQL2005TestPackage /CHECKPOINTING OFF /REPORTING V'" the execution will always fail when the Access database is open (shared mode). The connection manager looks like this: "Data Source=E:Test.mdb;Provider=Microsoft.Jet.OLEDB.4.0;Persist Security Info=False;Jet OLEDB:Global Bulk Transactions=1". The error is listed below:
Code: 0xC0202009 Source: NewPackage Connection manager "SourceConnectionOLEDB" Description: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft JET Database Engine" Hresult: 0x80004005 Description: "Could not use ''; file already in use.".
Has someone managed to pass successfully a variable from a parent package to a child package? I€™ve tried a zillion permutations and I can€™t get it to work. The strange thing was that I was able to successfully do this with pre-RTM builds. Basically, what I am trying to do is:
The parent package has a variable, e.g. ExecutionID which I set using a script to System::ExecutionInstanceGUID. I verified that the variable is set correctly by dumping it to a SQL Server table. I created a child package variable with the same name. In the child package, I€™ve created a parent package configuration that points to the ExecutionID variable. I am trying to read the variable in a Derived Column Task in which I have a column linked to @ExecutionID. This doesn€™t work. Step-by-step instructions from someone who managed to concur this will be greatly appreciated.
Oh, I didn€™t have any luck hitting a breakpoint in a script task inside a child package with both in and out of process execution also.
This problem is a bit weird but I'm just wondering if anybody else experienced this.
I have a package that has file system tasks (copying dtsx files actually). Basically the package copies other packages to a pre-defined destination. Thing is, it only works if one of the packages it is configured to copy has some sort of sensitive data (e.g., a connectionstring with a password), otherwise it reports a success message on execution but doesn't actually do anything. I've checked the forcedexecutionresult and it is set to None for that matter.
Just wondering if anybody else experienced this problem and of course if there's a way to solve it.
I'm pretty new to DTS, so forgive me if this is basic. I created a simple DTS package to run a query and export it to a text file. I can execute the package fine from my workstation through EM, but when I try to execute the job to run the package I get this error: Error = -2147467259 (80004005) Error string: Error opening datafile: Access is denied.
I think that maybe SQL Agent doesn't have the right permissions to write to that network drive. What should the permissions be?
This is probably very simple, but I can't get passed this problem.
I have a report in MS Access that uses info generated by a query. One of the text fields in the query contains either the word 'Select' or the name of a course. The report should display a space if the value is 'Select', or the actual value of the field in any other case. The field can never contain a null value.
I've used: =IIf([optVoc1]="Select","",[optVoc1]) in the text box on the report, but this only returns #error regardless of the actual content of the field.
I create and schedule a SQL job to run every minute to update a table base on certain condition but it doesn't work. Job history says successful every time but the table doesn't get updated.
However if I move it to Query Analyzer and run it under dba, it will work. Thinking that it may have to do with the user the job run as, I then change run as user from self to dba. But still SQL job won't update my table.
Anything about user permission or security that I can check? Or it there any other possibility?
When I run the select its fine but I cannot delete..... i have done this many times and it has worked.... I cannot see the error what am i missing
select eqnow.empnumber, eqnow_names.empnumber, eqnow_names.names --delete from eqnow inner join eqnow_names on eqnow.empnumber = eqnow_names.empnumber where eqnow_names.names is null
i get this error Server: Msg 156, Level 15, State 1, Line 4 Incorrect syntax near the keyword 'inner'.
When I try to install the problem I get the following error.
The SQL Server service failed to start. For more information, see the SQL Server Books Online topics, "How to: View SQL Server 2005 Setup Log Files" and "Starting SQL Server Manually."
The log tells me nothing useful. I can't start the thing manually because after clicking cancel on the error message, the installer proceeds to roll back the installation.
This is the autogenerated code from the SelectCommand of my DataAdapter, except the red text. This DataAdapter is used to fill a DataGrid. What I want to do, is to calculate the total memory (4 slots) / PC.This code makes the sum of all memory of all PC's together.I'm not sure if the group by clause is needed here ...Me.OleDbSelectCommand1.CommandText = "SELECT PC.ID, PC.Nummer, PC.Netwerknaam, Case_Type.Type AS Case_Type, Processor_T" & _"ype.Type AS Processor_Type, Processor_Snelheid.Snelheid AS Processor_Snelheid, " & _"(SELECT SUM(Memory) FROM Memory, PC, RAM WHERE RAM.PcID = PC.ID AND RAM.GrootteID = Memory.ID)" & _"AS Memory, OS.Naam AS OS, OS_SP.Nummer AS OS_SP, Gebru" & _"iker.Naam AS Gebruiker_Naam, Status.Status, PC.Tagged FROM (Status RIGHT OUTER J" & _"OIN ((((((((PC LEFT OUTER JOIN (RAM LEFT OUTER JOIN Geheugen ON RAM.GrootteID = " & _"Geheugen.ID) ON PC.ID = RAM.PcID) LEFT OUTER JOIN Case_Type ON PC.Case_TypeID = " & _"Case_Type.ID) LEFT OUTER JOIN OS_SP ON PC.OS_SpID = OS_SP.ID) LEFT OUTER JOIN Ge" & _"bruiker ON PC.GebruikersID = Gebruiker.ID) LEFT OUTER JOIN Processor_Snelheid ON" & _" PC.Processor_SnelheidID = Processor_Snelheid.ID) LEFT OUTER JOIN Processor_Type" & _" ON PC.Processor_TypeID = Processor_Type.ID) LEFT OUTER JOIN OS ON PC.OsID = OS." & _"ID) LEFT OUTER JOIN Switchbox_Details ON PC.ID = Switchbox_Details.PcID) ON Stat" & _"us.ID = PC.StatusID) GROUP BY PC.ID, PC.Nummer, PC.Netwerknaam, Case_Type.Type, " & _"Processor_Type.Type, Processor_Snelheid.Snelheid, OS.Naam, OS_" & _"SP.Nummer, Gebruiker.Naam, Status.Status, PC.Tagged"I would like to know how to calculate the total memory for each separate PC.Hope you can help me.
I try to copy a DB from one server to another. On the target server an older version of the DB has been deleted and I now try to attach the new version using "sp_attach_db DBname, Filelocation", but I always get an error "Device Activation error. The physical file name 'D:mssql7dataAgency_log.ldf' may be incorrect" "Database 'Agency' cannot be Created"
To me it seems that the database is looking for the log files (now deleted). I've tried forcing a new log file I created using the same locations for the mdfs. I've tried using create a new database and replace the mdf file, but nothing works.
I wanted to create a new trigger, but Enterprise Manager tells me about an "Incorrect syntax near @UpdatedByID, line 28". I double-checked everything, but it still does not work :mad: .
Any hints :confused: ?
TIA,
-Gernot
Here is the statement (line 28 is marked with ***):
CREATE TRIGGER TransferToABII ON [dbo].[CALGeneral] FOR INSERT AS BEGIN TRANSACTION BEGIN DECLARE @Event varchar(255), @BBaseUID int, @StartDate smalldatetime, @EndDate smalldatetime, @Details varchar(255), @AddressID int, @ProjectID int, @UpdatedByID int, @ActID int, @EventID int
BEGIN EXEC BrainBase.dbo.BB_NEW_CREATE_NoteTask_Ret *** (@UpdatedByID, @AddressID, @ProjectID, @BBaseUID, @StartDate, GetDate(), @Event, NULL, NULL, NULL, NULL, @Details text, @ActID = @ActID OUTPUT, @EventID = @EventID OUTPUT) END BEGIN UPDATE CALGeneral SET ActID = @ActID WHERE ID = INSERTED.ID END END
IF @@ERROR <> 0 BEGIN RAISERROR('Error occured',16,1) ROLLBACK TRANSACTION END COMMIT TRANSACTION
I'm combining first name, last name, middle name, and an ID number together into an alias. Then I need to match that alias with a variable passed to the page (its a search results page). The problem is it claims that there is no table with the name of my alias. Anyone know what I'm doing wrong?
A mockup of the SQL looks like this:
SELECT UserID, Last_Name + ', ' + First_Name + ' ' + Middle_Name + '.' AS name FROM Table WHERE name LIKE 'variable%'
Everything looks right with the results, if I take out the WHERE clause it has name displayed properly and joined together with the rest of the data in the results properly.
Thanks in advance for any help that can be provided!
I have a query that doesn't work when i use 4 name convention instead of a openquery. The msg is below. Anyone know what is going on? Both queries are the same but one doesn't work.
-- works SELECT TOP 1 * FROM OPENQUERY(AS400_PROD, 'SELECT * FROM PPTREASUSA.ORDDET')
-- doesnt work SELECT TOP 1 * FROM AS400_PROD.S1030Y3M.PPTREASUSA.ORDDET
Server: Msg 7399, Level 16, State 1, Line 1 OLE DB provider 'MSDASQL' reported an error. [OLE/DB provider returned message: Unspecified error] [OLE/DB provider returned message: [IBM][iSeries Access ODBC Driver][DB2 UDB]CPF5715 - File ORDDET01 in library QTEMP not found.] OLE DB error trace [OLE/DB Provider 'MSDASQL' IDBSchemaRowset::GetRowset returned 0x80004005: ].
hi all, i made a stored procedure that uses the sp_send_dbmail to send mails. SQL server dislays the message "mail queued" but nothing is recieved
here is the code of the stored procedure i made EXEC msdb.dbo.sp_send_dbmail @profile_name = 'Exams', @recipients = 'me@domain.com', @Body_format = 'HTML' , @subject = 'Room Preparation' , @body='hi there'; so can anyone help with this issue thanks in advance
I have a drill through that passes four parameters. Three are passed from the current selections in that reports parameters and the fourth needs to be the customer name they click on in the body of the report so it's passed as Fields!fieldname.Value. When I click on the customer name, the drillthough fires but the report simply doesn't load. If I remove the parameter from the field clicked on and just pass the three parameters, it goes to the drill through correctly and that fourth parameter just sets to the default for that parameter in that report.
I can then simply check that parameter and select the value from the list that is exactly the same as the value I was attempting to pass it in the drill through and report refreshes correctly.
Whatever it is, is something in the manner that the value is passed in the drill through specifically.
Hello, I've rescued a MDF and LDF files off a client's old server, and I wanted to attach it to our own, but I can't seem to get the command to work, basically I have these two files, which I've dropped on our server:
So when I do a SP_ATTACH_SINGLE_FILE_DB 'somedb','C:Program FilesMicrosoft SQL ServerMSSQLDataMYCLIENTNAME_Data.MDF'
It says the LDF path my be incorrect, and that there's two other files that are missing: MYCLIENTNAME_LOG (no extension) extra_log (no extension)
I thought the whole point of the command is that you only need a single file? Its very hard to go back to the client's old server and try to find these two files, and it doesn't really matter if we loose a bit of data, so long as the bulk of it is available.
Update: I think I've found the answer...its not possible to do this, it really needs all the log files. Any workarounds?
I have been looking at this for over a day now. I cannot see why this procedure does not work, its so simple. No matter what happens it always returns 0. If it locates a record, it doesnt update it, yet it still returns 0. It should not be returning 0 if its not updating so I can't figure out why it does. Why does this always return 0? [pre]Create Procedure CreateNewCategory @title nvarchar(100), @description nvarchar(1000), @displayOrder intAS DECLARE @Result as int IF EXISTS(SELECT categoryTitle FROM categories WHERE categoryTitle = @title) BEGIN SELECT @Result = 1 ENDELSE BEGIN INSERT INTO categories(categoryTitle, categoryDescription, displayOrder) VALUES(@title, @description, @displayOrder) /* If no error was encountered, 0 will be returned. */ SELECT @Result = @@Error ENDGO[/pre] Thanks!
Since I have to go across the network, I'm trying to use the UNC. However, this won't even work when I'm using the UNC to point to the server on which this is run. I'm trying to restore a single table on 6.5. What is the obvious piece that I'm missing?
This works.
LOAD TABLE address FROM DISK = 'd:MSSQLackupDBBackup.DAT' WITH source='address'
This doesn't.
LOAD TABLE address FROM DISK = 'server1d$MSSQLackupDBBackup.DAT' WITH source='address'
Hello, I upgraded to Beta 3 and now mail isn`t working(I hate mail!) I found 2 error messages. One when trying to add an operators email id and the other when I try to "test" mail under server properties.
- Unable to start a mail session on the server with this mail profile.
- no mail profile defined.
The account that I am using is used for all our SQL Servers and was working on this server on 6.5 before the upgrade. I have checked the new SQLServerAgent service and it`s using the correct account.
Any help would be appreciated.
And to all you that are off for a long Labor Day weekend...I`m jealous!
Hi all, I have a problem with this trigger. It seams to be very simple, but it doesn't work...
I created a trigger on a table and I would want that this one updates a field of a table on a diffrent DB (Intranet). When I test it normally (a real situation), it doesn't work, but when I do an explicit update ("UPDATE AccesCard SET LastMove = getDate();" by example) it works.
If anyone could help me, I would appreciate.
NB: Is there a special way, in a trigger, to update a table when the table to update is on another BD ?
Francois
This is the trigger: ------------------------------------------------------------
ALTER TRIGGER UStatus ON AccesCard AFTER UPDATE, INSERT AS
DECLARE @noPerson int
SET NOCOUNT OFF
IF UPDATE(LastMove) BEGIN SELECT @noPerson = Person FROM INSERTED UPDATE Intranet.dbo._Users SET Intranet.dbo._Users.status = 1 WHERE personNo = @noPerson; END
I have the following situation: A webserver in a DMZ which connects to a DbServer in a Domain.
but when I try to make a ODBC connection on the webserver I get the well-known "Server does not exist or access denied" error.
I tried using IP address as well as FQDN. I turned off the firewall in the router (I'm able to access shares and what not from the webserver so the firewall is really turned off).
I think however i have narrowed down the problem to the TCP/IP connection of the SQL Server. Normally when you try from a command prompt "telnet dbserver 1433" you get a connection. But when I try this to the specific DbServer I get an "unable to connect" error. Even when I try it on the dbserver itself (so telnet localhost 1433). This should work always?
In the network configuration of the sql server the Named Piped as well as the TCP/IP protocols are enabled. TCP/IP is offcourse configured on port 1433.
Hi again, all...here I am again, trying to work on my CloseIndex thang again...Same subject, different tack...
Basically, I have two tables...for the sake of simplicity, let me define them as:
PortfolioIndex PortfolioID int CreateDate smalldatetime CloseIndex float
PortfolioPerformance PortfolioID int CreateDate smalldatetime PrevDate smalldatetime DailyPerChg float
UPDATE PortfolioIndex SET CloseIndex = CASE WHEN PPI.CloseIndex IS NULL THEN 100.00 ELSE (PPI.CloseIndex + (PPI.CloseIndex * PP.DailyPerChg / 100)) END FROM PortfolioIndex AS P INNER JOIN PortfolioIndex AS PPI on (P.PortfolioID = PPI.PortfolioID), PortfolioPerformance AS PP WHERE (P.PortfolioID = PP.PortfolioID) AND ((P.CreateDate = PP.CreateDate) AND (P.CreateDate = @CreateDate) AND (PPI.CreateDate = PP.PrevDate))
What I am trying to do is...get the previous day's portfolioIndex row's CloseIndex and create a new one for today's row.
As ugly as it is, it works when I execute it in the SQL Query Analyzer, but when I try to create the stored procedure, the syntax check complains that the PortfolioIndex reference at the UPDATE... part is AMBIGUOUS...yet when I define it in the SP as P.PortfolioIndex, it fails at run time saying there is no object named P.PortfolioIndex (well, of course there isn't!).
How can I make this work (and if possible, make it prettier too! *L* ;) )