SQL 2012 :: Job Fails On First Wednesday Of Every Month?
Jun 26, 2014
Got a job here that fails on the first Wednesday of every month. It's a complex job (66 steps at last count) that takes roughly 7 hours a night. It starts at 1am, and should be finished by 8am.
The first step normally takes under 5 minutes and is one of the quicker steps However, on the first Wednesday of every month, the first step never finishes; it'll still be running when I come in at 8.20 (ish...). Kill the job, start it again, and no matter what it will not get past step 1. The only solution is to restart the service (not the server!), and then restart the job. When we do this it then goes through in the usual less than 5 minutes.
According to all the monitoring I'm doing, the actual step is not doing anything; there doesn't appear to be any processing, no CPU usage, no reads, no writes etc. It just will not process it
I've tried clearing the cache, both ad-hoc and proc (the step runs a SP). Although I have to confess I've not tried running the SP in step 1, seeing if that works, and then run the job from step 2. Part of the issue is we need to get the process running as quickly as possible, and I don't generally have time to experiment.
Ops and Tech Support say there is nothing running on the server (servers - as it's a cluster) - no AV downloads or anything. The fact I only have to restart the service, rather than the server would imply it's a SQL thing.
I use the derived column to convert a string date from a flat file like this: "Jan 02 2005" into a datetime. I have seen in the forum to use: (DT_DATE)(SUBSTRING(mydate,5,2) + "-" + SUBSTRING(mydate,1,3) + "-" + SUBSTRING(mydate,8,4)) However, even if it produces a string like '02-Jan-2005', the following cast to dt_date fails. I have also tried inverting month and day, year/month/day but all with the same result:
Derived Column [73]] Error: The component "Derived Column" failed because error code 0xC0049064 occurred, and the error row disposition on "output column"...
I think the cast fails bacause of the month format. Therefore the only solution would be to code in in a lookup table Jan, 01 | Feb, 02 |... ????
My goal is to select values from the same date range for a month on month view to compare values month over month. I've tried using the date trunc function but I'm not sure what the best way to attack this is. My thoughts are I need to somehow select first day of every month + interval 'x days' (but I don't know the syntax).In other words, I want to see
Select Jan 1- 23rd feb 1-23rd march 1-23rd april 1-23rd ,value from table
Hello what I'd like to display the following in a matrix report:
Parameter selected: 3 (March), 2008 (Year)
Monthly TO Summed up ArtNo March <=March 1210 20,500 50,900 1220 21,200 64,000 1230 15,400 40,300 ... ... ...
So, in the rows I have the articles and in the column the selected month via parameter. In another column I need to sum up all monthly values up to the selected month, meaning in this example the sum of jan, feb and mar per article.
I am trying to add month to a date. Here is my code
declare @CollectionDate date='10-28-2014' select @CollectionDate ;WITH CTemp AS ( SELECT TransactionDate=CAST(@CollectionDate AS DATE) ,RemainingTransaction=1 UNION all
[Code] ....
It is working fine. But when I am giving date '10-30-2014' it shows me the error
Msg 241, Level 16, State 1, Line 3 Conversion failed when converting date and/or time from character string.
I can understand the problem that it is for the month of February. But How do I overcome the situation?
I'm trying to find the most succinct way to get the last occurrence of April 1st given a date.
At the moment I'm using this:
DECLARE @Date DATE = '20131217' SELECT CONVERT(DATE, CAST(DATEPART(YEAR, IIF( --If we're at the start of a year --we'll need to go back a year DATEPART(MONTH, @Date) IN (1,2,3), DATEADD(YEAR, - 1, @Date), @Date )) AS VARCHAR(4)) + '0401')
What's the best way to calculate a customers age and value by month and year?
I need to be able to calculate customer value by month and year, and then to calculate their age at each month in time. I've found a way of grouping sales by month and year that includes age for a particular contact like this:
select fh.contact_number , concat(year(fh.transaction_date), '-', month(fh.transaction_date)) as transaction_month_year , cast(fh.transaction_date as date) as transaction_date , sum(fh.amount) as ttl_amount_in_month
[Code] .....
It seems to work, but any better way to achieve this?
In the above data, no record exist for 201403,201404,201405, query I wrote will give only the data for which there LeftCount exists, but I am looking for a query which get the data in the below format.
I have 12 month report and I need show volume and difference between current and prev month volume, what is the smart way to do this, do I need to put prev month value onto same row horizontally? I think should be some other smart way, I heard about LEAD function?
This what I think for now, It should be listed per ClientID also, in my example I have single ClientID for simplicity.
I tried to do LEAD but with not success..
/* IF OBJECT_ID('tempdb..#t') is not null drop table #T; WITH R(N) AS (SELECT 1 UNION ALL SELECT N+1 FROM R WHERE N <= 12 ) SELECT N as Rn, 10001 ClientID, DATENAME(MONTH,DATEADD(MONTH,-N,GETDATE())) AS [Month],
I need to build a report that compares a count on a certain day of the week by month by year by stacks. That is,for first Monday in October 2012 against first Monday in October 2013 for stack DM1 against first Monday in October 2014 stack DM1, same for second Monday, first Tuesday, second Tuesday, ect. Attached is a sample dataset and what I want to achieve.
I have @Year and @Month as parameters , both integers , example @Year = '2013' , @Month = '11'
SELECT T.[Year], T.[Month]
-- select the sum for each year/month combination using a correlated subquery (each result from the main query causes another data retrieval operation to be run) , (SELECT SUM(Stores) FROM #ABC WHERE [Year] = T.[Year] AND [Month] = T.[Month]) AS [Sum_Stores], (SELECT SUM(SalesStores)
[code]....
What I want to do is to add more columns to the query which show the difference from the last month. as shown below. Example : The Diff beside the Sum_Stores shows the difference in the Sum_Stores from last month to this month.
Based on the description below on average how many hours a month would it take to monitor and maintain the MSSQL Server databases?
Description of IT infrastructure.All Windows Servers and MSSQL Servers are up to date on patches and best practices.
Corporate site with 3 remote sites.
All remote sites have one DC and one MSSQL Server.
The corporate site has one MSSQL Server.
Replication is performed between the remote MSSQL databases and the corporate office MSSQL database.
There is no in-house DBA. All DBA services will have to be outsourced. I am trying to determine what is reasonable in budgeting for time involved for this service.
There is one project written in MS Access using Visual Basic for Applications (VBA) with the backend residing on these database.
The question is on average approximately how many hours a month would it take to monitor and maintain the health of the MSSQL Servers database by a MSSQL DBA. The DBA will not have to create any user reporting, queries, etc. Just maintain the existing MSSQL Servers database.
I have the table below and like to create a view to show the no of days the property was vacant or void and rent loss per month. The below explanation will describe output required
For example we have a property (house/unit/apartment) and the tenant vacates on 06/09/2014. Lets say we fill the property back on 15/10/2014. From this we know the property was empty or void for 39 days. Now we need to calculate the rent loss. Based on the Market Rent of the property we can get this. Lets say the market rent for our property is $349/pw. So the rent loss for 39 days is 349/7*39 = $1944.43/-.
Now the tricky part and what im trying to achieve. Since the property was void or empty between 2 months, I want to know how many days the property was empty in the first month and the rent loss in that month and how many days the property was empty in the second month and the rent loss incurred in that month. Most of the properties are filled in the same month and only in few cases the property is empty between two months.
As shown below we are splitting the period 06/09/2014 - 15/10/2014 and then calculating the void days and rent loss per month
Period No of Void Days Rent Loss 06/09/2014 - 30/09/2014 24 349/7*24 = 1196.57 01/10/2014 - 15/10/2014 15 349/7*15 = 747.85
I have uploaded a screenshot of how the result on this link: [URL] ....
Declare @void Table ( PropCode VARCHAR(10) ,VoidStartDate date ,LetDate date ,Market_Rent Money
How do I find sales trend of an employee via comparing current month and previous month sales?
I got so far query upto following,
;WITH SalesOrderHeader As ( SELECT ROW_NUMBER() OVER (ORDER BY SUM(H.SUBTOTAL)) AS ROWNUMBER, SUM(H.SUBTOTAL),H.SALESPERSONID,
[Code]....
I am getting following error: The ORDER BY clause is invalid in views, inline functions, derived tables, subqueries, and common table expressions, unless TOP, OFFSET or FOR XML is also specified.
I like to create an SQL view to divide amount 300,000 between 12 month starting from Month July 2014 to June 2015 as shown below
Amount Month Year 25,000 July 2014 25,000 August 2014 25,000 September 2014 25,000 October 2014 25,000 November 2014 25,000 December 2014 25,000 January 2015 25,000 February 2015 . . . .
If I try to backup of database in sql 2012 with t-Sql as follows:
Backup database db1 to disk='c:myfullbkup.bak' with init go
This code works absolutely fine.
But if I try to take backup through SSMS -Task -Backup in object explorer and gave the same path of c: drive or any other folder outside sql default "Backup" folder.It gives error as access denied.This was not the scene in Sql server 2008.
I'm using SQL Server 2012 and I need to run a query against my database that will output the difference between 2 dates (namely, DateOfArrival and DateOfDeparture) into the correct month column in the output.
Both DateOfArrival and DateOfDeparture are in the same table (let's say GuestStay). I will also need some other fields from this table and do some joins on some other tables but I will simplify things so as to solve my main problem here. Let's say the fields needed from the GuestStay table looks like below:
I need my query to output in the following format:
I have a table with dates and values and other columns. In a proc i need to get the result as Month and the values for all the months whether or not the data exists for the month.
The Similar table would be-
create table testing( DepDate datetime, val int) insert into testing values ('2014-01-10 00:00:00.000', 1) insert into testing values ('2014-05-19 00:00:00.000', 10) insert into testing values ('2014-08-15 00:00:00.000', 20) insert into testing values ('2014-11-20 00:00:00.000', 30)
We are running SQL Server 2012 SP1 64-Bit EE on Windows Server 2008 R2 SP1. I have a SSIS Package which connects to a FTP Site and downloads a file. Then it truncates a table and loads the file data into a table. This package works okay when executed from within VSS and SSMS (In SSISDB, right click on the package and execute). However, when I execute it as a Job it does not run and appears to be failing on the first task which is the FTP Task. SQL job step - Type: SSIS Package; Run as: SQL Server Agent Service Account (domain account called playuser); Authentication: Windows Authentication.In the All Executions Standard Report for the SSISDB Catalog, it only says: FTP Download File: Errors: There were errors during task validation.
Is this because my domain account does not have access to the FTP Site?Is this where I need to come up with a Proxy Account with Credentials?Do I need to set up a SQL Server Login (Proxy Account) with the same username being used in the FTP Batch file?
FTP Commands in a batch file: username password cd omb asc get STRMASTER quit
I have a SSIS package set up that will transfer a file from a location on the network drive and transfer it over FTP to another location.
When I manually run the package, the file is transfer with no errors. But when the job is automated (via Job Activity Monitor) the transfer fails?
I have set the ProtectionLevel of the package to "EncryptSensitiveWithUserKey" and also converted the package to a Development Model. The settings for the FTP is saved within the package.
What am I missing? below is the error message
Executed as user: UHBInfoSQLAgent. Microsoft (R) SQL Server Execute Package Utility Version 11.0.5058.0 for 32-bit Copyright (C) Microsoft Corporation. All rights reserved. Started: 08:43:02 Error: 2014-10-13 08:43:03.72 Code: 0xC001405F Source: ResearchWebsite
I have a two node SQL 2012 AlwaysOn HADR cluster (v11.0.3412) with 4 availability groups configured. The AG groups are set to synchronous mode and the secondary is not readable (we do not want the synchronous replica readable so we do not risk any reads causing contention so we maintain fast performance).
On the secondary we are getting a persistent failure with the Data Collector job called Collection_Set_3_Upload. The failure occurs within the second job step. That job step is executing the following command:
dcexec -u -s 3 -i "$(ESCAPE_DQUOTE(MACH))$(ESCAPE_DQUOTE(INST))" The error message is as follows:
Log Job History (collection_set_3_upload) Step ID 2 Server CLUSTERNODE2 Job Name collection_set_3_upload Step Name collection_set_3_upload_upload Duration 00:00:07
[Code] ....
I know I can prevent this error message by enabling readable secondaries, but we do not want this.
I have tried stopping the data collection jobs and purging the cache directory but to no avail. It will succeed the first time then persistently fail again with the same message every time after that.
In addition, if I set the one failing AG group to readable secondary the job succeeds. So that means that 3/4 work fine, only this one is having an issue.
All I get back is an error message of "Analysis Services Processing Task Error: A Connection cannot be made. Ensure the Server is running" The server is running, I can process the cube by connecting to the AS instance and right-click processing it.
I can process the cube by running the SSIS task inside of SSDT Just when I deploy the SSIS package (in Project mode) and then execute it do I get the error message.
SQL Server, SSAS, and SSIS processes are all running under the same account. SSAS is on a separate server from SSIS and SQL if that matters.
Our backups by default go to a network location, but I'd like to modify our maintenance plans to backup to an alternative location if the primary location isn't available. I've setup two Backup Database Tasks where the second one runs only if the first one Fails, and if the second one runs (on first one's failure) it then sends a notification to me so I know this occurred.
The Plan is running as expected, when I simulate a bad path in the first Backup Task the second one runs and the notification is sent, but the Job shows failure. I'd like to show the job as Successful when this occurs since I'm handling the issue and notification within the Plan, but I'm unable to find out how. I've set FailParentOnFailure to False on the Plan and I've changed the MaximumErrorCount to 2 with the assumption that this would work, but neither didn't.
Also I'm running into this in both SQL 2008 and SQL 2012.
I have a package on the default instance which runs and completes successfully. When that package is moved to the same SQL server, but a different instance, running under the same service account, it fails. The error is below with some specific stuff removed:
Delete Data from Level 1:Error: Executing the query "-- Variable to capture FileID's DECLARE @DeleteFil..." failed with the following error: "The DELETE permission was denied on the object '[name removed]', database '[]', schema '[]'.". Possible failure reasons:
Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
This makes me think that the package under the non-default instance ends up running under a different security context.