Is there a way to tell if a report actually got sent? I have a report with an email subscription. As a test, I changed the data source credentials to something invalid. Then I ran the subscription using this:
I'm troubleshooting a performance issue , Looking at Profiler - for the given statement, I'm getting the following figures , why would there be such a disparity between the figures. ? How can I go about finding out why there is such difference?
In report builder 2.0, I cannot determine how to display the number of times a field has been filled out when a one to many relationship is involved. I get an error every time I use two aggregates.
Why do I get the message "Operation can't be completed?" when I try to save a stored procedure!!! When I create a new stored procedure and copy the code into it it works fine!
i am runing stored procedure which runs for around 20 min. as it is filtering data from lacks of record. after completion it shows above message i.e."query completed with errors" with 466814 records affected why this happining as it does not display any errror in message window . Does it is because of size of data . plz guide me how to debug it.
How can a tell if a tak completed succesfully from a stored procedure?
I have a task which is executed from a stored procedure. The sp_runtask only returns whether the task started successfully. How can I tell if it completed successfully?
Does any baby offer command to verify if database backup is completed? In backup wazard, there is an option "Verify backup upon completion" . Is there the SQL command? Thanks.
I have 2 tables one is filtered and the other not. I am trying to do merge between them and everthing appears to work fine and I reacht he end where it says 'Applied the snapshot and merged 1 data change' but nothing really is changed or updated on either end.
I have 2 tables one is filtered and the other not. I am trying to do merge between them and everthing appears to work fine and I reacht he end where it says 'Applied the snapshot and merged 1 data change' but nothing really is changed or updated on either end.
hi, i have a message queue system using sql 2005 service broker. the code and setup is the same on both dev and live database. but soon after i restored a live backup to dev. the queue stopped working on dev, live is ok thou. after some trouble shooting, i found that the server is not sending the message at all, but it says "Command(s) completed successfully" without any error messages.
setup:
-----------------------
create message type TestQueryMessage validation = none
create contract TestQueryContract (TestQueryMessage sent by initiator)
create queue TestSenderQueue
create service TestSenderService on queue TestSenderQueue
create queue TestQueueReceiver
create service TestServiceReceiver on queue TestQueueReceiver (TestQueryContract)
send message:
-------------------------
declare @conversationhandle uniqueidentifier;
begin dialog @conversationhandle
from service [TestSenderService]
to service 'TestServiceReceiver'
on contract [TestQueryContract]
with encryption = off;
send on conversation @conversationhandle
message type [TestQueryMessage] ('blah blah blah');
result:
----------------------------------
Command(s) completed successfully.
but when i do "select * from TestQueueReceiver", there's nothing. and i sure nothing else had picked up the messages.
I use the VB.NET transaction to update an sql server 2000 database. I call a number of stored procedures within this transaction.The stored procedures will update tables. These tables use triggers..My question is, when does the trigger get called? Is it after the each stored procedure, or is it after the whole transaction? Jag
I got 2 questions to ask: 1. I choose Windows and SQL authentication during setup. Will this have any impact on my connection string in Dot net? 2. If I want to host my db on my computer, what other protocols do I need to enable? Currently I got shared memory on.
I have a SSIS package with the last three tasks in the control flow are stopping the SSAS, then "on success" the second last task is execute a batch file to copy a bunch of files to a remote server using the robocopy command, then "on success" the last task is to start the SSAS. I test all three tasks individually and they are all working fine. The problem is in-between the second last task and the last task, the second last task is to execute a batch file and then forward to the last task. The task just moved to the last task once the batch file is executed and it did not wait until the actual robocopy job is completed. Therefore it caused a problem in the robocopy process. Thanks.
I am trying to use the export wizard to export to a MS Access file, using the provide source query option. I get the error below when pasting the query in. The query does run successfully in SSMS, it is a long running query. About 8 minutes to complete.
TITLE: SQL Server Import and Export Wizard ------------------------------ The statement could not be parsed. ------------------------------ ADDITIONAL INFORMATION: Deferred prepare could not be completed. Query timeout expired (Microsoft SQL Native Client)
I have 2 SQL servers. And in the first one I have added the second SQL as a Link Server. When I run an SQL statement on the linked server I get the following message. Server: Msg 7202, Level 11, State 1, Line 1Could not find server 'PROD' in sysservers. Execute sp_addlinkedserver to add the server to sysservers.[OLE/DB provider returned message: Deferred prepare could not be completed.] The SQL statement that I am runnins is Select * from openquery(PROD,'Select * from PROD.GMS.dbo.qryDispCL') But when I run only the SQL statement "Select * from PROD.GMS.dbo.qryDispCL" it works perfect. But I need to have the first statement running. Please help. Your valuable feedback is greatly appriciated.
I execute a script that someone else wrote and I get Query Completed with Errors but there is not an error message.If I highlight and execute parts of the script, it completes successfully.
I have an existing EE setup that captures all failing queries (see code below). The problem is that I also want to somehow capture RPC_starting so that I can see which parameters are passed in whenever a query fails. Is there a way to somehow capture those two events (error_reported & rpc_starting), but only capture rpc_starting when there is actually an error reported?Or maybe just an event on rpc_starting and somehow filter to only capture when there is an error?
Existing error_reported EE code: CREATE EVENT SESSION [what_queries_are_failing] ON SERVER ADD EVENT sqlserver.error_reported ( ACTION(sqlserver.sql_text, sqlserver.tsql_stack, sqlserver.database_id, sqlserver.session_id, package0.collect_system_time, sqlserver.transaction_id, sqlserver.username, sqlserver.client_hostname)
Hello, I am trying to use the Import export wizard to created a package, using the provide source query option. If i just copy the query from a text file and try to paste , sql only accepts it partially. so i saved it as a sql file and then opened it in the window. However, when i click on 'next' or 'parse' , i get the below error.
TITLE: SQL Server Import and Export Wizard
------------------------------ The statement could not be parsed.
------------------------------ ADDITIONAL INFORMATION: Deferred prepare could not be completed. Query timeout expired (Microsoft SQL Native Client)
The query is pretty big, but it executes successfully in the Management Studion Query Explorer window. I had no problem creating a package using DTS with the same query in Sql 2000. I also tried to migrate the package already existing in Sql 2000, but even though i can migrate it successfully , the package does not execute in Sql 2005. Also i tried other queries which are as big as this one, again the query source window during import/export does not seem to accept large queries??? I depend heavily on large queries for my packages, which i run daily. I have not had any issues with this is sql 2000. Can someone help me with this???
I would like to add a calculation to provide an annualized Gross Amount value. Most solutions I have seen are for annualizing down to the day. For instance, if today is the 123rd day of the year, the calculation would be Amount / 123 * 365. What I want to do is annualize the amount based only on completed periods. Today is 8/26.
The last completed quarter ended on 6/30. Therefore I want to basically take the sum of Q1 and Q2, divide by two, and multiply by four to get an annualized amount based on completed quarters. Eventually I want to do something similar with months.
So here's what I have so far. In my DSV my DimDate is actually based on a view. The view has a Date column with the actual date, all of the normal fields you would expect in a DimDate table, and two columns that look like this:
CASE WHEN [Date] < DATEADD(QQ, DATEDIFF(QQ, 0, GETDATE()), 0) THEN 'Y' ELSE 'N' END AS CompletedQuarter,
[Code] ....
Note that for dates between 1/1 and 3/31 the CompletedQuarterCount will be 0. I want any annualized amount to be 0 in that case because I only want to use completed quarters in my calculation.
I added both to my Pay Date dimension (which uses vwDimDate from the DSV) in the cube. I have tried the calculation below and in my Excel pivot table I'm getting blanks for the calculated field.
CREATE MEMBER CURRENTCUBE.[Measures].[Gross Amount Annualized by Pay Date Quarter] AS SUM( IIF([Pay Date].[Pay Date Completed Quarter Count] > 0, [Gross Amount] / [Pay Date].[Pay Date Completed Quarter Count] * 4, 0) ), FORMAT_STRING = "Currency", VISIBLE = 1 , ASSOCIATED_MEASURE_GROUP = 'Fact Claim' ;
One of our dba's runs a process every night to update the database with a daily data file received from an external source. He was testing on a new SQL Server 2012/Windows 2012 R2 cluster that has an Availability Group. While trying to process INSERTs, the process failed with a error: "could not allocate a new page for database X because of insufficient disk space in filegroup PRIMARY."
The log also contains "Operating System Error 1237 (The operating could not be completed. A retry should be performed) encountered".
However, there is 300 GB free on the data drive (E:) where the .mdf file is located. The SQL Server service account has the "Perform Volume Maintenance Tasks" permission (instant file initialization).
All of the disks are VMware 5.1 or 5.5 VM's and the E: disk has thick/eager zero provisioning.
suddenly I'm getting a connect to server error dialogue box that says...
Cannot connect to serverinstance name Additional information The operation could not be completed. (WinMgmt)
...when trying to connect to RS2005 Server in Management Studio. I can reach Reporting Manager thru IE and run my reports.
Is it possible that my setting IIS to basic authentication (and turning off Windows Integrated) might be preventing me from connecting in MS, perhaps because MS has to go thru the RS service and doesnt know what basic auth is?
I'm temporarily unable to set IIS back to Windows Auth cuz the server is being used by users to test reports.
I use SQLServer2000 when i begin a transaction and update set of records in a table till i don't commit or rollback the transaction nobody can not select previous values of that records and it must wait until i complete the transaction, i have tested it in oracle but it hasn't this limitation don't you think it is Sqlserver weakness?
I have been following the tutorial/blog post HERE to create an annualization (or "run rate") of my Gross Amount measure. What I want to do now is exclude any period that is not "complete".
For example, if today is 9/9 then Q3 is not complete - only Q1 and Q2 are complete. And if I'm looking at it monthly then January through August are complete, but September is not.
Cells B5 and D5 look exactly as I expect and want them to be. On row 6 below under each "Gross Annualized" value I have basically just put the formula for what it's actually doing.
What I would like to see in the blue cell is $67,211,697,268 - essentially the most recent annualization for a completed period. The annualization for Q3 is misleading because the quarter has not yet completed. There's $16b in Q1 and $17b in Q2, so the measly $78m in Q3 (yellow cell) is dragging the annualization down significantly. Even worse, the Gross Amount for Q4 is being treated as a $0.00, which is further dragging down the "2015 Gross Annualized" amount in blue. In a T-SQL average calculation, for example, I think the green cell would be treated as NULL rather than $0. That's kind of the behavior I want.
So I would like to do two things:
Create a calculation (probably just a 0 or 1 flag) that indicates whether the current period is complete or not. Again, using 9/9/2015 as an example, Q2 would be complete but Q3 would not be. And August would be complete, but September would not. Make the blue cell show $67,211,697,268 - an annualization based on completed quarters only.
For whatever it's worth here is the current calculation for Gross Annualized.
CREATE MEMBER CURRENTCUBE.[Measures].[Gross YTD] AS AGGREGATE( YTD([Pay Date].[Calendar].CurrentMember) ,[Measures].[Gross Amount]), FORMAT_STRING = "Currency", VISIBLE = 0;
I am running the following query trying to return server properties across a linked server. I want to store the results in a table on the server where I an running the query.
DECLARE @BuildClrVersionx nvarchar(128)
SET @BuildClrVersionx =
(SELECT *
FROM OPENQUERY(LKMSSQLXYZ01, 'CONVERT(nvarchar(128),SERVERPROPERTY("BuildClrVersion")'))
I am getting the following errors:
OLE DB provider "SQLNCLI" for linked server "LKMSSQLADM01" returned message "Deferred prepare could not be completed.".
Msg 8180, Level 16, State 1, Line 1
Statement(s) could not be prepared.
Msg 156, Level 15, State 1, Line 1
Incorrect syntax near the keyword 'CONVERT'.
If you have any ideas how I can run this query across a linked server I would appreciate it.
We have a scheduled weekly full backup job running through maintenance plan. We are using Sql 2008r2. Our backup server is windows 2008R2. There is plenty of space in the backup server but still my database full backup failed with the following error.
Executing the query "BACKUP DATABASE [Test] TO DISK = N'C:fullbackup." failed with the following error: "Write on "C:fullbackup est_backup_2015_04_12_200003.bak" failed: 665(The requested operation could not be completed due to a file system limitation) BACKUP DATABASE is terminating abnormally.
I'm working on an application that allows users to set up scheduled time based reports. Each scheduled report creates a SQL Agent job associated with a schedule.The default time to fire these off is 8:00 AM. There are several hundred. DWH and it has no trouble running hundreds of reports all fired off at the same time.
There are several ETL processes and occasionally they don't complete before our verbal SLA of 8:00 AM.
My problem is on days where the ETL runs past 8:00 AM I wan't to hold these scheduled jobs from firing off.
I am playing with DBCC command to check the contsrainst on a perticular table (DBCC CHECKCONSTRAINTS ('myTable') WITH ALL_CONSTRAINTS), it always gives the following result:
DBCC execution completed. If DBCC printed error messages, contact your system administrator.
---------------------------------------------------------------------- I executed it in my SQL Server Management Studio Express and I got: Commands completed successfully. I do not know where the result is and how to get the result viewed. Please help and advise.
I'm getting this when executing the code below. Going from W2K/SQL2k SP4 to XP/SQL2k SP4 over a dial-up link.
If I take away the begin tran and commit it works, but of course, if one statement fails I want a rollback. I'm executing this from a Delphi app, but I get the same from Qry Analyser.
I've tried both with and without the Set XACT . . ., and also tried with Set Implicit_Transactions off.
set XACT_ABORT ON Begin distributed Tran update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TRANSACTIONMAIN set REPFLAG = 0 where REPFLAG = 1 and DONE = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.WBENTRY set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.FIXED set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.ALTCHARGE set REPFLAG = 0 where REPFLAG = 1 update OPENDATASOURCE('SQLOLEDB','Data Source=10.10.10.171;User ID=*****;Password=****').TRANSFERSTN.TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 update TSADMIN.TSAUDIT set REPFLAG = 0 where REPFLAG = 1 COMMIT TRAN
It's got me stumped, so any ideas gratefully received.Thx