IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[table_Data]') AND type in (N'U')) DROP TABLE [dbo].[table_Data] GO /****** Object: Table [dbo].[table_Data] Script Date: 04/21/2015 22:07:49 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[table_Data]') AND type in (N'U'))
I have an SSRS 2012 table report with groups; each group is broken ie. one group for one page, and there are multiple groups in multiple pages.
'GroupName' column has multiple values - X,Y,Z,......
I need to group 'GroupName' with X,Y,Z,..... ie value X in page 1,value Y in page 2, value Z in page 3...
Now, I need to display another column (ABC) in this table report (outside the group column 'GroupName'); this outside column itself is another column header (not a group header) in the table (report) and it derives its name partly from the 'GroupName' Â values:
Example:
Value X for GroupName in page 1 will mean, in page 1, column Name of ABC column must be ABC-X Value Y for GroupName in page 2 will mean, in page 2, column Name of ABC column must be ABC-Y Value Z for GroupName in page 3 will mean, in page 3, column Name of ABC column must be ABC-Z
ie the column name of ABC (Clm ABC) Â must be dynamic as per the GroupName values (X,Y,Z....)
Page1:
GroupName          Clm ABC-X
X
Page2:
GroupName          Clm ABC-Y
Y
Page3:
GroupName          Clm ABC-Z
Z
I have been able to use First(ReportItems!GroupName.Value) in the Page Header to get GroupNames displayed in each page; I get X in page 1, Y in page 2, Z in page 3.....
However, when I use ReportItems (that refers to a group name) in the Report Body outside the group,
I get the following error:
Report item expressions can only refer to other report items within the same grouping scope or a containing grouping scope
I need to get the X, Y, Z ... in each page for the column ABC.
I have been able to use this - First(Fields!GroupName.Value); however, I get ABC-X, ABC-X, ABC-X in each of the pages for the ABC column, instead of ABC-X in page 1, ABC-Y in page 2, ABC-Z in page 3, ...
I have a job where the first step starts and checks for a condition. If its not true, I want it to reset itself and start again in 10 minutes. I'm using sp_stop_job and sp_update_jobschedule and, initially, it looks like it works. But since it's a Daily job, the 'Next Run Date' increments to the following day. Even though I'm using sp_update_jobschedule to keep the active_start_date as the same day, it still increments. I've tried updating sysjobschedules directly, but get the same results.
Any thoughts much appreciated! Here's my code: USE msdb
--This is the part that goes in the job step --and increments the next_run_time if the condition is true.
If (Select count('x') from mytable (NoLock) Where PublicationDate > getdate()) < 1 BEGIN Declare @ActiveStartDate int Declare @ActiveStartTime int
Select @ActiveStartDate = active_start_date from msdb.dbo.sysjobschedules (NoLock) Where schedule_id = 61 Select @ActiveStartTime = active_start_time from msdb.dbo.sysjobschedules (NoLock) Where schedule_id = 61
I am trying to develop a sql statement that will create a recordset of the min (or max) values in x minute increments over a period of time.
e.g. over a period of 7 days, I have data that was collected in 1 minute intervals. I need to know the min (or max) value in each 10 minute interval over that same period of time.
hey all, i need to find the ratio of difference in 2 datetime variables and the difference of another 2 datetime vars. I figured the best way to do it is to convert the difference in both numerator and denominator to number of minutes.
I am runing from .NET application an SQL Query it normally return the rows in 10 seconds but time to time the application turn 2 or 3 minutes and nearlly crash (or crash)
I have a result that comes out in number of seconds, but need to see it converted to minutes and hours and seconds. Is there a convert function that would do this?
Consider a table that holds Internet browsing history for users/machines,date/timed to the minute. The object is to tag all times that are separatedby previous and subsequent times by x number of minutes or less (it couldvary, and wouldn't necessarily be a convenient round number). This willenable reporting "active time" for users (a dubious inference, but hey).There are a lot of derivative ways of seeing this information that might begood to get to. What's the fist and last of these sets of times? Whatpercentage of a given period is spanned by active times, and not? What isthe average duration of such periods? What is the average interval betweenweb hits during such periods? During other times?Blah, blah. The basic problem is my principal problem. I don't have muchexperience with cursors, but from what I understand it would be very goodindeed to spare them, given the number of records I anticipate workingwith.I'd be glad of any pointers.--Scott
Hi, the SQL DB freezes and no one can access the DB for 5 or 10 minutes. Even select queries don t execute, nothing is displayed.
Until we kill the process id that s blocking in the sql activity monitor, only then the DB is released and people can work again. What does it mean that no query executes until we kill the processes ID? what could it be?
Also, recently we created indexes and ran tuning adviser, is it possible that the creation of indexes cause the freeze of a DB? is that possible?
I have SQL 2005 full version installed a remote server and when I start my computer I can connect to the database from SQL Managment Studio and a program I'm making. After about 15 minutes I find that I cannot connect using both programs, but if I enter the servers IP address using Managment Studio I can connect (this does not happen with the program I'm making). I have to log of my computer before I can connect again.
I want to create a Trigger that will run every 5 minutes. This trigger would actually run a Stored Procedure named say "SP_SetRooms". This SP will in turn run and andupdate the rooms table. Would anyone know how to help me get started on creating a Trigger with the info I've provided? Thank you, Dharmendra parihar
I've the following simple code bound to a button on a Web page
string time = (string)Cache["KEY"]; if (time == null) {
SqlConnection sqlConnection = new SqlConnection(@"Server=BIZYUSUFSQL2005;Database=Deneme;User Id=sa;Password=;"); SqlCommand command = new SqlCommand(@"select KOLON1 from dbo.CACHE", sqlConnection); sqlConnection.Open();
SqlCacheDependency dependency = new SqlCacheDependency(command); time = System.DateTime.Now.ToString(); Cache.Insert("KEY", time, dependency); command.ExecuteNonQuery(); sqlConnection.Close(); } return time; This code has to return the time value from cache. And when a record is inserted into the CACHE table, the cache item has to be invalidated and the new time value has to be returned. The code works properly for 2-3 minutes. But when there is no activity for 5 minutes, the cache invalidation does not work anymore.
The company I work for has no experience with replication, and neither have I. Now I recieved a functional specifications document and apparently what they want is use replication to maintain a copy of a production database on the data warehouse server. The database is 70GB, and they want to do it every 15 minutes (so their reports will contain up-to-date data).
As I said, I know nothing about replication, but to me, this sounds like madness. I fear that this will will (at least) have a negative impact on the performance of the production database.
So is it possible? Does replication have a big impact on the database or is it hardly noticable? I expect about 500 new records every 15 minutes; production database and data warehouse are on different servers; both are SQL2005.
I have two field and I need to find the sum I guess of them. One is called the clm_dout (process date) and the other one is clm_rcvd (received Date). These are both date and time fields and it looks like this 2006-03-17 00:00:00.000. I need to create a formula that wiil i guess give me the difference and create it into hour and minutes because I an trying to create a turn around report. this is what I had...
TimeValue({clm_doubt}-{clm_rcvd})
This is not working for me. Can someone give me a suggestion please.
Hi Can anyone help me convert a number to give the result in hours and minutes? For example 195 as 3:15 or 210 as 3:30. We are trying to create a report showing hours and minutes worked without having to export to Excel.
I've had a look around the net and this seems to be quite a difficult function in SQL Server.
I have two date columns one is sent_date and other is approved_date my requirment is to find the difference between the two dates which can be minutes/hrs/days. using datediff function iam able to get it in minusts or hrs but my output should be of the format hh:mm 23:10 (ie 23 hrs and 10 min) or say 48:00 (for 2 days)
Hi,I want to get the count of rows per each 20 minutes, is that possible?Is there a date function or any other function that I can use in Groupby clause, which will group the data of every 20 minutes and give methe count?Thank you.Vidya
I have a question regarding whether or not Data Mining can be utilized in a specific problem I have to solve.
Situation: I€™m going to simplify the problem by explaining it in terms of a €œpizza manufacturer€?. Suppose I wanted to predict the run minutes + downtime minutes (I use these to get an hourly rate: Pizzas/(run hrs + delay hrs) = Pizzas per hour) by looking at a set of input properties.
My properties could be something like the following: # of Toppings # of Special Pricing Stickers Cardboard Box Indicator Case Indicator (0 represents auto-casing, 1 represents putting in case by hand) Machine Type (0 or 1€¦ 0 represents an older €“slower machine, 1 is newer) Quantity of Run (there could be up to 15 other properties that may or may not impact our rate)
Measured Values: Run Minutes Delay (down) minutes
Steps I€™ve Done So Far: I€™ve created a couple different data mining models for this as I was unsure which one(s) to use. I checked the lift chart while feeding back in the original data set and my scatter plot appeared fairly inaccurate.
I've attempted to use Excel to create a linear regression, however my r squared value was always around .30. I decided to try to use SQL Server Data Mining to see if it could be something to help predict our accuracy better than a linear formula.
I've played with a couple different algorithms in Data Mining, and it appeared that none of them did exceptionally well with prediction. I even checked the lift chart using the same table as I used to train the model.
What algorithm(s) might work the best? Can I reasonably expect a prediction within a fairly strict tolerance (I'm guessing the answer to this is: "yes, if your source data represents a consistent pattern")? How can I best utilize Data Mining to give an answer like "historically, your run rate has been between these 2 values with a probability of X". I'm thinking I can utilize the predictprobability and stdev to some extent.
Any suggestions would be greatly appreciated.
If anyone needs further clarification, please let me know.
Does anyone know what would cause my log file (.LDF) to grow at a rate of over 1MB per second and quickly fill up the hard drive? I could use a quick answer on this. My experience is in Oracle but I'm assuming you can set the maximum size for a log file for starters? Not sure why it would be growing at this rate anyway though. I could use some quick answers on this one. Thanks!
Hi everyone - I have an ETL package which loads about 10 million rows from SQL 2005 staging tables to new, empty tables (no indexes or constraints) in another SQL 2005 DB to be SWITCHED into the main partioned data tables.
Both databases reside on the same SQL Server instance - it is a dev server so the disk aren't super fast/SAN speeds but it has plenty of RAM/CPU & SCSI disks.
The insert takes about 45 minutes - can I get this working any faster?? or is this typical for 10 million rows?? I've messed about withe the data flow a few times but I can't seem to get any significant improvements.
Any tips anyone??
I perform several lookups on dimensions - these are not cached.
I do query the source table concurrently with different WHERE clauses & run two pipelines processing the data into 2 destination tables.
Would it be better to query the base table once & use a conditional split instead of the two separate queries??
I also mulicast from each pipeline & use a UNION ALL to log some of the rows from each pipeline to anther destination table.
Hope this makes sense?? Any ideas or tips on how I can speed up this kinda transform would be appeciated..
I'm using oleDB connections.
Hope ths makes some kinda sense!! Thanks for any advice!!
I've posted a feedback with Microsoft to see if we can get them to fix the issue described below, but so far no one from Microsoft has commented to let us know what they're doing about this problem! I'm posting this here to see if maybe we can get more people to rate this feedback or chime in on what a pain it is! Please feel free to add your own comments or how you had to work around this issue and whether or not you think this is something Microsoft should be addressing NOW.
Provide Individual Page Numbering per Group and Total Pages per Group
Currently in a Reporting Services report, you can't readily reset the page number for each group in a table, nor can you display the total number of pages per group. For example, if I'm printing invoices and each invoice is a separate group, I'd like to be able to print "Page 1 of 5" , "Page 2 of 5" etc. for the first invoice, then "Page 1 of 3" when the next invoice begins, and so on. This was easy in Crystal Reports. I realize that Crystal Reports has a two-pass process that enables that kind of pagination. However, this is REALLY important functionality that's just missing from Reporting Services and I'm hoping you'll provide it REALLY SOON! Yeah, I know there are work-arounds if you can know exactly how many rows of information there are on each page. But gosh! That's not practical, especially if you have second level groups inside the main group or text blocks in rows that can 'grow' to more than one line. I've read a couple of work-arounds, but none of them works correctly and consistently when more than one user is running the same report or when you print the report while you're looking at it on the screen. I still may need access to the overall report page number and the overall total number of pages, so don't get rid of that. It's just that if you're doing this already for the entire report, I don't see why you can't do it per group! Lots of people have been asking for this for years, and I don't understand why it hasn't been implemented.
I've read a few articles on this topic, but no one has come up with a decent work around. My theory is that Microsoft should be addressing this immediately. This is major functionality that's just plain missing from SSRS and should have been there from the start. If anyone from Microsoft can let us know what's going on with this issue or if anyone would like for me to clarify this further, feel free to let me know.
I have an SSRS report with groups that when exported to excel contains drill-in's (plus marks on left side). The issue I have is that for all the groups in the drill-in, those cells become merged. I want to keep the group drill-in but have the cells UNMERGED. I have heard this can be done with the RDL XML but I don't know what to modify to accomplish this.Â