Historical Reporting On Changing Data
Apr 27, 2008
I've got a customer who wants reproducible/historical reporting. The problem is that the underlying data changes.
I tried to explain that this can't be done (can it?), but he doesn't
understand.
To illustrate the situation - Let's say a teacher wants to track
spelling test scores for her students.
The below are scores for students A, B, and C (for January, February, March)
A: {70,80,85}
B: {70,65, 80}
C: {100,90,100}
So, I can generate a historical report that charts the class average
and student trend - that's pretty easy.
Now, in April, we find that the school board has mandated that the
British spelling of words is ok, so now the cumulative scores (for
January, February, March, April)
A: {90,80,85,100}
B: {80,65, 80,80}
C: {100,90,100,75}
He wants a report showing the January average as (70+70+100)/3 = 80,
when really it is (90+80+100)/3 = 90.
Now imagine that there are actually thousands of data points changing like this...
Now also imagine that we add and remove students on a regular basis...
He and his office manager get frustrated when I explain that the
reports are not simple - in their mind it is. They have determined
the solution is to get a report writer and buy Crystal Reports...
I've tried to explain that the problem is that the report
specification is unclear (basically - they don't understand what they want). The situation is ok for now, I'm just trying to plan for when they figure out that buying Crystal Reports won't change their situation (except they are done several thousand dollars)...
Any tips?
View 20 Replies
ADVERTISEMENT
Aug 14, 2007
We have an issue in a SCD where a number of records may be presented that have changes to their attributes of EVERY type.
Example,
BusinessKey: xxxxxxxx
BuildingTypeId: 7
BusinessUnitHistoryId: 4019
BusinessUnitId: 4019
CurrencyId: 26
DevelopmentTypeId: 14
MarketId: 182
Name: abcdefgh
CurrencyId is a fixed attribute
MarketId & BuildingTypeId and the BusinessUnitId & BusinessUnitHistoryIds are historical attributes
Name is a changing attribute
The behaviour of the ETL seems to suggest that if fixed attribute changes are detected, these rows will error and therefore the changing & historical attributes will NOT be amended during the SCD transformation. Is this correct... as it seems to be what is happening.
View 7 Replies
View Related
Sep 29, 2006
I have a Slowly changing dimension that I am using to
populate a dimension table. My problem is this when I run the package and
any of the fields are marked as Historical Attributes it will add an additional
row regardless of the fact that the incoming data and the data in the warehouse
match exactly.
I've tried several things to fix this problem but so far none of them have
worked. Some of the things I have tried that haven€™t worked are to match
all the data types (which I have to do anyways) I've tried trimming the
strings, I've also tried adding just one column
I am using a data conversion to convert them from varchar (the source datatype)
to nvarchar(the warehouse datatype)
I'm at a dead end here and don't know where to go any help would be greatly appreciated.
Thanks
View 5 Replies
View Related
Jun 15, 2015
I am looking command (cmd/powershell/c#) for setting/updating credential for SSRS data source.
View 2 Replies
View Related
Sep 19, 2005
Hi Everyone.
I need some advise on how to create a historical database.
What is the best way of doing this? For example, should I create a new row if a column is changed in a row and Time Stamp all record? What happens when I have child tables link to a Header table?
I have been looking on the NET for methods of creating a historical database, but I cant find any.
Thanks in advance
View 11 Replies
View Related
Oct 20, 2005
A general data design question:We have data which changes every week. We had considered seperatinghistorical records and current records into two different tables withthe same columns, but thought it might be simpler to have them alltogether in one table and just add a WeekID int column to indicatewhich week it represents (and perhaps an isCurrent bit column to makequerying easier). We have a number of tables like this, holding weeklydata, and we'll have to query for historical data often, but only backthrough the last year -- we have historical data going back to 1998 orso which we'll rarely if ever look at.Is the all-in-one-table approach better or the seperation of currentand historical data? Will there be a performance hit to organizing datathis way? I don't think the extra columns will make querying too muchmore awkward, but is there anything I'm overlooking in this?Thanks.
View 1 Replies
View Related
Apr 7, 2008
From what I've read this is called 'slowly changing dimensions'. Bassically the system I'm working on needs to store the history of certain data so that at any time a user can look up an old project and view it exactly as is, even though the associated parts might have had certain changes over time. From what I can tell Type 2 ( current and historical records are stored in the same table) seems to be the most popular. Type 4 (current records in one table and historical records in a seperate history table) seems like it would also work but I've been unable to find any articles comparing the two. Does anybody have any info on the dis/advantages of one v.s. the other?
View 3 Replies
View Related
May 3, 2004
We have a database that adds over 100,000 records per day. This goes back to 2002 and I only need historical data for 6 months. Presently we can only can delete 1000 row at a time. Is there a faster way of deleting. We seem to continuely run out of disk space. Urgent!!!!!!!!!!!
View 7 Replies
View Related
Apr 24, 2008
Hi,
I have a SQL2005 db for tracking the prices of products at multiple retailers. The basic structure is, 'products' table lists individual products, 'retailer_products' table lists current prices of the products at multiple retailers, and 'price_history' table records when the price of a product changes at any retailer. The prices are checked from each retailer daily, but a row is added to the 'price_history' only when the price at the retailer changes.
Database create script:
http://www.boltfile.com/directdownload/db_create_script.sql
Full database backup:
http://www.boltfile.com/directdownload/database.bak
Database diagram:
http://www.boltfile.com/directdownload/diagram_0.pdf
I have the following query to retrieve the price history of a given product at multiple retailers:
SELECT
price_history.datetimeofchange, retailer.name, price_history.price
FROM
product, retailer, retailer_product, price_history
WHERE
product.id = 'b486ed47-4de4-417d-b77b-89819bc728cd'
AND
retailer_product.retailerid = retailer.id
AND
retailer_product.associatedproductid = product.id
AND
price_history.retailer_productid = retailer_product.id
This gives the following results:
2008-03-08 Example Retailer 22.3
2008-03-28 Example Retailer 11.8
2008-03-30 Example Retailer 22.1
2008-04-01 Example Retailer 11.43
2008-04-03 Example Retailer 11.4
The question(s) I have are how can I:
1 - Get the price of a product at a given retailer at a given date/time
For example, get the price of the product at Retailer 2 on 03/28/2008. Table only contains data for Retailer 1 for this date, the behaviour I want is when there is no data available for the query to find the last data at which there was data from that retailer, and use the price from that point - i.e. so for this example the query should result in 2.3 as the price, given that was the last recorded price change from that retailer (03/08/2008).
2 - Get the average price of a product at a given retailer at a given date/time
In this case we would need to perform (1) across all retailers, then average the results
I'd really appretiate anyone's help on this :)
many thanks in advance,
dg
View 17 Replies
View Related
Feb 17, 2008
I have a database which contains time series data (historical stock prices) which I have to search for patterns on a day to day basis. But searching this historical data for patterns is very time consuming not only in writing the complex t-sql scripts but also executing them.
Table structure for one min data:
[Date] [Time] [Open] [High], [Low], [Close], [Adjusted_Close], [MA], [DI].....
Tick Data:
[Date] [Time] [Trade]
Most time consuming queries are with lots of inner joins. So for example if I have to compare first few mins data then I have to do inner join like:
With IntervalData AS
(
SELECT [Date], Sum(CASE WHEN 1430 = [Time] THEN [PriceRange] END) AS '1430',
Sum(CASE WHEN 1431 = [Time] THEN [PriceRange] END) AS '1431',
Sum(CASE WHEN 1432 = [Time] THEN [PriceRange] END) AS '1432'
FROM [INDU_1] GROUP BY [Date]
)
SELECT [Date] ,[1430], [1431], [1432], [1431] - [1430] As 'Range' from IntervalData
WHERE ([1430] > 0 AND [1431] < 0 AND [1432] < 0) OR ([1430] < 0 AND [1431] > 0 AND [1430] > 0)
------------------------------------------------------------------------
select ind1.[Time], ind1.PriceRange,ind2.[Time], ind2.PriceRange from INDU_1 ind1
INNER JOIN INDU_1 ind2 ON ind1.[Time] = ind2.[Time] - 1 AND ind1.[Date] = ind2.[Date]
where (ind1.[Time] = 2058) AND ((ind1.PriceRange > 0 AND ind2.PriceRange >0) OR (ind2.PriceRange < 0 AND ind1.PriceRange < 0))
ORDER BY ind1.[Date] DESC;
Is there anyway I can use Sql 2005 Data mining models to make this searching faster?
View 1 Replies
View Related
Aug 2, 2007
I am working on a project, which involves displaying trends of certain aggregate values over time. For example, suppose we want to display how the number of active and inactive users changed over time.
One issue is how to store historical data. First of all, should I create a separate database for each historical snapshot or should I use one database for all snapshots? Second, our database size is a couple of gigabytes and replicating the entire database on a daily basis is not feasible. An alternative solution is to back up aggregate values, but how do I back up results of aggregate queries, where the user can specify a date range in the WHERE-clause? Another solution is to create fact tables from our relational schema and back those up.
Another issue is how to query historical data. Using multiple databases to store historical snapshots makes it harder to query.
As you can see there are several design alternatives and I would like to know how this sort of problem is generally solved in the industry. Does SQL Server provide any support for solving this problem?
Thanks.
View 5 Replies
View Related
Oct 25, 2005
My application is to capture employee locations.Whenever an employee arrives at a location (whether it is arriving forwork, or at one of the company's other sites) they scan the barcode ontheir employee badge. This writes a record to the tblTSCollected table(DDL and dummy data below).The application needs to be able to display to staff in a control roomthe CURRENT location of each employee.[color=blue]>From the data I've provided, this would be:[/color]EMPLOYEE ID LOCATION CODE963 VB002964 VB003966 VB003968 VB004977 VB001982 VB001Note that, for example, Employee 963 had formerly been at VB001 but wasmore recently logged in at VB002, so therefore the application is notconcerned with the earlier record.What would also be particularly useful would be the NUMBER of staff ateach location - viz.LOCATION CODE NUM STAFFVB001 2VB002 1VB003 2VB004 1Can anyone help?Many thanks in advanceEdwardNOTES ON DDL:THE BARCODE IS CAPTURED BECAUSE THE COMPANY MAY RE-USE BARCODE NUMBERS(WHICH IS DERIVED FROM THE EMPLOYEE PIN), SO THEREFORE THE BARCODECANNOT BE RELIED UPON TO BE UNIQUE.THE COLUMN fldRuleAppliedID IS NULL BECAUSE THAT PARTICULAR ROW HAS NOTBEEN PROCESSED. THERE ARE BUSINESS RULES CONCERNING EMPLOYEE HOURSWHICH OPERATE ON THIS DATA. ONCE A ROW HAS BEEN PROCESSED FORUPLOADING TO THE PAYROLL APPLICATION, THE fldRuleAppliedID COLUMN WILLCONTAIN A VALUE. IN THE PRODUCTION SYSTEM, THEREFORE, ANY SQL ASREQUESTED ABOVE WILL CONTAIN IN ITS WHERE CLAUSE (fldRuleAppliedID IsNULL)if exists (select * from dbo.sysobjects where id =object_id(N'[dbo].[tblTSCollected]') and OBJECTPROPERTY(id,N'IsUserTable') = 1)drop table [dbo].[tblTSCollected]GOCREATE TABLE [dbo].[tblTSCollected] ([fldCollectedID] [int] IDENTITY (1, 1) NOT NULL ,[fldEmployeeID] [int] NULL ,[fldLocationCode] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_ASNULL ,[fldTimeStamp] [datetime] NULL ,[fldRuleAppliedID] [int] NULL ,[fldBarCode] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL) ON [PRIMARY]GOINSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (963, 'VB001', '2005-10-18 11:59:27.383', 45480)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (963, 'VB002', '2005-10-18 12:06:17.833', 45480)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (964, 'VB001', '2005-10-18 12:56:20.690', 45481)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (964, 'VB002', '2005-10-18 15:30:35.117', 45481)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (964, 'VB003', '2005-10-18 16:05:05.880', 45481)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB001', '2005-10-18 11:52:28.307', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB002', '2005-10-18 13:59:34.807', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB001', '2005-10-18 14:04:55.820', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (966, 'VB003', '2005-10-18 16:10:01.943', 97678)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (968, 'VB001', '2005-10-18 11:59:34.307', 98374)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (968, 'VB002', '2005-10-18 12:04:56.037', 98374)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (968, 'VB004', '2005-10-18 12:10:02.723', 98374)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (977, 'VB001', '2005-10-18 12:05:06.630', 96879)INSERT INTO dbo.tblTSCollected(fldEmployeeID,fldLocationCode,fldTimeStamp,fldBarCode)VALUES (982, 'VB001', '2005-10-18 12:06:13.787', 96697)
View 4 Replies
View Related
Aug 12, 2014
I want to load historical data from an old system into a new one.Thing is, that old system stored dates as Datetime and the new one uses DateTimeOffset.
All data was collected in the same Time Zone... but with the Daylight Saving Time (DST)
The offset is either +04:00 or +05:00, based on the calendar date. To add to the complexity, the rules for DST changed a couple of years ago.
To determine the offset, I'd need to know what was or would have been the server Timezone for each historical date.
View 1 Replies
View Related
Aug 13, 2014
Recently, I partitioned one of my largest tables into multiple monthly field groups. For the current month, it is attached to my "Active' table. The older records are kept in the "historical" table. I need an efficient way to pull records when have a date range that can be spread across both tables.
View 9 Replies
View Related
May 12, 2015
I'm looking to get counts on historical data where the number of records exists on or after May 1 in any given year. I've got the total number of records for each year worked out, but now looking for the number of records exist after a specific date. Here's what I have so far.
SELECT p.FY10,p.FY11,p.FY12,p.FY13,p.FY14,p.FY15
FROM
(
SELECT COUNT(recordID) AS S,
CASE DateFY
[code]...
View 2 Replies
View Related
Sep 29, 2015
I am getting ready to start a project where I am charged with moving out old data from production into a newly created historical DB. We have about 8 tables that are internal audit tables, that are big and full of old data. These tables are barely used and are taking up way too much space and time for maintenance.
I would like to create a way (SSIS?) to look at the date field in each of the 8 tables and copy out anything older than two years into my newly created history DB. Then deleting the older records from the source DB.
I don't know if SSIS is the best method to use. If it is, what containers to use to move over data, then how to do delete from source?
Can I do the mass deletes on my audit source tables without impacting performance/indexes/fragmentation?
View 5 Replies
View Related
Jul 10, 2015
Ok, I'm looking to get counts on historical data where the number of records exists between two dates with different years. The trick is the that the dates fall in different years. Ex: Give me the number of records that are dated between 0ct 1, 2013 and July 1, 2014.
A previous post of mine was similar where I needed to get records after a specific date. The solution provided for that one was the following. This let me get any records that occured after May 1 per given Fiscal year.
SELECT
MAX(CASE WHEN DateFY = 2010 THEN Yr_Count ELSE 0 END) AS [FY10],
MAX(CASE WHEN DateFY = 2010 THEN May_Count ELSE 0 END) AS [May+10],
MAX(CASE WHEN DateFY = 2011 THEN Yr_Count ELSE 0 END) AS [FY11],
MAX(CASE WHEN DateFY = 2011 THEN May_Count ELSE 0 END) AS [May+11],
MAX(CASE WHEN DateFY = 2012 THEN Yr_Count ELSE 0 END) AS [FY12],
[Code] ....
I basically need to have CASE WHEN MONTH(OccuranceDate) between Oct 1 (beginning year) and July 1 (ending year).
View 4 Replies
View Related
May 7, 2008
Is there a way to change the main reporting services looks. For example, can i change the font? Can I add a logo? Change the color?
View 1 Replies
View Related
Sep 27, 2007
Hi,
I am formatting a currency field within reporting services using an expression. However i get the $ sign. I need the £ sign. How can i change this?
Thanks
David
View 1 Replies
View Related
Dec 19, 2007
Hi everyone!
I have a problem regarding the generation of report using SSRS (SQL Server Reporting Services). As I've read from other forums, the said service need access to the Windows temp folder, which by default, is found in C:WINDOWSTemp. The said path is found in either the TMP or the TEMP environment variable.This means that there should be sufficient access to this folder, in order for SSRS to properly generate the report. The problem is that my client, for security reasons, does not want to grant special rights to this folder, using the ASP.NET account. Is there any way that we can change location of the temporary folder being used by SSRS without having to change the value of the TMP and TEMP environment variables? thanks in advance
View 1 Replies
View Related
Nov 5, 2015
Fields!TaskStartDate.Value & Â Fields!TaskName.Value & Fields!StatusForExecutiveReporting.Value & Fields!NotesForExecutiveReport.Value
This is my expression in a cell.when i run my report  task start date value is something like the:
01/25/2015 8:00:00AM.
I want this date to be in this form: jan 2015
View 3 Replies
View Related
Apr 27, 2007
Hi all
sorry im new with using Reporting Services and even more inexperienced with using cubes.
My situation is as follows. I perform dynamic grouping (user selects the view via a parameter) Depending on the view selected, I need to change the dimension filter in the dataset.. Is this possible ?
Regards,
Neil
View 5 Replies
View Related
Nov 13, 2007
When I am working on an Integrations Services project and I open a Reporting Services file, it displays it in the Code view. Is there a way to display it in the Design view? Or do I have to close the project and open up a Reporting Services project?
Fred
View 1 Replies
View Related
Nov 6, 2007
I've SQL Server Reporting Services 2005 configured on my system, but now I'm NOT Able to start the report server. I get this error on http://<serverurl>/Reports/Home.aspx:
The report server cannot decrypt the symmetric key used to access sensitive or encrypted data in a report server database. You must either restore a backup key or delete all encrypted content. Check the documentation for more information. (rsReportServerDisabled) (rsRPCError) Get Online Help
The report server cannot decrypt the symmetric key used to access sensitive or encrypted data in a report server database. You must either restore a backup key or delete all encrypted content. Check the documentation for more information. (rsReportServerDisabled)
Key not valid for use in specified state. (Exception from HRESULT: 0x8009000B)
I don't want to loose any data. Please help me resolving this error.
FYI, I've modified the SQL Reporting Services account LogOnAs info, in Properties of "SQL Server Reporting Services (MSSQLSERVER)" Service, in Windows Service Managment.
Earlier I had .useraccount as username, I modified it to domainnameuseraccount. And then suddenlt everything broke for reporting server.
View 4 Replies
View Related
Jun 29, 2015
I want to be able to open URLs in a new window in my SSRS report, and the URLs keep changing in every record. The problem is these links are not the only things present in the record, and there is other data present. I keep seeing examples where they are considering only a link to be present in the record with no other data.
Also the data in the record includes HTML tags and I checked the option that says "HTML - Interpret HTML tags as styles" on the placeholder properties window.
Example of data in the record:
Hello! Welcome to google, the most used search engine.
Click on www.google.com to go the website
So when I click on the www.google.com URL, it should open in a new window.Â
View 2 Replies
View Related
Feb 28, 2008
Hi, all experts here,
Do we always have to use SCD component for the loading of data into data warehouse to handle changes of rows?
I am looking forward to hearing from you and thank you very much in advance for your help.
With best regards,
View 4 Replies
View Related
Apr 16, 2014
I have an source file and i have to load it into the data base by changing datatype of the columns in ssis
View 1 Replies
View Related
Oct 3, 2007
I have a fairly simple data flow task that loads data from one table (OLE DB Source) into another table (OLE DB Destination). The data type for one of the pairs of columns is nVarChar(120) and it contains version information that looks like a decimal. When I run the export, the destination has a trailing zero added after the decimal point as if it were a numeric column which invalidates our comparisons (string 1.0 is not the same as string 1.00). There is no cast or convert done to this column, it is a straight copy. Any ideas what could be causing this or how to fix?
View 6 Replies
View Related
Aug 29, 2006
I have created a SSIS package that transfer data from a Foxpro database to an instance of SQL Server 2005 Express. I used the wizard to create the package but I load and execute the package within a custom application that I have written in C#.
The way the custom application is intended to work is that the user can have the database in any location on the computer and all he has to do is specify the location then the application programatically changes the location of the source on the package that it has loaded and then execute it. When I initially run the package the first time (using the original path), it works fine and transfers the data. However, every subsequent time I run the application and specify a different path, the database on the SQL Server side gets created as expected but the data is not transfered!
Where am I going wrong? Do I need to save the package after I modify the source then reload and run it again or do i need to change something else in the Data Flow to make this work?
View 4 Replies
View Related
Mar 13, 2008
Hi,
I have a question about historical table. I have a table in Sql Server that keeps the history of my clients for the last 3 months.
This table has a PK : client_code and rep_date_id. I want to do a query that returns me a client and to extract the date when i inregistreted for the first time.
I think about a cursor, but i don't know how to use it.
My table looks like this structure:
- client_code
-activity_code
-country_code
-district_code
-...
-rep_date_id_n
And I want to return
--activity_code
-country_code
-district_code
-...
-start_date
-current_date
Thanks
View 1 Replies
View Related
Jul 20, 2005
How to build a TRIGGER that copies, all the time, the data from the table onwhich the transaction occurs to the historical table?thanksFernand---Outgoing mail is certified Virus Free.Checked by AVG anti-virus system (http://www.grisoft.com).Version: 6.0.659 / Virus Database: 423 - Release Date: 2004-04-15
View 1 Replies
View Related
Apr 4, 2007
I have about 45000 records in a CSV file, which I am using as HTTP request parameters to query a website and store some results in a database. This is the kind of application which runs 24/7, so database grows really quickly. Every insert fires up a trigger, which has to look for some old records based on some criteria and modify the last inserted record. My client is crazy about performance on this one and suggested to move the old records into another table, which has exactly the same structure, but would serve as a historical table only (used to generate reports, statistics, etc.), whilst the original table would store only the latest rows (so no more than 45k at a given time, whereas the historical table may grow to millions of records). Is this a good idea? Having the performance in mind and the fact that there's that trigger - it has to run as quickly as possible - I might second that idea. Is it good or bad? What do you think?
I read a similar post here, which mentioned SQL Server 2005 partitioning, I might as well try this, although I never used it before.
View 5 Replies
View Related
Mar 21, 2006
I have an SCD that contains a historical output path. If I throw dataviewer in the flow before the SCD, I can see that it should trigger a trip down that lane, but its not. In the advanced editor, all looks good. Anything I can check. BTW, the datatype of the fields that should cause the SCD are datetime.
thanks in advance
View 1 Replies
View Related