Calculating Average From Rows Entries..
Mar 19, 2008
Dear all,
I havea table name HISTORY_MEASURE which is a collection of different measure value issue from different mesuring device.
Values inthis table is represented as follow :
Id Name Value
==============================
1 Diameter1 0.45
2 Diameter2 1.23
3 Temperature2 123
4 Temperature2 200
5 Diameter1 0.65
Out of this table what I need to do is calculate the average value for each same [Name]. As you can see from the sample set above, the Diameter1 has 2 entries value hich gets store at different time of course.
So I I take the example of Diameter1 I need to calculate and display in a field the average results.
The result would be
Name Average
=====================
Diameter1 .....
Diameter2 ....
Temperature2 ....
Temperature1 ....
How can I perform this ?
Or could it be better to get a view of the table above which gets display as follow :
Diameter1 Temperature2 Diameter2
0.45 123 1.23
0.65 200 0
Thanks fro your help
regards
serge
View 7 Replies
ADVERTISEMENT
Mar 29, 2008
I am trying to create a SSRS report that needs to show average for both parent and child rows. In the example below i need to show an average of ErrorRecAge column for Company, then for cic_Group and one grand total/avg. How can this be done from single query (as below) and using SSRS built in functions/codes? I am also curious to know if this can be done from TSQL directly (ex. something similar to running sum/total). I appreciate the help.
Select
CustomerCorrection.Id As ParentId,
cc_company As cc_company2,
ccError.Id As ErrorId,
COALESCE(cic_Group,'Other Errors') As cic_Group,
COALESCE(cic_Code,'Unknown') As ErrorCode,COALESCE(cic_Description,'Unknown') As ErrorDescription,
DateDiff(hh,cc_entrydt,getdate())/24.00 AS ErrorRecAge
From
CustomerCorrection Inner Join CCError On
CustomerCorrection.Id = CCError.CustomerCorrectionId
Left Outer Join CustomerImportControl On
cic_code = ce_errno
Where cc_company = 'IWA' And CustomerCorrection.Id In (129,1004,3228)
Order By CustomerCorrection.Id
That Returns
129 IWA 992 Other Errors Unknown Unknown 399.500000
129 IWA 1089 Other Errors Unknown Unknown 399.500000
129 IWA 1760 Other Errors Unknown Unknown 399.500000
1004 IWA 952 Other Errors Unknown Unknown 365.333333
1004 IWA 1853 Other Errors Unknown Unknown 365.333333
3228 IWA 10 Other Errors Unknown Unknown 329.375000
Here is my temp workaround:
Select a.*,b.*
From
(
Select
cc_Company, Count(Distinct CustomerCorrection.Id) as RecCount, Avg(DateDiff(hh,cc_entrydt,getdate()))/24.00 As RecAge
From
CustomerCorrection
Group By cc_Company
) As a
Inner Join
(
Select
cc_company As cc_company2,
COALESCE(cic_Group,'Other Errors') As cic_Group,
Count(Distinct ccError.Id) As ErrorRecCount,
AVG(DateDiff(hh,cc_entrydt,getdate()))/24.00 As ErrorRecAge
From
CustomerCorrection Inner Join CCError On
CustomerCorrection.Id = CCError.CustomerCorrectionId
Left Outer Join CustomerImportControl On
cic_code = ce_errno
Group By cc_company, COALESCE(cic_Group,'Other Errors')
) as b
On a.cc_company = b.cc_company2
Order By cc_Company,cic_group
View 2 Replies
View Related
May 2, 2007
Im trying to get the average Fuel Consumption for A Manufacturer that produces two or more cars, so far ive only been able to find all manufacturers Average Fuel consumption.
Heres what I have so far
Select aManufacturer.MName, avg(FuelCons)
From aCar
Join aBuilts On aBuilts.CName = aCar.CName
Join aManufacturer On aBuilts.MName = aManufacturer.MName
Group by aManufacturer.MName
This produces nearly all I want only I need to be able to get only the Manufacturers who produce two or more Cars, ive tried implementing a few Count statements but nothings working, any ideas?
View 4 Replies
View Related
Feb 12, 2013
I want to calculate average of grades of each student and get the highest one with SQL command.
I have 2 tables:
Students:
*StudentId
*StudentName
___________
Grades:
*StudentId
*Grade
___________
I need to calculate average of each student and then get the highest.
My try:
Code:
SELECT Students.StudentId,Students.StudentName,AVG(Grades.Grade) AS avg_grade FROM Students s JOIN Grades g ON Grades.StudentId =Students.StudentId
GROUP BY Students.StudentId, Students.StudentName
ORDER BY avg_grade
LIMIT 1 FROM Students;
I encounter problem with this code, maybe it's Completely wrong...
View 5 Replies
View Related
Jul 4, 2014
How to calculate Average sal foe below scenario.
I am having tables with 12 columns as jan,feb,.......dec.
Now I want to calculate average salary for each record, but condition is that if any month salary is zero then that column also exclude from average calculation.
For example : if jan and feb column values are zero then i want to calculate (mar+apr+...+dec)/10.
View 5 Replies
View Related
Dec 3, 2013
I am calculating the average number of patients per day as like this
COUNT(DISTINCT PATIENTNAME) * 1.0/NullIf(COUNT(DISTINCT COALESCE(ARRIVEDATE,DEPARTDATE)),0) AS [AvgNo.ofpatients PerDay]
but i am getting results as like this 5.111111111111 , 8.000000000000,1.000000000000
we don't want to get that many digits after point we want only two digits like this 5.11,8.00 or 8, 1.00 or 1.
How can i do this?
View 2 Replies
View Related
Jul 5, 2007
Hi all,
I have a problem which needs to be sorted out immediate in Analysis service Cube. My requirement is as follows
The following data explains the average value of each employee in corresponding level.
Level1 - > E1 – (25hrs /25days) =1 hrs/day
Level2 ---------- >E2 – (125hrs /25days) = 5 hrs/day
Level3 ------------------------ >E4 – (150hrs /25days) = 6hrs/day
Level4 --------------------------------------- > E6 – (100hrs /25days) = 4hrs/day
Level4 --------------------------------------- > E7 – (75hrs/25days) = 3hrs/day
Level4 --------------------------------------- > E8 – (175hrs/25days) = 7hrs/day
Level3 ------------------------ >E5 – (75hrs/25days) = 3hrs/day
Level2 ---------- >E3 – (100hrs /25days) = 4hrs/day
Eg:
I have productivity records of each day and each employee. I need to calculate avg of each last level employee productivity by monthly. Again Last level employee productivity avg must be added up with their immediate head. But, when I define a Measure Item as avg in the cube, it sums all the values of lost level employees & head and divides with number of records (normal avg).
My requirement is calculating each head avg by sum of each last level employee avgs / no of employees. If head having value, he too will be added. Again Head’s Avg will be added up immediate head.
The following calculation gives average value at each level.
Average of Level 3(E4) = > (4+3+7+6)/4 = 5 hrs/day< = (E6+E7+E8+E4)/4
Average of Level 2(E2) = > (5+3+5)/3= 4.333 < =avg(Level3(E4))+avg(Level3(E5)))/2
Average of Level 1(E1) = > (4.333+4+1)/3 = 3.111
< = avg(Level3(E2))+avg(Level3(E3)))/2
Formula for average of level :
: (Sum of Children value + Head Value of Corresponding children) / (No.of Children +1)
I want to calculate average of each employee as well as average of each level in cube (SQL Server Analysis Services).
Thanks in advance
Thiru
View 1 Replies
View Related
Jul 20, 2005
Firstly, sorry for the long post, but I've included a fair bit ofsample data.Im doing a comparision of 10yr Bond prices and CPI adjustments, withan 18 week moving average of the CPI. I'm using a nested sub-query tocalculate the moving average, but I'm having difficulty selectingexactly 18 data points (ie When I include the 'HAVING COUNT(C1.Closes)= 18' line, I get no results).Can anyone help?-- Some sample data:CREATE TABLE Bond10 (Closes [datetime] NOT NULL ,Prices [smallmoney] NOT NULL ,)INSERT INTO Bond10SELECT '1994-01-14', 6.57 UNIONSELECT '1994-01-21', 6.53 UNIONSELECT '1994-01-28', 6.44 UNIONSELECT '1994-02-04', 6.51 UNIONSELECT '1994-02-11', 6.54 UNIONSELECT '1994-02-18', 6.89 UNIONSELECT '1994-02-25', 7.18 UNIONSELECT '1994-03-04', 7.43 UNIONSELECT '1994-03-11', 7.43 UNIONSELECT '1994-03-18', 7.44 UNIONSELECT '1994-03-25', 7.66 UNIONSELECT '1994-04-01', 7.96 UNIONSELECT '1994-04-08', 8.07 UNIONSELECT '1994-04-15', 8.24 UNIONSELECT '1994-04-22', 8.23 UNIONSELECT '1994-04-29', 8.45 UNIONSELECT '1994-05-06', 8.82 UNIONSELECT '1994-05-13', 8.86 UNIONSELECT '1994-05-20', 8.44 UNIONSELECT '1994-05-27', 8.75 UNIONSELECT '1994-06-03', 8.79 UNIONSELECT '1994-06-10', 8.77 UNIONSELECT '1994-06-17', 9.24 UNIONSELECT '1994-06-24', 9.63 UNIONSELECT '1994-07-01', 9.66 UNIONSELECT '1994-07-08', 9.59 UNIONSELECT '1994-07-15', 9.41 UNIONSELECT '1994-07-22', 9.56 UNIONSELECT '1994-07-29', 9.58 UNIONSELECT '1994-08-05', 9.31CREATE TABLE AvgCPI (Closes [datetime] NOT NULL ,AvgCPI [smallmoney] NOT NULL ,)INSERT INTO AvgCPISELECT '1994-01-14', 2.04 UNIONSELECT '1994-01-21', 2.04 UNIONSELECT '1994-01-28', 2.04 UNIONSELECT '1994-02-04', 2.04 UNIONSELECT '1994-02-11', 2.04 UNIONSELECT '1994-02-18', 2.04 UNIONSELECT '1994-02-25', 2.04 UNIONSELECT '1994-03-04', 1.51 UNIONSELECT '1994-03-11', 1.51 UNIONSELECT '1994-03-18', 1.51 UNIONSELECT '1994-03-25', 1.51 UNIONSELECT '1994-04-01', 1.51 UNIONSELECT '1994-04-08', 1.51 UNIONSELECT '1994-04-15', 1.51 UNIONSELECT '1994-04-22', 1.51 UNIONSELECT '1994-04-29', 1.51 UNIONSELECT '1994-05-06', 1.51 UNIONSELECT '1994-05-13', 1.51 UNIONSELECT '1994-05-20', 1.51 UNIONSELECT '1994-05-27', 1.51 UNIONSELECT '1994-06-03', 1.80 UNIONSELECT '1994-06-10', 1.80 UNIONSELECT '1994-06-17', 1.80 UNIONSELECT '1994-06-24', 1.80 UNIONSELECT '1994-07-01', 1.80 UNIONSELECT '1994-07-08', 1.80 UNIONSELECT '1994-07-15', 1.80 UNIONSELECT '1994-07-22', 1.80 UNIONSELECT '1994-07-29', 1.80 UNIONSELECT '1994-08-05', 1.80-- My query so far:SELECT A1.Closes, A1.Prices, B1.AvgCPI, SUM(C1.AvgCPI) AS MovSumCPI,AVG(C1.AvgCPI) AS MovAvgCPI, COUNT(C1.AvgCPI) AS CounterFROM (SELECT Closes, Prices FROM Bond10) A1LEFT JOIN (SELECT Closes, AvgCPI FROM AvgCPI) B1 ON A1.Closes = B1.ClosesLEFT JOIN (SELECT Closes, AvgCPI FROM AvgCPI) C1 ON C1.Closes >= A1.Closes AND DATEADD(Week,-18,C1.Closes) <A1.ClosesGROUP BY A1.Closes, A1.Prices, B1.AvgCPI, C1.AvgCPI-- HAVING COUNT(C1.Closes) = 18ORDER BY A1.ClosesDROP TABLE Bond10DROP TABLE AvgCPIExpected ResultsCloses Bon10 AvCPI MovAvg========== ==== ==== ====14-Jan-94 6.57 2.0414-Jan-94 6.57 2.0421-Jan-94 6.53 2.0421-Jan-94 6.53 2.0428-Jan-94 6.44 2.0428-Jan-94 6.44 2.044-Feb-94 6.51 2.044-Feb-94 6.51 2.044-Feb-94 6.51 2.0411-Feb-94 6.54 2.0411-Feb-94 6.54 2.0411-Feb-94 6.54 2.0418-Feb-94 6.89 2.0418-Feb-94 6.89 2.0418-Feb-94 6.89 2.0425-Feb-94 7.18 2.0425-Feb-94 7.18 2.0425-Feb-94 7.18 2.04 2.044-Mar-94 7.43 1.51 2.014-Mar-94 7.43 1.51 1.9811-Mar-94 7.43 1.51 1.9511-Mar-94 7.43 1.51 1.9218-Mar-94 7.44 1.51 1.8918-Mar-94 7.44 1.51 1.8625-Mar-94 7.66 1.51 1.8325-Mar-94 7.66 1.51 1.801-Apr-94 7.96 1.51 1.781-Apr-94 7.96 1.51 1.758-Apr-94 8.07 1.51 1.728-Apr-94 8.07 1.51 1.6915-Apr-94 8.24 1.51 1.6615-Apr-94 8.24 1.51 1.6322-Apr-94 8.23 1.51 1.6022-Apr-94 8.23 1.51 1.5729-Apr-94 8.45 1.51 1.5429-Apr-94 8.45 1.51 1.516-May-94 8.82 1.51 1.516-May-94 8.82 1.51 1.5113-May-94 8.86 1.51 1.5113-May-94 8.86 1.51 1.5120-May-94 8.44 1.51 1.5120-May-94 8.44 1.51 1.5127-May-94 8.75 1.51 1.5127-May-94 8.75 1.51 1.513-Jun-94 8.79 1.8 1.5310-Jun-94 8.77 1.8 1.5417-Jun-94 9.24 1.8 1.5624-Jun-94 9.63 1.8 1.571-Jul-94 9.66 1.8 1.598-Jul-94 9.59 1.8 1.6115-Jul-94 9.41 1.8 1.6222-Jul-94 9.56 1.8 1.6429-Jul-94 9.58 1.8 1.665-Aug-94 9.31 1.8 1.67Thanks,Stephen
View 6 Replies
View Related
Jun 9, 2015
I am having trouble with calculating AVG in a matrix. I have done the subtotal but I would like to know how to calculate the Average. I would also like AVA below the Total.
Link : [URL] ....
I have also added an image oh how the matrix looks like.
View 5 Replies
View Related
Aug 18, 2014
I need developing a query to get the average count by the following:
Day - use daily info for the last ??? days
Weekly - average
- Add all days and divide by 7
- As of Saturday midnight
Monthly - average
- Add all days and divide by days in the month
- As of last save on last day of month
Quarter - average
- Add all days and divide by number of days in the quarter
- As of last day of quarter
Year - average
I don't have requirements for year as of yet.
How can I get the avery count per these timeframes?
View 7 Replies
View Related
Apr 27, 2015
How you would calculate the average read/write latency experienced by a SQL Server instance during a specific time window in order to monitor this for multiple instances. From this MSDN blog, I know that you have to take multiple samples and do some calculations to get the correct latency.
[URL] ...
However, the SQLServer:Resource Pool Stats object tracks these numbers per resource pool and we want to get one number for the whole server. Since there can be a different base value for each resource pool, you can't simply sum the numerator values together. Here's some sample data from a server that illustrates the problem.
object_name counter_name instance_name cntr_value cntr_type
SQLServer:Resource Pool Stats Avg Disk Read IO (ms) default 307318919 1073874176
SQLServer:Resource Pool Stats Avg Disk Read IO (ms) Base default 25546724 1073939712
SQLServer:Resource Pool Stats Avg Disk Read IO (ms) internal 2045730 1073874176
SQLServer:Resource Pool Stats Avg Disk Read IO (ms) Base internal 208270 1073939712
I'm thinking I would need to do some sort of weighted average, but I'm not sure if that will result in the correct value. Here's the formula I am thinking about using currently before doing the calculation over time
((default * default[base]) + (internal * internal[base]))/(default[base] + internal[base])
Then to do the calculation over time, I'd use the changes in the calculated numerator and denominator to get the average.
Does this sound like to correct way to get this value? Is there a good way to verify?
View 2 Replies
View Related
Jun 22, 2015
I have got this matrix and I am trying to calculate the average amount of working days in a month. At the moment, I have divided the total number of jobs by 21 for every month which is a hard coded value. However, I am not sure how to retrieve this value dynamically. Is there any formula that can find out the working days?
View 7 Replies
View Related
Oct 15, 2004
heyas,
i wanna know how can i select only entries with at least 2 rows?
What i mean is that i want to make a select * from table1 where columnA = 'Something' order by columnB
but addicionally i want it to return only register that happen at least 2 times for columnB
thanks in advance!
View 2 Replies
View Related
Mar 25, 2008
I have a table with four columns: id, value1, value2, and name. I need to select all duplicate rows with the same name in the name field. Once selected I need to average the value1 and value2 rows with each other. Then I would like to delete the duplicate rows and replace with the new averaged values. Should I first move the duplicate rows to a temp table and then average? Any help would be appreciated.
View 1 Replies
View Related
Sep 11, 2014
i have a table like below
create table staff_attendance
(
attendance_id int,
attendace_date datetime,
staff_id int,
working_year int,
hours int
)
values like
1 2014-06-30 00:00:00.0ST10121
2 2014-06-30 00:00:00.0ST10122
3 2014-06-30 00:00:00.0ST10122 ----same entry like previous one
4 2014-07-01 00:00:00.0ST10121
5 2014-07-01 00:00:00.0ST10122
6 2014-07-02 00:00:00.0ST10121
7 2014-07-02 00:00:00.0ST10122
8 2014-06-30 00:00:00.0ST10221
9 2014-06-30 00:00:00.0ST10222
10 2014-07-01 00:00:00.0ST1022 1
11 2014-07-01 00:00:00.0ST102 22
12 2014-07-02 00:00:00.0ST102 21
13 2014-07-02 00:00:00.0ST102 22
I Need to find the duplicate rows like same entries which is having more than 1 rows.... how do i find?
View 3 Replies
View Related
Apr 9, 2008
1. Is it possible to use the subtotal as a field for calculating values?
2. Can I add another row to the subtotal area, having two subtotal rows?
I need to achieve the following output:
Months
1 2 3
101200 Cust1 2008 50 40
2007 45 45
Subtotal +5 -5
102300 Cust2 2008 70 80
Subtotal 0 0
What I want is to use the subtotal as a variance-field for the two selected years by the user. It could happen that a customer doesn't generate turnover in one year, then the subtotal has to be 0 or 100 or whatever.
If my suggestions aren't applicable then I'd like to know If there is another possibility to display the desired results.
Thanks!
View 8 Replies
View Related
Aug 23, 2007
Hello forum.
I have three columns in my table with the following datatypes:
Date - DateTime
Height - Decimal
Change_From_last_Height - Decimal
I am using "SQL Server 2005 Express Edition". I'm fairly new to SQL and would greatly appreciate any help or advice I can get.
The "date" column increases by an extra day in every new row and I then enter the new height of the plant. What I want to know is how I can get SQL server express to automatically enter the difference in height between the current row's height and that of the previous row.
Is it possible to automate the entry in the Change_From_Last_Height column in SQL?
Put another way, I know how to find the difference between two values in the same row but different columns, but how do I calculate the difference between values in adjacent Rows (ie. Rows next to each other)?
Thank you.
View 3 Replies
View Related
Jul 23, 2005
Hi,I'm am looking for a little help. I need to create a SQL view whichjoins a few tables, and I need to return an average for a particularcolumn where a second duplicate ID exists...Heres an example of how the results could be returned...ID | Name | Order No. | Value---+------+-----------+---------5 | test | 1234 | 35 | test2| 1234 | 45 | test3| 1234 | 35 | void | 1235 | 55 | void2| 1235 | 65 | void3| 1235 | 55 | void4| 1235 | 7ID is my main join which joins the tablesName is a unique nameOrder No is the same for the different names, I only need to return onerow with this order no, and the first name (the rest are irrelevant)Value is the field which I wish to return as an average of all 3, 4 orhowever many rows is returned and share the same order no. This iswhere I get totally lost as I am pretty new to SQL. Can anyone provideany help on how I would go about limiting this query to the uniqueorder no's and returning the average of the value field, and I can takeit from there with my own tables.Thanks for your helpstr8
View 3 Replies
View Related
Oct 2, 2014
I have two tables that look like this (below). One tells me the name of my product, the Amazon Category it is in & the amount that I want to sell it for. The other tells me the Category & the fee for that category. So far so good. Though it gets tricky in the sense that some categories have two tiers. So in Electronics, the fee for $0.00 - $100.00 is 15%. But from $100 and up it is 8%.
Since it has two columns & both of the new columns pertain to the fee of my product, I can't figure out how to use both at once. For my $599.99 example it would be ($100 * 0.15) + ($499.99 * 0.08) = $55.00. Would I pivot the data? If not, how would I group it to be considered together?
Category Example
IDAmazonCategoryIDAmazonCategoryNameFeePercentageStartPriceEndPrice
1apsAllDepartments0.150.000.00
2instant-videoAmazonInstantVideo0.000.000.00
3appliancesAppliances0.150.000.00
Product Example
1Product1Electronics9.99
2Product3Electronics99.99
3Product2Electronics599.99
Raw SQL
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE #Amzn_Category_FeeStructure(
[ID] [int] IDENTITY(1,1) NOT NULL,
[Code] ....
I use Microsoft SQL 2008
View 2 Replies
View Related
Jun 21, 2015
Calculation of an average using DAX' AVERAGE and AVERAGEX.This is the manual calculation in DW, using SQL.In the tabular project (we're i've noticed that these 4 %'s are in itself strange), in a 1st moment i've noticed that i would have to divide by 100 to get the same values as in the DW, so i've used AVERAGEX:
Avg_AMP:=AVERAGEX('Fct Sales';'Fct Sales'[_AMP]/100)
Avg_AMPdollar:=AVERAGEX('Fct Sales';'Fct Sales'[_AMPdollar]/100)
Avg_FMP:=AVERAGEX('Fct Sales';'Fct Sales'[_FMP]/100)
Avg_FMPdollar:=AVERAGEX('Fct Sales';'Fct Sales'[_FMPdollar]/100)
The results were, respectively: 701,68; 2120,60...; -669,441; and finally **-694,74** for Avg_FMPdollar.i can't understand the difference to SQL calculation, since calculations are similar to the other ones. After that i've tried:
test:=SUM([_FMPdollar])/countrows('Fct Sales') AND the value was EQUAL to SQL: -672,17
test2:=AVERAGE('Fct Sales'[_Frontend Margin Percent ACY]), and here, without dividing by 100 in the end, -696,74...
So, AVERAGE and AVERAGEX have a diferent behaviour from the SUM divided by COUNTROWS, and even more strange, test2 doesn't need the division by 100 to be similar to AVERAGEX result.
I even calculated the number of blanks and number of zeros on each column, could it be a difference on the denominator (so, a division by a diferente number of rows), but they are equal on each row.
View 2 Replies
View Related
Dec 2, 2013
If I wanted to search for Jobs as a particular status (e.g. 0130) and wanted to keep the jobs at this status until it has reached 0500, 0125, or 0900 in it's subsequent status log entry, how can I write the SQL for it to achieve it?
I have the following SQL which searches for the Jobs at 0130, but don't know how to develop it further to search on the requirement above.
------ SQL -------
SELECT
job.job_number,
(SELECT MAX(jsl.job_log_number)
FROM job_status_log jsl
WHERE
job.job_number = jsl.job_number AND
jsl.status_code = '0130') as Last_Early_Warning_Status_Entry
[code].....
In the job_status_log table above, there is a job_log_number field which increments by 1 when there is a new status log entry.
View 1 Replies
View Related
Feb 15, 2008
I have a temp_max column and a temp_min column with data for every day for 60 years. I want the average temp for jan of yr1 through yr60, averaged...
I.E. the avg temp for Jan of yr1 is 20 and the avg temp for Jan of yr2 is 30, then the overall average is 25.
The complexity lies within calculating a daily average by month, THEN a yearly average by month, in one statement.
?confused?
Here's the original query.
accept platformId CHAR format a6 prompt 'Enter Platform Id (capital letters in ''): '
SELECT name, country_cd from weather_station where platformId=&&platformId;
SELECT to_char(datetime,'MM') as MO, max(temp_max) as max_T, round(avg((temp_max+temp_min)/2),2) as avg_T, min(temp_min) as min_temTp, count(unique(to_char(datetime, 'yyyy'))) as TOTAL_YEARS
FROM daily
WHERE platformId=&&platformId and platformId = platformId and platformId = platformId and datetime=datetime and datetime=datetime
GROUP BY to_char(datetime,'MM')
ORDER BY to_char(datetime,'MM');
with a result of:
NAME_________________CO
-------------------- --
OFFUTT AFB___________US
MO______MAX_T _____AVG_T__MIN_TEMTP_TOTAL_YEARS
-- ---------- ---------- ---------- -----------
01_________21______-5.31________-30__________60
02_________26______-2.19______-28.3__________61
03_______31.1_______3.61______-26.1__________60
04_______35.6______11.07______-12.2__________60
05_______37.2_______17.2_______-3.3__________60
06_______41.1______22.44__________5__________60
07_______43.3______24.92________7.2__________60
08_______40.6______23.71________5.6__________60
09_________40______18.84_______-2.2__________59
10_______34.4_______12.5_______-8.9__________59
11_________29_______4.13______-23.9__________60
12_________21______-2.52______-28.3__________60
View 4 Replies
View Related
Jan 19, 2008
Code Block
Hi,
I'm working on a database for a financial client and part of what i need to do is calculate a value from two separate rows in the same table and insert the result in the same table as a new row. I have a way of doing so but i consider it to be extremely inelegant and i'm hoping there's a better way of doing it. A description of the existing database schema (which i have control over) will help in explaining the problem:
Table Name: metrics_ladder
id security_id metric_id value
1 3 80 125.45
2 3 81 548.45
3 3 82 145.14
4 3 83 123.32
6 4 80 453.75
7 4 81 234.23
8 4 82 675.42
.
.
.
Table Name: metric_details
id metric_id metric_type_id metric_name
1 80 2 Fiscal Enterprise Value Historic Year 1
2 81 2 Fiscal Enterprise Value Current Fiscal Year
3 82 2 Fiscal Enterprise value Forward Fiscal year 1
4 83 2 Fiscal Enterprise Value Forward Fiscal Year 2
5 101 3 Calendar Enterprise value Historic Year 1
6 102 3 Calendar Enterprise Value Current Fiscal Year
5 103 3 Calendar Enterprise value Forward Year 1
6 104 3 Calendar Enterprise Value Forward Year 2
Table Name: metric_type_details
id metric_type_id metric_type_name
1 1 Raw
2 2 Fiscal
3 3 Calendar
4 4 Calculated
The problem scenario is the following: Because a certain number of the securities have a fiscal year end that is different to the calendar end in addition to having fiscal data (such as fiscal enterprise value and fiscal earnings etc...) for each security i also need to store calendarised data. What this means is that if security with security_id = 3 has a fiscal year end of October then using rows with ids = 1, 2, 3 and 4 from the metrics_ladder table i need to calculate metrics with metric_id = 83, 84, 85 and 86 (as described in the metric_details table) and insert the following 4 new records into metrics_ladder:
id security_id metric_id value
1 3 101 <calculated value>
2 3 102 <calculated value>
3 3 103 <calculated value>
4 3 104 <calculated value>
Metric with metric_id = 101 (Calendar Enterprise value Historic Year 1) will be calculated by taking 10/12 of the value for metric_id 80 plus 2/12 of the value for metric_id 81.
Similarly, metric_id 102 will be equal to 10/12 of the value for metric_id 81 plus 2/12 of the value for metric_id 82,
metric_id 103 will be equal to 10/12 of the value for metric_id 82 plus 2/12 of the value for metric_id 83 and finally
metric_id 104 will be NULL (determined by business requirements as there is no data for forward year 3 to use).
As i could think of no better way of doing this (and hence the reason for this thread) I am currently achieving this by pivoting the relevant data from the metrics_ladder so that the required data for each security is in one row, storing the result in a new column then unpivoting again to store the result in the metrics_ladder table. So the above data in nmetrics_ladder becomes:
security_id 80 81 82 83 101 102
----------- -- -- -- -- -- --
3 125.45 548.45 145.14 123.32 <calculated value> <calculated value>
4 ...
.
.
.
which is then unpivoted.
The SQL that achieves this is more or less as follows:
*********
START SQL
*********
declare @calendar_averages table (security_id int, [101] decimal(38,19), [102] decimal(38,19), [103] decimal(38,19), [104] decimal(38,19),etc...)
-- Dummy year variable to make it easier to use MONTH() function
-- to convert 3 letter month to number. i.e. JAN -> 1, DEC -> 12 etc...
DECLARE @DUMMY_YEAR VARCHAR(4)
SET @DUMMY_YEAR = 1900;
with temp(security_id, metric_id, value)
as
(
select ml.security_id, ml.metric_id, ml.value
from metrics_ladder ml
where ml.metric_id in (80,81,82,83,84,85,86,87,88,etc...)
-- only consider securities with fiscal year end not equal to december
and ml.security_id in (select security_id from company_details where fiscal_year_end <> 'dec')
)
insert into @calendar_averages
select temppivot.security_id
-- Net Income
,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[80])
+((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[81]) as [101]
,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[81])
+((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[82]) as [102]
,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[82])
+((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[83]) as [103]
,NULL as [104]
-- Share Holders Equity
,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[84])
+((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[85]) as [105]
,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[85])
+((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[86]) as [106]
,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[86])
+((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[87]) as [107]
,NULL as [108]
-- Capex
-- Sales
-- Accounts payable
etc...
..
..
from temp
pivot
(
sum(value)
for metric_id in ([80],[81],[82],[83],[84],[85],[86],[87],[88],etc...)
) as temppivot
inner join company_details cd on temppivot.security_id = cd.security_id
*********
END SQL
*********
The result then needs to be unpivoted and stored in metrics_ladder.
And FINALLY, the question! Is there a more elegant way of achieving this??? I have complete control over the database schema so if creating mapping tables or anything along those lines would help it is possible. Also, is SQL not really suited for such operations and would it therefore be better done in C#/VB.NET.
Many thanks (if you've read this far!)
M.
View 6 Replies
View Related
Dec 1, 1998
Below are the registry entries I want to update but "Trusted_Connection" = "Yes" does not seem to have any effect when it is run. Any ideas??
REGEDIT4
[HKEY_CURRENT_USERSoftwareODBCODBC.INIAAED]
"Driver"="C:WINNTSystem32sqlsrv32.dll"
"Description"="AAED"
"Server"="SIM0108"
"Trusted_Connection"="Yes"
[HKEY_CURRENT_USERSoftwareODBCODBC.INIODBC Data Sources]
"AAED"="SQL Server"
View 2 Replies
View Related
Jan 27, 2005
Hi, I am searching for the most easy SQL solution:
Lets say i have 500000 rows of cars in one table with 30000 different car ID's.
This table contains no keys, no date values, but I need the last row of these 30000 different cars. How do I get them without adapting table structure and without using cursors ?
Thx for reply.
dajm
View 11 Replies
View Related
Sep 24, 2006
Hello
my table :
Report :
R_id (PK)
RName
RDate
i am having a few 10.0000 lines and i want to keep the last 10 (or less if not in the table) rows maximum for each name
i can have 100 report by name (100 rows with the same name and of course R_id and RDate are different)
how can i do it ?
thanks a lot for helping
View 1 Replies
View Related
Mar 30, 2012
I have a table which stores phone numbers of a customer in a table.
Say this table is as below
CustomerName - PhoneNumber
Customer 1 - Phone number 1
Customer 2 - Phone number 2
Customer 2 - Phone number 3
Customer 3 - Phone number 4
What would be the best approach to prevent adding another entry against Customer 2. I should be able to add new customers and add multiple phone number against all other Customers. The restriction should be only against Customer 2.
View 4 Replies
View Related
May 14, 2004
I am trying to pull the last three entries from a table in my database but I am having trouble writing the correct query. The database has multiple entries for each item in my database but I want to pull just the last three for each item. I have tried the TOP function with the items ordered in descending order but that only works when I define each item seperatley. I need to know how to pull the last three entries for every item I have in the table.
Thanks,
Tim
View 3 Replies
View Related
Nov 8, 2005
Hello
Without technical information and DB design, I wish to know the maximum entries in MS SQL in million. I guest for access are a limit of 2 million and a 2GB DB.
How much for MS SQL in max. entries and how much in GB?
A lot of thx for your feedback.
Dominique Javet
View 4 Replies
View Related
Mar 28, 2006
I'm extracting data from a log (log_history) of patients where nurses perform various actions on a call, such as assessing and reassessing, despatching etc. This is the script:
Select
L.URN,
LH.THE_TIMESTAMP,
LH.ACTION_TYPE,
LH.ACTION_BY,
LH.ACTION_REQD,
LH.NOTE,
em.position_type_ref
From
LOG L
Join Log_history LH on (L.URN = LH.LOG_URN)
left outer join employee em on (em.code = LH.action_by)
Where
(L.Taken_at >= :DateFrom and L.Taken_at <= :DateTo) and (LH.ACTION_TYPE = 'D') and (em.position_type_ref ='NU')
Order By
L.URN ASC, LH.THE_TIMESTAMP DESC
The result I get shows duplicate 'timestamp' entries and I only want to return unique timestamp entries. Does anyone have any ideas. I'm self taught and have hit a wall
View 7 Replies
View Related
Jun 12, 2006
Hi,
I need help desperately. The following is an example of the data that I have.
I have two tables one name Customers and the other named Orders. The tables have the following attributes
CUSTOMER
CustomerNum, CustomerName, OrderNum
ORDER
OrderNum, OrderDesc, CustomerNum
Bare in mind that the CustomerNum can have multiple customers attached to it, example a whole family (I know bad database design but it's too late to change) Let's use the following info for the tables respectively
CUSTOMER
0001 Sharon Bigbsy 1234
0001 Dale Bigbsy 1235
0001 Omar Bigbsy 1236
ORDERS
1234 Chips 0001
1235 Gatorade 0001
1236 Candy 0001
The query i'm using is select Customer.CustomerNum, Customer.CustomerName, Orders.OrderDesc from CUSTOMER, ORDERS where CUSTOMER.CustomerNum = ORDER.CustomerNum
The results are as follows
0001 Sharon Bigbsy Chips 1234
0001 Sharon Bigbsy Gatorade 1235
0001 Sharon Bigbsy Candy 1236
0001 Dale Bigbsy Chips 1234
0001 Dale Bigbsy Gatorade 1235
0001 Dale Bigbsy Candy 1236
0001 Omar Bigbsy Chips 1234
0001 Omar Bigbsy Gatorade 1235
0001 Omar Bigbsy Candy 1236
It's giving all the orders place to each customer name but I need only the following
0001 Sharon Bigbsy Chips 1234
0001 Dale Bigbsy Gatorade 1235
0001 Omar Bigbsy Candy 1236
I'm using SQL in MS Access 2003. Please disregard the abnormalities of the tables, it's someone else's work I inherited with over 6000 entried.
Any help will be greatly appreciated
View 8 Replies
View Related
Feb 7, 2007
I have an application that allows the user to enter data into a table. There are multiple users so I put in some code that, I thought, would keep 2 users from creating a new record at the same time. The IDs for the records are identical and this is causing a problem.
The IDs are in the format of ####-mmyy. at the start of each month the #### part goes back to 1.
We tried a test today where we had 2 users click on the New button at exactly the same time. The IDs that were created were identical. Is there anyway on the database that I can prevent this from happening?
Here is how I create the new record id:
I get the MAX(ID) from the table
I add 1 to the ID and then insert a new record with the new ID into the table.
Any help is appreciated.
Thanks,
enak
View 5 Replies
View Related
Jan 7, 2008
Hi all
can some one there please let me know how to configure error logs. what i observe on some of the servers error log got entries for all conenction, database backup, logbackup etc..... but on some there r entries only about database backup.
thanks in advance
View 2 Replies
View Related