ETL: OLTP -&> DATA Store

Jul 23, 2005

Greetings All, I was wondering if any of you would share some of your
experiences regarding the task of loading a data store from an Oltp
source. We are using Analysis Services in a BI product that requires
data to be pulled from one of our products, an OLTP database. The
design is to first run an ETL process from the OLTP source into an
operational data store, from here Analysis Services will pull its data
to do its thing. Now, for small OLTP databases (< 1Gb) the stored
procs I have written to do the extraction works well, it is relatively
fast and efficient. However, we have a few databases that are 10Gb's
and the load could end up taking several hours. During this long load
the OLTP source may be in use and I want to avoid write blocks, or if I
were to use "select ... NOLOCK" I could get dirty data brought over. I
could used BCP for some of the big tables or Bulk Copy but I wanted to
see if anyone has dealt with this issue and what their specific
resolution was for their specific problem. It is my hope that by
seeing how others have dealt with this I will be able to architect a
solution for my specific problem

Regards, TFD.

View 1 Replies


ADVERTISEMENT

ETL Architecture - Streaming Data Into An OLTP

Oct 15, 2007

Interested in feedback from the SQL grand wizards (and would-be wizards) that haunt these forums.

Let's say you need to constantly stream data into an OLTP system. We are talking multiple level hierarchies totaling upwards of 300 MB a day spread out not unlike a typical human sleep cycle (lower data during off-peak, still 24/7 requirements). All data originates from virtual machines running proprietary algorithms. The VM/data capture infrastructure needs to be massively scalable, meaning that incoming data is going to become more and more frequent and involve many different flat record formats.

The data has tremendous value when viewed both historically as well as in real-time (95% of real-time access will be read-only). The database infrastructure is in it's infancy now and I'm trying to develop a growth plan that can meet the needs of the business as the data requirements grow. I have no doubt that the system will need to work with multiple terabytes of data within a year.

Current database environment is a single server composed of a Dell PowerEdge 2950 (Intel Quad Core 5355, 16 GB RAM, 2 x 73 GB 15K RPM SAS ) with an attached Dell PowerVault MD1000 (15 x 300 GB 10K RPM SAS in RAID 5+0 [2x7] w/hot spare) running Win 2k3 64-bit and SQL Server 2005 x64 Standard, 1-CPU.

I am interested in answering the following questions:

Based on the scaling requirements of the data capture and subsequent ETL, what transmission method would you find most favorable? For instance, we are weighing direct database writes via stored procedures for all VM systems versus establishing processes to collect, aggregate and stream CSVs into a specialized ETL environment running SSIS packages that load data and then call SQL Stored procedures to scrub and prepare for production import. The data will require scrub routines that need access to current production data, so distributing the core data structures to multiple ETL processing systems would be expensive and undesireable.
Cost is very important to the overall solution design. In terms of database infrastructure, how would you maximize business value while keeping cost as low as possible? For instance, do you think there is more value in an ACTIVE/ACTIVE cluster (2 x CPU licenses) where one system acts as ETL and the other as OLTP or would you favor replication of production data from ETL to OLTP or (vice-versa). With the second scenario, am I mistaken in thinking we could get away with a Server/CAL licensing model for the ETL server?.
Are there any third party tools that I should research that would greatly aid me here?


I appreciate all feedback, criticism, and thoughts.

Best Regards,

Shane

View 5 Replies View Related

Data Warehousing :: Can Use Dimensional Model For OLTP System?

Jul 9, 2015

Question: Is it feasible to use a star schema dimensional model for an OLTP system that incurs few (750 per day)Sales Orders transactions?

Background: My customer wants to replace an existing OLTP system database because it runs on Oracle and their in-house expertise is in SQL Server.  The original database developers that designed the Oracle DB have apparently retired.  The Oracle database has been over-normalized, to say the least.  The number of sales orders being entered daily is small: about 500-750 per day.  These entries are done at the five clerks' convenience, from a paper form, and are very unlikely to ever be entered in quick succession.  Nothing else gets regularly entered into this database except for the occasional change to a customer, but new customers are very few and far between.  

I've designed a star schema for the replacement database with the Sales Order Header and Sales Order detail table combined into a single 'fact' table, and I've introduced some duplication into dimension tables (like customer) in order to eliminate some of the joins (and confusion) that were built into the original database.

I've never tried this before.  Is there any reason this would not or should not work?

View 5 Replies View Related

OLTP Vs OLAP (Data Warehouse) As A DataSource In Analysis Services 2005

Jul 3, 2007

Hello,Can I import an OLTP (Reltional DB) as a Data Source into SQL ServerAnalysis Services 2005 and then use the Cube Wizard and the new DataSource View feature to create the OLAP model ?Or do I have to first design an OLAP Data Warehouse with a Star Schemaand then import this DW as a Data Source into my Analysis ServicesProject.With SQL Server 2000 , OLAP would be the way to go..but with SQLServer 2005 , it seems as though the wizard and data source viewfeatures do half the work for you.I have an OLTP DB and am not sure which route I should take ! Anysuggestions / input would be much appreciated.Thanks in Advance...RegardsRusszee

View 1 Replies View Related

SQL Server 2012 :: How To Write Stored Procedures To Load Data Model From OLTP To DWH

Nov 24, 2014

How to write Stored Procedures to load the Data Model from OLTP to DWH ?

View 9 Replies View Related

Limits In OLTP

Aug 1, 1998

Is it possible / advisable to use Sql Server as the backend database for handling 40 million transactions,
taking around 8 GB space?

View 2 Replies View Related

Oltp && Olap

Dec 19, 2007

hello everyone,

Is it make sense to create 2 databases, OLTP for the insert, update and delete and OLAP for selection
and i'll sync between the 2 databases.

Thanks

View 2 Replies View Related

Please Help With Oltp Solutions

Aug 28, 2007

hi. i dont understand what they mean when they say developing oltp solutions. can anybody pls explain it to me. also does anyone know what ways there are to develop sql oltp solutions using SQL 2005 reporting services, OLTO, Excel Services. as well as any good tutorials for it?

thanks for the help.

View 6 Replies View Related

OLTP Vs Decision Support

Mar 2, 2004

Whilst on the Nth hour (n = many) of my magical journey through MS Sql BOL I've come across OLTP Vs Decision Support. After a couple of searches here could someone shore up the following for me please...

A decision support database is the same as warehouse database.

This is for static data commonly used for reporting and analysis.

OLTP is a live database (accomodates inserts, deletes, updates etc).

Is that right?

Also would it be fair to assume that a decision support database is generally going to be spawned from the historical data of an OLTP database? Any real world examples of these two terms would be greatly appreciated too.

Cheers

Dan

View 2 Replies View Related

What Are The Differences Between OLTP And OLAP ?

Mar 7, 2008

I want to know the basic dfferences between these 2 (OLTP and OLAP) ?

View 3 Replies View Related

Asp.net Page Is Unable To Retrieve The Right Data Calling The Store Procedure From The Dataset/data Adapter

Apr 11, 2007

I'm trying to figure this out
 I have a store procedure that return the userId if a user exists in my table, return 0 otherwise
------------------------------------------------------------------------
 Create Procedure spUpdatePasswordByUserId
@userName varchar(20),
@password varchar(20)
AS
Begin
Declare @userId int
Select @userId = (Select userId from userInfo Where userName = @userName and password = @password)
if (@userId > 0)
return @userId
else
return 0
------------------------------------------------------------------
I create a function called UpdatePasswordByUserId in my dataset with the above stored procedure that returns a scalar value. When I preview the data from the table adapter in my dataset, it spits out the right value.
But when I call this UpdatepasswordByUserId from an asp.net page, it returns null/blank/0
 passport.UserInfoTableAdapters oUserInfo = new UserInfoTableAdapters();
Response.Write("userId: " + oUserInfo.UpdatePasswordByUserId(txtUserName.text, txtPassword.text) );
 Do you guys have any idea why?
 
 

View 6 Replies View Related

OLTP And Reporting Databases Seperated?

Jan 12, 2006

we are using an object database for our OLTP but for reporting we have
got some issues about performance as the cpu becames a bottleneck.And
we want to be able to run on low end computers...

One of our team members suggested to replicate the object database to a
SQL table.But just a single one.The most denormalized thing ever.(358 coloumns)

is this the fastest way we can get in reporting?
   
    *we don't want harddisk,ram or cpu to became a bottleneck. ( must run on cheap staff)

View 2 Replies View Related

Need Info Pls: How To Convert OLTP Db To OLAP

May 21, 2008



Hi al,


I need some steps to create OLAP DB .

Actually i have OLTP Db . i created SSAS soln and created cube with necessary dimensions . i deployed it on SSAS instance of management studio.

my qn, is the instance created under SSAS instance OLAP?.

pls provide me steps to have a OLAP database...

Thanks,
Nav

View 2 Replies View Related

DB Design :: Extracting History From OLTP?

Nov 3, 2015

Simple and conventional OLTP database, we need to capture all changes for insert into a DW via staging / ods etc.

Is there a recommended approach for this? Obviously it has to be real time as there might be multiple updates for a time period. I'm thinking of triggers on OLTP tables (bad for performance as it's synchronous), or change data capture, service broker as asynchronous methods.

View 8 Replies View Related

Writeback To The Source OLTP Database.

Apr 22, 2008

I was wondering if it is possible to use SQL Server Reporting Services 2005 with the 'writeback' feature to the source OLTP database?

I have seen article's that refer to using SQL Server Analyses Services (SSAS) and writing back to the ROLAP/MOLAP database, however this is not desirable for our case.

I have almost come to the conclusion that it is not possible without SSAS.

Cheers,

View 3 Replies View Related

SQL Server 2008 :: Large Tables In OLTP

Jul 14, 2015

How many no of records of the tables are called large tables.

We are getting more deadlocks. We are using default isolation. Read & insert statements are blocking each other and causes dead locks.

I am thinking that might be purging will reduce deadlocks.

The table has 15million records. Is this table consider as large table or not in OLTP systems?

In general how many records we need to consider as large table.

View 1 Replies View Related

Analysis :: Creating Tabular Model On OLTP

Nov 4, 2015

I have been looking at implementing a tabular model based on an OLTP database that's not dimensional. I know that this is possible but during my proof of concept I have encountered numerous problems ...

The things that I have run into are: After setting up the relationships I have found that measures filter context don't propagate along the relationships as I would expect. if the measure is coming from a target table and not a source then an ALL member is returned ( as in multi dimensional when a dimension isn't related to a measure group). Given the lay out of an OLTP database this will be hard to avoid.

One thing I have done to try an mitigate the above problem is to combine the tables used for measures in a view and using that in the source to connect to the rest of the tables. however due to the tables being of different grains this has then created duplication in some of the keys and measures. so the keys cant be used in relationships and the measures aren't accurate.

Are these things other people have come across? or should I give up the ghost and just recommend using dimensional models for the source? is tabular just geared towards a DW the same as multidimensional?

View 2 Replies View Related

Mirroring OLTP DB With Transactional Replication To Staging DB

Mar 12, 2007

I want to create a mirrored DB set for data entry in a extremely busy OLTP DB. I want to add transactional replication between the production server and a staging server outside my quorum that I will use to index the data and prepare it for reporting and warehousing purposes.

If/when fail-over takes place, what happens to my transactional replication between the former production sever (now presumably offline) and my staging DB? Does it switch to the new production server automatically or do I have to manually set the replication between the new production server and the staging DB?

Thanks in advance.

View 2 Replies View Related

Best Backup/restore (OLTP/OLAP) Practices In 2005

Aug 14, 2007

We have a live OLTP database for which we create full backups every week and differential backups every day. Recently we added an OLAP database, which we need to update daily with changes from the live database.

This is the process we are planning to use.
1. Restore last full OLTP backup.
2. Apply the last differential OLTP backup.
At this point we should have a replica of the live OLTP database.
3. Update OLAP database based on the OLTP replica database.
4. Delete the OLTP replica database.

Two questions.
1. If different from the process above, how is this OLTP-to-OLAP transformation typically done in the industry?
2. What is the best way to implement this process with SQL Server 2005?

Thanks.

View 3 Replies View Related

OLTP Database Design Help For Bank's Customer Table

Aug 9, 2007


Hello,friends

1) CustomerID
2) FirstName
3) MiddleName
4) SurName
5) Title
6) Marital Status
7) Education
8) Occupation
9) Annual Income
10) Line of Business
11) DOB
12) Father Name
13) Mother Name
14) SpouseName
15) Gender
16) Email
17) MainTel
18) Home Tel
19) Passport Number
20)----------------------
21)- - - - - - - - - - -


100)-------------------
Above mentioned list is a snapshot of our customer master table ,which contain approximately 100 attributes related to a customer.

We are designing an application for banking sector (but NOT Core banking solution),for which we may need to capture variable number of addresses for bank's customer,i.e more then three types of addresses Fixed,Temporary and Communication addresses(which is generally the case with all banks).
A single address includes address1/address2/city/country/state/pincode fields.
In context of OLTP database,We have option to put multiple addresses in child table but that involves various joins at the time of data retrival and slow down the query.


As another option we can can create redundent addresses columns(address1/address2/city/country/state/pincode) in master table that will accumulate addresses if demand for more then three type addresses arises(although there is reasonable numer of extra addresses is expected, i.e 10)

Database is expected to serve the records of 25 million(approx) bank's customer,so does someone can suggest me how to maintan the balance between two approches.

View 2 Replies View Related

How Can I Store Data ?(encoding And Data Type)

Oct 20, 2007

hi my friends;i have got a problem.i thing you can help mea='x80x02}qx00(Kx02Kx03Kx04Kx06Kx05Kx07u. 'i want to store database but i don't know which encoding and dataytpepls help mesorry for my bed english...thanks all

View 3 Replies View Related

DB Engine :: Unexpected Update / Delete On OLTP Database In 2005

Jun 17, 2015

We had one of the major issues where one of the table on a heavily used OLTP database seems to have updated the records which were not expected.

Scenario:

We got around more than 12K contracts updated to status expired even though the expiry date is not set to be so:

for E.G : Below table has a column contract status which overnight seems to have updated the values to expired.

Even though the start and expiry date does not follow the logic for above.

We had the above working for past 3 years via a SP scheduled via SQL agent Job which Expire active contracts whose expiration date is less than today's 12:00AM.There has been no change in SP.

How can i track how it happened and what caused it?

View 28 Replies View Related

DB Engine :: In-Memory OLTP Use With Existing Tables / Index / Procedures

Nov 10, 2015

1. I need to make use of in memory engine for my pr-existed develop procedures ,tables ,index.  do I need and code changes for application and how to store tables /indexes in OLTP memory

Assume table index may have primary key index as well.

2. If table with one primary index and 2 foreign constraints, 3 non clusters indexed. which one able o load to memory area and how t do that.

3. In memory is lock free zone. usually locks will happpen in RDMS context . how this works without locks.

View 3 Replies View Related

Integration Services :: Purge Data In Transaction Table Or Delete Some Data And Store In Separate Table

Aug 18, 2015

How to purge data  in transaction table or we can delete some data and store in separate table in data warehouse?

View 7 Replies View Related

Store All Data In One Coloumn ?

Dec 31, 2006

we need to decide an architecture for Performance on a web site Search! I wanna use text service of SQL 2005 .But I am worried about the performance .... How should I design the system if I want the best perfomance and scalability ?1.Should I build a seperate coloumn in my every table and merge all the information into one coloumn and full text index that column.2.Put a full text index in all column in the table and use OR clause and reverse rank it for AND clause,using CONTAINSTABLE function.3.Make a different table and put _ID,_TYPE and _VALUE fields and search in that table with less coloumns.4.Seperate the full text database and search in a seperate db so that I can scale better?did anybody have a similiar problem ? Any books on full text search ?

View 4 Replies View Related

How Can I Store Data In One Row Instead Of 10 Rows?

Jan 1, 2007

Hi
I have this stored procedure using Table Variable.
-------------------------------------------------------------------------------------------------------------------------------
create procedure sp_matching @pid Int
ASdeclare @tbl table (MAID int, image varchar(100))
insert @tblselect MAID, image from productswhere pid in (select top 10 mid from matching1 where pid=@pid order by newid())
select * from @tbl
GO
---------------------------------------------------------------------------------------------------------------------------------
When I run this sp, i have table with 10 rows. But what I really need to do is to store all data in one row like:
PID, MAID1, Image1, MAID2, Image2, ..... MAID9, Image9, MAID10, Image10
 
How can I do this? Please help me out for this.
Thanks.

View 5 Replies View Related

How To Store HTML Data In SQL

Aug 19, 2007

Hi friends  I want to add HTML to the SQL server as data please can anyone help me. which data type . & how to store the data.  Thanks for all your support  Thanks & regardsASPFreak

View 3 Replies View Related

How Data Store On Server??

Feb 15, 2004

Hi all,
please, show me How to data is store on the server ?
thanks for reading.

View 5 Replies View Related

Where Are BLOB Or VAR**(MAX) Data Store?

Apr 7, 2007

hi,

i learned that BLOB data types are out-row data by default, i think it means it will not store in the row of table, because the max-length of a col is 8KB.

but will the system store those BLOB data? and is there any way to specify where should it be stored, like tha Partition Schema.

i am using SQL Express, so i hope it works for express.(partition does not work for express)

View 5 Replies View Related

Inserting Data From A Store Procedure

Oct 3, 2007

Hi,
This is more store procedure:ALTER PROCEDURE dbo.InsertSpecialOrders
(@OrderID int,
@DepotName nvarchar(50),@CustomerName nvarchar(50),
@OrderDate nvarchar(50),@TelephoneNumber nvarchar(50),
@ObtainedFrom nvarchar(50),@CustomerCanCollect nvarchar(50),
@DepositPaid nvarchar(50),@PriceQuoted nvarchar(50),
@OrderTakenBy nvarchar(50),@BalancePaid nvarchar(50),
@Status nvarchar(50),@TransferedBy nvarchar(50),@CarriagePrice nvarchar(50)
)
 
AS
Declare @LatestID Int
 INSERT INTO Specials
(
OrderID,
DepotName,
CustomerName,
OrderDate,
TelephoneNumber,
ObtainedFrom,
CustomerCanCollect,
DepositPaid,
PriceQuoted,
OrderTakenBy,
BalancePaid,
Status,
TransferedBy,
CarriagePrice
 
)
VALUES
(
@OrderID,
@DepotName,
@CustomerName,
@OrderDate,
@TelephoneNumber,
@ObtainedFrom,
@CustomerCanCollect,
@DepositPaid,
@PriceQuoted,
@OrderTakenBy,
@BalancePaid,
@Status,
@TransferedBy,
@CarriagePrice
)
 SELECT @LatestID = Scope_Identity()
 INSERT INTO SpecialParts
(
OrderID,
PartNo,
Quantity
)
VALUES
(
@LatestID,
@DepotName,
@CustomerName
 
)
RETURN
 
Now I want to insert data and create the tables:
 SqlDataSource ordersDataSource = new SqlDataSource();ordersDataSource.ConnectionString = ConfigurationManager.ConnectionStrings["SpecialsConnectionString1"].ToString();
 
 
ordersDataSource.InsertCommandType = SqlDataSourceCommandType.StoredProcedure;
 So to insert data for the first field I need:
ordersDataSource.InsertParameters.Add("@CustomerName", CustomerName.Text) Is this right? Any help is appreciated.
 
 

View 2 Replies View Related

To Store Data With Single Quote

Mar 10, 2008

HI,I anm geting error when i want to store some text which contens single quote like this    Hi I am 'santosh'.as i am using text editor which genetates XML data (not pure) so i have used varchar(max) to store the data but it gives error.
Is thier any way to store text with single quote........
urgent plz.

View 1 Replies View Related

Store The Data From A Database Into A File.

Mar 22, 2008

how to use class iDataReader's function getbytes(...............) to read binary data? 
 
using System;using System.Data;using System.Configuration;using System.Collections;using System.Web;using System.Web.Security;using System.Web.UI;using System.Web.UI.WebControls;using System.Web.UI.WebControls.WebParts;using System.Web.UI.HtmlControls;using System.Data.SqlClient;using System.Data.Common;using System.IO;
public partial class uploadFileToDatabase : System.Web.UI.Page{    string strUser;    protected void Page_Load(object sender, EventArgs e)    {      strUser=Request.QueryString["download"];      if(strUser!=null)      {        string strConn="Data Source=hao-pc;Initial Catalog=simpleBBS;Persist Security Info=True;User ID=sa;Password=123;Pooling=False";                SqlDataAdapter sda=new SqlDataAdapter();        DbProviderFactory dbProviderFactory=DbProviderFactories.GetFactory("System.Data.SqlClient");        DbConnection dbConn=dbProviderFactory.CreateConnection();        dbConn.ConnectionString=strConn;        dbConn.Open();        DbCommand dbComm=dbProviderFactory.CreateCommand();        dbComm.Connection=dbConn;        dbComm.CommandText="select * from userFiles2 where fileisn='"+strUser+"'";        IDataReader myReader=dbComm.ExecuteReader();        int strFileLen=0;         if(myReader.Read())        {        strFileLen=Convert.ToInt32(myReader.GetString(3));        FileStream afs;        string fPath;        fPath = Server.MapPath(Request.ApplicationPath) + "\upLoadFiles\" + myReader.GetString(1);        afs = new FileStream(fPath, FileMode.OpenOrCreate, FileAccess.Write);        Byte[] filedata = new Byte[strFileLen];                afs.Write(filedata, 0, strFileLen);                 BinaryWriter abw;                  int iBufferSize=1000;          Byte[] outbyte=new Byte[iBufferSize-1];          long retval;          long starIndex=0;                    abw=new BinaryWriter(afs);          starIndex=0;          int itemNum=4;          retval=myReader.GetBytes(itemNum,starIndex,outbyte,0,iBufferSize);  //when i store  large data into a file it display an error such as "Buffer offset '0' plus the bytes '82452' is greater than the length of the passed in buffer.            abw.Write(outbyte);                        starIndex+=iBufferSize;            retval=myReader.GetBytes(itemNum,starIndex,outbyte,0,iBufferSize);          }              abw.Write(outbyte);            abw.Flush();            abw.Close();            afs.Close();            myReader.Close();            dbComm.Dispose();                   }      }                                      }    protected void Button1_Click(object sender, EventArgs e)    {      string strFileName;      string strFileType;      int strFileLen;      Random ran=new Random();      string strFileISN=DateTime.Now.ToString("yyyyMMddhhmmss")+ran.Next(0,999);      if(FileUpload1.PostedFile.FileName!=null)      {        strFileName=FileUpload1.PostedFile.FileName;        strFileType=strFileName.Substring(strFileName.LastIndexOf(".")+1);        strFileName=strFileName.Substring(strFileName.LastIndexOf("\")+1);        strFileLen=FileUpload1.PostedFile.ContentLength;        string strConn="Data Source=hao-pc;Initial Catalog=simpleBBS;Persist Security Info=True;User ID=sa;Password=123;Pooling=False";                SqlDataAdapter sda=new SqlDataAdapter();        DbProviderFactory dbProviderFactory=DbProviderFactories.GetFactory("System.Data.SqlClient");        DbConnection dbConn=dbProviderFactory.CreateConnection();        dbConn.ConnectionString=strConn;        dbConn.Open();        DbCommand dbComm=dbProviderFactory.CreateCommand();        dbComm.Connection=dbConn;        string dateTime=DateTime.Now.ToString();        Stream fs=FileUpload1.PostedFile.InputStream;        Byte[] filedata=new Byte[strFileLen];        fs.Read(filedata,0,strFileLen);                dbComm.CommandText="insert into userFiles2 (fileisn,filename,filetype,filesize,filecontent,uploadTime) values (@fileisn,@filename,@filetype,@filesize,@filecontent,@uploadTime)";        dbComm.Parameters.Add(new SqlParameter("@fileisn", (object)strFileISN));        dbComm.Parameters.Add(new SqlParameter("@filename",(object)strFileName));        dbComm.Parameters.Add(new SqlParameter("@filetype",(object)strFileType));        dbComm.Parameters.Add(new SqlParameter("@filesize",(object)strFileLen.ToString()));        dbComm.Parameters.Add(new SqlParameter("@filecontent",(object)filedata));        dbComm.Parameters.Add(new SqlParameter("@uploadTime",(object)dateTime));        dbComm.ExecuteNonQuery();        fs.Close();        dbConn.Close();        dbComm.Dispose();        dbConn.Close();      }      }  }
 

View 3 Replies View Related

Need To Store Data From Recursive Query Using CTE

Apr 18, 2008

I have a recursive query, using common table expressions, like this:
 WITH TaskHierarchy (GUID, ParentGUID, Title, Complete, HierarchyLevel)
AS
(
SELECT GUID, ParentGUID, Title, Complete, 1 HierarchyLevel
FROM Task
WHERE ParentGUID = @GUID

UNION ALL

SELECT t.GUID, t.ParentGUID, t.Title, t.Complete, th.HierarchyLevel + 1 HierarchyLevel
FROM Task t
INNER JOIN TaskHierarchy th
ON t.ParentGUID = th.GUID
)
SELECT (COUNT(*) - SUM(CAST(Complete AS INT))) Outstanding FROM TaskHierarchy
  The result is a number.  I need access to this number.  Ideally, I would like to store it in a variable, but anything would work as long as I can access it after the query.Anyone know of a way?

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved