Hourly Rate For Consulting
May 8, 2008I am working with a company that needs some SQL Reporting developed. What is the going rate for SQL Reporting work?
View 14 RepliesI am working with a company that needs some SQL Reporting developed. What is the going rate for SQL Reporting work?
View 14 RepliesSOS Technology Corporation has multiple long-term consulting opportunities available for Systems Engineers in Midtown Manhattan. Salary is negotiable, based on experience. The assignment is with a large banking institution.
Positions available:
1 SQL guru to write transaction reports and SQL statements
1 SMS guru to write scripts.
2 individuals with practical work experience with both SQL & SMS
****REFERRAL FEES OFFERED****
Interested applicants, please contact:
Craig Mihok/Staffing Consultant
SOS Technology Corporation * 15 Meridian Rd. Eatontown, NJ 07724
Toll Free # (888)616-7447 ext 4332 * Fax # (888)612-7332
e-mail: cmihok@sostc.com * website: http://www.sostc.com
Hello all!
I want to get into consulting (perhaps a little at first and more once I can support myself on it solely). I am about to start the purchase of a laptop soon, and I want to know which components I should focus on the most. I also want to know which version of SQL Server I should purchase.
From my studies and what I have learned (about to take the test to get my sql server 2005 certificate) I am going to assume that I want to get SQL Server 2005 developer software. However, the part that is not touched on very often, is if I am working with a client that has say, SQL Server 2005 enterprise, x64 exition, and I have developer IA64 edition, would I be able to connect and work on their server?
As far as the hardware is concerned, I am thinking I want a laptop with a large enough screen (I currently get a discount through hp where I currently work and am looking at the models that have a 17 inch screen) so I don't go blind trying to work . I am also thinking I don't need an extremely powerful processor, but I am not sure between athlon and pentium on which I Should choose since I do not have much experience in this area with sql server. I know I want at least 2GB of memory for the simple fact that sql server isnt the only thing I will be doing on the laptop and want some headroom. I also want a large enough hard drive to hold everything, but I definitely want to see if I can get a 7200rpm drive vs a 5400rpm for the simple fact that I have noticed a significant difference in read and write speeds between the 2 drives. I am not too fussed over the video card, as I am not into graphics design or high-end video games (I think world of warcraft is the most intense I have bought in years). I am also thinking that because I wont need to have a very powerful video card, that a mega powerful battery is not super important.
These so far are my thoughts. Please let me know your opinions on what I should focus on/get
Hi! how do get the number of students who atteded prep with the last 1 hour. and log store the number in a Variable
CREATE TABLE [dbo].[PrepTime] (
[ID] [int] IDENTITY (1, 1) NOT NULL ,
[PrepNo] [int] NOT NULL ,
[Log_Time] datetime NOT NULL DEFAULT, CURRENT_TIMESTAMP,
[coming] [int] NULL ,
[going] [int] NULL ,
[Student_ID] [bigint] NULL ,
[Study_Type] [int] NULL ,
[State] [smallint] NULL
) ON [PRIMARY]
GO
Apologies for the simplicity of the question, but it reflects mycapabilities! I have the following sample fields coming from differenttables:LocationTimeDate (timestamp)DataI need to return the average of Data per Location per HOUR.Thanks.
View 7 Replies View RelatedI am trying to get hourly data on a table which was working fine on mysql but not on sql. how to use this in sql server managmenmt studio, i receive 'time' is not a recognized function name.and curdate() is not recognise function.
select [pa_number], [pa_surname]
,[pa_forename],
sum(time(datetime) >= '07:00:00' and time(datetime) < '08:00:00') as '7.00-8.00 AM',
sum(time(datetime) >= '08:00:00' and time(datetime) < '09:00:00') as '8.00-9.00 AM',
sum(time(datetime) >= '09:00:00' and time(datetime) < '10:00:00') as '9.00-10.00 AM ',
sum(time(datetime) >= '10:00:00' and time(datetime) < '11:00:00') as '10.00-11.00 AM',
[code]....
hi,
I have setup an hourly schedule to run every hour, but how can I choose a start and end time ? This is within subscriptions.
is this possible ?
HiHow can I create a job in sql agent to create a new snapshot every hour?I have, for eg a T-SQL that does it manually.create database Snapshotter_snap_20070418_1821 on( name = Snapshotter, filename ='c: empSnapshotter_snap_20070418_1821.ss')as snapshot of SnapshotterNow, what I do NOT want, is to only have one copy, but rather to do thisevery hour or two through out the day - and keep the old copies for sometime. (In that case, a DROP database, and a CREATE database <generic name>is easy).Any help appreciated,M
View 4 Replies View RelatedFirst time here so please bear with me.Set up a DTS package to export data to an excel sheet on an hourlybasis. Problem is, it keeps appending to the same excel sheet.Any idea how to prevent that. All I want to accomplish is that everyhour, the latest data is in the excel sheet and the previous data isdeleted.Thanks in advance!
View 5 Replies View RelatedFor a GPS utility project we are planning on extracting certain attributes from a huge "GPS Raw Data" read only database which we have access to containing GPS data from several years from several devices attached to vehicles.The data is time stamped. Where the time gap between pieces of data is more than 10 minutes, a new trip is instance is assumed and in our write access "Trip" database we create a new instance for the data clump with a new trip id along with the time range of the data. The process is to be run hourly to update the "Trip" database with new trips and append to overlapping trips. We've some questions:a) Is it easy to read from one database and write into another in c# hourlyb) How would one go about running a C# program automatically every hour on the server?c) Is there a better way to do this than an hourly update? (dynamically perhaps??)d) When querying the database and comparing the time stamps, how for instance would we go about identifying a 10 minute gap when the time/date is in the format "22/12/2007 11:25:00". I can't get my head around actually writing this - it's probably ridiculously simple
View 7 Replies View RelatedAny thoughts as to why I get this error, every hour. How do I fix it?
Be gentle I'm a SQL novice.
Error reported in the Application Event Log.
Package "Hourly Transaction Log" failed.
OS: Windows Ser_2003_sp1
Source: SQLISPackage
Category: None
Type: Error
Event ID: 12291
File Name dtsmsg.rll
File Version: 2005.90.3042.0
Product Name: Mic SQL Server.
Product Version: 9.0.3042.
Time 3:00:01 PM
Type: Error
I have two servers, using SQL server 2000.I was asked for implementing hourly Backup 3 databases in one serverand restore those databases to another server.Could anyone give me the detailed steps to do that?Thanks a lot in advance!
View 13 Replies View RelatedHello everyone,I have around 20 reports in an ASP web-application which connects to aSQL Server 2000 dB, executes stored procedures based on inputparameters and returns the data in a nice tabular format.The data which is used in these reports actually originates from a 3rdparty accounting application called Exchequer. I have written a VBapplication (I call it the extractor) which extracts data fromExchequer and dumps the same into the SQL Server dB every hour. Therunning time for the extractor is an average of 10 minutes. Duringthese 10 minutes, while the extractor seems to run happily, my ASPweb-application which queries the same dB that the extractorapplication is updating becomes dead slow.Is there anyway I can get the extractor to be nice to SQL Server andnot take up all its resources so that the ASP web-application users donot have to contend with a very very slow application during thosetimes?I am using a DSN to connect to the dB from the server that runs theweb-application and well as the other server which runs extractor.Connection pooling has been enabled on both (using the ODBCAdministrator). The Detach Database dialog gives me a list of openconnections to the dB. I have been monitoring the same and I havenoted 10-15 open connections at most times, even during the executionof extractor.All connection objects in the ASP as well as VB applications areclosed and then set to nothing.This system has been in use from 2002. My Data file has grown to 450MBand my Transaction Log is close to 2GB. Can the Transaction Log be aproblem. For some reason, the size of the Transaction Log does not godown even after a complete dB backup is done. Once a complete dBbackup is done, doesn't the Transaction Log lose its significance andcan be actually deleted? Anyway this is another post I'm doing todayto the group.In the extractor program,1) I create a temporary table2) I create an empty recordset out of the table3) I loop through the Exchequer records using Exchequer's APIs, addingrecords into the recordset of the temporary table as I go along.4) I do an UpdateBatch of the Recordset intermitently5) I open an SQL Transaction6) I delete all records from the main table7) I run a INSERT INTO main_table SELECT * FROM #temp_table8) I commit the transactionI hope that the information is sufficientThanksSam
View 4 Replies View RelatedI am using the following query in a view to retrieve the latest 24 hourly records for a site.This returns 24 hourly records for the last day of measurements at a Site.This works great. However, I now need to retrieve the latest hourly records from the current hour. For example, hours will run from 00:00 to 23:00 and if the query is executed at 15:00, I will return only hourly records for 00:00 to 15:00 etc. I believe I need to filter the result set or modify the query to exclude records greater than the current hour.
View 6 Replies View RelatedI am using the below script and I am getting data for 15 minutes interval. I would like to aggregate this data to hourly so instead of reading for 2014-01-01 00:15:00.000 and 2014-01-01 00:30:00.000 I want all the data aggregated for 2014-01-01 00:00:00.000 and then for 2 o’clock. how should I tweak this query to sum the interval values and display it?
SELECT r.MeterId, r.ReadingDate, r.Reading
FROM MeterReading r, MeterDetail d, Building b
where r.MeterId = d.MeterId
and d.BuildingId = b.BuildingId
and b.BuildingName like '%182%'
and r.ReadingDate between '2014-01-01'and '2014-01-10'
order by r.MeterId
Current Output
MeterIdReadingDateReading
3969 1/01/2014 0:000
3969 1/01/2014 0:150
3969 1/01/2014 0:300
3969 1/01/2014 0:450
3969 1/01/2014 1:000
3969 1/01/2014 1:151
3969 1/01/2014 1:300
3969 1/01/2014 1:450
3969 1/01/2014 2:000
3969 1/01/2014 2:150
3969 1/01/2014 2:300
3969 1/01/2014 2:450
3969 1/01/2014 3:000
I am using SSIS package for pulling the data(last 2 months data).
Since the data size is huge, i have to split the data into hourly basis and pull the data.
how i can make this dynamic? Right now i am changing the hours manually after package execution.
I am trying to break down the content based of hourly basis. It works fine when there are values for that specific hour but if there are entries or values for a specific hour then it returns null instead of 0. How not to get null instead get zero.
Here is the code below:
With temp_exp As
(Select pl.state,Cast(signeddate As date) As signatureDate, signeddate As DoneTime From contract c with(nolock) where c.signeddate>DateAdd(Day, Datediff(Day,0, GetDate()), 0)
)
Select
Sum(Case When CONVERT(varchar(8),DoneTime,108) Between '07:00:00' And '07:59:59' Then 1 Else 0 End) '8AM',
[Code] ....
I have SharePoint 2010, which I have uploaded a PowerPivot model onto.
Currently it doesn't seem like I could setup the Data Refresh service to refresh my model more frequent than once a day. The Data Refresh configuration page looks like this:
Which doesn't show an option for anything more frequent than daily.
I have also tried to refresh the model's database directly on the Tabular SSAS instance (which SharePoint is using to store PowerPivot models) via SSIS or XMLA, but I get an error saying the tabular model is in "ReadOnly" mode, which I could potentially bypass (by detaching and re-attaching the model), but thats starting to sound abit too hacky.
Is there any way I could refresh my SharePoint uploaded PowerPivot model more than once daily?
If you are familiar with Crystal reports or Visual basic, you may be familiar with the Rate and Pmt functions.
I need to duplicate them in SQL sever 7.
Anybody have code for this already? I hate re-inventing the wheel.
More (Unnecesary) details:
I have a client who has handed me the formula that I need to use for calculating Interest rates. Unfortunatly, the formula was written in Crystal reports, so now I need to pick it apart and do the work that CR does automaticly. Any help?
This is the code I have written and I am trying to retrieve minimum count of PQpageId for every hour for a given date of range.
WITH CTE AS (
SELECT PQIM.PQPageID
,PQIM.PageURL as PageDescription
,CONVERT(Date,NCPI.RequestDateTime) AS [Date]
,DATEPART(HOUR,NCPI.RequestDateTIme) AS [HOUR]
,ISNULL (COUNT(NCPI.PQPageID),0)AS HourlyPQPageIdCount
FROM dbo.NewCarPurchaseInquiries AS NCPI WITH (NOLOCK)
[Code] ....
This is the output I get :
PQPageId Date HOUR MINCOUNT
-------- ---------- ---- --------
1 04-11-2015 8 2359
1 05-11-2015 8 2332
1 06-11-2015 8 2008
1 07-11-2015 8 1964
1 08-11-2015 8 2139
1 09-11-2015 8 54
[Code] ....
But I am expecting
PQPageId Date HOUR MINCOUNT
-------- ---------- ---- --------
1 09-11-2015 8 54
1 11-11-2015 9 10
1 11-11-2015 10 4
2 11-11-2015 8 10
2 11-11-2015 9 2
2 11-11-2015 10 1
For ex: Pqpageid 1, at 8am on date 4-11-2015 has a count of 2359 and on 9-11-2015 at 8 am it has count 54then it should return date 9-11-2015 and count 54 for hour 8 like wise for every pageid and hour from 8 to 23 it should return the min count from given date of range.
Hello,
Does 7.0 store db Growth rate information?
I am looking for information that tells me how fast a db is growing in MB and or percentages over a given period of time, ie weekly, monthly, yearly etc. Either in real numbers or estimates. Does 7.0 already store something like this or do I need to create some code for this?
Or does someone have something like this already coded that they would be willing to share?
Thank you in advance.
Troy
We have some reports run quite slow because the queries are complicate and tables are large. So we create a materialized view in Oracle which store the result from the query and refresh it occationally. How to do this in MSSQL? I try the indexed view but seems it has lots of restrictions, our query has sub queries and cross database table joins so can't use the indexed view. Any other object or temp table can be used to cache the report data and be accessed globally by other procedures?
Any suggestions are welcome,
Cheers
Hi,
I want to store tax rate in my tables. I set the data type to float, I wan't 4 decimal places and the data in the table has 4 decimals, but when I run a query in query analyzer it returns: 4.4999999999999998E-2 instead of 0.045.
How can I fix this?
I have a Table as below
Month Year Rate
1 2013 100
2 2013 101
8 2014 105
The rate is the value for the Month(1)January and year 2013.
I wana have a query which need to get rate value between 3rd month 2013 and 5the month 2014.
The Output has to be
3 2013 101
4 2013 101
5 2013 101
to
4 2014 105
5 2014 105
anyone wrote a function to calcuate internal rate of return?
View 8 Replies View RelatedI have installed a SQL Server diagnose tool for evaluation. It prompts and warns me that "Procedure Cache hit rate is for example 15%. Its help indicates:
The Procedure Cache Hit Rate alarm is raised when the ratio between the number of times SQL Server looks for a plan in the procedure cache and the number of times it does not find a required plan in the procedure cache falls below a threshold.
A low procedure cache hit rate indicates that SQL Server is finding fewer of the query execution plans it needs already in memory and therefore has to perform more compiles. These extra compilations will degrade SQL Server performance by causing extra CPU load.
What can I do to increase the rate?
Canada DBA
I've got a statistics table that I've been writing to for about 2 years now. Every saturday night, a size (in MB) snapshot of each DB file is taken and dumped into this table. I'm then emailed a copy for that week.
Now, I'm trying to figure out what the fastest growers are. Here's the table ddl
CREATE TABLE [dbo].[DBSizeStats] (
[statid] [int] IDENTITY (1, 1) NOT NULL ,
[LogDate] [datetime] NULL ,
[Server] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[DBName] [varchar] (200) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[MDFName] [varchar] (200) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[MDFSize] [decimal](18, 0) NULL ,
[LDFName] [varchar] (200) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[LDFSize] [decimal](18, 0) NULL ,
[TotalSize] [decimal](18, 0) NULL
) ON [PRIMARY]
GO
What I'm trying to figure out is how to query the average monthly and yearly growth percentages per DB on the MDFSize column.
I'm usually pretty good at this sort of thing, but I just can't seem to wrap my head around how to solve this issue. I'm not having a very good math day.
what am I missing here?
Have the following rate schedule - amount and discount rate
$0-10 = 1% discount
$10-100 = 2%
$100-over = 3%
I created the following rate table with min range and no max range so last entry will handle everything over 100$
$0 1%
$10 2%
$100 3%
I then create a view to translate it as min and max range
$0 10 1%
$10 100 2%
$100 max 3%
Select* from view where $10 between minrange and maxrange
give 2 rows
1%
2%
Question: Is there a way to structure the rate table so I can use SELECT BETWEEN and
what is the most common way to setup a rate table
I need to pickup a tax rate, that is stored on a 1 record file. I would like to avoid using the CROSS JOIN. Is there a way to SELECT the record and set a Variable = to the tax rate so I can pickup the rate in another SELECT statement on each record?
View 10 Replies View RelatedHi
We are having problems with our SQL server 2000.
The problem is that on a daily basis we run out of disk space and I always have to run shrinkdatabase on tempdb.
Today we started with 160GB of free space and by the end of the day it was gone!
Yes we do have many jobs running on our SQL server pulling data in from many sources. But I dont know how to find out which job is causing this problem. I have a suspicion that it could be a job that runs hourly that pulls data from Oracle (approximately 10000 rows each time), but that job has been active since the 28th August 2007. We only started running out of space in the past 5 days. Any suggestions would be appreciated as to what is causing this or how to diagnose the problem.
Thanks
I have a 32 bit SQL 2005 EE clustered installation with 10GB of physical memory and AWE enabled. Our monitoring tool, Spotlight, is reporting the Procedure Cache to be 384MB and a Hit Rate of 75% on a fairly regular basis. Sometimes the Procedure Cache increases to 495MB and a Hit Rate of 82%.
(1) With 2005 can the Procedure Cache be increased?
(2) What is the max size of Procedure Cache?
(3) How do I increase the Hit Rate to a higher percentage?
I do not encounter the issue on any other SQL Server installation, however this is our only cluster.
DBCC PROCCACHE
num proc buffs = 64889
num proc buffs used = 1135
num proc buffs = 1135
active proc cache size = 2896
proc cache used = 364
proc cache active = 364
Thanks, Dave
Hi, all here, I found that in my case when I trained the data mining models, the model cover rate is very low (in my case, the train data set has 82 rows but the case occuring in the models I trained is only 25). How can I improve the cover rate to improve the quality of the models? (if it is possible in SQL Server 2005) I am using SQL Server 2005.
Cheers.
Hi,
We have Asynchronous Database Mirroring on SQL Server 2005 SP2 Entprise Edition/Windows 2000 Advanced Server. We noticed that log sent rate is quite low (average 1.3 MB/sec) in most of the cases whereas "Log bytes flushed/sec" is high (1.4 MB/sec) as a result Log send queue keeps on increasing and finally taking all the transaction log space. Our disk queue length is always in range of 0.01. And prinicipal and mirror servers are on local LAN.
I tried on low end server and high end server and in both cases Log sent rate is approx 1.3 MB/sec (Maximum 4 MB/sec).
Is there any limitation on Log sent rate?
How can we improve on log sent rate? Since both servers are on local LAN, network bandwith does not seems to be an issue.
Any help is greatly appreciated.
Thanks,
Ramesh