TABLEONE: SEC
TABLETWO: B_DATE and E_DATE (they are numbers – not dates); RATE (is a number)
I first calculate the difference between E_DATE and B_DATE and between a chosen date and B_DATE
creating the new columns GG_CED_C and GG_MAT_C.
These new columns are calculated perfectly (they make the difference
between the two dates according to the 30/360 date count convention) and don’t give me any problems. They are all filled with numbers.
I’m able to use these new columns to make other calculations. For example I succeed in calculating
RATE/GG_CED_C* GG_MAT_C
but if I try to calculate
RATE/360*GG_MAT_C AS RATEO
the result is that in some records I have my new column RATEO correctly calculated, while in some other records there is no value in the column.
FROM (
SEC, B_DATE, E_DATE, RATE, GG_CED_C, GG_MAT_C
(YEAR2 - YEAR1)*360 + (MONTH2 - MONTH1)*30 + (DAY2 - DAY1) AS GG_CED_C,
(YEARRIL - YEAR1)*360 + (MONTHRIL - MONTH1)*30 + (DAYRIL - DAY1) AS GG_MAT_C,
FROM (
SELECT
A.SEC, , B_DATE, E_DATE, RATE,
TRUNC(B_DATE/10000,0) AS YEAR1 ,
TRUNC(MOD(B_DATE,10000)/100,0) AS MONTH1 ,
CASE
WHEN MOD(B_DATE,1000)= 229 THEN 30
WHEN MOD(B_DATE,1000)= 228
AND MOD(TRUNC(B_DATE/10000,0),4) > 0 THEN 30
ELSE MIN(MOD(B_DATE,100),30)
END AS DAY1,
TRUNC(E_DATE/10000,0) AS YEAR2 ,
TRUNC(MOD(E_DATE,10000)/100,0) AS MONTH2 ,
CASE
WHEN MOD(E_DATE,1000)= 229 THEN 30
WHEN MOD(E_DATE,1000)= 228
AND MOD(TRUNC(E_DATE/10000,0),4) > 0 THEN 30
ELSE MIN(MOD(E_DATE,100),30)
END AS DAY2,
TRUNC(&DRIL/10000,0) AS YEARRIL ,
TRUNC(MOD(&DRIL,10000)/100,0) AS MONTHRIL ,
CASE
WHEN MOD(&DRIL,1000)= 229 THEN 30
WHEN MOD(&DRIL,1000)= 228
AND MOD(TRUNC(&DRIL/10000,0),4) > 0 THEN 30
ELSE MIN(MOD(&DRIL,100),30)
END AS DAYRIL,
FROM TABLEONE A, TABLETWO C
What's happening?
After what I've seen in this query I'm wondering if SQL is reliable
when it comes to calculations (after all its main duty is to query data and not to make calculations among them)(?????.
Our Transactions/sec counter jumped quite a bit when we moved to SQL Server 2005. The move coincided with increased load so we didn't think anything of it until recently. Upon further review, the counter just seems too high.
There was an article in SQL Server magazine a few years ago by Brian Moran where he states, "Transactions/sec doesn't measure activity unless it's inside a transaction. Batch Requests/sec measures all batches you send to the server even if they don't participate in a transaction." He goes on to say that Transactions/sec will be skewed lower because it is a subset of Batch Requests/sec. (http://www.sqlmag.com/Article/ArticleID/26380/sql_server_26380.html)
The article was written for SQL Server 2000. We conducted tests in 2000 and found what he said to be right on the money. SELECT statements increased Batch Requests/sec, but not Transactions/sec. UPDATE/INSERT/DELETE statements increased both in lockstep. Makes perfect sense so far.
We conducted the same tests in 2005 and found a radically different story. While SELECT statements behaved the same, UPDATE/INSERT/DELETE statements showed Transactions/sec skyrocket 2-10x more than Batch Requests/sec for the duration of the statement. In other words, a single transaction submitted by our application fires off exponentially more transactions than the one we submitted. I was unable to pinpoint exactly what these "hidden" transactions were actually doing. Is this something that occurred in 2000 but simply wasn't reported? Or is it new behavior in 2005?
While trying to answer these questions we noticed a second strange behavior in 2005. When no queries are being executed the Transactions/sec counter still jumps every six seconds like clockwork. And these phantom transactions number in the thousands. We tried to use profiler to capture what SQL was being executed, but nothing shows up in any SQL Statement or Batch event. However, when we turned on the SQLTransaction event we found it, sort of. An object called GhostCleanupTask runs every six seconds causing thousands of transactions. We don't know exactly what it is doing, but we noticed that it ran consistently on some databases, but never on other databases. Both sets of databases are identical and in use.
So, all of this investigation leads me with three final questions.
1. What is behind all the extra transactions caught by perfmon when I submit a single transaction? 2. What is GhostCleanupTask and why does it take so many transactions? (And why does it only run on certain databases?) 3. If a potential customer asks for our Transactions/sec count, is it accurate to give them the big number, knowing that our application is only actually submitting a fraction of that? On the other hand, the system apparently is actually doing that many transactions. (For instance, on our production server during peak, Batch Requests/sec is about 4,000, while Transactions/sec hits 26,000.
SSIS is behaving differently in different environments but the code is same.
One thing is nor working correctly that is I am converting a string data type column to float data type in data conversion. In our local environments the package is working fine but in production environments it is not working correclty. It is unable to convert the data it is throwing an error.
"The data value cannot be converted for reasons other than sign mismatch or data overflow"
Recently, my SQL Server has started getting: "Unable to connect. The maxinum of '100' configured user connections are already connected. System Administrator can configure to a higher value with sp_configure." The server has to be rebooted to recover.
What's odd is that this occurs during the weekend when only about 5-8 connections are used by various system processes. I can't tell yet what is sucking up the connections.
We use SQL 6.5 sp4 on NT4.0 sp3 for DEC Alpha. We are also using SQL Mail, Seagate Backup (with SQL drivers) and some cgis that talk to the database.
I'm running a DB using MSDE (2000) that is interfaced by 2 differentades running on PCs with Access 2000 Runtime. One of the ADEs is apackage accounting system that is very solid and stable, the other isa custom application that I wrote (much less solid and stable). Thecustom app only deals with a select few tables in the database, andthe table in question is not one of those.With alarming regularity(daily), records are getting deleted out of aparticular table. I've set up a couple of dummy records in the tableand put a delete trigger on the table that creates record in a 'log'table that tells me the user and the time that the records aredeleted.The deletion (all records in the table) always occurs during businesshours (never over the weekend or at night) and the user responsiblevaries among 3 or 4 different users. 2 of those users don't even haverights to that table, so I'm really confused how those logins couldcause a delete on the table they don't have access to!??!As far as I can tell, this is only happening to this particular table( I hope!).Is there a way that I can get more information on the process ormachine or anything else that is behind the deletion?
So after 7 months of using SQL Express DB(on a dedicated server), sql management studio on two pc in my office to connect to said DB, and also connecting to DB with query analizer with no connection problems at all, I came into the office yesturday morning to find that the two computers with Studio could not connect to the DB?!?!?! Now, query analizer connects fine to the DB still, with the exact same ip/server instance and with windows authentication??? ive tried reloading studio, disabling firewall both on the client pc and on the server... the instance on the server is set to accept incoming connections (verified i guess by the fact that query analyzer can still connect to it)... i dont get it??? any suggestions????
~ Mauricio
Here is the error message from studio
TITLE: Connect to Server ------------------------------ Cannot connect to 74.***.***.**sqlexpress. ------------------------------ ADDITIONAL INFORMATION: An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (Microsoft SQL Server, Error: -1) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=-1&LinkId=20476
I am running a simple merge replication in SQL Server 2000. I have one database that is the publisher, and a second database that is the subscriber. When I add a new row to the subscriber it will replicate to the publisher as expected. However, the new row at the subscriber will then be deleted without explanation. The row will remain at the publisher though.
I'm using the Data Flow Task to load data from a flat file into a SQL table and I'm missing rows. And there doesn't see to be any consistent or obvious reason why.
When I use the Bulk Insert Task I import the correct number of rows from the flat file. But when I use the Data Flow task and use a Flat File Source connected to a OLE DB Destination I get about 1/3 the right number of rows. So looking at these loaded tables at the same time I notice that the Data Flow Task method just skips rows sometimes.
I created a custom assembly using C# to transform some binary data into text, and in this assembly I used one win32 dll developed by our customer to help me to do the tranformation. The code I used to call the win32 dll is like below: [DllImport("tdasuie.dll", EntryPoint = "AlrtLogConditionToText", ExactSpelling = false, CharSet = CharSet.Auto, SetLastError = true)] private static extern UInt32 AlrtLogConditionToText(Byte[] pbCondition, StringBuilder pszText, UInt32 dwSize);
I defined a C# method to call the above win32 method and return a string. Then in the report, I called this C# method to get the correct string.
In the report designer, the C# method in the custom assembly can return the correct string in the preview window. But after I deployed the report into the report server, the textbox will only display "#error" in the report manager web page.
I have a excel file which has a column called "Code" and their values are A,B,C,D,E,F,G,H. I want to create a new column called "status" based on the values of "Code".
Code:
A B C D E F G H
If A,C,E,G then "status" = "Active" else if B,D,F,H then "Status" = "Inactive". I like to do it using "Derived Column".
The requirement is to have a table say 'child_table', with an Identity column to refer another column from a table say 'Parent_table'..
i cannot implement this constraint, it throws the error when i execute the below Alter query,
ALTER TABLE child_table ADD CONSTRAINT fk_1_ct FOREIGN KEY (child_id) REFERENCES parent_table (parent_id) ON DELETE CASCADE
the error thrown is : Failed to execute alter table query: 'ALTER TABLE child_table ADD CONSTRAINT fk_1_ct FOREIGN KEY (child_id) REFERENCES parent_table (parent_id) ON DELETE CASCADE '. Message: java.sql.SQLException: Cascading foreign key 'fk_1_ct' cannot be created where the referencing column 'child_table.child_id' is an identity column.
Hi guys, If I have a temporary table called #CTE With the columns [Account] [Name] [RowID Table Level] [RowID Data Level] and I need to change the column type for the columns: [RowID Table Level] [RowID Data Level] to integer, and set the column [RowID Table Level] as Identity (index) starting from 1, incrementing 1 each time. What will be the right syntax using SQL SERVER 2000?
I am trying to solve the question in the link below: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2093921&SiteID=1
Thanks in advance, Aldo.
I have tried the code below, but getting syntax error...
ALTER TABLE #CTE ALTER COLUMN [RowID Table Level] INT IDENTITY(1,1), [RowID Data Level] INT;
I have also tried:
ALTER TABLE #CTE MODIFY [RowID Table Level] INT IDENTITY(1,1), [RowID Data Level] INT;
Hi I’m trying to alter a table and delete a column I get the following error. The object 'DF__Morningst__LastU__19EB91BA' is dependent on column 'LastUpdated'. ALTER TABLE DROP COLUMN LastUpdated failed because one or more objects access this column. I tried deleting the concerned constraint. But the next time I get the same error with a different constraint name. I want to find out if I can dynamically check the constraint name and delete it and then drop the column. Can anyone help.IF EXISTS(SELECT 1FROM sysobjects,syscolumnsWHERE sysobjects.id = syscolumns.idAND sysobjects.name = TablenameAND syscolumns.name = column name)BEGIN EXECUTE ('ALTER TABLE tablename DROP CONSTRAINT DF__SecurityM__DsegL__08C105B8')EXECUTE ('ALTER TABLE tablenameDrop column columnname)ENDGO
I am trying to exclude records from a table where the ID column is the same but the Mail code Column is multi-valued.For Example: (the table looks like....)
Hi, I have dates in "mmddyy" format coming from the sources and they are older dates of mid 80s like 082580 for instance.
When I cast it this way (DT_DBTIMESTAMP) Source_Date , It says ok but throws a runtime error.
When I hardcode a date in same format, (DT_DBTIMESTAMP) "082580" , It becomes red (an indication of syntax error) . Please note that we use double quotes in expressions in Derived Column Transformation; So an anticipation that using double quotes over single ones would be the syntax problem would be wrong.
I am creating a report in SSRS which has the following criteria:
- Row 1 (parent) is 'Product'
- Row 2 (child) is 'Feed'
- Columns are date. I have 5 dates showing at any one time across the top. The date field is set up as a parameter so depending on the date the user selects, the report will show that date on the end column and then the 4 days prior to that in the other columns.
- Data is the number of records.
I have a sub total on the Product and the report is collapsed on Product as default.
What i'm stuck on is trying to insert a column at the very end that will show the variance between the last two dates. So the difference between the date the user selected (@date parameter) and the day before that.
I have a Table Having Date,Opening,Addition,Sale values where opening value comes in the very first row other times it is zero.
In ssrs how can i have a report showing closing value = Opening+Addition-Sale in current row (it is simple for 1st row ). this closing be the opening value in next row and same formula to be continued...
ALTER procedure [dbo].[MyPro](@StartRowIndex int,@MaximumRows int) As Begin Declare @Sel Nvarchar(2000)set @Sel=N'Select *,Row_number() over(order by myId) as ROWNUM from MyFirstTable Where ROWNUM Between ' + convert(nvarchar(15),@StartRowIndex) + ' and ('+ convert(nvarchar(15),@StartRowIndex) + '+' + convert(nvarchar(15),@MaximumRows) + ')-1' print @Sel Exec Sp_executesql @Sel End
--Execute Mypro 1,4 --->>Here I Executed Error Select *,Row_number() over(order by myId) as ROWNUM from MyFirstTable Where ROWNUM Between 1 and (1+4)-1 Msg 207, Level 16, State 1, Line 1 Invalid column name 'ROWNUM'. Msg 207, Level 16, State 1, Line 1 Invalid column name 'ROWNUM Procedure successfully created but giving error while Excuting'. Please anybody give reply Thanks
I got a sales cost and cost amount table for my budget. the sales cost table is getting updated with FOBB items which makes the total incorrect . the FOBB values needs to be moved from the sales cost column to the cost amount column. how can i do it with an SQL script.
We have a database where many tables have a field that has to be lengthened. In some cases this is a primary key or part of a primary key. The table in question is:-
/****** Object: Table [dbo].[DTb_HWSQueueMonthEnd] Script Date: 09/25/2014 14:05:09 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO SET ANSI_PADDING ON GO CREATE TABLE [dbo].[DTb_HWSQueueMonthEnd](
[Code] ....
The script I am using is
DECLARE@Column varchar(100)--The name of the column to change DECLARE@size varchar(5)--The new size of the column DECLARE @TSQL varchar(255)--Contains the code to be executed DECLARE @Object varchar(50)--Holds the name of the table DECLARE @dropc varchar(255)-- Drop constraint script
[Code] ....
When I the the script I get the error message Could not create constraint. See previous errors.
Looking at the strings I build
ALTER TABLE [dbo].[DTb_HWSQueueMonthEnd] DROP CONSTRAINT PK_DTb_HWSQueueMonthEnd ALTER TABLE [dbo].[DTb_HWSQueueMonthEnd] Alter Column [Patient System Number] varchar(10) ALTER TABLE [dbo].[DTb_HWSQueueMonthEnd] ADD CONSTRAINT PK_DTb_HWSQueueMonthEnd PRIMARY KEY NONCLUSTERED ([Patient System Number] ASC,[Episode Number] ASC,[CensusDate] ASC) WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF, IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
They all seem fine except the last one which returns the error
Msg 8111, Level 16, State 1, Line 1 Cannot define PRIMARY KEY constraint on nullable column in table 'DTb_HWSQueueMonthEnd'. Msg 1750, Level 16, State 0, Line 1 Could not create constraint. See previous errors.
None of the fields I try to create the key on are nullable.
Hi everyone,I encountered an error "Need to run the object to perform this operationCode execution exception: EXCEPTION_ACCESS_VIOLATION" When I try to import data from Oracle to MS SQL Server with EnterpriseManager (version 8.0) using DTS Import/Export Wizard. There are 508 rowsin Oracle table and I did get first 42 rows imported to SQL Server.Anyone knows what does the above error message mean and what causes therest of the row failed importing?Thanks very much in advance!Rene Z.--Posted via http://dbforums.com
Using MDS 2012: I have an entity "XYZ_Entity". In "XYZ_Entity" entity I have 2 domain based Columns "DealerGroup" and "Dealer".
While inserting information into "XYZ_Entity" entity user can select the required dealer group from domain base Dealer Group values. Now for selecting Dealer he wants the dealers to be filter based on selected dealer group and he can select from the filtered list. reason to do that is he don't want to go through thousands of dealers and select an incorrect one.
I receive this message when I try to run any report. The reportserver and reportservertempdb databases were upgraded using backup/restore from SQL2000 to SQL2005 on a separate server which is running RS2005 . Please help. Thanks
I want a query to join all this tables based on EmployeeID, PeriodID and LeaveTypeID sum of LeaveEntitlement.LeaveEntitlementDaysNumber based on LeaveTypeID AS EntitleAnnaul and AS EntitleSick and sum AssignedLeave.AssignedLeaveDaysNumber based on LeaveTypeID AS AssignedAnnaul and AS AssignedSick and subtract EntitleAnnaul from AssignedAnnual based on LeaveTypeID AS AnnualBalance and subtract EntitleSick from AssignedSick based on LeaveTypeID AS SickBalance
and the table should be shown as below after executing the query
I am working on a column chart type (stacked column sub-type) report.
Our customer requires us that the space(padding) between the columns should be a constant(including the space between the Y-axis and the first column). I know how to set the width of the columns, but I really don't know how to set the width of the space between them. The columns just varies the space between them automatically according to the number of the columns (the number of the columns is not certain).
Hey all - got a problem that seems like it would be simple (and probably is : )
I'm importing a csv file into a SQL 2005 table and would like to add 2 columns that exist in the table but not in the csv file. I need these 2 columns to contain the current month and year (columns are named CM and CY respectively). How do I go about adding this data to each row during the transformation? A derived column task? Script task? None of these seem to be able to do this for me.
Here's a portion of the transformation script I was using to accomplish this when we were using SQL 2000 DTS jobs:
' Copy each source column to the destination column Function Main() DTSDestination("CM") = Month(Now) DTSDestination("CY") = Year(Now) DTSDestination("Comments") = DTSSource("Col031") DTSDestination("Manufacturer") = DTSSource("Col030") DTSDestination("Model") = DTSSource("Col029") DTSDestination("Last Check-in Date") = DTSSource("Col028") Main = DTSTransformStat_OK End Function *********************************************************** Hopefully this question isnt answered somewhere else, but I did a quick search and came up with nothing. I've actually tried to utilize the script component and the "Row" object, but the only properties I'm given with that are the ones from the source data.
I am trying to create a whole number DAX calculated column that is derived from a date column. Basically it gets the date from the source data column and outputs it as an integer in the YYYYMMDD format.So 01/OCT/2015 would become --> 20151001...I've been fidgeting with DAX but my problem is that I keep missing the leading zeroes for months and days. So 01/March/2015 becomes 201531 which is not what I want (I need 20150301 in this case).
I am using a SQL command in ASP.NET to send a query to a an OLAP cube that returns a dynamic set of data that I load into a datatable and then bind to a GridView. I have made my own ITemplate implementaton for displaying and formatting the data, and the following line is causing me problems: RawValue = DataBinder.Eval(row.DataItem, "[Month].[Month].[MEMBER_CAPTION]") The error returned is: Month is neither a DataColumn nor a DataRelation for table My guess as to what is happening is that it sees the brackets in the field name and stops reading the field name at [Month], when the actual field name is much longer. I know that it knows about the column name because it displays it correctly in the header. Since I am loading this from an OLAP cube the names of my columns vary based on the criteria so I cannot alias the column because I don't know exactly which columns will be displayed. Does anyone know how I might get the DataBinder.Eval function to work with fields that contain square brackets [ and ] ? If I use the GridView's "auto-generate fields" option it will show the data (and this is the column name) but I lose all control over formatting and the other custom code I'm writing in my ITemplate interface. Thanks
I have a table in which a non-primary key column has a unique index on it. If I am inserting a record into this table with a duplicate column value for the indexed column, then what will be the error number of the error in above scenario? OR How could I find this out?