Hi
I'm updating an old Access application to SQL Server and am currently trying to decipher one of the reports on the old application. It appears to be evaluating a derived column from one query (qryStudentSuspGroup.Suspension) in the Select statement of another. I have tried to put the query that creates the derived column in as a nested query into the other query but can't get it to work. This is all a bit beyond my rudimentary SQL skills! Any help would be greatly appreciated!
The original Access SQL appears below:
SELECT [Enter the academic year (4 digits)] AS [input], ResearchStudent.Department, ResearchStudent.DateAwarded,
ResearchStudent.StudentNumber, Person.Forenames AS fore, Person.Surname AS Sur, ResearchStudent.Mode,
ResearchStudent.RegistrationDate, StudentExamination.Decision,
IIf(([Suspension]) Is Null Or [Suspension]=0,([DateAwarded]-[RegistrationDate])/365,(([DateAwarded]-[RegistrationDate])-([Suspension]))/365) AS CompDate,
ResearchStudent.EnrollmentCategory, qryStudentSuspGroup.Suspension
FROM ((ResearchStudent LEFT JOIN Person ON ResearchStudent.ResearchStudentID = Person.PersonID)
LEFT JOIN qryStudentSuspGroup ON ResearchStudent.ResearchStudentID = qryStudentSuspGroup.ResearchStudentID)
LEFT JOIN StudentExamination ON ResearchStudent.ResearchStudentID = StudentExamination.ResearchStudentID
WHERE (((Year([DateAwarded]))>=[Enter the academic year (4 digits)]
And (Year([DateAwarded]))<=([Enter the academic year (4 digits)]+1))
AND ((IIf(Year([DateAwarded])=[Enter the academic year (4 digits)],Month([DateAwarded])>8,Month([DateAwarded])<9))<>False))
ORDER BY ResearchStudent.Department, ResearchStudent.Mode, ([DateAwarded]-[RegistrationDate])/365
Here's another one of my bitchfest about stuff which annoy the *** out of me in SSIS (and no such problems in DTS):
Do you ever wonder how easy it was to set up text file to db transform in DTS - I had no problems at all. In SSIS - 1 spent half a day trying to figure out how to get proper column data types for text file - OF Course MS was brilliant enough to add "Suggest Types" feature to text file connection manager - BUT guess what - it sample ONLY 1000 rows - so I tried to change that number to 50000 and clicked ok - BUT ms changed it to 1000 without me noticing it - SO NO WONDER later on some of datatypes did not match. And boy what a fun it is to change the source columns after you have created a few transforms.
This s**hit just breaks... So a word about Derived Columns - pretty useful feature heh? ITs not f***ing useful if it DELETES SOME of the Code itself after there have been changes in dataflow. I cant say how pissed off im about that SSIS went ahead and deleted columns from flow & messed up derived columns just because the lineageIDs dont match.
Meta-data - it would be useful if you could change it and refresh it - im just sick and tired of it that it shows warnings and errors when there's nothing wrong - so after a change i need to doubleclick all my transforms so that those red & yellow boxes would disappear.
Oh and y I passionately dislike Derived columns - so you create new fields based on some data - you do some stuff - combine multiple columns to one, but you have no way saying remove the columns from the pipeline. Y you need it - well if you have 50K + rows with 30+ columns then its EXTRA useless memory overhead for your package.
Hopefully one day I will understand how SSIS works (not an ez task I say) - I might be able to spend more time on development and less time on my bitchfest - UNTIL then --> Another Day - Another Hassle with SSIS
I'm trying to write a query that concatenates multiple records into onederived column. Let's say I have an author (Joe Writer) who has writtenthree books (Book 1, Book2 and Book 3). The author is in tblAuthors, hisbooks are in the tblBooks and they are joined by the AuthorID field(number). If I use a simple select query to give me the author name and thetitle, I will get three records, one for each book written.What I want is to have all three books combined into one derived column. Soif I do the select statement, I will get one column with the author name,and the second column will put together all three names of the bookseparated by a column. So it will look like:Author TitleJoe Writer Book 1, Book 2, Book 3,Rather than having it appear as 3 records:Joe Writer Book 1Joe Writer Book 2Joe Writer Book 3Could someone help me with the SQL involved in this?Thanks for the help.Cheers,Mike
I have seen it happen frequently that I type in a perfectly valid SSIS expression (this is easy for me since I am an old hand at C++/C/C#) in a row in a Derived Column transformation, and it turns red. Or sometimes, I will have an invalid expression that I correct, but it stays red. Finally I have also seen it happen that I make some change in the data flow pipeline and suddenly a Derived Column transform develops an error. I then go into the Derived Column transform and find that the expression has turned red. So, I literally have to go into the expression in these cases, and make a trivial change to them to get the red error to go away. Alternatively, I can cut the derived column expression text, and then paste it back in and it works (this is most telling.)
So, it seems to me the Derived Column is somehow holding onto some meta data about the Derived Column that is getting out of date (rather than re-evaluating the correctness of the Expression.) One thing I usually can do to repro this at times is to remove a column (that the Derived Column depends upon) from the pipeline and then re-add it. When I go into the Derived Column it will be red, and then like I said I have to tweak the expression to force SSIS to re-evaluate the expression.
I'm sending a lot of columns through my derived column transform, checking for empty strings from a flat file - I was wondering is there a way that I could "script out" all the transforms instead of enduring this click hell that I'm stuck in inside the derived column transformation editor. I've got probably 100+ columns to configure with the following sort of transform....
Basically - if the string is empty, then throw Null in the data stream. I'm about a third the way through but it would really be nice if there was a quicker way. Even with the most efficient copying & pasting & keyboard shortcuts, it's still painful.
Is it possible to do pattern matching against a string using FINDSTRING or similar in a derived column expression without recourse to including 3rd party regexp style plugins?
I want to seach for a reference in a string which will have the format ANNNNNAAA ie. an alpha with a certain value followed by 5 digits followed by three alphas with specific values.
I have a lot of different data flows that need "Derived Column". There are maybe only 5 different such "Derived Column" but they appear many times. Is there a way to eliminate all that double work? It should be something that does not take me more time to do than just duplicating all the "Derived Columns".
In my working existing package I am adding a derived column as date and datatype as DT_DBTIMESTAMP and my destination data type is datetime. when I try to union these records through an "Union ALL" component its not allowing me to map this column and throws error as "outputcolumnlineageid " and defines the error as metadata for source and destination are not matching. I checked the metadata and they are DT_DBTIMESTAMP, Am I missing something? please advice.
Trying to setup a derived column to use an expression stored in a package variable. But it seems that variables are always evaluated as text...I need it to evaluate as a Columname sometimes.
Example: On an ETL of Products I want a new derived column that uses two other columns.
I can hardcode an expression of ProductID + " - " + ProductName and that results in dynamic output. But now I want to use that as an expression stored in a variable so I can change it when needed.
So I make that expression = @[User::Variable] and stuff into @variable ( a string param) : ProductID + " - " + ProductName My output is the literal "ProductID + " - " + ProductName", and not the actual ID's and Names
I've tried with/without brackets, quotes and braces but no change.
Any way I can get the pieces in that variable expression to evaluate as column names?
I have 3 different companies that share the same ticket_types(CRMS System). I need to display the Ticket Types and the 3 company's Ticket Count:
Ticket Type | Company A Count | Company B Count | Company C Count
I can get the information individually for each company, but if a company doesn't have a ticket in one of the ticket_types, then it isn't displayed in a row. So, I tried to write the following, which isn't pulling back any data.
DECLARE @startdate date = '20150306' DECLARE @enddate date = '20151031' DECLARE @AcctGrp varchar(20) = '111' ;WITH TType AS ( SELECT ctp.description as TicketType
[Code] .....
If I run each SELECT individually from above (excluding the last SELECT), it works and I get the following:
TicketType AR Request Credit Availability/Rush Cancel Order Credit Card Payment Expedite Order Freight Quote
[Code] ...
How to get the query results? Am I even close to getting it right?
I have to Select Order, Order Details and Order Status
Order Status is determined from Order Stage as follows:
If, at least one order detail line(from Order Details and Related Order details table) is approved, that Order status=Approved.
For the example, Order Status of Order ID=2, is Approved based on order status for order details lines 3(from table 2) and order details ID 1 and 2 (from table 3)
How to combined order stage from table 2 and table 3 and then compute order status.
Parameter Information cannot be derived from SQL statements with sub-select queries. Set Parameter information before preparing command.
Here's the query:
update GCDE_SEQ set LAST_NO = (select max(FLD_NO) from PONL_FLD) ,UPDT_USER = ? ,UPDT_DT = getdate() where SEQ_NM = 'FLD_NO'
Why can't Execute SQL Task handle this simple query? I figure i can use 2 SQL Execute SQL Task, one to get the max into a var, and the other to do the updating. However, this is alot of trouble since i'm having this almost exact query in alot of places. Any way around this?
I have a table which has a field called Org. This field can be segmented from one to five segments based on a user defined delimiter and user defined segment length. Another table contains one row of data with the user defined delimiter and the start and length of each segment. e.g.
Table 1
Org
aaa:aaa:aa
aaa:aaa:ab
aaa:aab:aa
Table 2
delim
Seg1Start
Seg1Len
Seg2Start
Seg2Len
Seg3Start
Seg3Len
:
1
3
5
3
9
2
My objective is to use SSIS and derive three columns from the one column in Table 1 based on the positions defined in Table 2. Table 2 is a single row table. I thought perhaps I could use the substring function and nest the select statement in place of the parameters in the derived column data flow. I don't seem to be able to get this to work.
Any ideas? Can this be done in SSIS?
I'd really appreciate any insight that anyone might have.
I have a business need to create a report by query data from a MS SQL 2008 database and display the result to the users on a web page. The report initially has 6 columns of data and 2 out of 6 have JSON data so the users request to have those 2 JSON columns parse into 15 additional columns (first JSON column has 8 key/value pairs and the second JSON column has 7 key/value pairs). Here what I have done so far:
I found a table value function (fnSplitJson2) from this link [URL]. Using this function I can parse a column of JSON data into a table. So when I use the function above against the first column (with JSON data) in my query (with CROSS APPLY) I got the right data back the but I got 8 additional rows of each of the row in my table. The reason for this side effect is because the function returned a table of 8 row (8 key/value pairs) for each json string data that it parsed.
1. First question: How do I modify my current query (see below) so that for each row in my table i got back one row with 19 columns.
SELECT A.ITEM1,A.ITEM2,A.ITEM3,A.ITEM4, B.* FROM PRODUCT A CROSS APPLY fnSplitJson2(A.ITEM5,NULL) B
If updated my query (see below)Â and call the function twice within the CROSS APPLY clause I got this error: "The multi-part identifier "A.ITEM6" could be be bound.
2. My second question: How to i get around this error?
SELECT A.ITEM1,A.ITEM2,A.ITEM3,A.ITEM4, B.*, C.* FROM PRODUCT A CROSS APPLY fnSplitJson2(A.ITEM5,NULL) B, Â fnSplitJson2(A.ITEM6,NULL) C
I am using Microsoft SQL Server 2008 R2 version. Windows 7 desktop.
I'd like to first figure out the count of how many rows are not the Current Edition have the following:
Second I'd like to be able to select the primary key of all the rows involved
Third I'd like to select all the primary keys of just the rows not in the current edition
Not really sure how to describe this without making a dataset
CREATE TABLE [Project].[TestTable1]( [TestTable1_pk] [int] IDENTITY(1,1) NOT NULL, [Source_ID] [int] NOT NULL, [Edition_fk] [int] NOT NULL, [Key1_fk] [int] NOT NULL, [Key2_fk] [int] NOT NULL,
[Code] .....
Group by fails me because I only want the groups where the Edition_fk don't match...
I am working on a Statistical Reporting system where:
Data Repository: SQL Server 2005 Business Logic Tier: Views, User Defined Functions, Stored Procedures Data Access Tier: Stored Procedures Presentation Tier: Reporting ServicesThe end user will be able to slice & dice the data for the report by
different organizational hierarchies different number of layers within a hierarchy select a organization or select All of the organizations with the organizational hierarchy combinations of selection criteria, where this selection criteria is independent of each other, and also differeBelow is an example of 2 Organizational Hierarchies: Hierarchy 1
Country -> Work Group -> Project Team (Project Team within Work Group within Country) Hierarchy 2
Client -> Contract -> Project (Project within Contract within Client)Based on 2 different Hierarchies from above - here are a couple of use cases:
Country = "USA", Work Group = "Network Infrastructure", Project Team = all teams Country = "USA", Work Group = all work groups
How to implement the data interface (Stored Procs) to the Reports Implement the business logic to handle the different hierarchies & different number of levelsI did get help earlier in this forum for how to handle a parameter having a specific value or NULL value (to select "all") (WorkGroup = @argWorkGroup OR @argWorkGrop is NULL)
Any Ideas? Should I be doing this in SQL Statements or should I be looking to use Analysis Services.
Hi, I'm creating a dynamic SQL statement in MS SQL Server that is similiar to this: EXEC('IF @' + @current_column + ' (SELECT ' + + @current_column etc... I'm basically looping through a large list of parameters that correspond to column names. However, since SQL Server treats EXEC() as its own scope when it gets to what "@' + @current_column" evaluates to it says the parameter must be declared. Is there a way to convert "@' + @current_column " into the actual value of the parameter?
We recently moved from v6.5 to v7.0. Now I have the databases and logs set to "autogrow". How can I monitor the disk space to ensure I do not run out of room (or is that preset as to how large it can grow ?). Can't find anything in the books online. Do I do this through the NT admin tool or through the SQL*Server Enterprise Manager and more importantly - how ??? Thanks so much for any help... Nancy
Hi,I am evaluating Lumigent's Entegra for doing security and businessaudit of some of the critical database(s) in the company I work for. Iwould like to know what has been your experience in using this productfor doing similar audits in your company, if you have also done suchaudits.Thanks,Sanjeev
Our Business partners request me to read the field names from a SQL table dynamically so that my SSIS package will not get impacted if Web Service hosts make a change.
Is there a way to evaluate a string at runtime that contains a field name ?
I have been looking all over for some info about other people having this problem, but haven't found anything.
I have a package that needs to download a dated file from an ftp site. I am using a couple script objects to set variables, and one of them is the filename based on the date. I use an expression to get the date:
Everything works really well when I am debugging it locally. However once it is on the server or even once I come back to it in a day or two, I am still seeing the old date. I thought it might be because my variable needed to be set to evaluateexpression = true, however once I did this it hung me and prevented me from debugging and I had to end bus dev studio. Not sure if its because it is being evaluated in two places (as a global and then in a script) but when I took it out of my script it hung again. Its strange in order to get it to work when I am debugging it locally I have to go to each process and evaluate the expressions in there, then it seems to work. thanks!
Hi all€”I'm new to expressions, and am trying to write one where I can evaluate if a field is null, to handle it by inserting a default value if a null value is found. Given this table definition:
Hello all, I currently have a pain in the butt with a subquery that needs to be evaluated, and if the result gives me a null value, I want to re-evaluate the condition and fill the column with the proper information. I'll try to explain it as best as I can:
The query retrieves information from 2 tables basically, but i do need some inner joins of diferent tables in order to follow the relations.. I got 2 conditions that need to be acomplished, the 'typeof' (intId_Tipo) has to be diferent from 6 and a keycode equals to 3.
For the "general query" i do filter this just fine, but in the subquery is where I get stucked: Since the evaluation that I do in oder to find the initial point is just by substracting 1, there are some cases where the type called intId_Tipo is actually 6 and the subquery returns me a null value. In this particular case I want to substract 2..if not I just want to substract 1. I added a SQL CASE in oder to evaluate the null value, but it is not evaluating properly, why? I'm not sure, I need your help and recomendation guys
Here's the query that I currently have: (the comments were made in order to take the next screenshot)
Code Snippet SELECT --h.dt_Lectura, --h.int_Velocidad, rp.int_VelMaxima, (SELECT intpunto FROM tblRecorrido_Puntos WHERE intPunto = case when (intPunto) is null then rp.intPunto - 2 else rp.intPunto -1 END AND rp.intId_Ruta_Ramal =intId_Ruta_Ramal AND intId_Tipo =rp.intId_Tipo) as PuntoInicial, rp.intPunto as PuntoFinal --rp.strDescripcion_Punto as DescripcionPuntoFinal, --1 as Contador --a.idRuta_Guid, a.idRamal_Guid,a.idUnidad_Guid, a.idOperador_Guid
FROM tblHistorico h INNER JOIN tblAsignaciones a ON a.id=h.intIdAsignaciones INNER JOIN tblRutaRamal_Generado rrg ON rrg.id_Ruta = a.intRuta_Asignada INNER JOIN tblRecorrido_Puntos rp ON (rp.intId_Ruta_Ramal =rrg.intId_Ruta_Ramal AND rp.intPunto=h.int_PtoDest)
WHERE ( h.int_ClaveTDE =3 and rp.intId_Tipo <> 6 )
ORDER BY rp.intPunto ASC
Here's the screenshot of the query result: http://img218.imageshack.us/my.php?image=consultahd0.jpg
I want to do some error checking on the parameters found in a SQLDataSource before I run the insert. The problem is these are ControlParameters and I want to do this dynamically so I can't just call the Control.Text property and grab its value. So how can I get access to what the ControlParameter evaluates to? Secondly, is there a way to access what the update parameters evaluate to in order to check them before they're inserted - if so how do I get access to these? Here's an example of one of the data sources i'm using:<asp:SqlDataSource ID="sqlContact" runat="server" ConnectionString="<%$ ConnectionStrings:strConn %>" SelectCommand="SELECT [ContactID], [FirstName], [LastName], , [Address], [Phone], [Grade], [Contacted], [ListServe] FROM [Contact]" UpdateCommand="UPDATE Contact SET FirstName = @FirstName, LastName = @LastName, Email = @Email, Address = @Address, Phone = @Phone, Grade = @Grade WHERE ContactID = @ContactID" DeleteCommand="DELETE FROM [Contact] WHERE ContactID = @ContactID" InsertCommand="INSERT INTO [Contact] ([FirstName],[LastName],,[Address],[Phone],[Grade],[Contacted],[ListServe]) VALUES (@FirstName,@LastName,@Email,@Address,@Phone,@Grade,0,0)"> <UpdateParameters> <asp:Parameter Name="FirstName" Type="String" /> <asp:Parameter Name="LastName" Type="String" /> <asp:Parameter Name="Email" Type="String" /> <asp:Parameter Name="Address" Type="String" /> <asp:Parameter Name="Phone" Type="String" /> <asp:Parameter Name="Grade" Type="String" /> </UpdateParameters> <InsertParameters> <asp:ControlParameter Name="FirstName" Type="String" ControlID="txtContactFirst" PropertyName="Text" /> <asp:ControlParameter Name="LastName" Type="String" ControlID="txtContactLast" PropertyName="Text" /> <asp:ControlParameter Name="Email" Type="String" ControlID="txtContactEmail" PropertyName="Text" /> <asp:ControlParameter Name="Address" Type="String" ControlID="txtContactAddress" PropertyName="Text" /> <asp:ControlParameter Name="Phone" Type="String" ControlID="txtContactPhone" PropertyName="Text" /> <asp:ControlParameter Name="Grade" Type="String" ControlID="ddlContactGrade" PropertyName="SelectedValue" /> </InsertParameters> </asp:SqlDataSource> An Event is fired when I click add on a button which looks similar to this: //Add btnAddContact_Clickprotected void btnAddContact_Click(object sender, EventArgs e) { InsertRow(sqlContact); }The InsertRow() function is then what i'm using to evaluate the values... So how can get the values those controlparameters actually are in order to evaluate them before I actually insert. Or is there a better way to do it?
I have two int fields in my database, CEOAnnualBonus and CEOBonus, and I want to return the value of whichever one has the larger value as CEOBonusCombined. I thought using COALESCE would do the trick like below but there are many cases where either CEOAnnualBonus or CEOBonus have a zero value instead of NULL and it doesn't work.
SELECT COALESCE(CEOAnnualBonus, CEOBonus) AS CEOBonusCombined FROM tbenchmarktemp WHERE Ticker='F'
I have a scenario that reminds me of a pivot table and I am wondering if there is a way to handle this in SQL.
I have four tables. Product Line, Item, Property, and Value.
A Product Line has many items and an item can have many property's and a property can have many values.
I want to select a product line and show all the items with the Property's as column headers and the Values as the data. The thing I am having trouble with is the property's for an item are variable from a few to a whole bunch.
how to declare multiple derived columns in SSIS Derived Column Task in one attempt.as i have around 150 columns coming from Flat file. I had created the required Expression in Excel and now i want add those in derived column task but its allowing only 1 expression at a time.