I need to do a stored proc in sql server 2005 where I have to count the total work orders based on their wo_type + wo_status, i.e. open, compltd, incomplete. For example I may have (wo_types) AB, AC, AD, orders that are in open status. I need to count the total AB_Type, AC_Type, AD_Type then the total of all 3 in another column (as Total Open). I must do this for all wo_status.
I also have to allow the user to be able to enter the date they want. I.e. if they would want to know on 3/17/08 how many orders were opened that day only. I know I have to use a parameter for this but how do i do that?
Should I do separate select statments for each status, multiple tables ?
Also for 'open' wo_status the wo_status_code can change from O to S but it is still considered 'open'. For this wo_status how can I get the most recent status_code for the open order, i.e., I want to count a particular 'open'work order's - most recent status_code in the work order count for a given wo_type -->> the order should only be counted once on any given date entered. So if the status changed from O then S the same at 1pm then 2pm, respectively, then only the 2pm status should be counted. Thanks in advance for your assistance.
I need for a database to give my users and indication that a renewal has been complete. Basically what happens is every year once a month a report is generated from sql of how many employees need their gaming license renewed the filter is based off of a field called final suit. I need to find a way to let them know through the database that an employee has been renewed. anyone got any ideas??
Hi all I am fishing for ideas on the following scenario and hope someone can point me in the best direction.
I create a new table and insert data from relevant other tables but i want to set the data in the new table colums to set widths that have leading zeros where applicable. I.E the new table column is varchar (10) the data going in to the column is coming from another table where the size was varchar (8) but the data was only 2 characters in size so i want to pad it out to the full new varchar (10) with leading zeros if that makes sence.
I am really just trying to get some ideas on the best possible way to do this thanks.
I have a table with 18,000 records with beg_eff_date since the year 2005. I need to separate the entries based on their daily activity. For example if Beg_eff_date 01/0/2005 then day is "1" if Beg_eff_date is 01/27/2005 then the day is "27". Repeating the same process until I reach "NOW" present time. Any ideas of how can I do this?
Thank You for all your previous help and the current one!!
I created a Temp Table that hold a collection of records by date.
I need to do create calculation that give me the Total fuel used per day:
c.rcpt_nom_vol -c.rcpt_fuel- c.rcpt_act_vol = Total Fuel used per day. Then pull the result of this calculation and assign the result to the specific day of the month. For example is the Calculation Result is 4.25 on Feb 12, 2007. Then the record is insert into a Temp Table as 4.25 day #12.
Does anyone has an idea of how to do this? Thanks and Let me know!!!!
SELECT Distinct a.contract_nbr, a.contract_sub_type, a.contract_type_code,a.current_expirtn_date, b.nom_id, b.nom_rev_nbr, c.beg_eff_date, c.rcpt_dlvry_ind, c.rcpt_nom_vol, c.rcpt_fuel,c.rcpt_act_vol from TIES_Gathering.dbo.contract a Inner Join TIES_Gathering.dbo.NOm b on a.contract_nbr = b.contract_nbr Inner Join TIES_Gathering.dbo.Nom_vol_detail c on c.Nom_id = b.Nom_id where (a.contract_sub_type = 'INT') and (a.Contract_type_code ='GTH') and (DATEADD(mm, DATEDIFF(mm,0,getdate()), 0) < a.current_expirtn_date) and (c.rcpt_dlvry_ind ='R')
We've got currenlty around 500 dts 2000 in production.
In order to know in what ETL processes we have Oracle connections, or FTP tasks or whatever, we did a VB6 app using dtspkg.dll which load all the properties for each DTS into Sql server tables. So that, then you could see from a specific DTS how many connections, Sql Tasks it had and so on..
How to accomplish the same with SSIS? I know, doing the same but using .Net, of course, but is there any else approximation? I am little bit concerned when we will have hundreds of them up.
Maybe 2005 is offering this feature automatically, I don't know.
Am working on an SSIS project and I am not sure how to handle this situation. I have four tables in two completely separate networks {AB} {CD} and I want to populate one of the tables D based on the information from tables A and C with the following scenario I get data from from table A make a lookup transformation to check what did has changed if the data has changed or if there is a new entry in table A get the identity key from table C {they both share a common field} and add the identity key as part of my data inserted in table D
A ----lookup <
data that has changed --- get the identity key from C ---- insert data from table A + identity key into D
I have some performance issues which I can not explain; The database it is running on on a dedicated SQL server box with 8 cpu€™s 4 GB memory. Users are connecting with a fat client.
There is hardly on users using it but they are getting long delays. So I kicked of a profiler trace and sp_blocker. I noticed some queries taking over 30 sec to complete in profiler but when I run them in query analyzer they run in under a second.
I check the output from the blocker and there was no blocking over that time period and it was the only processes actually running.
How can it be so slow I have run out of ideas please any ideas would be welcome
I'm looking for Ideas on how to handle a Pldege Reminder process. For example; a pledge is made to pay $2400 over the next two years. They will pay $100 per month and each month a reminder will be sent. No real mistery if there is a balance you send a reminder. My problem is how to handle things like what if they want to pay quarterly or annually and how to determine if a payment is really due based on when they paid last, etc... You most likely see what I mean. If anyone has done this in the past and/or has any ideas and is willing to share I would greatly appreciate any help. Some stuff that may help you help me better: tblClient (ClientID) tblPledge (PledgeID, ClientID, PledegedAmt, PledgeDate,Frequency,NumberYears) tblPledgePayments (PmtID, PledgeID,PmtAmt,PmtDate)
I'm new to asp.net. Please help me with some ideas about the design of databases and interface of the web site. We are a public school district, I want to create a website for families to go on line to update their family and student's demographic information. Basically I put the students' infomation pulled from our student information system software on the web, the parents can log in on line to update it at the beginning of school year. After that, I will create some report and let secretary manually do the update to our student information system. The demographic infor includes 3 parts, 1. family street address, city, state, zip2 guardian1 primary phones,second phone, emails. primary phones,second phone3, student date birth, gender. may have multiple students in one family But how can I track which field parents changed, shall I do it in programming in the web form, or later when I create some kind of reports. For I only want to pull students with the fields that updated and give them to secretary to update manully, and I don't want to generate a full report and let them compare which field is changed and then update, it will take them too much time. Thanks much in advance
I've read thru all the threads here which say basically the same thing, use a drop table & then a create table to clear the spreadsheet. My problem is the process works the first time to create the table in a new Excel file but fails to clear excel after that. The processes drop, create, & pump data all work without fail but the spreadsheet data is appended to. I've used the wizard as well, saved the package and ran it again without success. Anyone ran into this one?
I have tried to make my basic audit log do more, but i haven't gotten very far;
In my basic audit log, i record this information:
table type of change field modified old value new value db user date/time
This audit records everything, which is great, but it cannot relate information when i go back to analyze the changes; for example, when a "directory" record is added, a user's information may be entered into several different tables, such as:
name (different table) addresses (different table) phone numbers (different table)
If one wanted to look up the changes to addresses of a person in the directory based on the person's name, i could not do it with my existing audit log because the addresses would be in a different table than the name table and there is no relating data in the audit log to relate the address changes to a persons name;
What might be a solution? I have tried a few approaches and am at a loss;
it is working but takes about 3-4 seconds per exec.
CREATE PROCEDURE isp_ap_calc_apt_totals @p_comp char(2), @p_vend char(6), @p_asofdatechar(8) as
if (@p_asofdate <= '00000000') begin set @p_asofdate = '99999999' end
delete from XAPAPTTOT where xapt_comp = @p_comp and xapt_vend = @p_vend and xapt_asof_date = @p_asofdate
insert into XAPAPTTOT select apph_comp, apph_vend, apph_type, apph_id, @p_asofdate, sum(apph_paymnts), sum(apph_discts), sum(apph_adjts), count(apph_paymnts), sum(apph_paymnts)+ sum(apph_discts) + sum(apph_adjts) + b.apt_gross, 0, max(str(yy,4) + replace(str(mm,2),' ','0') + replace(str(dd,2),' ','0')) from APPHISTF.a join APTRANF.b on b.apt_comp = a.apph_comp and b.apt_vend = a.apph_vend and b.apt_type = a.apph_type and b.apt_id = a.apph_id where ((a.apph_comp = @p_comp) and (a.apph_vend = @p_vend) and (a.apph_unpost_dt = 0) and (str(a.yy,4) + replace(str(a.mm,2),' ','0') + replace(str(a.dd,2),' ','0') <= @p_asofdate)) or ((a.apph_unpost_dt > 0 and a.apph_unpost_dt <= @p_asofdate and b.apt_unposted_fg = 1 and b.apt_comp = @p_comp and b.apt_vend = @p_vend and b.apt_type = a.apph_type and b.apt_id = a.apph_id)) or (((str(a.yy,4) + replace(str(a.mm,2),' ','0') + replace(str(a.dd,2),' ','0') <= @p_asofdate) and a.apph_unpost_dt > @p_asofdate and b.apt_comp = @p_comp and b.apt_vend = @p_vend and b.apt_type = a.apph_type and b.apt_id = a.apph_id)) group by apph_comp, apph_vend, apph_type, apph_id
update XAPAPTTOT set xapt_last_payck = (select max(apph_payck) from APPHISTF where apph_comp = xapt_comp and apph_vend = xapt_vend and apph_type = xapt_type and apph_id = xapt_id and str(yy,4) + replace(str(mm,2),' ','0') + replace(str(dd,2),' ','0') = xapt_last_paydt ) where xapt_comp = @p_comp and xapt_vend = @p_vend and xapt_asof_date = @p_asofdate GO
To Whom It Concerns - MS Product Design - SQL Server Reporting Services:
I've been working with an ASP.NET web based project utilizing SQL Server Reporting Services for a while and have some things that I think need to be addressed in the next version:
1) I finally got security working utilizing the form based authentication of my Web based app and passing that to SSRS through the ReportViewer control. Not that it was easy. From what I see, a lot of people are having problems with this and so I didn't feel like I was alone. That said, instructions on how to do it need to be easier. Ultimately, it would be better if on the ReportViewer control, you could add properties as follows:
* Impersonate SSRS User
* Impersonization Password 2) Maybe I'm missing something, but in the web app the system logs in to different production databases based on the user id. The only way that I can figure out how to make the same reports available to each database (all on the same SQL Server DB) is to copy the reports and change the data source in the copied directory to copy to the new DB in SQL Server. Could ReportViewer contain a property that allows you to override the connection sring? 3) You have the ability to create PDF's. This is a very nice feature. However, I would like to extend the capabilities that you have. I want to be able to embed a PDF inside the report and have it generated in-line with the report. As an example, I have meta data concerning the PDF kept in the database. When the report prints, I want the meta data printed and the full PDF referenced by the meta data to print. I actually posted a message concerning this to find that there is no solution - I've seen that many people have looked at the message indicating that they are interested in the same feature. 4) It would be nice to have additional controls available - panel, calendar , checkbox, checkboxlist, radiobutton, radiobuttonlist and hyperlink in particular. 5) Dashboard widgets that we can use. Your promoting SSRS as a way to build data decision front ends. It would be nice if out of the box SSRS included several widgets that we could use as indicators to metrics being tracked by the system. As it stands, I can do simple bar, line and pie charts and that is about it. No progress bar, no guages. It would be nice to have some sexier graphing components. All in all, I do like the product. It was a pain to get setup due to the authentication issues, but I do like it. David L. Collison Any day above ground is a good day!
I have the need to pull data from multiple tables from a DB2 system via ODBC and update or insert as needed into tables in a SQL200 DB.
Step 1. The data from the initial parent table will need to be limited to being a set number of days old, which I have in place and working.
Step 2 The next tables data needs to be limited from the data retrieved in step 1 (I’d like to use the paprent table retrieved in step 1, that is in SQL now, rather than doing it on the DB2 side.
Step 3 The returned rows here, need to be limited to key values returned from step 2
Additional steps apply, but nearly all will be limited to the results of parent tables from the prior step.
What is the best approach to this? I really want to pull table A to SQL, and limit the next child set from Table A, that was pulled to SQL in the prior step.
I also need to do updates rather than dropping and creating the needed tables each time. Insert if no key exists, etc .etc.
I've been working on this project, and had it working in MySQL, but it was badly done and couldn't last more than a few hours without growing so large that everything slowed way down. I don't expect anyone to tell me exactly what to do, just please provide an outline of what the best way to approach this in SQL Server 2005 is.
To simplify it, I have one table "Items" and another table "ItemPrices". Items has an id and a name. Each row in ItemPrices has an id for the item, a price and two datestamps (added, last updated).
On average, there's about 15,000 active items, 50% of them have new prices every couple minutes, so I'm looking at what seems like a ton of data being constantly imported. There's probably a good way to do this but I only know the bad way :)
So.. what I want to be able to do is have maybe a stored procedure (?) that takes the item name and price as parameters. (In MySQL I was using "INSERT... ON DUPLICATE KEY UPDATE") A. If it's a new item name, it will add a row to the Items table and a row to ItemPrices B. If it's an existing item with the same price as the current price (the most recent price for that item in ItemPrices) it will update the "last updated" date field C. If it's a new price it will insert a row into ItemPrices for that item
Also, I want historical pricing data, but if I ever release this, 95% of the users will just be looking at current prices. I need the current prices to be very fast to query, in my MySQL version I was using something like this: "SELECT... join on lastupdated=(SELECT Max(lastupdated) FROM ItemPrices ...", after I had 300k price updates querying a list of items took like 15 seconds.. there's got to be a better way? What should I do to make this faster?
Does this make any sense? Hopefully someone can lead me in the right direction. Thank you very much!
I use MDX on a cube which provides data about animal population.
The cube contains the keyfigure "ANIMALS" that takes the number of animals.
The cube has a dimension "VERSION" which is used to identify the keyfigure as a target or an actual value (possible values: "actual" and "target") The cube has another dimension "ZONE" for the population zones. Possible values for zones: "A", "B", "C" and "D".
Now I want to create an MDX statement, that gives me a result row like this:
Actual number of animals (as sum of all 4 zones) in column no. 1, Target number of animals (as sum of all 4 zones) in column no. 2, Achieved percentage (as actual number / target number * 100) in column no. 3.
Until here my statement works and it looks like this:
WITH MEMBER [VERSION].[achieved] AS '[VERSION].[actual] / [VERSION].[target] * 100' SELECT {[VERSION].[actual], [VERSION].[target], [VERSION].[achieved]} on COLUMNS FROM [$MYCUBE] WHERE ([Measures].[ANIMALS])
It surely is possible that the achieved value for all zones together is equal to or greater than 100%, while single zones might have an achieved values less than 100%.
In order to account on this, i would like column no. 4 to display one of these words: "ok" if none of the single zones has an achieved value smaller than 100%, "warning" if any of the single zones has an achieved value between 96 and 99%, "alert" if any of the single zones has an achieved value smaller than 95%.
That means, i want e.g. the word "yellow" if the lowest achieved value of the 4 zones is between 96 and 99. I want to have "red" if the lowest value is smaller than 95.
I am quite new to MDX and I have struggled quite a long time with this. I would be grateful for a hint on how i have to modify / enhance my MDX statement.
I have what I feel like is a simple package I am working to create. I am teaching myself SSIS as I go along.
Source server SQL 2000 database allows NULL values in columns.
Destination Server also SQL 2000 but the database required a value in each column.
So I do a basic source select what I want. I next need to read the values and determine if null then insert a space, do some column matching and insert them into the destination sever.
I believe I should use a Derived Column and an expression ISNULL to accomplish what I want.
Maybe there is a better way. Suggestion and comment appreciated.
I've been banging my head for a while now, and it is sore! :-P
I'm a best practice/Microsoft approach type of person and want to make sure I do things correctly.
I have a database, kind of like a forum.
Obviously executing multiple queries in one "batch" (stored proc) would have an impact on the performance.
Now, I would like to give a more detailed/specific error back to the caller (either by aid of error code or whatever) with such situations like...
"EditReply"
Edit reply takes the threadID, replyID and userID.
Before actually commiting the changes, it needs to check:
1) does the user exist in the database? (during the editing of the reply, perhaps the user may have been deleted before running the stored proc, who knows)
2) does the thread exist?
3) does the reply exist?
if the conditions are met, only then will it go ahead and update the database. Now that is 3 queries, and 4 statements overall to make a change to a field/table.
Obviously if one of the commands returns false, in other words if say "does the thread exist" returns 0 (thread doesnt exist) it will return back to the caller an errorcode, which they will handle in their application. Thats all fine but the question is
Am I doing this correctly? (no) - how can I improve this? What do I need to think about?
Of course I would like to give a more detailed error back to the caller (aid of errorcode designed in the application overall) instead of just "no, databases not updated".
In this situation, am I wrongly assuming that the database designers use this type of approach?
Please help, I value your feedback and suggestions. I want to improve and think of the right lines of doing these things.
Hi, I want to write a sproc whose main table is Order and it inner joins another table called OrderShipTo with a one to many relationship so the way i have written the first query is as follows SELECT o.OrderId, o.OrderDate, o.PlanId, o.CreatedByUserId, o.Quantity, o.BillToCompany, o.BillToContact, o.BillToAddress, o.BillToCity, o.BillToState, o.BillToZip, o.BillToEmail, o.BillToPhone, o.ShipToIsBillTo, o.BookTypeId, o.RequiredDeliveryDate, o.SpecialInstructions, o.ApprovedBy, o.ApprovedByPhone, o.ApprovedByEmail, os.ShipToId, os.ShipTypeId, os.ShipToCompany, os.ShipToContact, os.ShipToAddress, os.ShipToAddress1, os.ShipToCity, os.ShipToState, os.ShipToZip, os.ShipToEmail, os.ShipToPhone, os.Quantity FROM [Order] o Inner Join OrderShipTo os on o.OrderId = os.OrderId WHERE o.OrderId = @OrderId So suppose if i give an OrderId of 15 In the OrderShipTo table they may be 3 Or 2 or none entries for this order in the Ordership to Table So if there is 3 entries in the OrderShip i will get 3 rows of data which is correct... But i just want o.Columns to appear once and i want n rows for the os.Columns to appear. So in Order To acheive i wrote this query DECLARE @OrderId int SET @OrderId = 311 SELECT o.OrderId, o.OrderDate, o.PlanId, o.CreatedByUserId, o.Quantity, o.BillToCompany, o.BillToContact, o.BillToAddress, o.BillToCity, o.BillToState, o.BillToZip, o.BillToEmail, o.BillToPhone, o.ShipToIsBillTo, o.BookTypeId, o.RequiredDeliveryDate, o.SpecialInstructions, o.ApprovedBy, o.ApprovedByPhone, o.ApprovedByEmail FROM [Order] o WHERE o.OrderId = @OrderId EXEC usp_OrderShipToLoadByOrderId @OrderId
But at the end i want all the rows o.Columns + the rows from os.columns... I also tried giving a union of both the tables but that didnt work of type mismatch.. How can i acheive that??? Regards Karen
Up until recently one of our SQL 7 databases has been running quite happily. However over the last 2 weeks the users have started to complain about performance levles, operations that would normally take 2-3 seconds now run for about 40 seconds. I have rebuilt all indexes and even ran update statistics althought both the 'Auto create Statistics' and 'Auto update statistics' are turned on. This failed to help so I ended up stopping and starting SQL server but still no luck.
I have run a DBCC SHOWCONTIG and the fragmentation is well within acceptable levels, scan desnity is at 94%. The only other thing I can think of is to reboot the NT server as it has been up for 67 days now. Can anyone else think of anything I might have overlooked?
I have about 20 remote databases that I need a query/scheduled task to run that generates a .rpt file. That part I can get.
However, I need to get this file to another company (via email or ftp, etc) and make it automated.
For example:
Scheduled task runs twice a month and generates a file that then needs to be emailed to an individual or ftp'd to a ftpsite. Is there a way to do this in SQL 7.0 (or 6.5)?
This is a tuffie, but I think I'll learn new techniques in SQL.I wish to put data from MS Active Directory and put it into a table.Specificly I want user information (first name, last name and so forth)and the groups that they belong into a SQL table.LDIFDE is a utility that can create a csv file from an AD server. Thisis a sample output:dn: CN=rob camarda,OU=Corporate,OU=GeographicLocations,DC=strayer,DC=educhangetype: addobjectClass: topobjectClass: personobjectClass: organizationalPersonobjectClass: usercn: rob camardagivenName: robmemberOf: CN=Arlington Admin,OU=Campus Domain Admin,DC=strayer,DC=edumemberOf:CN=Arlington,OU=Arlington,OU=Region2,OU=Geographic Locations,DC=strayer,DC=edumemberOf: CN=RN Report Consumers,OU=Cognos ReportNet,DC=strayer,DC=edusAMAccountName: rob.camardadn: CN=Robert A. Camarda,OU=TechnologyGroup,DC=strayer,DC=educhangetype: addobjectClass: topobjectClass: personobjectClass: organizationalPersonobjectClass: usercn: Robert A. CamardagivenName: RobertmemberOf: CN=Role Regional Director,OU=Roles,DC=strayer,DC=edumemberOf: CN=Role Campus Director,OU=Roles,DC=strayer,DC=edumemberOf: CN=TLSAdmin,OU=Talisma-Users,DC=strayer,DC=edumemberOf: CN=ASPTestReports,OU=Roles,DC=strayer,DC=edumemberOf: CN=IT Report Authors,OU=Roles,DC=strayer,DC=edumemberOf: CN=Developers,OU=TechnologyGroup,DC=strayer,DC=edu memberOf: CN=SQL Backup Admin,OU=TechnologyGroup,DC=strayer,DC=edumemberOf: CN=RN Report MetaData Modelers,OU=CognosReportNet,DC=strayer,DC=edumemberOf:CN=RN Corporate,OU=Corporate,OU=Region2,OU=GeographicLocations,DC=strayer,DC=edumemberOf:CN=Arlington,OU=Arlington,OU=Region2,OU=Geographic Locations,DC=strayer,DC=edumemberOf: CN=RN Administrator System,OU=CognosReportNet,DC=strayer,DC=edumemberOf: CN=RN Administrator Server,OU=CognosReportNet,DC=strayer,DC=edumemberOf: CN=RN Report Authors,OU=Cognos ReportNet,DC=strayer,DC=edumemberOf: CN=Backup Operators,CN=Builtin,DC=strayer,DC=edumemberOf: CN=Domain Admins,CN=Users,DC=strayer,DC=edumemberOf: CN=Administrators,CN=Builtin,DC=strayer,DC=edusAMAccountName: robert.camardaIn this output, each user is separated by a blank line. sAMAccountNameis the user's login ID to ADS. Lines starting with memberOf: shows thepath for each group the user belongs.My thought is to load the text data into a SQL table with the PK beingthe line number. This way the data will stay together. The secondcolumn would be just text, varchar(100).I'd like to end up with a table something likeUSER_ID, GROUP_MEMBERSHIP, GIVENNAMEIn the example of robert.camarda, that user belongs to 7 groups, sothere would be 7 records, one for each group. I think once I have thispart, I can build my final table with PK's an all the good-housekeeping of a SQL Table.Now the part that I have no idea how to solve:How do I convert the data from unfriendly for databases, to something Ican use?1. I know I have a new user when I find dn:2. I know I am done with the user when I get a blank (null) line.3. I know what I want to populate rows with name and the contents onceI find rows starting with memberOf:4. It appears there is a max line length that LDIFDE will export, andstarts a new line. So, it will be necessary to join lines.I would think this is a combo of CURSORS, a do/while loop and otherassorted magic.If someone can help me get started, I would have something to reseachor model from. As of now, im staring and a blank page and not sure howto start. Maybe someone knows of a simular problem that can share theSQL.TIARob
I would like to write a user defined function that takes as an input a varchar and returns an xml. If the string is valid xml the function would just return it. If the string is not valid xml then the function would encode the value before returning it.
The function would look something like this:
Code Snippet
CREATE FUNCTION dbo.EncodeXMLIfNotValid (
@InputString varchar(max)
)
RETURNS xml
AS
BEGIN
DECLARE @xml xml
BEGIN TRY
SET @xml = CAST(@InputString as xml)
END TRY
BEGIN CATCH
SET @xml = '<![CDATA[' + @InputString + '>]]>'
END CATCH
RETURN( @xml )
END
However, after writting this I figured out that using BEGIN TRY and CATCH is not allowed in a UDF.
Any thoughts or ideas how I could write a function with this behavior?
I have a table containing a list of source table names to be transferred to destination.I need to pass each and every table name as a variable and based on the name of the variable I need to select data from source and insert into destination.The source and destinations are on 2 different servers.The source tables have different metadata (different column names and data types).Any help how to implement this would be greatly appreciated.
To be more clear I have List table which has below format
TABLE NAME CUSTOMER PRODUCT INVENTORY
Customer , product, inventory are tables names. So my package should first go and collect the table name from LIST table and then transfer the Customer table data from Server A to Server B. This should be repeat untill all the tables listed in LIST table are transferred.
I'm doing my best here, but need some help. I have a client that has a company list that they want searched by key word. This is exported from another program (in excel) that they want used and searched on their website.
Bad news, is each Keyword is listed with the company separately. So if a company has 5 different key words, they will be listed in the excel file 5 times.
The info I have is Name, Address, City, State, Phone, Keywords:
So example of excel is:Company A, 123 Main St, Mycity, Mystate, 123-123-1234, Green Company A, 123 Main St, Mycity, Mystate, 123-123-1234, Furry Company A, 123 Main St, Mycity, Mystate, 123-123-1234, Large Company A, 123 Main St, Mycity, Mystate, 123-123-1234, Circular Company B, 746 Sparrow Ave, Diffcity, Diffstate, 987-987-9876, Blue Company B, 746 Sparrow Ave, Diffcity, Diffstate, 987-987-9876, Furry Company B, 746 Sparrow Ave, Diffcity, Diffstate, 987-987-9876, Small
I am able to import this large (4.2 MB) file into a table called fctable
What i'm trying to do is write SQL scripts or queries that can insert into a Company Table and Keyword Table.
I'm trying to write this through asp.net 2.0 (so the excel file is uploaded) and have tried to write my inserts like
INSERT INTO [Company] ([Name], [address], [City], [State], [Phone]) VALUES (SELECT DISTINCT Name, Address, City, State, Phone FROM fctable )
But that doesn't seem to be working.
This is the only way my client can get the info to me, and it will be changed probably twice per month, so I'd hate to have to try to manipulate an excel file 24 times a year to import.
Hello everyone. I am new to.Net and here is what I have to do. I needto update a SQL table with data coming from a XML file. I have seen some Microsoft documentation on this (the nice SQL statement that updates and inserts in the same stored procedure) but I don'tknow what is the best approach for passing my XML file to the stored procedure. The XML contains about 12 000 records, kind of phonebook info (name, email, phone). What would be the best approach to do this? What objects should I use? Thanks a million, Ben
Environment NT Server 4.0 w/ SP4 SQL Server 7.0 w/ SP1 Win98 Client w/ Lotus Approach 9.5
I recently added SQL 7.0 to be a back end for my Approach front end. I transferred all the data from a dbase IV in approach to SQL. Most of the conversions worked ok. I have two big problems.
1) One particular repeat panel in Approach loses the children records of the master record. If I delete some of the records, more will appear. It's as if there is an imaginary limit of the number of records it can read in the repeat panel. I don't have this problem with any other records and children in repeat panels. I called Lotus and they don't have an answer. This is important because the children records need to be summed up so I can have a running total.
2) I original configured the clients to use the TCP/IP Netlib w/ the default port. I couldn't open enough databases so I changed to Multiprotocol. This allowed certain clients to open more databases, but others can't open additional databases. Also, after the change, the NT authentication login has had problems. I had to change to the SQL login to get all my clients back on line. Sometimes the same client can't open more than 10 databases while other times it will open 15. There is no consistent patten to when it can and can't open the additional databases.
If anyone knows how to fix either of these problems, I would greatly appreciate any advice. I'm getting tired of my boss yelling at me.
I have a select statement where I need to test two values that are returned and perform a different calculation if they return null.
Basically, if TESTA is null and TESTB is not null return TESTB
if TESTB is null and TESTA is not null return TESTA
ELSE if both are not null TestA + TestB / 2 is value is returned
is this possible to do in a select statement? thanks in advance.
See select statement below:
Select S.StudentDimKey, (select ISNULL(R.Score, NULL) from CLT_StudentAssessmentRawFact R, CLT_LessonPlanDim L where L.LessonPlanKey = 1 and L.LessonPlanDimKey = R.LessonPlanDimKey and StudentDimKey = S.StudentDimKey) as TESTA, (select ISNULL(R.Score, NULL) from CLT_StudentAssessmentRawFact R, CLT_LessonPlanDim L where L.LessonPlanKey = 2 and L.LessonPlanDimKey = R.LessonPlanDimKey and StudentDimKey = S.StudentDimKey) as TESTB, from CLT_StudentPlacementFact P, CLT_StudentDim S, CLT_ClassHierarchyDim H, CLT_StudentClassFact C where P.StudentDimKey = S.StudentDimKey and H.ClassHierarchyDimKey = C.ClassHierarchyDimKey and S.StudentDimKey = C.StudentDimKey and S.StatusCode = 'A' and S.CurrentRecord = 1 and S.SchoolID = 87577
im practicing set based approaches... and what im trying to do is grab each value from a table , scramble it and put it back in the table... i dont want the solution to this as id rather figure it out myself for practice...
the thing im stuck at is i can do this with a cursor but i want to avoid cursors in future, how would i use a set based approach to get each value of a table and work with it?