No Records In Data Set
Jan 9, 2014
I need to validate the data before import to final table, so i create a staging table , data is imported to this staging table by sharepoint which after import call my store procedure.I have spilt that logic n prepare three store procedure.
1) update the discrepancies rows
2) If there is discrepancies than Call Another SP to show report
3) if not than Call Another SP to insert in final table
4) truncate staging table.
The issue here happening is the if i call exclusively this store procedure for validation and if there are any discrepancies than the report SP is called and data is shown in SSMS.But when same SP is called from sharepoint there is not data in dataset and according to Sharepoint developer the table is truncated in the mean time it is fetching rows,
USE Demo
GO
DROP PROCEDURE uspValidateTimeSheet
GO
SET ANSI_NULLS ON
GO
[code]...
View 0 Replies
ADVERTISEMENT
Aug 6, 2007
HI,
I have been trying to solve the locking problem from past couple of days. Please help mee!!
Scenario:
--------------
I have a SSIS package in which 2 data flow tasks. 1st data flow task deletes records from a 5 tables and the 2nd data flow task should insert records into 1 of the five tables after the success of 1st data flow task. This scenario runs in Transacation.
The above scenrio in the 2nd data flow task hangs in runtime. It does not complete. with sp_who2 command i could see that there is an intent share lock(LK_M_IS) on the table and the status is SUSPENDED.
I dont know how to come out of this locking. Please help.
Thanks ,
Sunil
View 7 Replies
View Related
Aug 7, 2015
I have an MVC asp.net application that stores many records in a table on sql server, in its own system. used the system for 2 months, worked fine accessing, changing data.
Now that other users are logging in? there is cross coupling going on. one user gets the data from another users sql search.
In the mvc app it had used the get async method to read the ID record from the db, i set that to synchronous. no effect; the user makes their own login id but that does nt matter either.
View 8 Replies
View Related
Mar 20, 2014
writing the query for the following, I need to collapse the continuity. If the termdate for an ID is one day less than the effdate of the next id (for the same ID) i need to collapse the records. See below example .....how should i write the query which will give me the desired output. i.e., get min(effdate) and max(termdate) if termdate is one day less than the effdate of next record.
ID effdate termdate
556868 1999-01-01 1999-06-30
556868 1999-07-01 1999-10-31
556869 2002-10-01 2004-01-31
556872 1999-02-01 2000-08-31
556872 2000-11-01 2004-01-31
556872 2004-02-01 2004-02-29
output should be ......
ID effdate termdate
556868 1999-01-01 1999-10-31
556869 2002-10-01 2004-01-31
556872 1999-02-01 2000-08-31
556872 2000-11-01 2004-02-29
View 0 Replies
View Related
Mar 23, 2008
Is there a problem with stability when one uses too many text (memo) fields?I'm having a problem with data from one record occasionally ending up inanother record, though apparently not through any user interaction.I'm using SQL 7 and accessing the tables through the ODBC driver. There aretwo tables with a one-to-one relationship -- TableA and TableB. TableB isthe one that's having this occasional problem. TableB has about 30 textfields out of a total of about 50 fields in the table. The problem isusually with one or two text fields containing data from a different record,one that was created close to when the problem record was created.Example: TableB has fields 1-9, say. In one record, field 1 has A, field 2as B, field 3 has C, and so on.In another record (created shortly after the other one), field 1 has AA,field 2 has BB, field 3 has CC, and so on.The user works with the records, everything's fine. Then one day the usernotices that in the second record, field 1 has AA, field 2 has BB, but field3 has C instead of CC. In other words, all data's fine, except for one,maybe two fields, that have data from a previously-created record.At first glance this seems to be a user-interaction thing, that somehow theuser inadvertantly placed data from the older record into the newer one,either through a shortcut, or by having that data on the clipboard, orwhatever. But a recent incident opposes that theory.I have two forms in the front end for editing records (the forms are boundto the ODBC table links). Form1 is bound only to TableA (the one thatdoesn't have the problem); Form2 is bound to a query that is TableA joinedwith TableB.In the recent incident where data shifted, both the record that was affectedand the record from which the data came were both only edited with Form1. Inother words, TableB never came into play; yet its data was somehow affected.When a record is created, the user completes a few fields in a form, andthen a stored procedure creates a record in TableA and then a sister recordin TableB (using the TableA record's autonumber primary key as its primarykey). A couple of user entered values are entered into the TableB record.But if the user is using Form1, they never see the TableB record.In this case, the TableB record's two fields got changed to fields from anearlier record (one which was created a little earlier the same day), eventhough both records were only edited in Form1 (according to the historylog), which doesn't touch TableB.Thus, I'm wondering if there's a possibility that either the SQL database orthe ODBC driver somehow shifted the data from one record into another. Thatseems far-fetched. But, at this point, since a table that the user didn'ttouch somehow had its data changed to data from a different record, I'mtrying to explore all possibilities.Thanks for any insight!Neil
View 4 Replies
View Related
Jul 20, 2005
Hi,Can the English version of Sql Server (or other DBMSs) store data/records inother languages (e.g. French, Chinese, etc.) ordo you have to use the French version of Sql Server to store French data?Where can I go to read more on this?Thanks.Eric
View 1 Replies
View Related
Mar 29, 2007
I posted this question last night and thought I had an answer. I have a date field. I want to be able to filter records by the month of the date. To do this, I pass the integer of the month. But I also want to be able to return all the records if no month integer is passed. So functionally,select * from table where (MONTH([AD ENDS]) = @month)will return all records where [Ad Ends] is in January if @month = 1 and all records if @month is empty. The solution I got last night was to use select * from table where (MONTH([AD ENDS]) = ISNULL(@month, MONTH([AD ENDS]))) If @month is null, all records are returned. I was focused on another aspect of the page when this was posted. It worked in the designer, so I thought I was set. This morning I realized I don't know how to pass a null variable in a querystring. Since a querystring is a string, it probably can't be done. Another suggestions was to change this programmatically. But is there a way to do this in the dataset? I'm using SQL Server 2005, a strongly typed dataset and the designer.Diane The answer i got last night was to
View 4 Replies
View Related
May 8, 2008
Hi,
I am facing a big problem in data transfer from Access To SQL. Well the problem is i want to transfer nr about 50 lacs record from access to sql. at my side where sql server is complete install i could not find any problem and it transfer all data in 5 to 7 mins. now problem is at client side we not install entire sql but we install sql runtime which is available with installshied. and whenever i start to transfer data there is another exe called DLLHost.exe is run and it ate near about 70% to 75% Memory of system. well i use sql OPENROWSET statement and as soon as that statement fire at machine where entire sql is not install and only runtime install it starts to eat system memory so instead of transfer all data in 5 to 10 min it take more then 6 hrs bcaz dllhost.exe eat memory. is there any better way to transfer this records more fast pls let me know.
pls. help me to get resolve this problem.
thanks for help ,
pls help me.
View 1 Replies
View Related
Dec 20, 2004
I am trying to combine 2 tables, one is an event table, the other is a system table.
I would like to update field system.transaction with the first event.transaction
Where event.account = system.account
and event.effdt > system.effdt
and event.effdt <= system.effdt + 30
It's not really a join so I'm not sure how to write it.
Thanks,
Doug
View 2 Replies
View Related
Sep 20, 2011
I want to update table2.message based on the criteria of table1.name. for example, all records named John will be updated with 'Msg1' in table 2.message. I am using MS SQL 2000 and below is the scenario.
table1 columns
ID
Name
table2 columns
ID
Message
Select a.Id, a.name, b.message
from table1 a, table2 b
where a.id =b.id
a.id a.name b.message
1 John Msg1
2 Steve Msg2
3 Scott Msg3
4 John NULL - update b.message to 'Msg1'
5 Steve NULL - update b.message to 'Msg2'
6 Scott NULL - update b.message to 'Msg3'
7 John NULL - update b.message to 'Msg1'
8 Steve NULL - update b.message to 'Msg2'
If i will update the record per name i am using the query below and i am pre-selecting all the existing names.
update table2 b
set b.message=(Select top 1 b.message
from table1 a, table2 b
where a.id =b.id
[Code] ...
How to update this in bulk without preselecting all the names?
View 7 Replies
View Related
Jul 7, 2015
We are having a requirement to Aggregate the data and create LY, CY data across 5 Metrics.
the Volume of Data will be 200 Mil - 250 Mil
Server Configuration:
Memory: 32 GB
Processors: 16
understand the bench mark configuration needed for the server & any hints on better aggregation methods & LY / CY data
View 5 Replies
View Related
Aug 10, 2015
I wrote a script to archive and delete records rom a table back in 2005 and 2009.
I can't seem to get the syntax right. Any sample script to simply archive and delete records?
This is what I have so far.
DECLARE @ArchiveDate Datetime
SET @ArchiveDate = (SELECT TOP 1 DATEPART(yyyy,Call_Date)
FROM tblCall
ORDER BY Call_Date)
--SELECT @ArchiveDate AS ArchiveDate
DECLARE @Active bit
[Code] ....
View 9 Replies
View Related
Jan 15, 2007
Hi, All
I'm only have permition to query table data and with local sql server installed. Also, capable link server from local machine to live sql server (Blue). Getting error type when ran a query that commons: Column name or number of supplied values does not match table definition. Others, ambiguous column name " ". Here are the information below before i ran the query.
In house data:-Table fields contain (address,city,state,zip,zip4,fips)
-All data type fields are character
-No Primary Key
Local machine table:
-Fipscodes (table) contain data was provided by customer that I inserted into my machine
and column fields (state,zip,fips) made Zip as Primary Key.
So, I wants to able run a query to retrieve data from live server by using some kind of join table with table(FipsCodes)locate in my local machine.
Query Statement:
SELECT h.lname,h.fname,h.street,h.city,h.state,h.zip,h.zip4,h.carroute,h.gender,
h.keycode,h.purprice,h.mortamt,h.phone,h.lender,h.recorddate,h.fips,h.pubmonth,
h.pubday,h.pubyr,h.trantype,h.condocode,h.transdate,h.ratetype,h.loantype,h.birth,
h.heritage,h.estcurrval,h.estcurreq,h.dpbc,n.state,n.fips
FROM blue5.Homeowner.dbo.homeowners AS h INNER JOIN FipsCodes AS n
ON h.FIPS = n.FIPS
WHERE state ='AL' AND fips='001' AND pubDate >= '12/1/2006' AND pubDate <='1/8/2007'
Error occur:
Msg 209, Level 16, State 1, Line 1
Ambiguous column name 'State'.
Server: Msg 209, Level 16, State 1, Line 1
Ambiguous column name 'FIPS'.
, please help me solve this issue and thank you very much everyone.
View 8 Replies
View Related
May 2, 2007
I have a need to read a table and extract records that matches a criteria (dough!, not that simple). Then I put the record in a text message and send them out via email. Works OK, but the problem is when I have multiple records for one entry. Since SQL doesn't have arrays "per say", I need to load the records on a "temporary array" then assembly the message at the end to include the records. Any suggestions how to "emulate" an array on a stored procedure (SQL), not sure what direction to follow. thanks
View 2 Replies
View Related
Jan 22, 2008
Hello,
i am using sql server 2000. I have a DTS which generates data and exports it to msexcel template file. The file is then FTP to web server.
how can i make sure that if records are more than 65535, it will copy rest of data to another sheet.
i know we need to use page breaks for this but dont know how to implement it in sql server 2000.
Any help will be appreciated.
View 6 Replies
View Related
Mar 30, 2006
I'd like to do the following thing with a data flow task
Get all the records from a source (for example customers from a textfile, flat file source)
Then check for each record if the customer already exists in a table, for example with a customerID. If not, insert the record in the table (ole db destination), else copy the customer thats already in the table to another table (history table) and update the record with the customer from the textfile.
Is this possible?, and what kind of data flow transformation do I need?
View 1 Replies
View Related
Mar 19, 2007
Good Morning, I need some assistance with SQL Server 2000 Importing Data.
When I import data from a text on a routine basis, three things must happen:
1. New records identified by primary key get appended to table.
2. Exisiting records identified by primary key get overwritten with new/(updated) data.
3. All other existing records are left alone.
Does anyone know how to Import Records with the following the criteria above? It cannot insert duplicate primary keys by nature, so it must overwrite those records!
This is being built into a DTS Package, but I need to get over this obsticle! Thanks for any guidance!
View 2 Replies
View Related
Oct 23, 2005
We have an asp.net app with about 200 data entry forms. Customers mayenter data into any number of forms. Each form's data is persisted ina corresponding sql table. When data entry is complete, it needs to beprocessed. Here's where the questions start.How can we easily determine in which tables a customer has data and howbest to select that data?We're not opposed to putting all the data in a single table. Thistable would wind up having ~15 million records and constantly have CRUDoperations performed against it by up to 5000 users simultaneously.With sufficient hardware, is this too much to ask of the db?
View 10 Replies
View Related
Jan 24, 2008
Hi all,
sorry for this question, but I am brand new with SQL Server 2005 Express.
I have nothing found in the documentation.
With which tool can I view the data records in my SQL Server?
I have tried with MS SQL Server Management Studion Express but had no success.
thanks for help
hawk
View 5 Replies
View Related
Jun 20, 2006
Problem:
I am working on a price comparison system which matches the best prices for a purchase (or an order) from exisiting purchase data.
The order is stored in multiple tables including order details (stores major items purchased: e.g., PC) and order sub-details (optional items purchased with the major items: e.g., speakers, backup device, webcam etc.).
There could be a number of major items in an order and each major item could have multiple related sub items. The other variables that affect the price include trade-ins if any, sales going on at the time of order, number of units etc.
Now, for any new configuration (major items/related sub items), the system should be able to return a list of previous purchases made with similar configurations, and similar variables (quatities, trade-ins etc). Even if the same model is not present, similar pcs by the same vendor should be considered. etc etc.
Questions:
Is this possible using Data mining?
If yes, which algorithm is recommended?
Also, can I assign/modify any kind of weights to certain variables (if same model: .6 ; if same model not available but pcs made by same manufacturer available: .3 ; by other manufacturers: .1)?
Any help will be greatly appreciated.
Thanks,
Jojy
View 1 Replies
View Related
Jan 29, 2008
Hi,
I have an SSIS package that runs each day from a live data source to create a data mart, which is then used for various things including SSAS and SSRS.
The problem is that certain records that will eventually go on to form fact tables are deleted from the live system (not a very robost database in the first place, hence the SSIS!) but these are not reflected in the SSIS transformation, creating plus figures when compared to the live system.
I currently use type 1 slowly changing dimension processes in each data flow (of which there are about 35) but I realise that this only updates records and does not delete.
The solution I have in place is to truncate the fact tables in the mart before the run starts using an Execute SQL task. This solves the problem though to me seems a little heavy-handed and renders the slowly changing dimension processes redundant (as it is currently only run once a day).
My question is, is there a better method of dealing with the above scenario? If there isn't, it would be a nice feature to add to future versions (*nudge nudge*).
Thanks in advance :-)
View 1 Replies
View Related
Apr 3, 2008
Hi, all experts here,
I am desperate to need you help.
Could you please give me any advices on how to filter out the records through out the data flow by any particular condition? E.g. In my case, I want to filter out rows with null id (will get rid of those rows with null id which are not matched in the look up component)? Hope it is clear for your help and I am looking forward to hearing from you for your help and thank you very much.
With kindest regards,
Yours sincerely,
View 5 Replies
View Related
Apr 21, 2008
hello, i have 100,000 records in my table. Out of that what means to me is only 500 records that I want to pass thru and send it to destination.
The data I am trying to transport has a column ID with Not Null values in it.
Please let me know how can we do it,
Can we use a look up for ID from the other tables to just pass thru when it finds an ID match,
Am i correct.
View 3 Replies
View Related
Sep 10, 2007
Hi everyone,My company has a website, use ms sql server. One table has more than 1000000 records. When users search data from this table(such as search records which contain the word "school" in NewsTile field.And the server often occurred deadlock error.How can I improve it?Thanks.P.S. The table has these fields:NewsIDNewsTitleNewsContentNewsClickTimesNewsInsertTime
View 14 Replies
View Related
Feb 16, 2004
I receive a file that will have hyphens between data items such as
123-aed-edr-45r-ui9
1-ed3-45r-rrr-98u
I need to split the values to load into a table that will hold the 5 separate data itens. The fields will always have the hyphens but could be different lengths. Any idea on a best approach to split this in tsql
View 2 Replies
View Related
Mar 24, 2015
i have table below
CREATE TABLE [dbo].[DR_Test](
[source_item_id] [int] NOT NULL,
[source_line_no] [int] NULL,
[buyer_id] [int] NOT NULL,
[seller_member_id] [int] NULL,
[code]...
the table contains more than 80 million records so when i fetch the data using buyer_id & timezone its taking lot of more than 1 hours or so....& where buyer_id is not unique.how to fetch the data fast or need to change the structure of the table
View 3 Replies
View Related
Apr 14, 2015
I have a query needs to look for 5 records data in a table. Basically i need to hardcode. Below is my query which didn't work out.
select BF_ORGN_CD, BF_BDOB_CD, BF_TM_PERD_CD,data
from BF_DATA
WHERE (BF_ORGN_CD,BF_BDOB_CD,BF_TM_PERD_CD) in ***** i guess this is the wrong query****
('A1', 'B1', 'C1')
('A2', 'B2', 'C2')
('A3', 'B3', 'C3')
('A4', 'B4', 'C4')
('A5', 'B5', 'C5')
but if i use the query below it will generate more records than these 5 records
select BF_ORGN_CD, BF_BDOB_CD, BF_TM_PERD_CD,data
from BF_DATA
WHERE (BF_ORGN_CD) in ('A1', 'A2', 'A3', 'A4', 'A5')
and (BF_BDOB_CD) in ('B1', 'B2', 'B3', 'B4', 'B5')
and (BF_TM_PERD_CD) in ('C1', 'C2', 'C3', 'C4', 'C5')
View 4 Replies
View Related
Jul 20, 2005
I have two tables of book information. One that has descriptions of thebook in it, and the isbn, and the other that has the book title,inventory data, prices, the isbn.Because of some techncal constraints I won't get into now, I can'tcombine them both into one table. No problem. Things are going fine aslong as there is a description in the one table to corrispond to theisbn and other data in the other table.However, about half of the products are not yet entered into thedescrition table. I'd like to run a sql query that pulls up all theisbns that don't exist in the other. In other words, I'd like to get aquery that tells me exactly which isbns do not yet have descrition datain them. I know there is some sql that says to search from one filewhere the number does not exist in the other, but it slips my mind. Cansomeone help me on this please?Thank you!Bill*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
View 2 Replies
View Related
May 5, 2015
I have two tables. Users and records. I need to select only the users that not has lines recorded in the other table. How could I do that?
Example:
ID| Name| Access 1|Access 2|
----------------------------
1 | Axel | True |False |
2 | Ivan | False |False |
3 | Bob | True |False |
4 | Sue | False |False |
ID| Points| Month| Year|User_1|User_2|
--------------------------------------
1 | 2 | 5 | 2015| 2 | 1 |
2 | 5 | 5 | 2015| 2 | 1 |
3 | 1 | 5 | 2015| 3 | 1 |
Then I want to run a select in the users table, only with the users that hasn't records in the second table.
In the example, the second table has User_1 as the user who receives the points and the User_2 is the user who give the points. Then I would know what user didn't receive 'points' yet.
View 3 Replies
View Related
Jun 22, 2015
I have a table that is increasing quite largely each day. By now, I have average 300 million of records over 2.5 years. Before we received our new interface, the data we received was aggregated and thus not that big.The problem is that the table is so huge that I cannot use the Slowly Changing Component. I was thinking about making a temp table where I load the incremental data before I load it into the final data mart table.Based on this temporary table I use a script to compare the temp data with the already existing data in the data mart. However, this requires a compare of each records (300 mil records).
View 3 Replies
View Related
Feb 15, 2008
During the execution of SSIS Package which is populating huge data into OLEDB Destination from OLEDB Source, then some of the records are getting rejected. Again if we are executing it twice or thrice, the rejected records are getting inserted.
Wants to know, why the records are getting rejected? Target table does not contain any constraints.
Hope to hear soon!!!!!!!!!!!!!!!!!!
View 5 Replies
View Related
Jan 24, 2008
Hello, i currently have a gridview that is populated with data from a SQLServer datasource. I have put an output mask in the select statement, so the date and time attributes are displayed in the format i prefer them to be in. SELECT PatientNo, ConsultantName, HospitalName, CONVERT (varchar, Date, 101), CONVERT (varchar, Time, 8) FROM [Appointment];
However when i click the 'edit' link for a record in the gridview, i am unable to edit the date/time attributes and when i click update to confirm any changes to the other attributes, the values in the date/time attributes are emptied. How can i solve this update problem. I'm guessing i need to configure my SQL UPDATE statement, but bit stuck how i do this. Please help!
Thanks,
James
View 9 Replies
View Related
Nov 5, 2014
I have 1 table that is just a list of feeds. A, B, C, D etc (15 rows in total) and each feed has other information attached to it such as Full name, location etc etc. I am only interested in the Feed column.
Each feed then has a corresponding data table which contains a list of records. Eg Feed A data is contained in TableA, Feed B data is contains in TableB and so on.
Basically what I need is a combined table that shows the list of Feeds in the 1st Column ( So A, B, C, D…..) and then a second column which counts the records from each separate data table corresponding to that feed.
So the end result would look something like this:
Feed------No of Records
A----------4 (from TableA)
B----------7 (from TableB)
C----------8 (from TableC)
D----------1 (from TableD)
Possible?
View 2 Replies
View Related