What's The Best Technique ???
Nov 23, 2007
Hi, we currently use binding to bind data from an SQL Query to a DataGrid, the problem is that we offer the choice to the user to choose the field that he want to see in the grid.
We currently include ALL optional field in the query even if they are not used by the grid, that make a realy big query with MANY inner and left outer join.
Whats the best techinique to dynamicaly construct my query with the field that the user choose.
All suggestion or webpage link will be appreciated.
Thanks a lot.
View 1 Replies
ADVERTISEMENT
Jan 19, 2008
Hello Members and Contributors and All the Peoples, Assalamualikum;
I am a Univesity Student and Currently i am creating my own website on ASP.
I have little bit of problem in Personal Message or Private Message Section.
What i want is that my Message Box Field of PM Should render HTML tags.. like here , in this website, Message Box renders HTML tags.. also here it has option to see the HTML view...
ALSO in my SQL Database the maximum length of my PM is 4,000... obviously that field will not store if it increases the limit. So how i increase my PM Field Length. Because i have put the option of replying the message... and old message will be shown as quoted message so all the text is very much important and what technique should use to STOP <a href> tags
View 1 Replies
View Related
Jul 20, 2005
I'm a newbie when it comes to the SQL Server side of things. I've been doingAccess on and off for years.I've got a form that is the main point of entry for all data.As a matter of technique, should I use one stored procedure to validate data(testing for uniqueness mostly) then another one to actually write the datato the tables?Does it make a difference one way or another to the performance?Do I gain any flexibility in my app dong it that way (Access Data Project)?Your help is appreciated.--Jake
View 2 Replies
View Related
Aug 8, 2007
Questoin
I am using Sql Server 2000.
I have a table named Cities which has more than 2600000 records.
I have to display the records for a specific city page wise.
I don't want to compromise with performance.
Can anyone has the idea?
Waiting for your fruitful response.
Happy Day And Night For All
Muhammad Zeeshanuddin Khan
View 1 Replies
View Related
Aug 28, 2004
I need to write a query in an SP that returns either yes or no. Is it bad technique to use the return value of the SP for this? Should I use a scalar or output parameter instead? If so which?
Thanks,
Steve
View 7 Replies
View Related
Jul 23, 2005
Sorry for my bad englishWhat is the optimization technique used by SqlServer dbms?Thanks
View 1 Replies
View Related
Jul 20, 2005
Hi AllI'm creating some SPs and I've got a query which is inserting datainto a table with a a unique constraint:CREATE TABLE [fil_Films] ([fil_ID] [int] IDENTITY (1, 1) NOT NULL ,[fil_Film] [nvarchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,CONSTRAINT [PK_Tfil_Film] PRIMARY KEY NONCLUSTERED([fil_ID]) ON [PRIMARY] ,CONSTRAINT [IX_fil_Films] UNIQUE NONCLUSTERED([fil_Film]) ON [PRIMARY]) ON [PRIMARY]GOWhen I insert data, should I check in the SP to see if there is anexisting record or simply catch the error if it already exists? Whichis the better technique? My current SP looks like:CREATE PROCEDURE spAddFilm (@Type varchar(50)) ASDECLARE @Count intSET NOCOUNT ONSELECT @Count = Count(fil_ID) FROM fil_Films WHERE fil_Films.fil_Film= @TypeIF @COUNT IS NULLBEGININSERT INTO dbo.Tfil_Film (Tfil_Type) VALUES (@Type)RETURN 1 -- OKENDELSEBEGINRETURN 2 -- ExistsENDGOThanksSam
View 2 Replies
View Related
Jan 15, 2008
Been handed a system which inserts millions of records per day into a single table with a composite 'wide' index (they tell me it's the only way to achieve uniqueness) and they run large financial reports from this 300,000,000 row table and of course we are I/O is an understatment.
I wish to split the data feed (inserts) workload away from the reporting tables however I need method to transfer the feed data into the report tables and control the volume of traffic. first choice is replication but is this sufficiently robust enough for a commodeties system? If not any ideas?
View 2 Replies
View Related
Jul 20, 2005
Looking for some insight from the professionals about how they handlerow inserts. Specifically single row inserts through a storedprocedure versus bulk inserts.One argument are people who say all inserts (and updates and deletionsI guess) should go through stored procedures. The reasoning is thatthe developers that code the client side have no reason to understandHOW the data is stored, just that it is. Another problem is an insertthat deals with multiple tables. It would be very easy for thedeveloper to forget a step. That last point also applies to businesslogic. In my case, adding a security to our SecurityMaster can touch 1to 4 tables depending on the type of security. Also, certain fieldsare required while others are set to null for depending on the type.Because a stored procedure cannot be passed datasets but only scalarvalues, when you need to deal with multiple (i.e. bulk) rows you arestuck using cursors. This post is NOT about the pros and cons ofcursors. There are plenty of those on the boards (some of themprobably started by me and showing my understanding (or morecorrectly, lack of) of the way to do things). Stored procedures alsogive you the ability to abort and/or log inserts that cannot happenbecause of contraints and/or business rule failures.Another approach is to write code (not accessible from outside thedatabase) that handles bulk inserts. You would need to write in rulesto "extract" or "exclude" rows that do not match constraints orbusiness rules otherwise ALL the inserts would fail because of one badrow. I guess you could put the "potential" rows into a temp table.Apply your rules to the temp table and delete / move rows that wouldfail. Any rows left can that be bulk inserted. (You could also use therows that were moved to another temp table for logging why theyfailed.)So that leaves use with two possible ways to get data into the system.A single row based approach for client apps and a bulk based forinternal use. But that leaves use with another problem. You now havebusiness logic in TWO separate areas. You have to remember to modifycode or fix bugs in multiple locations.For those that are still reading my post, my question is...How do you handle this? What is the approach you take?
View 4 Replies
View Related
Jul 20, 2005
I've been asked to turn our single-user database system into a multi-usersystem. At present, each user has a copy of the MSDE on their desktopmachine and uses our program to access it. In future, we would like tocentralise our MSDE instance and allow multiple users to access it. Inorder to facilitate this, we are going to only allow one user write accessto the system at a time (I know, its a kludge, but the system was neverdesigned for multiple users in the first place).I have a single, simple question this being the case: can I update a single"read-only" bit field in a table of the database in order to flag to otherusers that the system is in read-only mode in a way that avoids concurrencyissues? ie. does an "UPDATE" query lock and unlock? ( I suspect the answeris yes! ). If anyone else has experience of these things, I would alsoappreciate some tips on how best to proceed.ThanksRobin
View 7 Replies
View Related
Apr 14, 2007
hello.
How to do Search in the method which is done in MSDN library 2005 (local application)?
i.e enabling a search filter and then (most incredibly) as you type the criteria, the
list gets updated depending on the characters entered.
for eg. you type: loo
and obtain the following display:
Look in dialog box
Look tab
lookaheads
l
l
l
l
l
l
l
l
l
l
l
l
l
How to use this fast searching technique in vb.net 2005 and the items to be searched in
sql server 2005 express??
View 1 Replies
View Related
Apr 11, 2008
Hi folks, I have implemented this technique to simplify SCD loads and also to maintain consistent units of work during update/insert of a single row. Wanted to get your feedback on this technique: performance, transaction issues, etc.
I send all rows to an OLE DB Command that performs both update and insert for a single row in a single command:
Code Snippet
UPDATE PROPERTY SET ORD_TERM_DT = ? WHERE ACCOUNT_NBR = ? AND ORD_TERM_DT = '9999-12-31 23:59:59';
INSERT INTO PROPERTY (
ACCOUNT_NBR
, APPRAISAL_COMPANY_CD
, .....
, ORD_TERM_DT
) VALUES (?, ...,?);
This way I can guarantee that if the termination (update) of an old row (say, row 10) succeeds, but insert of the new row 10 fails, that it will roll back. Otherwise, row 10 will get terminated without being replaced with a current record...
Performance: load of 7,734 changed records into a table of 6.8M existing records was roughly 8 seconds. The data flow task container TransactionOption = Required.
View 4 Replies
View Related
Apr 8, 2008
I'm looking to remove hundereds of legacy procs, triggers, functions etc in a DB. Is there a DMV, sys.<something> command or technique that will tell me the last night an object was accessed or executed?
View 2 Replies
View Related
May 22, 2008
I have Came to know that Pinning concept is removed in sqlserver 2005 .
So is there any alternate for that .Any links regardint that which provides information.
View 5 Replies
View Related
Nov 14, 2006
Colleagues:
I'm working through the DM tutorials and I can't select a data mining technique when creating a new mining structure in BI Studio - BI Studio locks up totally. Have installed SQL Svr 20005 SP1 and components from feature pack. Analysis Server service is running, have set the properties in the DM project properties to the running instance of Analysis Services. Have read the various posts on the forum and have double checked my configuration - all seems as it should be.
All suggestions appreciated - very much looking forward to working with the technology.
Michael (michael.dataSense@sympatico.ca)
View 3 Replies
View Related
Jul 12, 2007
Hello geniuses
First of all I would like to announce that this is my first time I post here.. However, I'm pretty sure that I'm in the best place to ask what I want. To cut the story short, I'm querying SQL database on a remote machine and having the result saved (mapped) to another table on another database on the same remote machine. The thing is the destination table was empty before the query was run the first time. I have been searching for some smart way so that when I modify the source tables that my query is based on, it doesn't affect except the modified rows. In other words, it should be like if the row is already there, do nothing. otherwise, it updates the existig record. else, it's a new record and it's inserted. I think what i need will include some coding for sure, yes? I don't know if i'm clear about the requirement or not though! but I know that you are experts and can direct me. Waiting for your valuable replies.
Sherif Magdi
View 11 Replies
View Related
Apr 13, 2007
Hi All,
We have this SSIS package that serves a ETL role, but sometimes when the package crashes we wanted our support personnel to be aware of it, so we did implement an Event handler Package, executable is the package and Event Handler on taskFailed. The idea is if any of the Tasks fail, then trigger the event handler.
Now recently I found out while debugging the package, one of the DFT's failed, and the control went to the Event handler. But thenrest of the Task's are executing and the package finished successfully.
How do I ensure that if at all some Tasks fail, the control should invoke the task at the event handler and end the package run.
One more query, what is the ideal and best way of Event handling to be followed in case of ETL jobs ?
Thanks in Advance.
View 10 Replies
View Related
Nov 25, 2007
Hi,Env: MS SQL Server 2000DB Info (sorry no DDL nor sample data):tblA has 147249 rows -- clustered index on pk (one key ofdatatype(int)) andhas two clumns, both are being used in joins;intersecTbl4AB has 207016 rows -- clustered index on two fks andthis intersection table has six colums but only the two fks are beingused for joins;tblB has 117597 rows -- clustered index on pk (one key ofdatatype(bigint)) andhas four columns but only its key are being used for joinsA complex query involving the above the three tables includes innerand outer joins, aggregate, sorting, predicate, math function, derivedtable etc.On the first run, the query takes about 4200ms to finish;after some research on index optimization, I provided some index hint,then the query runs atabout 3000ms. (That was yesterday).Just now I realized that MS SQL Server 2000 is quite "intelligent", Ithink it saves search termsinto cache because the second time search of the a same term is muchmuch faster. Now, a couple of questions:a) if I construct a long long list of "common" terms andprogrammatically let the sql server to cache them would it speed upthe overal query performance in my case? (Or it may depend on thequality of the "common" terms?) and if your answer is yes (supposedlyyou've been there, do you have to know where I could find such a"common" term list, for everyday life or the general public?)b) what other techniques out there to speed up the above describedquery? Bring it down to 1000ms would be most desirable.Thanks.
View 1 Replies
View Related
May 20, 2007
there is any service or technique to reflect database changes to form?
am mean if there is two people update same data and one of them
does the update i need (search) on service in sql server or .net that(or triger) that can automatically reflect changes to the form control that display data automatically when data changes before the another persone make second aupdate, so he can see the update made by the first person pefore he make another update.
Thanks So Much for who try to answer.
View 3 Replies
View Related
Nov 27, 2007
I had a problem with the ntext datatype. I need to strip the HTML tags out of a ntext datatype column. I have sample query for that, which works fine for STRING, as stuff is the string function, what to do for ntext field.
=======The Process follows like this =========
--**************************************
--
-- Name: A relational technique to strip
-- the HTML tags out of a string
-- Description:A relational technique to
-- strip the HTML tags out of a string. Th
-- is solution demonstrates how to use simp
-- le tables & search functions effectively
-- in SQL Server to solve procedural / ite
-- rative problems.
-- This table contains the tags to be re
-- placed. The % in <head%>
-- will take care of any extra informati
-- on in the tag that you needn't worry
-- about as a whole. In any case, this t
-- able contains all the tags that needs
-- to be search & replaced.
CREATE TABLE #html ( tag varchar(30) )
INSERT #html VALUES ( '<html>' )
INSERT #html VALUES ( '<head%>' )
INSERT #html VALUES ( '<title%>' )
INSERT #html VALUES ( '<link%>' )
INSERT #html VALUES ( '</title>' )
INSERT #html VALUES ( '</head>' )
INSERT #html VALUES ( '<body%>' )
INSERT #html VALUES ( '</html>' )
go
-- A simple table with the HTML strings
CREATE TABLE #t ( id tinyint IDENTITY , string varchar(255) )
INSERT #t VALUES (
'<HTML><HEAD><TITLE>Some Name</TITLE>
<LINK REL="stylesheet" HREF="/style.css" TYPE="text/css" ></HEAD>
<BODY BGCOLOR="FFFFFF" VLINK="#444444">
SOME HTML text after the body</HTML>'
)
INSERT #t VALUES (
'<HTML><HEAD><TITLE>Another Name</TITLE>
<LINK REL="stylesheet" HREF="/style.css"></HEAD>
<BODY BGCOLOR="FFFFFF" VLINK="#444444">Another HTML text after the body</HTML>'
)
go
-- This is the code to strip the tags out.
-- It finds the starting location of eac
-- h tag in the HTML string ,
-- finds the length of the tag with the
-- extra properties if any. This is
-- done by locating the end of the tag n
-- amely '>'. The same is done
-- in a loop till all tags are replaced.
BEGIN TRAN
WHILE exists(select * FROM #t JOIN #html on patindex('%' + tag + '%' , string ) > 0 )
UPDATE #t
SET string = stuff( string , patindex('%' + tag + '%' , string ) ,
charindex( '>' , string , patindex('%' + tag + '%' , string ) )
- patindex('%' + tag + '%' , string ) + 1 , '' )
FROM #t JOIN #html
ON patindex('%' + tag + '%' , string ) > 0
SELECT * FROM #t
rollback
View 1 Replies
View Related
Mar 9, 2007
Hi I have a production table in SQL Server 2005 that has approx 500,000 records---every 6 hours this table needs to be truncated and filled
The basic SSIS package uses a script compentant and imports the data into a staging table which has the same structure as the production table. I have a final Execute SQL Task that Truncates the production table and does a Insert into production-select * from stage table.
Takes around 30 seconds to run this last Execute SQL Task--problem is there is a risk that our webservice will query this table during the Execute SQL Task and return incorrect results.
Q1: In this last Execute SQL Task if I used a BEGIN TRANSACTION and COMMIT TRANSACTION; will this be any quicker ?
Q2: In this last Execute SQL Task- would it be better to use a RENAME TABLE technique in TSQL--any code examples ??
Q3:Is there any way in TSQL I can compare EVERY FIELD in two tables ie Stage and Production which have identical data structures and figure out a way to update only the records that changed? Is SQL Server Replication the best way to do this
thanks kindly
Dave
View 9 Replies
View Related