Seeking For The Fast Query
May 11, 2004
Hello, everyone:
There is a big table with several million records. I am developing a query that retrieve the first rowset that meets WHERE condition. Any suggestions for the fast query? Thanks a lot.
ZYT
View 2 Replies
ADVERTISEMENT
Apr 28, 2008
Hello
I have this data in a Access DB of ~4500 posts.
Here is a sample of my problem.
The Name has no ID, it is a simple text field with ~1800 diferent names in it:
Year|Name|
-------------------
2005|NN|
2005|NN|
2005|YY|
2005|XX|
2006|XX|
2006|XX|
2006|XX|
2006|NN|
2006|NN|
2008|NN|
2008|NN|
2008|NN|
I have tried to make a SQL query to show this:
Count of each Name Grouped by year
Year|Name|Cnt
-------------------
2005|NN|2
2005|YY|1
2005|XX|1
2006|XX|3
2006|NN|2
2008|NN|3
and this:
All Name as total count in the DB
|Name|TotalCnt
------------
|NN|7
|XX|4
|YY|1
without any success.
All ideas are welcome at this point since i am stucked.
View 2 Replies
View Related
Jul 23, 2005
I am able to run a query which runs FAst in QA but slow in theapplication.It takes about 16 m in QA but 1000 ms on theApplication.What I wanted to know is why would the query take a longtime in the application when it runs fast on SQL server?How should we try debugging it?Ajay
View 2 Replies
View Related
Feb 15, 2006
I have a very unusual situation - we converted a client from DB2 7.2 to MS SQL Server 2000, SP3. There is one report that runs very quickly when ran on the Database Server, but it takes a long time to complete when it is ran from a client system. This query is ran from within the application and not from within Query Analyzer.
Has anyone else here ever encountered this issue? What did it turn out to be? I am leaning away from it being a network issue.
Thanks in advance.
View 2 Replies
View Related
Jan 8, 2008
Ok, I'll admit right off the bat that I never suspected that I'd ever raise this complaint, much less worry about how to fix the "problem" associated with it!
We're preparing to take a large set of changes (projects) to PeopleSoft Financials from development to test. The code is still somewhat rough, but it has been "desk checked" to ensure that it does what the developers think that it ought to do, and they've blessed it at that point. The code is now moving into the test phase, and the QA team is finding locking/blocking issues that we've never seen in this code before... Sort of a "lock avalanche" where no one process locks for very long, but many of them block one another to the point where applications actually "freeze" while almost never hitting a deadlock.
My solution was to create a "blitzkrieg" query / stored procedure that would periodically sample master.dbo.sysprocesses, master.dbo.sysdatabases, and apply one of the dm_ functions to gather information on locking, blocking, and deadlocking. My procedure runs nicely (it never hangs) and gets about 99.3% of the data that I want.
The problem is that the blasted query / stored procedure runs either too fast or too slow, depending on how you look at it. Because the dm_ function takes a few ms to run, there can be a situation where either a row appears as a false positive or as a missing row because of timing... Either the culprit shows up as a blocker, but by the time the victim spid is evaluated the block has cleared, or the row is skipped and by the time the victim is evaluated the block has occured.
The whole process runs in well under 100 ms when there is nothing to report, and I've never seen it run 200 ms yet under the worst conditions it has faced, so the code is fast... The problem is that I really don't want to try to enforce any kind of locking to resolve the issue, because that locking would impact performance and that is EXACTLY what I do NOT want to do.
Any suggestions?
-PatP
View 8 Replies
View Related
Oct 17, 2007
Hello,
I was in a confusion that is Stored Procedures are really fast ? I have a .NET application where I am using Stored Procedures. But recently I cam through this link http://weblogs.asp.net/fbouma/archive/2003/11/18/38178.aspx which describes Stored Procedures are bad and it won't give any performance difference. What is the truth ? Will it give good performance that passing query from the application ?
Please make it clear
View 8 Replies
View Related
Sep 21, 2007
I have this 3rd party query:
SELECT D_P.PostId, D_P.TopicId, D_P.PostingUserId, D_P.BasePostId, D_P.ParentPostId, D_P.PostLevel, D_P.SortOrder, D_P.PostTitle,
D_P.PostDate, D_P.IsAnon, D_P.FileId, D_P.Property, D_P.IsDeleted, D_P.IsHTML, D_P.LastEdittedByUserId, D_P.LastEditDate,
CAST( ISNULL(D_P.FileId,0) AS BIT ) AS HasFile, F.FileName, CAST( ISNULL( U_RP.IsRead, 0 ) AS BIT) AS IsRead,
CAST( ISNULL( U_RP.IsFlagged, 0 ) AS BIT) AS IsFlagged, CASE WHEN IsAnon = 1 THEN CAST('Anonymous'AS VARCHAR(128))
ELSE U.FirstName+' '+U.LastName END AS Poster
FROM DISCUSSION_POSTS D_P
INNER JOIN USERS U ON D_P.PostingUserId = U.UserId
LEFT JOIN FILES F ON D_P.FileId=F.FileId
LEFT JOIN DISCUSSION_READPOSTS U_RP ON U_RP.UserId = 4265 AND D_P.TopicId = U_RP.TopicId
AND D_P.PostId = U_RP.PostId
WHERE D_P.TopicId = 460106
AND BasePostId IS NOT NULL
AND ((PostTitle LIKE '%flood%') OR (PostText LIKE '%flood%'))
AND ((PostTitle LIKE '%flood%') OR (PostText LIKE '%flood%')) -- No idea why they are doing this twice
ORDER BY PostDate DESC, D_P.PostId DESC
On the same server DatabaseA the query runs in less than 5 seconds. On DatabaseB the query times out. I can not see the execution plan for the query that times out. I am executing the query using SQL Server MGMT Studio. I have rebuilt all of the indexes, stats updated usage. Still no luck. I have checked all of the database setting and they are the same. If I comment out theste 2 lines "AND ((PostTitle LIKE '%flood%') OR (PostText LIKE '%flood%')) " the query run like it should and uses an execution plan all most the same as when using DatabaseA. Any pointers in the the right direction would be greatly apprecited!!!
Oh ya SQL Server EE 2005 sp 2 cu 3
Thanks,
~Joseph~
View 2 Replies
View Related
Apr 23, 2008
Performance issue.
I have a very complex Stored Procedure called by a Job that is Scheduled to run every night.
It's execution takes sometimes 1 or 2 hours and sometimes 7 hours or more.
So, if it is running for more than 4 hours I stop the Job and I run the procedure from a Query Window and it never takes more than 2 hours.
Can anyone help me identify the problem ? I want to run from the Job and not to worry about it.
Some more information:
- It is SQL 2000 Enterprise with SP4 in a Cluster (It happens the same way in any node).
- The SQL Server and SQL Agent services run using a Domain Account that have full Administrative access.
- When I connect to a Query Window I also use a Windows Account.
- There is no locks or process bloking or being blocked while the job is running.
- Using the Task Manager the processor activity is ok, no more than 30 % in any processor.
View 15 Replies
View Related
Oct 17, 2007
1)Which statement shows the maximum salary paid in each job category of each department?_______
A. select dept_id, job_cat,max(salary) from employees where salary > max(salary);
B. select dept_id, job_cat,max(salary) from employees group by dept_id,job_cat;
C. select dept_id, job_cat,max(salary) from employees;
D. select dept_id, job_cat,max(salary) from employees group by dept_id;
E. select dept_id, job_cat,max(salary) from employees group by dept_id,job_cat,salary;
2)description of the students table:
sid_id number
start_date date
end_date date
which two function are valid on the start_date column?_________¡£
A. sum(start_date)
B. avg(start_date)
C. count(start_date)
D. avg(start_date,end_date)
E. min(start_date)
F. maximum(start_date)
View 3 Replies
View Related
Apr 8, 2008
Hello all,
I have 2 primary key fields the ssn and refnum... if the data in the file is duplicated it will not import to my table rights even though i am using DTS to do my import, correct? or do I need to add an extra validator in there?
View 2 Replies
View Related
Jul 12, 2004
I would like to pull some data from a SQLServer database, and save it into an Access MDB file (which can be empty to start). I would then zip up the MDB and download it to the user.
I am seeking advice on the most "elegant" or "efficient" way to do this. Here are some ideas I have been considering:
1) Should I start with an empty template MDB and file-copy it before I populate it? Or is there a neat way in ASP.NET to allocate a brand new MDB outright?
2) I could read the SQLServer data into a Dataset object. I could then open a connection to the MDB, create a table object, defining all the columns, etc., and then I could write the data to the new table object. BUT ... I have a hunch there is a nifty ADO.NET way to save the data already in the Dataset object right into the MDB (creating the table and columns as a matter of course) ... all with an instruction or two (or three). Any ideas?
Thanks in advance!
View 1 Replies
View Related
Apr 29, 2002
A database with 1 mdf and 2 ldf has been detached from SQL Server 7.0 . Then removed the log files ( they are gone , unable to recover ) and there's no backup at all . Now I want to attach the database with the same mdf , but got error msg - 'Device activation error'. It seems like it's looking for one of the log files.
Is there any way to recover the db ?
I guess NOT , isn't it ?
I don't understand why it doesn't work with sp_attach_single_file_db and sp_attach_db . I actually tested it with a dummy database with 1 log file , and it worked - a new log file was recreated. Thus I performed in production server. Don't understand why it doesn't work.
thanks in advance.
View 4 Replies
View Related
Aug 10, 2007
Hello,
I am asking a question i have seen many threads on, but I am looking for an idiot's guide on how to convert my SQL 2005 database to SQL 2000 so i can get it to run on my web hosting server. I'm very new to asp.net , but have ahd years of experience in normal HTML and a year or two in the old ASP.
I was advised to learn ASP.Net 2.0 and have found it nothing but brilliant. The intergration with SQL 2005 made it a lot quicker to link up a database than using Access. Unfortunatly my hosting company is a little behind and still using SQL 2000. There isn't mch databse intergration (a few aplication forms) so I dont mind re-writing the whole database but I dont know how to set Visual web developer up with a SQL 2000 Database. I have also read from various other forums that you can convert a databse to 2000 by doing something with the scripts, but the explaination is too complicated for me to follow.
Is there anyone out there who wouldn't mind going over some old ground and explain this all in simple terms? Im using 'SQL Server Managment Studio Express' (although i dont know how to use it) and 'Microsoft visual web developer 2005 express edition' .
Thanks for reading this
Simon
View 6 Replies
View Related
Feb 8, 2008
Hello all, I'm new to SSIS and this forum, and this is my first post.
We're migrating a 2000 DTS ETL process to 2005 SSIS. We really like the enhanced functionality of SSIS thus far.
One problem we have with our 2000 process is that runs at 1:00am each morning. The scheduling is done via a distributed scheduling tool called Maestro. Our process pulls data from a mainframe-based DB2 OLTP and reformats it into SQL Server reporting tables. We have nightly mainframe batch processing that updates the DB2 tables, and we need those updates on a nightly basis.
The mainframe batch process starts at 8:00pm each evening. It finishes normally by 1:00am 90% of the time, but it is 20+ years old, and has its share of problems, especially during month-end. The problems can't be resolved until the next business day in some cases.
We'd like to elegantly connect the two processes somehow so the SSIS ETL process kicks off when the mainframe batch process finishes. I intentionally didn't use the word 'trigger' up until this point.
It would not be a problem to modify the mainframe batch process to insert or update a DB2 table that SSIS has access to, but I don't think we can get the mainframe batch process to update SQL Server 2005 tables...?
Any advice would be greatly appreciated--TIA!
John
View 7 Replies
View Related
Jul 2, 2007
I have two sites. Site A and Site B
Each site has two databases
Site A
Db1
Db2
Site B
Db1
Db2
Site A Db1 has to perform transaction replication to Site A- Db2 and Site B- Db1 and Db2.
I started Site A as pubisher and distributor and Site A and Site B both as subscriber.
Site B is in a different geographical area (state).
----------
Please suggest the best scenario to save bandwidth and server load for Publisher, and Distributor.
-------
Earlier I thought that I will implement local replication between Site B - in between Db1 and Db2. The Sql Server does not let me set Db1 as publisher, and distributor for its local database Db2.
-------
P.S. My all databases need same transactions though they are connected to different hardware at different places. So please don't question that why I need four similar databases.
View 7 Replies
View Related
Nov 30, 2007
Hello everyone, I am upgrading from SQL Server 2000 to SQL Server 2005. Any caveats? Can I just detach the db's and attach them into 2005. Or is there any conversion I should run or import first?
Thanks,
View 4 Replies
View Related
Jan 14, 2004
In desperate need of implementing a solution where the customer has purchased a CMS to replace their corporate site and wants to use MSSQL as the DB server type. I have 3 servers allocated to me to complete this and I could use some advice on the best setup.
THey're running windows 2003 standard server along with sql 2000 standard server. The intended plan is for the 2 servers to become web servers with the last server becoming the sql server. The CMS will reside soley on the sql server and the content for the web site on the seb server(s).
What I need to know is if it's possible to set up an active/passive node to accommadate this using the items mentioned above? From what I've been reading sql 2000 enterprise does clustering but I'm hoping this version of sql can be used for something.
Any responses are appreciated.
View 5 Replies
View Related
Jul 20, 2005
Hi,I am seeking the help of volunteers to test some software that I'vedeveloped which facilitates distributed two-phase commit transactions,encompassing any resource manager (e.g. SQL/Server or Oracle) controlled byMicrosoft's Distributed Transaction Coordinator in a Windows2000environment, with any resource manager under the control of DECdtm (e.g. Rdb(or Oracle via the XA Veneer)) in a VMS environment.[Yes, at some stage, I hope to sell this software and make money out of it,so unless you have a large philanthropic streak or are simply a techie wholikes to stay on top of Windows<->VMS connectivity issues, then you may wishto look away now. But if you do choose to participate, then rest assuredthat I have no interest in your personal or company details. (Just yourwork-rate :-)]What differentiates my Transaction Manager software from existingTransaction Monitor packages that are already in the marketplace (and whyyou should be interested) is that it is based on the Transaction InternetProtocol TIP standard. (RFC 2372) For those of you who don't know, thebeauty of TIP's "Two-Pipe" strategy is it's application-pipe (or middleware)neutrality. Whereas most XA implementations mandate homogenous TransactionMonitor deployments (such as Tuxedo everywhere, Encina everywhere, MQSerieseverywhere, ACMSxp everywhere and so on . . .), hotTIP from TIER3 Softwaregives you complete freedom to choose the middleware product(s) that bestsuite your particular application and heterogeneous network needs.Would you like to talk to VMS with TIER3 Sockets, COM or DCE/RPC? BEAMessageQ, IBM MQSeries or HTML? The choice is yours and yours alone. Butonce you realize that you need to encase your critical transactions withinthe ACID properties of a true Heterogeneous Two-Phase Commit then you willcome to the conclusion that you need a Transaction Manager that looks a lotlike this.Another drawback of traditional "One-Pipe" strategies is that they precludethe run-time determination of transaction participants. (Functionalitywhich may be advantageous in a wide-area or Internet based application.)Anyway, this is what I have: -On the Windows side, you need absolutely *NO* additional software! I'llreply to this note with a brief description of the COM+ and DTC functionsthat you would need to invoke in order to successfully push a MTS/DTCtransaction to VMS. NB: These are standard Windows APIs that are fullydocumented on MSDN.On the VMS side, I have a VMSINSTAL saveset that (all zipped up) is some150KB that I'm happy to e-mail to you along similar lines to the VMShobbyists (non-commercial use) license. I'll reply to this note with anInternet Daemon (INETd) example of code that uses my software to cedetransactional control, over an SQL insert into a Rdb database, to MTS/DTC.It's under 500 lines long and contains all of the DCL, 3GL, SQL required toproduce a working example of a TIP-2PC capable TCP/IP auxiliary server. Thisexample will insert a row into the MF_PERSONNEL.Employees table on the VMSside in co-operation with Windows2000 MTS/DTC client that is inserting a rowinto the NORTHWIND.Employee table. Commit them all or roll them all back.So, in summary, If you'd like to volunteer to put hotTIP through it's pacesthen simply reply to this mail.Regards Richard MaherPS. The following are a few functionality restrictions with the currentversion of my software that may effect your decision to participate: -1) Transaction has to be started/mastered/coordinated by W2K MTS/DTC2) Transactions cannot be PULLed from VMS and must be PUSHed from W2K3) No cluster-wide recovery.(If a txn falls over after being prepared then you have to wait for thatspecific node to become contactable again even though that lovely RDMrecovery job is sitting on another node protecting the database until myhotTIP TM tells it to commit or abort.)4) There is currently no Alpha or Itanium version available. The Alpha portis currently in progress but, for the time being, you'll either need a VAXor a VAX emulator on your PC.
View 7 Replies
View Related
Jul 13, 2006
I read somewhere that market basket analysis finds rules with substitutes as likely as rules with complements due to a consumer behavior called "horizontal variety seeking". This is when customers buy more than one product in the same category even though they are subsitutes. For example, when people go to the grocery store and buy soda, they buy coke and sprite at the same time even though they are substitutes of each other. I was wondering if anyone has experience with this anomaly and how they solved it. I found a time series model called the vector autoregressive model which is used to find the elasticity of prices over a time period. Does anyone have experience working with the VAR model? I am having trouble figuring out what some of the variables in the model are.
Below is the paper
http://www.feb.ugent.be/fac/research/WP/Papers/wp_04_262.pdf#search='VAR%20model%20market%20basket%20analysis'
View 1 Replies
View Related
Aug 20, 2007
Hi. We've decided to convert our Crystal Reports to SSRS 2005. We know (thanks to this forum) there are companies that will convert the reports at a cost; however, we'd like to undertake this ourselves. Are there resources you can point us to that might be specific for Crystal Reports users coming over to SSRS, especially for newbies? Thank you.
View 3 Replies
View Related
Jul 20, 2005
I'm a complete newbie. Need to insert a Company logo into a databasecolumn to use later on in a check printing application. Read how toinsert the pointer instead of the object into the column. Below iswhat I did:SET QUOTED_IDENTIFIER OFFGOINSERT INTO BankInfo(CoLogo) VALUES(0xFFFFFFFF)***Then I did this****DECLARE @Pointer_Value varbinary(16)Select @Pointer_Value = TEXTPTR(CoLogo)FROM BankInfoWHERE CMCo = '91'WRITETEXT BankInfo.CoLogo @Pointer_Value"\192.31.82.77DataCheckImagesWyattLogo.jpg"****This was straight out of a book and it seemed to work it gave me amessage that it was successful and when I view the data in the columnI can see the pointer0x453A5C436865636B496D616765735C57796174744C6F676F 2E6A7067*****But when I try to use the column in either Crytal Report or an AccessReport the Bank Logo does not show up. I also placed the logo on my Cdrive and tried pointing to it there with "C:WyattLogo.jpg" with nosuccess.It can't be this difficult to get a Company logo into a column. Idesperately need assistance. Remember I am the ultimate newbie. Ilooked at my first sql database last week. Thanks in advance for anyhelp, it is appreciated.
View 1 Replies
View Related
Jan 13, 2006
Hello,I've been searching the web for quite some time to resolve the problemof "1/1/1900" returning in a datetime field in SQL that resulted from ablank (not NULL) value being passed to it through an ASP page.The solution is that a NULL value needs to passed to SQL from ASP.Thats fine...I understand the why the problem is happening and thesolution around it. HOWEVER, I can't seem to get the proper syntax towork in the ASP page. It seems no matter what I try the "1/1/1900"still results. Below are a few variations of the code that I havetried, with the key part being the first section. Does anyone have anysuggestions?!?!?______________cDateClosed = ""If(Request.Form("dateClosed")= "") ThencDateClosed = (NULL)end ifsql="UPDATE rfa SET "&_"dateClosed='"& cDateClosed &"', "&_"where rfaId='"& Request.Form("RFAID")&"'"_____________________________cDateClosed = ""If(Request.Form("dateClosed") <> "") ThencDateClosed = (NULL)end ifsql="UPDATE rfa SET "&_"dateClosed='"& cDateClosed &"', "&_"where rfaId='"& Request.Form("RFAID")&"'"_____________________________cDateClosed = ""If(Request.Form("dateClosed")= "") ThencDateClosed = NULLend ifsql="UPDATE rfa SET "&_"dateClosed='"& cDateClosed &"', "&_"where rfaId='"& Request.Form("RFAID")&"'"_______________Thanks in advance!!!!
View 7 Replies
View Related
May 16, 2008
I'm trying to ascertain how I can find out more about a particular job.
The information I have from a script I have to identify deadlock root causes gave me back this information:
spid 86 is blocking spid 51... spid 86 info: SQLAgent - TSQL JobStep (Job 0xBAD836E3D331B44BA4CCAC400D244B17 : Step 1)
Well, that's good to know, but I would like to be able to identify the particular job that 'owns' TSQL JobStep (Job 0xBAD836E3D331B44BA4CCAC400D244B17 : Step 1).
I've read the BOL on the sysjob type tables; and, while they tell me about the columns in the tables and what they are, they tell me absolutely nothing about how one goes about figureing out what I want to know.
I suspect one problem I have is that '0xBAD836E3D331B44BA4CCAC400D244B17' needs converting to something else and I have no idea how to go about doing this. I was never that good at converting hex (I assume that is what this is) when I was doing it rather often, which is years ago, so I really have no idea how to start.
Can anyone throw me a lifeline here?
Tia
randyvol
View 2 Replies
View Related
Jan 17, 2006
my system has 2 db's - sql server 2000 & db2 @ separate locations. i have a select query which needs 2 pick up consolidated data from both the tables. also the schema on the db2 has minor changes when compared with the schema on sql server 2000.
while searching on microsoft i came across the technique of creating a linked server. would this be possible 2 implement in my scenario. also would in this case, be advised that i create another view in the db2 server which has changed the db2 schema to the sql server schema format??
please hurry..
regards,
sameer
View 2 Replies
View Related
Aug 6, 2007
One or more files listed in the statement could not be found or could not be initialized. (Microsoft SQL Server, Error: 5009)
I accidentaly created a log file on my drive E:, but every time that I try to delete the log file it keeps on returning the same error.
Can someone please help me delete the log file.
View 7 Replies
View Related
Aug 17, 2007
hii have over million records in my DB, what is the best way to get the results fast in case i need to get details of an employe name say "robert", if i do it normally it will take long, should i use index or is there any other good way.thanx in advancecheers
View 1 Replies
View Related
May 28, 2002
Hello everybody .
I have 40 GB db running mostly transaction processing.
I set up
1. back full backup 2 times a day (takes 30 -40 min)
2. log backup every 15 min
3. custom log shipping
4. We don't won't use Cluster.
Once in while becouse of nethwork, or other problem log shipping fails,
so I have to restart log shipping all over starting from restore in stand by mode last full back of my db.IT takes 2-3 hrs just to do this restore !!!
1. So I am asking advice is any way I can bring down time for restore ?
2. Should diffrential backup be taken ?
3. We will not use Custer
Alex
View 4 Replies
View Related
Oct 23, 2006
hello all !for MS SQL 2000i am having a table with > 100 000 rowsI must clean itDELETE FROM myTable WHERE Name LIKE 'aser%' AND info IS NULLDELETE FROM myTable WHERE Name LIKE 'tuyi%' AND Info = 'ok'DELETE FROM myTable WHERE Name LIKE 'hop%' AND info LIKE 'retro%'.....about 20 DELETE commandswhat is the best way to do it ?thank you
View 14 Replies
View Related
May 3, 2008
Hi everyone im in deep in need of help in a very easy query and few questions i want to ask,, i use msn boy22202@hotmail.com please i want to contact anyone who use sql server 2005 that can help me in it.... thank you
View 4 Replies
View Related
Dec 18, 2007
I need to insert data to a temp table in SQL ,
I have
CREATE TABLE TMP_X (
doc_name varchar(200)
)
--select * from TMP_X
INSERT into TMP_X
values
(
'...,
but its saying there isn't a match, and i know why its trying to insert all the data as one row, but i need them as seperate rows as i want only 1 column..
is there another INSERT type function ?
View 2 Replies
View Related
Feb 14, 2008
If I have a table with one column
and i want to insert a few 100's rows of names
I can't use the INSERT stmt as that does one row at a time ,
how can i achieve this ?
View 5 Replies
View Related
Mar 27, 2004
I have stoopidly enough deleted default Db. That causes Enterprise Manager to be unable to work with my DB's. The default DB I deleted had no other functions other then being default DB, I mean it was outdated, and I had other DB's that contained all my importent work. They are still running, and I can view DB driven site hosted at localhost, even though default DB no longer excist. I am even able to upload new content, or add new users, so this means all my other DB's are fine. I can even see SQL server icon in my bottom right corner of my desktop, and it shows server running.
Now I am in the need of adding tables and rework some of my excisting tables and stored procedures, but I am not able to do that with Enterprise Manager, due to the lack of default Database.
How do I correct this problem? I have gotten one tip of doing the following: EXEC sp_defaultdb 'User', 'DB' but I am not sure what to do with this.....tried to run it from command line, and put my username and the DB I would set to default but nothing happend.
So I need more details, step-by-step guiding will work, as I don't know a hole lot about Enterprise Manager and SQL.
Btw, this is my error in Enterpr.Managr:
A connection could not be established to MyComputerVSDOTNET2003
Reason: Cannot open default database. Login failed..
Please verify SQL server is running and check your SQL server registration prpoerties and try again
Pls tell me there is a way to fix this problem.
View 6 Replies
View Related
Dec 14, 2001
Hello everybody,
please advice: what is the fastest standard method of user interface access to SQL database? I am looking for fast display of one master record plus related dependent records, plus fast scrolling through master records with display of dependent records as fast as posible. Perhaps a standard problem with standard solution? At current state of matters, I am still much slower then with my old Access97 database.
thanks for any advice,
Otakar Kverka
Prague
View 1 Replies
View Related