i am working with database mirroring and i need to be able to determine which database will be principal.
for example if server A is principle and server B is mirrored and server A is down, then server B take owner and become principle, after fixing server A i want it to be principle again,
How to determine the value for the bar in Neural Network Viewer? I know the value is displaying in the tooltips when we pointing to the bar in the table but i don't know how to get them. So where can i get those of calculation or data of score, probability of value1 and 2, and lift for value1 and 2 ? Is it get from the Microsoft Neural Network Content Viewer? which column and how to calculate? If not, please advise.
Hope my question is clear.
I am looking forward to hearing from you shortly and thanks a lot in advance.
I am trying to determine the next available order id using the method below. It works provided the table has a record in it. If it doesn't I get the error "Input string was not in a correct format." I am certain that it is because the query is returning a value of NULL. How can get around that or check for the NULL value?' Establish data connection...Dim sqlConn As New SqlConnection(ConfigurationSettings.AppSettings("connectionstring"))'Determine order id number...Dim order_id As IntegerDim strSQL As StringstrSQL = "Select MAX(order_id) from mkt_order"Dim sqlCmd As New SqlCommand(strSQL, sqlConn)Dim sqlDA As New SqlDataAdapter(sqlCmd)Dim sqlDS As New DataSetsqlDA.Fill(sqlDS, "item")If sqlDS.Tables(0).Rows.Count <> 0 Thenorder_id = Convert.ToInt32(sqlDS.Tables(0).Rows(0)(0).ToString()) + 1Elseorder_id = 1End If
We have a web application (ASP) running on SQL Server 7.0. Recently, the users are getting quite a lot of timeouts on the database:
Microsoft OLE DB Provider for ODBC Drivers error '80040e31'
[Microsoft][ODBC SQL Server Driver]Timeout expired
The database is not supposed to be doing too much work, so I can't understand why these timeouts are occuring. How can I determine the cause of the timeouts?
The cause could probably anything from a trigger that's taking too long, a query that's taking too long, or simply bad database design.
I've looked at SQL Server's Profiler, but could not yet use it successfully to give me any hints of what could cause the timeouts.
Any ideas of how I can use Profiler, Performance Monitor, or any other tool(s) to see what is happening in the background in the database, i.e. how much processing a trigger is using, etc.
Thanks very much! --- Gert Lombard OSI Airport Systems South Africa
I'm new to full text catalogs and we have a vendor who's code utilizes them. The database server is SQL 2005 and I am noticing the following message in the SQL log every minute.
Changing the status to MERGE for full-text catalog "ResearchCatalog" (5) in database "DBA_Test" (11). This is an informational message only. No user action is required.
A SQL job is running the following command every minute.
I've been doing a lot of reading on this and my head is starting tohurt! It seems to be quite a feat to work out how much memory isactually being used by our server.I'm running W2K advanced server with SQL 2000 EE, 8GB of RAM, a min of4GB and a max of 6GB is assigned to SQL server.I'm trying to work out whether we've assigned enough or toomuch/little memory to SQL server. My first thought was to let SQLdymanically manage its own memory and see how much it uses, of coursewhen AWE (/3GB /PAE) is enabled it will just use all that isavailable.In perfmon "target server memory" = 6.1GB, "total server memory" =6.1GB, "total pages" = 768000 ( x 8KB = 6.1GB).My second thought was to use "total pages" - the average "free pages"= average mem used, therefore giving me the average amount of memoryused by SQL. I found out that SQL uses a min of 4GB (the min weassigned) and the max of all the memory, 6GB.Is there an easier way of finding out how much memory is actually usedin this situation or is going by the above average the best way?What i'm unsure about is will SQL just use all memory assigned to ituntil it has the whole DB in memory? 20GB including indexes etc....Any help would be greatly apprechiated.
I need to determine when (maybe) and if (definitely) a SQL Agent job will run again. I need to maintain a table of the next pending execution for each job. I need to be able to update this table from within a SQL Agent job, but preferably from within an executing SSIS package in the job. Is this possible and if so, any suggestions on how?
"Deterministic functions always return the same result any time they are called with a specific set of input values and given the same state of the database. Nondeterministic functions may return different results each time they are called with a specific set of input values even if the database state that they access remains the same."
I have a report that has a table with detail grouping. This table shows the sales by day for each product. The users only want to see the date field for the first item in the group. After that, they do not want to display this field (to reduce the data on the report). However, when the data wraps to a second page, they want the date to appear on the first row of the new page.
Is there any way to determine if a row is the first row on a page?
I tried using the RowCount, but that continues from the previous page.
since I am kind o'new with SQL, I preffer get an advice fro you pro's: I created an application which performs access to a database on an SQL server. the application will be used by a few different users, each on a different computer. the application calls stored procedures, updatesinserts records in tables on the SQL and delete rows. what would be the best role to define the users activity ? How do I limit their activity ONLY to the specified actions ?
I'm working on a sproc that determines the next order id for a specified customer. The table has
custid int,
ordernum varchar(10)
Data is:
1000, 1000-001
1000, 1000-002
1001, 1001-001
1000, 1000-003
I need to know the next ordernum for the specified custid. For example, GetNextOrderNum(1000) should return 1000-004. GetNextOrderNum(1002) should return 1002-001 (since there aren't any orders yet).
What is the most reliable way to determine the last LSN of a database? I've looked in sys.database_files to no avail. I've also looked in msdb.dbo.backupset which is accurate but only based on backups already performed not the current state of the database.
With an INSERT statement I add a record to a table. Then I want to get the (autonumber) ID of the newly created record. What is the fastest and best way to do this?
I need to determine which service pack we are running on our sql servers. I run SELECT @@VERSION and get it tells me that we are running 7.00.1020. I have a listing from google that tells me the value for each service pack, but my version doesn't match anything on the list.
Can you tell which service pack I am running based on the results of my query?
I have a system that processes inserts that originate from automatic data collection subsystems on manufacuturing cells. The system processes about 2500 records a day. The system is isolated with no ready support or attention. My goal is to automate any and every reasonable admin task. My present activity centers on re-indexing the main table (receives the data from the inserts, supplies the data for web based reporting).
The table - tb_production_log - receives inserts that are time stamped and bear a Machine_id. The table has a clustered index built on the Machine_id (int) and Date_time (time of data's acquisition). The table only receives Inserts, the records are never Updated. No inserts are out of time sequence (no older records ever have to be 'wedged' in amongst existing records). Ulitmately, the table is tested daily for records with age > 365 days. Such records are Deleted.
For the past week, I have been running a monitoring stored procedure on my test box to track the fragmentation of the tb_production_log table. It's based on DBCC SHOWCONTIG with some extra tests. After capturing the SHOWCONTIG data, the sp runs a test query against the table to emulate a typical User report. I track the time this query takes. The query covers records over the last 7 days. (approx. 17,500 records involved). In addition, I track the time it takes Inserts to run. Inserts are done in batches from an external app. I get a RecordsPerSecond data point for each batch.
I have a system that processes inserts that originate from automatic data collection subsystems on manufacuturing cells. The system processes about 2500 records a day. The system is isolated with no ready support or attention. My goal is to automate any and every reasonable admin task. My present activity centers on re-indexing the main table (receives the data from the inserts, supplies the data for web based reporting).
The table - tb_production_log - receives inserts that are time stamped and bear a Machine_id. The table has a clustered index built on the Machine_id (int) and Date_time (time of data's acquisition). The table only receives Inserts, the records are never Updated. No inserts are out of time sequence (no older records ever have to be 'wedged' in amongst existing records). Ulitmately, the table is tested daily for records with age > 365 days. Such records are Deleted.
For the past week, I have been running a monitoring stored procedure on my test box to track the fragmentation of the tb_production_log table. It's based on DBCC SHOWCONTIG with some extra tests. After capturing the SHOWCONTIG data, the sp runs a test query against the table to emulate a typical User report. I track the time this query takes. The query covers records over the last 7 days. (approx. 17,500 records involved). In addition, I track the time it takes Inserts to run. Inserts are done in batches from an external app. I get a RecordsPerSecond data point for each batch.
I am trying to determine what service pack I am running on a SQL 7 server. Can somebody tell me how to do this?
I have an article from technet showing me how to do it for SQL 6.5 but not 7. I assume it is the same method but I need to know which product version number relates to which service pack.
How do I determine the Login name from the user name, in SQL?
For example, I have a Login called Accounting with users Bob and Sue. How do I know from Bob or Sue's user name that they are members of the Accounting Login?
I would like to know how to determine how big log and data space is, and how much of this space is free. I would like to create a script to warn me when less than 20% free space is left in log as well as data.
Hi there, is that any function in ms sql server 2000 where if i pass a date or a year then it could gives me the total days of year from the parameter? in mySQL got select DAYOFYEAR(date);
can some one guide me on this please...i need to use it for a leap year function for my SP!
I have a problem with SP when passing in Parameters. Basically something like this:
-- Pass 1, 2 or 3 as parameter EXEC SP_mySP 1
-- The SP will do the following SQL Statement SELECT * FROM myTable WHERE ( (If @Parameter1 = 1 then myColumn = 'A' or myColumn = 'B') (If @Parameter1 = 2 then myColumn = 'C') (If @Parameter1 = 3 then myColumn is not null) )
I am writing a select statement to retrieve all data for a particular region. I need to select only the callerid's which are valid for that region.
I have two tables:
tblone contains PbX data, eg regionalid, callerid, donglearea and regionaldialup, tbltwo contains regionalid, and two columns which when concatenated make up a substring of the tblone.callerid...
i need to do a check of the callerid in tblone against tbltwo to make sure it is a valid callerid. (ie. it is present in tbltwo) N.B. the concatenation of the two columns in tbltwo is only a substring of tblone.callerid.
It is easy to concatenate the two rows and place them in a temp table, how do i then search through this table and if it is present, allow my select statement to print this entire row as part of the result set?
if the callerid in tblone is not valid i have to do something similar against a dongle table using the donglearea field and if the dongle area is not valid i have to use the regionaldialup field and by using a predefined value (eg. 0800003554) to determine if the row is valid and should be selected.
I then have to delete everything which was not valid from the table.
I am a junior sql administrator and your help would be much appreciated!
Hmm, I'm wondering if some of you have figured out a way to check alarge database's DRI programmatically.The term, 'a large database' is a loose one, let's just say, at leastover 200 user tables. Yes, DBCC CHECKCONSTRAINTS is handy.However, it won't be able to tackle some hidden DRI problems. Forinstance, here you have two tables, totally fictionary! but possiblyin thereal world, Customer and Account (for demo purpose, I'll name them temptables here),the original designer probably meant to link them via customer_ID,however, he/she did not do it properly.-- DDL and DML-- one day at data lifecreate table #customer (customer_ID char(12) not null primary key,first_name varchar(20), last_name varchar(20), sex char(1), statechar(2), ssn char(11),check(Left(customer_id,2)=state AND Right(customer_id,4)=Right(ssn,4)))-- thanks Joe Ceilko for a more meaningufl PK ...insert into #customervalues('md-1234-1234','Dan','Li','M','VA','567-28-4321')create table #account (account_id int identity(10000,1) primary key,account_type varchar(10), amount money, last_update datetime,first_name varchar(20), last_name varchar(20))insert into #account (account_type, amount, last_update,first_name,last_name)values ('Receivable',3000.0000,getDate(),'Dan','Li')/* at least two problems herea) DRI is lost hereb) Instead of 'Dan' and 'Li', one could enter 'NosuchFN' and 'NosuchLN'-- another day at data life, don't ask me why they do that, I wouldlikely use ACTIVE flag to keep all datadeletefrom #customerwhere customer_id = 'md-1234-1234'-- now, boss ask why do we have this Dan Li in the #account table whilethere's no such corresponding record in the #customer table orwho the heck is this Dan Li anyway (give me more info about this guy?)?Well, if we have only a few or a dozen tables, it won't require tonsof effort to find data problem for the given situation (database),but again, let's say, this db has over 200 tables, checking them byhand would seem to be like doing things like Homo Sappiens, I don'tmeanto be lazy, so, how would you systematically at least programmaticallyidentify the DRI problems?Many thanks in advance to those clearer heads.DL