i have created a table in sqlserver2005 named "Departments" - in this table different departments of a telephone ( landline ) company are to be stored,which deals with complaints registered to them by there users.
i want to know the name of these different departments which deals with complaints assigned to them like if i do have complaint from a user who has problem with his handset then that complaint will be assigned to "Maintance dept."
as i was never in indusrty , i need the help in filling the table.
just do write me name of departments and the nature of complaints wh they deal with!!!
in this computer sqlserver express edition is installed, i want to remove this express becoz in configuration manager it show two SQLSERVER'S are running. when i browse from COMPUTER - 2 for network servers it show server name as HASH/SQLEXPRESS, but not the main SQLSERVER.
COMPUTER -2
IP::::129.100.100.142
COMPUTER NAME::: FEROZ
MEMBER OF WORKGROUP
can anyone help me how to connect these two computers and remove this express edition
Hi there I need to fill database table randomly with 5 ordered numbers such as 43566 , 78578 , 92565 , .. to gain approximately 100000 row of a table . Is there a query ?
When I run the code below, I get a table with "In order to evaluate an indexed property, the property must be qualified and the arguments must be explicitly supplied by the user." as the contents... Public Function ReturnTable(ByVal strName As String, ByVal alParameters As ArrayList, ByVal strTable As String) As DataSet Dim sqlSP As New SqlCommand sqlSP.CommandTimeout = 120 sqlSP.Connection = sqlConn sqlSP.CommandType = CommandType.StoredProcedure sqlSP.CommandText = strName
AddParameters(sqlSP.Parameters, alParameters)
Dim dsDataSet As New DataSet()
sqlConn.Open()
Dim sqlDataAdapter As New SqlDataAdapter(sqlSP) sqlDataAdapter.Fill(dsDataSet, strTable) sqlConn.Close()
Return dsDataSet
End Function
AddParameters is a function which adds each of the parameters in the array list to the sqlSP parameter collection... In SQL Profiler, the sp is called and runs and returns results... But a dataset with one table and the above msg is returned...
Hi, I am using Sql Server 2005. I want to place a LOCK, insert something in the table then return back to the code (with the ID of the inserted item) and then call FUNCTION_A and then based on the outcome of the function I either want to
I want to keep the inserted item and release the lock or I want to delete the inserted item and release the lock Any suggestion is greatly appreciated. Regards naimulah
So to the above table I have added a new field/column named "ProdLongDescr(varchar, Null)"
So, I need to populate this newly added column with specific values for each row depending on "ProductCode" which is different forevery row. The problem is that I have 25 rows.So instead of Writing 25 individual update scripts, is there a way in which single query will do the same job instead of writing one update query for each row ?. If so can some one guide me how to achieve that OR point to me a good resource.
Below are a couple of Individual update scripts I Wrote. "ProductCode" is different for all 25 rows.
Update tblValAdPackageElement SET ProdLongDescr = 'Slideshows' WHERE ProductCode = 'SLID' And szElementDescr='Slideshow' if @@error <> 0 begin goto ErrPos end
Update tblValAdPackageElement SET ProdLongDescr = 'CategorySlideshows' WHERE ProductCode = 'SLDC' And szElementDescr='CategorySlideshow' if @@error <> 0 begin goto ErrPos end
Hello All I am wanting to fill a drop down list in ASP.NET using C# from a SQL database table using a stored procedure. I have my Sproc. But using ASP.NET C# I have no idea how to do this. Can someone give me a good example, and if not too much trouble, place comments in the code, and give an explanation. I am just learning ASP.NET after moving from Classic. Things are alot different.
I have a ton of data to load into a SQL 2005 database. I just loaded a bunch of data for a number of tables using bcp, and the last table that my script loaded was an 8 million row table. The next table was a 12 million row table, and about 1 million rows into the bcp'ing a log full error was incurred. I have the batch size set to 10000 for all bcp commnads. Here is the bcp command that failed:
Here is the last part of the output from the bcp command:
... 10000 rows sent to SQL Server. Total sent: 970000 10000 rows sent to SQL Server. Total sent: 980000 10000 rows sent to SQL Server. Total sent: 990000 SQLState = 37000, NativeError = 9002 Error = [Microsoft][ODBC SQL Server Driver][SQL Server]The transaction log for database 'billing_data_repository' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases
BCP copy in failed
I thought that a commit was issued after every 10000 rows and that this would keep the log from filling up.
The log_reuse_wait_desc column in sys.databases is set to 'LOG_BACKUP' for the database being used.
Does a checkpoint need to be done more often?
Besides breaking up the 12 million row data file into something more manageable, does anyone have a solution?
How can I continue to use my same loading script, and keep the log from filling up?
I have a class that works fine using the SQLDataReader but when I try and duplicate the process using a Dataset instead of a SQLDataReader it returnsa a null value. This is the code for the Method to return a datareader
publicSqlDataReader GetOrgID() { Singleton s1 = Singleton.Instance(); Guid uuid; uuid = new Guid(s1.User_id); SqlConnection con = new SqlConnection(conString); string selectString = "Select OrgID From aspnet_OrgNames Where UserID = @UserID"; SqlCommand cmd = new SqlCommand(selectString, con); cmd.Parameters.Add("@UserID", SqlDbType.UniqueIdentifier, 16).Value = uuid;
SqlDataAdapter adapter = new SqlDataAdapter(); adapter.SelectCommand = cmd;
adapter.Fill(dataset); return dataset;
}
Assume that the conString is set to a valid connection string. The Singlton passes the userid in from some code in the code behind page ...this functionality works as well. So assume that the Guid is a valid entry..I should return a valid dataset but its null. Additionally if I change the sql query to just be Select * From aspnet_OrgNames I still get a null value...I am assuming I am doing something wrong trying to fill the dataset.
MS SQL Enterprise Server, SP5, running under version 6.5.
I have recently been having a problem with the TempDb database filling up. I originally started the database at 250 Mb but recently expanded it to 500 Mb.
My last check of the activity on the server during an event such as this produced the following information.
- Approx. 300 connections to primarily 2 databases.
- 4 active connections:
Connection 1 -SELECT on database 1 with 13,000 records and a record size of approx. 300 bytes.
Connection 2 -SELECT on database 1 with 13,000 records and a record size of approx. 300 bytes.
Connection 3 -SELECT on database 2 with 550 records and a record size of approx. 100 bytes.
Connection 4 -Replication subscriber set at 100 transactions.
My questions are:
1. What processes may cause the TempDb database to fill up?
2. What processes prevent the database from purging?
We are having continual problems with our transaction log filling up on one of our major applications. Does anyone know of a way or tool to read the transaction log? We want to determine what is causing this problem.
Recently, we converted an Access database to SQL server 6.5. One of the processes that runs against the server is missing a commit causing temporary stored procedures to fill up TEMPDB in the sysobjects table. The only way to clear up TEMPDB is to stop and start SQL server when the database fills up. I wrote a quick and dirty stored procedure to delete the affending rows out of the tempdb..sysobjects table, however, the database still registers as full after the deletes. Question: does anyone know of a process/DBCC I can run against the tempdb..sysobjects table to regain the space in TEMPDB without having to stop and restart SQL Server? I need a temporary solution while the programmer is debugging the affending code. Thanks! TC
I've got replication set up as a publisher subscriber, to basically sync a primary server with a backup server. My distribution log keeps filling up, I've got a perf alert for now to truncate it at 75% full for now, but why does it fill up? it's size is about 1.5 gig My tran log for my database will also not remove about 500mb of data as well, is there any way to see what is going on?
hello, i'd need a little help with filling GridViewsi browsed over like 10 search pages, but couldnt find any which would solve my problem.so in my ajax project i made a testing page pulled a gridview (GridView1) on it with a fhew buttons and textboxes.i need to fill the gv from code so my websie.asp.cs looks like this 1 protected void Page_Load(object sender, EventArgs e)2 {3 4 5 string connstr = "Data Source=.;database=teszt;user id=user;password=pass";6 SqlConnection conn = new SqlConnection(connstr);7 SqlCommand comm = new SqlCommand("select * from users", conn);8 conn.Open();9 SqlDataReader reader;10 reader = comm.ExecuteReader();11 if (reader.HasRows)12 {13 GridView1.DataSource = reader;14 GridView1.DataBind();15 }16 reader.Close();17 conn.Close();18 comm.Dispose();19 } so i load the page and there's no gridview on the page at all, nor an error msg, the connection and the database/table is fine.any suggestions on what am i doing wrong? and i also like to know if there would be any problem with using this on a tabcontrol/tabthankyou
I have a table that keeps track of click statistics for each one of my dealers.. I am creating graphs based on number of clicks that they received in a month, but if they didn't receive any in a certain month then it is left out..I know i have to do some outer join, but having trouble figuring exactly how..here is what i have:
select d.name, right(convert(varchar(25),s.stamp,105),7), isnull(count(1),0) from tblstats s(nolock) join tblDealer d(nolock) on s.dealerid=d.id where d.id=31 group by right(convert(varchar(25),s.stamp,105),7),d.name order by 2 desc,3,1
this dealer had no clicks in april so this is what shows up: joe blow 10-2004 567 joe blow 09-2004 269 joe blow 08-2004 66 joe blow 07-2004 30 joe blow 06-2004 8 joe blow 05-2004 5 joe blow 03-2004 9
We have a stored procedure that uses temp tables and must gather a lot of data. When we stress test this stored procedure, the tempdb transaction log fills up.
We tried using "Select Into" for our tables. That caused us not to write to the transaction log but it caused problems because it locked up sysobjects.
Is there some other way not to write to the transaction log?
I have a database that I am splitting the data using odd account numbers and even account numbers. The odd acct numbers in one database and the even in the other database.
This database is very large. The problem is when I run the delete statements it is going to fill up the log files. Can I turn on the "Simple" mode on the database while I am deleteing the data. Will this cause a problem? Then can I turn back on the 'Full' mode when I have finished?
Has anyone ever done this and so how did it work. Or better yet is it possible?
I am new to SQL Server and learning lots very quickly! I am experienced at building databases in Access and using VBA in Access and Excel.
I have a time series of 1440 records that may have some gaps in it. I need to check the time series for gaps and then fill these or reject the time series.
The criteria for accepting and rejecting is a user defined number of time steps from 1 to 10. For example, if the user sets the maximum gap as 5 time steps and a gap has 5 or less then I simply want to lineraly interpolate betwen the two timesteps bounding the gap. If the gap is 6 time steps then I will reject the timeseries.
I have searched the BOL and MSDN for SQL Server and think there must be a solution using the PredictTimeSeries in DMX, but not quite sure if I can do this. I may be better off simply passing through the time series as a recordset and processing as I would have done in Access...(I am reluctant to do this as I have of the order 100 * 5 * 365 time series and growng by 100 each day and fear it will take quite some time...)
Can anyone help me by pointing me in the right direction please?
Unless there is a way of using PredictTimeSeries on its own, I think the solution is:
Identify if a record is the a valid one or part of a gap (ie missing values). Identify the longest gap and reject or process data on this value. Identify if a record preceedes or succeeds a gap. For each gap fill it using a linear interpolation.
In my application I am using Identity columns. When some rows are deleted from table, This identity values are not filling the gap. I mean My current identity is 5. That means 1 to 5 rows sequentially i inserted. If I am deleting 3rd and 4th rows, next identity will still continue with 6. So is there any method to fill the gap between rows
i have 50 tables. M trying to fill the table in dataset using a loop. no roblem for first 25 tables , but at 26th table it gives the error "Input string was not in a correct format.Couldn't store <value> in "column_name" Column. Expected type is UInt32.
column type is varchar(50) changed it to TEXT ....but nothing happened.
its taking numericals but no characters.
Debugged one by one... at 26th table.... adapter.fill(dataset) throws exception
I am having an issue with the transaction growing uncontrolled and filling up the disk. I suspect that transactions are structured incorrectly between the web application that is monitoring a queue and the SQL that is executing the WAITFOR RECEIVE. This method is receiving large binary objects, so thats the reason for the arguments to the reader. Also, even though its not the suggested way, we commit everytime through to prevent the queue from disabling (which it was doing when we would ROLLBACK - we don't really care if the message is bad, we just want to log it and wait for the next one).
The basic structure is this, which is executed on a separate thread. Am I missing something that could be causing transactions to get into a state where the log grows uncontrollably? Is there a problem with the loop? Should I be doing a ROLLBACK when there is nothing to receive (this is a low volume queue, so it may not receive a message for a few minutes or more)? If so, where should I be doing this?
hi friends, i look forward an answer that solves my problem. iam trying too populate a DropDown list . here is the codings. Previously it was working. suddenly, it s generating error. strConnectionString = "Provider = SQLOLEDB;Integrated Security=False; User ID=sa;Password=;Data Source=GIREESH-AC720F7;Initial Catalog=NorthWind"
in page_load event dim sql as string sql = "select AthleteNameKey from athletes" result_adap = DbAccess.ExecuteAdaP(sql) result_adap.Fill(result_ds, "athletes") cboAthleteName.DataSource = "athletes" cboAthleteName.DataTextField = "AthleteNameKey" cboAthleteName.DataValueField = "AthleteNameKey" cboAthleteName.DataBind()
Public Function ExecuteAdaP(ByVal sqls As String) As OleDbDataAdapter 'Dim ds As New OleDbDataAdapter Dim da As New OleDbDataAdapter(sqls, strConnectionString) 'da.Fill(ds) Return da End Function
Hello all I have shifted my vb/access database to vb/mysql and i have one form in my project in which i want to display all the records in the database in the list box . But while doing this i m getting the error " variable uses an automation type not supported in visual basic " . Moreover, it was working in Vb/access .
here is the code
Dim sql As String intCountSW_ID = 0 sql = "select SW_IDEN, SW_NAME, SW_DELETE,SW_LEFTDATE from SV_SOCIALWORKER order by SW_NAME" If rs.State = 1 Then rs.Close
I've been having problems with my tempdb filling up, and causing all databases on the server to stop functioning properly. I've been removing alot of data lately (millions of rows), and I think this is the reason why my tempdb log is going thru an unusual load.
Whats the best way to make sure the tempdb doesnt fill up causing me major problems? I had temporarily turned off backups while I was having a new HD put in. Am I right in thinking that when a DB is backed up, the tempdb log is reduced in size? Should maintaining a daily backup solution help keep things under control ?
I have a table that has a int field that contains unique nubers.if I need to insert 4000 or so new records into this table using a SQLcommand how would I be able to fill this field with the next availablenumber?This will be a manual procedure done maybe once a year so the nextnumber will be known ahaed of time. but the insert command need toincrement this number by one everytime it puts a new row in.
I always get a "ConstraintException" error when trying, at beginning of application, to run following statement (within "Form1_Load" routine called by "this->Load" EventHandler):
Also curiously if, while application is still running, I invoke again that same statement by means of a pushbutton clickevent, everything is running smootly without error... Looks to me that it is bugging only when running the first time...
Within Dataset, I tried to see what constraint could give me such trouble. Here is the only constraint I could find:
I then tried to find within "TauxTaxe" table if there could be any trace of records where "Pays" and "Province_Etat" columns would show any null value as well as any duplicate key values but there wasn't any...
Any place I should start to look for? BTW, I'm using a SQL Express database.
Finally, SQL Statement for "TA_TauxTaxe::FillByPays" is the following:
SELECT Pays, Province_Etat, Taxe1_Appl, Taxe1_Dsc, Taxe1_Taux, Taxe2_Appl, Taxe2_Dsc, Taxe2_Taux FROM TauxTaxe WHERE (Pays = @Pays)
In debug mode, I double-checked and could verify that @Pays didn't have any null value but a valid string value at time "Fill" routine was invoked. Any clue?
Insert into #Customers values(101,'Aron',23,1,1,12,1,0); Insert into #Customers values(102,'Cathy',28,1,1,13,1,0); Insert into #Customers values(103,'Zarog',33,1,1,14,1,0); Insert into #Customers values(104,'Michale',25,1,2,12,1,0); Insert into #Customers values(105,'Linda',43,1,2,13,1,0);
Insert into #Customers values(106,'Burt',53,1,2,14,1,0);
If you observe, the rows are unique based on the internalid per st_code,per city_code
Problem : Now the user inserts another row but this time he passes only the following :
Insert into #Customer values(120,'AronNew',null,1,1,12,null,null) - Note he doesnt pass the age or the type
I want that when he passes this row, i match up this row with the existing row based on st_code,citycode and internalid and then update the new row with the missing values(only columns with null) that were there in the existing row
I have implemented a caching strategy using the sqlcachedependency and sql server 2005 backend using the broker service.This works fine and well when i am connecting to the SQL Server 2005 under service account that is in the role db_owner. In a production enivironment i am reluctant to do this so i created another service account that only has execute permissions on the stored procedures.When i use this limited service account for my ASP.net web application, the broker service does not send any messages to the web app to invalid the cache. When checking the event log and SQL profiler i get errors all relating to the user not having access to the SqlQueryNotificationService queue. So i did a lot of googling and tried running the grant scripts below with no luck using this limited service account. Keep in mind everything works fine if i use an account with db_owner priviledges. These are the grants i have tried based on numerious articles GRANT CREATE PROCEDURE TO three_d_ss_loginGRANT CREATE QUEUE TO three_d_ss_loginGRANT CREATE SERVICE TO three_d_ss_loginGRANT SUBSCRIBE QUERY NOTIFICATIONS TO three_d_ss_loginGRANT RECEIVE ON QueryNotificationErrorsQueue TO three_d_ss_loginGRANT REFERENCES ON CONTRACT::[http://schemas.microsoft.com/SQL/Notifications/PostQueryNotification] TO three_d_ss_login These are the grants i have tried that does not work GRANT SEND ON SERVICE::SqlQueryNotificationService TO three_d_ss_loginGRANT RECEIVE on SqlQueryNotificationService_DefaultQueue to three_d_ss_loginCan some one suggest what i need to do to get sqlcacheddependencies to work with a sql2005 backend under a limited priviledged service account? ThanksJim