I have 2 aspx pages. one is "login.aspx" and the second is "test_connection.aspx". the "login.aspx" is using the membership class for my website's security. if u have have restarted your computer and you first load this "login.aspx", this will work fine and you will see that you can create a user. when you load (or view) next "test_connection.aspx" you will get this error message:
System.Data.SqlClient.SqlException: Cannot open user default database. Login failed.
Login failed for user 'sqluser1'.
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
If I will restart my computer and I will load first the "test_connection.aspx", this page will work just fine and if you will load next "login.aspx", i will get this error message:
Cannot open user default database. Login failed.
Login failed for user 'YECIAASPNET'.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: Cannot open user default database. Login failed.
Login failed for user 'YECIAASPNET'.
Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
the page "test_connection.aspx" is using an sql username i have created. I am only using one database in this project and thats it ASPNET.mdf
what's happening? i cant understand. I dont know what im doing wrong. i am very new to this dotNET and MS-SQL. im not sure if this has relation to the other problem i have on the other thread (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=254568&SiteID=1).
i was working on a production server, and have stopped the sqlserver service along with the sql server agent. since i had to copy a MDF file . now i started the service again . but i find that there are no transactions happening ....what could be the reason.
Im facing one problem When more than one user tries to insert record through the application only one users data is getting inserted. Wat might be the problem?
This is the stored proc im using..
create proc [dbo].[GSK_insertregion] (@Country varchar(50),@Userid int,@RegionId varchar(50)) as begin insert into UserHierarchy(Country,Userid,RegionId) values(@Country,@Userid,@RegionId) end
I set up my package for logging to SQL Server. I set up a connection manager for the logging, but did not specify a database. Doesn't that mean that the logging should default to msdb.sysdtslog90?
But when I check the table, there's nothing in it, after I run the package.
Hi I have deployed a website on a server having Windows2000, IIS5.0 . It uses SQL Server 2000 which is on another remote server. While developing I used the visual tools in VS.net to make a connection and have used DYnamic properties of the connection object to map the connection string to the entry in to the config file. This works fine on my developement machine which has IIS and SQL Server 2000 on the same machine. The entry in the web.config for my connection string is:
value= " server=xxx.yyy.com; Trusted_Connection=yes;provider=SQLOLEDB.1;Initial Catalog=events; User id=myuser; Password=password;"
where xxx.yyy.com is the server running SQL Server2000.
I do not get any error but the conncetion doesnot happen and my datagrid doesnot get filled. The code for creating the connection is designer generated code. Any clues? -svp
Hello people, I don't expect anyone to know the answer to this but I guess we'll see huh?
I'm using Microsoft SQL Server Management Studio to do all my SQL stuff, and one of the tools that comes with this is a program called Microsoft SQL Profiler 2005. Anyways, so I'm using this profiler to capture SQL processes in the background while I work with the front end.
Here is a problem I'm facing... I am looking at this table right... and I see that when I do this certain process on the front end, it inserts 3 rows into the table. So I'm thinking "I know that on the tracer, I should be looking for an insert or a Stored Procedure with and insert in it."
So I use the tracer on the front end process and it shows me all the stored procedures that happen during the process of inserting these 3 rows into this table (btw this process is doing other things besides inserting stuff into this table, but it's what i'm currently working with at the moment.)
So what I do here at my job is I take this code, tweak it to work for our front end, and wah lah.... we're good to go. So in saying that, I make my own stored procedure and I all these stored procedures that are happening.
My result......
I get only 2 row inserts into the table.... the 1st row and the last row..... the middle row isn't inserted for some mysterious reason. I tried checking all the stored procedures for unique information pertaining to that specific row insert but to no avail I couldn't.
So my question to you guys is.... is there anything I'm overlooking that could possibly be inserting that row in the table? Thanks guys!
I have two problems I need some help with.First, I've just inherited a system and am delving into a few timeoutproblems that are causing problems for the users.Now, if I do a simple select * from the table (which looks to be thecause of the problem at this stage) in QA, I get the results back inless than a second. If I open the table in EM it takes about 10. Isthere a difference in viewing the data this way ? I'm used to EM beingvirtually the same speed. There is only one row. Minor questionreally, just something I'd like to understand if there is adifference.CREATE TABLE [QUERY] ([QUERY_ID] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[CAT_ID] [numeric](18, 0) NOT NULL ,[QUERY_DESCR] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,[USER_NAME] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,[USER_ID] [int] NOT NULL ,[IND_EURO] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULLCONSTRAINT [DF_QUERY_IND_EURO] DEFAULT ('N'),[IND_DGCOLUMNS] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL CONSTRAINT [DF_QUERY_IND_DGCOLUMNS] DEFAULT ('N'),[NO_GROUPS] [int] NOT NULL CONSTRAINT [DF_QUERY_NO_GROUPS] DEFAULT(0),[NO_FIELDS] [int] NOT NULL CONSTRAINT [DF_QUERY_NO_FIELDS] DEFAULT(0),[NO_LINES] [int] NOT NULL CONSTRAINT [DF_QUERY_NO_LINES] DEFAULT (0),CONSTRAINT [PK_QUERY] PRIMARY KEY CLUSTERED([QUERY_ID]) WITH FILLFACTOR = 90 ON [PRIMARY] ,CONSTRAINT [FK_QUERY_QUERY_CATEGORY] FOREIGN KEY([CAT_ID]) REFERENCES [QUERY_CATEGORY] ([CAT_ID]) ON DELETE CASCADE ON UPDATE CASCADE) ON [PRIMARY]GOI don't think any re-indexing has been done on this (or the othertables in the db). I was wondering if constant adding/deleting rowscould cause the index to be massive and in need of a good clear out.Any pointers would be appreciated. From what I can tell, there wassome problems trying to get replication to work. I need to dig deeperto see if this is now correct.-------------------------Secondly, there is a another table in the same database.CREATE TABLE [FIELD_DATA] ([ID] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[DATA_ID] [numeric](18, 0) NOT NULL ,[FIELD_ID] [numeric](18, 0) NULL ,[FIELD_CODE] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,[FIELD_VALUE] [numeric](15, 5) NULL ,CONSTRAINT [PK_FIELDDATA] PRIMARY KEY CLUSTERED([ID]) WITH FILLFACTOR = 90 ON [PRIMARY]) ON [PRIMARY]GOIt holds approx 4 million rows. The rest of the tables have minimaldata and about the same amount (consider them the same if you will).Now, another 'copy' of this database is held elsewhere (differentclient data) and this holds 40 million rows. The difference is thatthe first DB is 4.5GB and the second 6.5GB (approx). Does this provemy theory that re-indexing would be a good idea ?ThanksRyan
"The log file for database is full. Back up the transaction log forthe database to free up some log space."Now I only know this way to deal with that manually,Step1. in option , chance Recovery model from FULL to Simple.Step2: go to task to manually shrink the log fileStep3: Change recovery model back from simple to FULL.But by this way, I could get same problem again, the log file is fill,and need free up.Could you give an idea how to prevent this from happening? what andhow should I do???Thanks a lot in advance for your help.
Iam having 500 pages report,the report ia running for the file key ,I have reference some of the feilds to the reports header segment to display in every page,For every file key the report header feild also will change ,It is working for all file keys but the reference not occuring from body to report header when file key grouping changes to other grouping
Please let me know is there any possible way to do this one
I have configured an alert like below to track all blocked events in SQL Server across all databases and then kick start a sql job when a blocking happens which inserts data to a table, when there is a blocking in SQL server , i get an email  --which is working fine and i am able to track all queries.
but, HOW to get notifications ONLY if BLOCKING IS HAPPENING FOR MORE THAN 30 SECONDS OR 1 MINUTE with out using sp_configure?
---ALERT USE [msdb] GO EXEC msdb.dbo.sp_update_alert @name=N'Blocking Process', @message_id=0, @severity=0, @enabled=1,Â
i am making a query which select the data again a particuler date.
I insert values in the table for with current date(Today's date) and the records is inserted with the date format(2006-07-14 16:12:09),now when i run the query after 2 or 3 minutes to select the records inserted today, my query returns no results.
I think it is because of the the time (14:16 in this case) that after 2 minutes, the query looks for the records inserted at (2006-07-14 18:12 or 2006-07-14 19:12) and does not get the result.
Is there a method to not consider the time(14:16) when running the query but the query fetches the records including the records inserted at this time(14:16) no matter at what time I run the query today?
Hi there I sorry if I have placed this query in the wrong place. I'm getting to grips with ASP.net 2, slowly but surely! When i try to access my site which uses a Sql Server 2005 express DB i am receiving the following error:
Server Error in '/jarebu/site1' Application.
Database 'd:hostingmemberasangaApp_DataASPNETDB.mdf' already exists.Could not attach file 'd:hostingmemberjarebusite1App_DataASPNETDB.MDF' as database 'ASPNETDB'. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.SqlClient.SqlException: Database 'd:hostingmemberasangaApp_DataASPNETDB.mdf' already exists.Could not attach file 'd:hostingmemberjarebusite1App_DataASPNETDB.MDF' as database 'ASPNETDB'.Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace:
Version Information: Microsoft .NET Framework Version:2.0.50727.42; ASP.NET Version:2.0.50727.210
This is the connection string that I am using: <connectionStrings> <add name="ConnectionString" connectionString="Data Source=.SQLEXPRESS;AttachDbFilename=|DataDirectory|ASPNETDB.MDF;Integrated Security=True;Initial Catalog=ASPNETDB;User Instance=True" providerName="System.Data.SqlClient"/> </connectionStrings>
The database is definitly in the folder that the error message relates to. What I'm finding confusing is that the connection string seems to be finding "aranga"s database. Is it something daft?
declare @error int, @rowcount int select @rowcount = COUNT(1) FROM STG_BCDR; while @rowcount > 0 begin  BEGIN TRAN Deletion
[code]....
Above code i try to delete records batch by batch to avoid table locking at BCDR table.total records in this BCDRÂ table is 40,000 records. Â However I run the code at execution plan, the BCDR table still clustered index scan which means that the locking still happend.
If i change the delete top (5000)...... to delete top (5).... then thre is clustered index seek, which is good..The problem here is  each time  only delete top 5 records which is means it will realy take very long time to remove those data.
how to cater the situation inorder for me to delete those huge data without table locking happend. If table locking happend , then other user will not be able to access this table at the same time.
Hi everyone, I use the forms authetification for my report manager and server. But, i have this problem : The Logon page running successfully but redirect to Folder.aspx not happening
To clarify:
>I get the logon page (UILogon.aspx)
>My user has been registered ok (i have checked the entry in the db to make sure it was created)
>I enter the login & password correctly and page posts back
The redirect never happens - In the browser, it never leaves the UILogon.aspx
Using Win2003, SQL Server 2005, Reporting Services
I am trying to get a grasp on the Sql Stored procedures it seems i dont really understnad what DECLARE @Date DateTime means??? I mean i think it means that i am just declaring a varible name Date that will hold a DateTime Value??? is that correct or is it more to it????
Hi everyone, I am getting that infamous message on an INSERT Sql query. I am doing everything right by the looks of it. All variables are either passed in through a custom form, or else declared and initialised in the body of the script. I post the relevent code below: SQLsqlInsertEmail = "INSERT INTO CandidateLogins (SiteID, LoginName, CandidateEmail, DateRegistered) " & _" VALUES (@SiteID, @LoginName, @CandidateEmail, @DateRegistered); SELECT SCOPE_IDENTITY()"Try sqlSetCandidateEmail.Parameters.Add(New SqlParameter("@SiteID", SqlDbType.Int)) sqlSetCandidateEmail.Parameters("@SiteID").Value = SiteID sqlSetCandidateEmail.Parameters.Add(New SqlParameter("@LoginName", SqlDbType.VarChar)) sqlSetCandidateEmail.Parameters("@LoginName").Value = userName sqlSetCandidateEmail.Parameters.Add(New SqlParameter("@CandidateEmail", SqlDbType.VarChar)) sqlSetCandidateEmail.Parameters("@CandidateEmail").Value = email sqlSetCandidateEmail.Parameters.Add(New SqlParameter("@DateRegistered", SqlDbType.DateTime)) sqlSetCandidateEmail.Parameters("@DateRegistered").Value = DateRegistered sqlSetCandidateEmail = New SqlCommand(sqlInsertEmail, C4LConnection) C4LConnection.Open() CandidateID = sqlSetCandidateEmail.ExecuteScalar() Catch Exp As SqlException lblResults.Visible = True lblResults.Text = "Unable to Register Jobseeker: " & Exp.MessageFinallyC4LConnection.Close()End Try All the variables passed into the SQL statement are initialised, with SiteID beign set to '0', rather than Null (none of the fields are Nullable in the database table) and I have checked that the SqlDbType's correspend to the Table Definition So far as I can discern, everything is correct and as can be seen, I am not using a stored procedure in this instance, but the script falls over be producing the error message "Must Declare Scalar @SiteID", even though SiteID is declared as Int32 further up in the script. Any help would be appreciated.
Need a little help! I am trying to insert ListItems values from a DropDownList into a database table. However in the code behind I am continuosly met with the error Name 'ddltest' is not declared. As you can see from the code below ddltest is an object with the ID ddltest. What am I doing wrong? Protected Sub ddltest_SelectedIndexChanged(ByVal sender As Object, ByVal e As System.EventArgs)Dim MyVar As String = ddltest.SelectedItem.Value
If MyVar = "" Then ErrorMessage.Text = "Please select a test" Else 'Insert selection into databaseDim oConnection As New SqlConnection Dim oCommand As SqlCommand Dim sConnString As String Dim sSQL As String sConnString = "Data Source=.SQLEXPRESS;AttachDbFilename=|DataDirectory|xxxxx.mdf;Integrated Security=True;User Instance=false"oConnection = New SqlConnection(sConnString) sSQL = "INSERT INTO testDB(Myxxxx) Values (@Myxxxx)" oConnection.Open()oCommand = New SqlCommand(sSQL, oConnection)oCommand.Parameters.Add(New SqlParameter("@Myxxxx", MyVar)) oCommand.ExecuteNonQuery() oConnection.Close() ErrorMessage.Text = "You selected " & MyVar & " and it has been added to the database." End If End Sub
Is there any way to create a cursor, based on a dynamically created select_statement? Something like: DECLARE someCRS CURSOR LOCAL FAST_FORWARD FOR @strSelect where @strSelect is previously declared as let's say varchar. I don't want to create a stored procedure for this.
OMG i'm so stupid, i edited my original post instead of replying!!
I was wondering if there was away to write a stored procedure where I concatenate several columns to create a Phrase and use that Phrase as a new value to do a second search in another table.
SELECT 'ACRES', '2008', property_char.property_id, ROUND(property_char.value,0) FROM
property_char INNER JOIN property ON property_char.property_id = property.id INNER JOIN property_char AS @pc2 ON property.id = @pc2.property_id INNER JOIN prop_valuation ON property.id = prop_valuation.property_id INNER JOIN val_component ON property.id = val_component.property_id
WHERE property_char.property_id < 81695 AND property_char.property_id = property.id AND property_char.prop_char_typ_code = 'SIZE' AND property_char.tax_year = '2008' AND @pc2.prop_char_typ_code = 'USECD' AND (@pc2.value not in ('85','86','87','88','95') AND --( <=== Review list of Usecodes)) @pc2.tax_year = '2008' AND @pc2.property_id = property.id AND property.pact_code = 'REAL' AND (property.eff_to_date is null OR property.eff_to_date >= getdate())AND prop_valuation.property_id = property.id AND prop_valuation.tax_year = '2008' AND prop_valuation.local_assed_ind = 'Y' AND val_component.value_type = 'MKLND' AND val_component.property_id = property.id AND val_component.tax_year = '2008' AND val_component.modified_value > 0)AND NOT EXISTS (Select 'z' from parcel_exclude where property.parcel_number = parcel_exclude.parcel_number AND special_assessment = 'CD')
I know how to set a variable for the entire query, but how do I set one per row?
I have one query which is growing beyond my ability to understand/maintain it. I factor in the sale price of an item based on cost, our markup, shipping & channel fees. The problem is that each one of these has it's own variability. I would like to be able to store inline calculations as a variable such that I can call & add them. Example:
SELECT product ,price ,Cast(price * 1.1 as decimal(18,2)) as markup ,@markup + 5 as 'markup-and-shipping' -- <-- This factors in all of the previous calculations, plus something new. FROM #tempPrice
i try to execute this one it does not work said that Msg 102, Level 15, State 1, Line 4 Incorrect syntax near '@fdas'. how to fix that one? by not replacing the @fdas