Maintain Session State Throught Application For Long Time
Mar 31, 2008
Hi,
I need the application not to expire the session always. I need to maintain the session always even if the user keeps the web application idle for long time.
The 2.0 version of ASPSTATE is slightly different than the 1.1 version in that one table has one additional column and another table uses a different data type and size for the key. The 2.0 version also has a couple additional stored procedures.
We'd like to manage just one session state database if possible so we're trying to figure out if Microsoft supports using the new schema for 1.1 session state access (it seems to work, but our testing has been very light).
Is there any official support line on this? If not, can anyone comment on whether or not you'd expect it to work and why?
HiCan anyone help me with Session Management in ASP.NET. I have a webapplication with a login page and i need to retain the username afterhe logs in. This is what I have in Config FilesGlobal.asax<OBJECT RUNAT="SERVER" SCOPE="SESSION" ID="MyInfo"PROGID="Scripting.Dictionary"></OBJECT>web.config<configuration><!-- This section stores your SQL configuration as an appsettingcalledconn. You can name this setting anything you would like.--><appSettings><add key="conn"value="server=ADVAITH;database=tis;uid=sa;pwd=;" /></appSettings><!-- This section sets the authentication mode to formsauthenticationand routes all traffic to the specified page. It alsospecifies atimeout. The authorization section below denies all users notauthenticated. For testing purposes, custom errors was turnedoff.The section below allows pages to be trace enabled fordebugging--><system.web><sessionStatemode="InProc"cookieless="false"timeout="20" /><authentication mode="Forms"><forms name="MyFormsAuthentication" loginUrl="login.aspx"protection="All" timeout="720" /></authentication><authorization><deny users="?" /></authorization><customErrors mode="Off" /><trace enabled="true" requestLimit="0" pageOutput="true" /></system.web><system.web></system.web></configuration>When i assign the user name in the log in page it workssession("username") = txtUsername.TextBut if I use a new session variable or the username(session variable asdeclared in the login page, in the menu page it gives me error sayingthat Decralation is expected . Is there any files that i need to importb4 using sessions??Please help me out. My email:bramanujam@gmail.com
I'm currently storing session state in a sql 2000 server for my vs.net 2002 projects. I'm currently developing a new project using vs.net 2005 and was wondering if I could store the session state in the same sql server. Does anyone know if this is possible? If so can someone point me in to an article or code example. All of the examples I have come across are for sqlExpress. Thanks in advanceJason
I have SQL 2005 mmirroring sucessfully working in an ASP.NET 2.0 web application. I also have session state being maintained in SQL Server using the built-in functionality in ASP.NET. The problem is, even with "allowCustomSqlDatabase=True" and specifying a failover partner in the DSN, it appears that ASP.NET does not work with the failover partner. It always tries to get the session info from the "data source" server specified in the DSN.
Is there a way to get the session state to fail over to the mirrored server without writing a custom session state provider?
Hey everyone, I'm new to .NET and I've recently inheirited a rather large and busy asp.net website. I was asked to add a testimonials section on each page that will randomly pull a testimonial out of the db. This is fine, however, I'm getting random errors about the DB connection either being closed or connecting. Here is the code for the testimonials class: 1 public SqlDataReader GetTestimonials(ref SqlDataReader reader, int iCatID, string sLanguageType)2 {3 SqlCommand cmd = new SqlCommand("sp_DVX_Testimonials_Fetch", Connection);4 cmd.CommandType = CommandType.StoredProcedure;5 6 cmd.Parameters.Add(new SqlParameter("@Cat_ID", iCatID));7 cmd.Parameters.Add(new SqlParameter("@LanguageType", sLanguageType));8 9 reader = cmd.ExecuteReader();10 11 return reader;12 } I know this isn't the best way to do this (especially for each page[this site averages about 1000 hits a day]), so I was wondering was--is there a way to maintain a single DB connection that's set up in the Application_Start that will maintain the connection so I don't have to worry about this error. If not, does anyone and any ideas as to what would help? Thanks in advance!
I am working in ASP.NET 2.0 and using sql server 2000 as backend . In my application i need to insert/update to oracle database table lying on different server. Please let me know how can i maintain two different connecttions to different databases lying on different servers.....
I am working in ASP.NET 2.0 and using sql server 2000 as backend . In my application i need to insert/update to oracle database table lying on different server. Please let me know how can i maintain two different connecttions to different databases lying on different servers.....
I have an ASP.Net (C# 2.0) application that has been using SQL Server 2005 Standard Edition with Service Pack 1 to hold the session state in a testing environment. Currently, the session state is being stored in TempDB, rather than the ASPState database. This has worked very well for us until yesterday. We installed SQL Server 2005 Service Pack 2, as well as the Critical Update for Service Pack 2 (KB933508). Once the SQL server was rebooted, I got the following error message when I tried to access the web application. The SELECT permission was denied on the object 'ASPStateTempApplications', database 'tempdb', schema 'dbo'.The SELECT permission was denied on the object 'ASPStateTempApplications', database 'tempdb', schema 'dbo'.The INSERT permission was denied on the object 'ASPStateTempApplications', database 'tempdb', schema 'dbo'. In the web.config file for the application, I have a SQL username and password defined that can access the ASPState database. To correct this issue, I had to give this user db_datareader and db_datawriter access to tempDB.
Has anyone else run across this problem, and is it related to SQL Server 2005 Service Pack 2?
I am contemplating storing session state data in a SQL server database (created by running the installSqlState.sql script included in the .NET framework installation) and have established a functioning connection to the database but I am constantly getting "access denied". I've found that tweeking the permission settings in SQL for the ASP.NET user is resolving each specific error that arises but was wondering if there is a more "global" resolution? I'm finding myself having to manually check off each individual object and every option or is this what is needed to resolve the "access denied" error?
We have a Silverlight based application which currently supports only one production version. Idea is to support three concurrent versions of the same application and user will switch to the newer versions based on their interest or they can still continue with the older version.
We still have to use the existing database for all these three versions.
What is the best way to architect this so that we can differentiate the code between the versions and still keep the data in sync and run all the versions in parallel.
An old website I inherited uses sa to connect to SQL SessionState and had the details in the web.config. This is bad for security.The session state database is of -sstype "t" which is defined as:Temporary. Session state data is stored in the SQL Server tempdb database. Stored procedures for managing session state are installed in the SQL Server ASPState database. Data is not persisted if you restart SQL. This is the default.What kind of WIndows user, SQL Login, role and permissions do I need to create to make Session State secure? (Windows Server 2012 and SQL Server 2012 mixed mode authentication, Webfarm).
I am using VS2005 (VB) to develop a PPC WM5.0 Program. And I am using SQLCE 3.0. My PPC Hardware is in 400MHz.
The question is when the program try to insert the first record into sdf database after each time the program started. It takes a long time. Does anyone know why and how can I fix it?
I will load the whole database into a dataset when the program start and do all the "Insert", "Update", "Delete" in this dataset and fill it into database after each action.
cn.Open() sda = New SqlCeDataAdapter(SQL, cn) 'SQL = Select * From Table scb = New SqlCeCommandBuilder(sda) sda.Update(dataset) cn.Close()
I check the sda.update(), it takes about 0.08s for filling one record into database normally. But:
1. Start the PPC Program
2. Load DB into dataset
3. Create a ONE new record in dataset
4. Fill back to DB
When I take this four steps everytime, the filling time is almost 1s or even more!
Actually, 0.08s is just a normal case. Sometimes, it still takes over 1s to filling back a dataset which only inserted one record when the program is running. (Even all inserted records are exactly the same in data jsut different in the integer key)
However, when I give up the dataset and using the following code:
cn.Open() Dim cmd As New SqlCeCommand(SQL, cn) ' I have build the insert SQL before (Insert Into Table values(XXXXXXXXXXXXXXX All field)
I found that it is still the same that the first inserted record takes more time, but just about 0.2s. And the normal insert time is around 0.02s. It is 4 times faster!!!
Hello, i have a problem regarding stored procedures and view server state.
I have an application with a lot of stored procedures, one of them checks data of the connected users. In SQL 2000 i had no problem getting this information, but in SQL server 2005 i do.
my stored procedure looks like this:
ALTER PROCEDURE [dba].[applsp_GetConnectionInfo]
(
@DBName varchar(100)
)
WITH EXECUTE AS OWNER AS
BEGIN
SET NOCOUNT ON
DECLARE @sCollationMaster VARCHAR(128);
DECLARE @sSqlString VARCHAR(900);
-- Determine collation from master database because collation from master and ultimo database may differ
SELECT @sCollationMaster = CAST(databasepropertyex('master', 'Collation') AS VARCHAR);
SET @sSqlString =
'SELECT max(status) AS Status, max(isnull(SCISUSENAME, ''ULTIMOLOGIN'')) AS Login
, MAX(Rtrim(Rtrim(convert(varchar(255), nt_domain)) + nt_username)) AS NTUser
, max(Rtrim(hostname)) AS Host, MAX(Rtrim(program_name)) AS Program
FROM master.dbo.sysprocesses JOIN dba.SCONNECTIONINFO on SCISPID = CAST(spid AS VARCHAR)
AND ( SCISUSENAME = ISNULL(loginame, '''') COLLATE ' + @sCollationMaster + ' OR ISNULL(loginame, '''') = ''ULTIMOLOGIN'')
WHERE ...... AND DB_NAME(dbid) = ''' + @DBName + '''
GROUP BY hostprocess
ORDER BY Login
';
EXEC(@sSqlString);
END
I've granted view server state permissions to my user 'dba' which is the db_owner. When i execute the query in the stored procedure seperatly as dba i get all the info i need, but when i execute the stored procedure i don't see anything.
I seem to have the same problem with sp_who2 Executing it gives me information about everyone but when i put in a stored procedure like this:
I have MS Time Seeries model using a database of over a thousand products each of which has hundreds of cases. It amazingly takes only a few minutes to finish processing the model, but when I click Mining Model Viewer to view the models, it takes many hours to show up. Once the window is open, I can choose model for different products almost instantly. Is this normal?
I am looking at building multiple SSIS packages. There will be some similarities. Flexibility is of highest importance. The main packages will need to connect to SQL Server1 as a source and SQL Server2 as a destination to transfer over dimenion data from multiple databases. (other SSIS packages may need to use SQL Server2 as a source and SQL Server1 as a destination)
For a single dimension table containing column dim_id on the target server (SQLServer2). I need to pass the results of the following SQL and insert into SQLServer2.database.dim_table
select dim.id from SQLServer1.database08.dim_table union select dim.id from SQLServer1.database07.dim_table union select dim.id from SQLServer1.database06.dim_table
Now next year the names of the databases on SQLServer1 will be database09,database08,database07!
Now so far my best thought is creating views in my destination SQL Server. So I need some way of dropping and recreating the views. Previously in DTS I would expect to see SQL Server connection that I could use as source and destination. Now I can see SQL Server destination but not source? Also How do I just use SSIS to run some SQL. i.e execute a stored procedure, drop and creat views?
Many thanks, Ells p.s Flexibility is the key, in the last three months all the ip and server names have changed more than once so need to be as flexible as possible.
After adding the Witness Server to the Mirror session, the Witness Connection state between the Mirror and Witness Connection is Disconnected and the state between Principal and Witness Connection is Connected.
The procedures defined in Books Online was used to setup Database Mirroring...when the Witness server was added to the Mirror session, only the alter database T-SQL statement was executed on the Principal server.
ALTER DATABASE <db_name> SET WITNESS = 'TCP://<servername>:<port>'
After executing the above statement, a few seconds later the state between Principal and Witness Connection changed to Connected and the state between Mirror and Witness Connection remains Disconnected.
The Mirror session is not using Certificates, every server is on the same domain, using the same domain login account, and all servers have SP2 installed running Enterprise Edition.
Any idea's why the state between Mirror and Witness Connection remains Disconnected?
I've got a SQL that registers people passing in and out via their acess cards. Basically passing in means they set a discrete tag to 1 and out tag to 0. SQL logs delta so change is logged. I would now want to Query the database and just want the monthly time they've been in i.e. how long has the tag been 1. I can pull a monthly list of the tags all statuses (1 and 0) and times but how do I put together a query that just gives me the time it been in 1 status?
following to pull a delta list i.e changes:
SET NOCOUNT ON DECLARE @StartDate DateTime DECLARE @EndDate DateTime SET @StartDate = DateAdd(wk,-1,GetDate()) SET @EndDate = GetDate() SET NOCOUNT OFF SELECT TagName, DateTime = convert(nvarchar, DateTime, 113), vValue, Quality, QualityDetail = v_History.QualityDetail, QualityString FROM v_History LEFT JOIN QualityMap ON QualityMap.QualityDetail = v_History.QualityDetail WHERE TagName IN ('Pete_Smith') AND vValue <= 1 AND wwVersion = 'Original' AND wwRetrievalMode = 'Delta' AND wwRowCount = 1000 AND DateTime >= @StartDate AND DateTime <= @EndDate
I have a stored procedure that is called from a VB.NET application that takes an enormously long time to execute. In the QA it only takes 10sec but in the application it takes ages. The stored procedure is as follows:
PROCEDURE NAME IS SPTOPTWENTYUSERS
SELECT TOP 20 STRUSERNAME,SUM(INTBYTESRECVD) AS INTDOWNLOAD FROM TBLISAWEBLOGS WHERE DTELOGDATE BETWEEN @BEGINDATE AND @ENDDATE GROUP BY STRUSERNAME ORDER BY INTDOWNLOAD DESC
The code that runs it is as follows:
sSQLString = SPTOPTWENTYUSERS Using cnn As New SqlConnection(GetPath) Try Dim cmd As New SqlCommand(sSQLString, cnn) Dim dr As SqlDataReader
With cmd .CommandType = CommandType.StoredProcedure .CommandTimeout = 0 .Parameters.Add("@BEGINDATE", SqlDbType.DateTime) .Parameters.Add("@ENDDATE", SqlDbType.DateTime) .Parameters("@BEGINDATE").Value = dtpStartDate.Value .Parameters("@ENDDATE").Value = dtpEndDate.Value End With cnn.Open() dr = cmd.ExecuteReader
Any help on why this happens would be much appreciated.
I made a website in ASP.net and using sql server 2005 as database. There is sometime processing data that need long time processing ( about 20 minutes ) and big data. It works fine in dev box, but when I place on shared hosting, and some people access it crashed. The website can not be accessed. Hosting support told me maybe I need to reprogram my code. So anybody has solution for this problem ? Should I create new thread ?
Hi There, We have developed a application in VB and connected to SQL Server 6.5, we have some stored procedures where it brings the data from SQL Server 6.5, this application is running since some months, when we run this application it usually take only one minute to generate the report but since couple of days it is taking 25 Minutes to generate the report, even when I run that stored procedure at backend in Query analyzer at Server it is taking 15-20 Minutes to give the result. please can any one help in identifying the problem, What all the things I need to check to identify it. Give me the solution.
Problem: I schedule a job that calls a stored procedure which loads around 1.5 million records. The Job takes 19 hrs to complete. However, if i run that stored procedure manually in Query Analyser it takes only 45 minutes..
Did anyone faced this problem? Is this known problem..Any suggestions/recommendations?
I have a CTE query that is used to fill in nulls on a history table. The With statement executes just fine. sub 2 seconds on 974 records, however the main query is what's turning the whole query into a turtle. I know that it's the looping that it's doing there that is causing the slow down, but I'm just not sure how to fix it. I've tried inserting it into a temp table, refactored the code a hundred times, but nothing seems to be working.
Code is below and the execution plan is attached. Server Version: 12.0.2342.0 Enterprise: 64bit
;WITH BuildTable AS ( SELECT [GEGTH].[ID] , [GEGTH].[Changed By] , CAST( [dbo].[GetWeekStarting] ([GEGTH].[Changed Date] , 2 ) AS DATE) AS WeekOf , [GEGT].[Title]
I had a database of electronic resources which had 28000 records earlies and was working fine. Now we have added a whole bunch to make it 800K records which has increased the search time to 14-22 seconds which is not acceptable. I have all the tables indexed.
Please help me how to solve this problem. Let me know what other information I should put up here to make my problem undestandable. Thanks in advance, Archana
When I login using QA to my SQL Server database, it takes 15-20 secondsto establish a connection and open a query window. Drilling into adatabase via Enterprise Manager is similar. Once the connection isestablished, the server runs plenty fast however.Can someone tell me why it could take a long time for a connection tobe established?This behavior occurs when I am local on the box.Thanks,John
Hi,I've a strange problem with a INSERT query. It's taking a long time toexecute. The format is like this :INSERT INTO table1SELECT ..FROM table2Executing the SELECT .. FROM table2 is taking 30 seconds. The resultis nothing: no records are selected.When i include the INSERT part it will take 12 hours to completeINSERT INTO table1SELECT ..FROM table2There's is an index on the table and when i delete it, it gives stillthe problem.Keh?Greetz,Hennie
I have a query which returns approximately 50000 records, I am using a linked server to connect to two databases and retrieve data. For some reason it is taking a liitle more than hour to execute the query, but on MS Sql Server query window it comes after few minutes but the query runs for a long time.
How can expediate my query execution process.
Environment details
Database: MS Sql Server 64bit 2005 MS Sql jar file: sqljdbc_1.2.jar OS: Windows both server and client.
I'm running a query (see below) on my development server and its taking around 45 seconds. It hosts 18 user databases ranging from 3 MB to 400 MB. The production server, which is very similar but with only 1 25 MB user database, runs the query in less than 1 second. Both servers have been running on VMWare for almost 1 year with no problems. However last week I applied SP 2 to the development server, and yesterday I applied Critical Update KB934458. The production server is still running SQL Server 2005 Standard SP 1. Other than that, both servers are identical and running Windows 2003 Server Standard SP 1. I'm not seeing this discrepancy with other queries running against user databases.
use MyDatabase
GO
select db_name(database_id) as 'Database', o.name as 'Table',
s.index_id, index_type_desc, alloc_unit_type_desc, index_level, i.name as 'Index Name',
If I use the following query for a Dataset and the execution takes a few seconds to show results
SELECT *
FROM dbo.ICParameter
WHERE (PatientID = @ID ) AND (LogTime > DATEADD(day, -1, GETDATE()))
ORDER BY LogTime
If I replace '99010200101' with @ID and enter '99010200101' when prompted for ID, the execution takes forever. Actually I have never got any results even after waiting for 10 minutes.
It seems inserting records takes a relatively long time. My guess is it needs time to allocate disk space for the extra space needed. Assuming this is true, are there any DB settings that allow auto space allocation in bigger chunk? I am looking for something like "DB growth factor" or " Table growth factor"
Dear friends, Our package which at the time of normal execution takes 2-2:30 mins for fetching data on VPN with some select queries. But some times its job runs for hours and hours. What could be the exact reason behind it? I guess its queries are stuch somewhere, but not when we run from the BIDS or run the job manually. Please help. Thanks.