I set up my package for logging to SQL Server. I set up a connection manager for the logging, but did not specify a database. Doesn't that mean that the logging should default to msdb.sysdtslog90?
But when I check the table, there's nothing in it, after I run the package.
I am developing a package on my local workstation. I have defined two logging service providers. One is for SQL Server and the other is for the Windows Event Log. I am using the Dts.Log method in a script task to write log entries.
Logging is working properly with the SQL Server provider and rows are being inserted into the sysdtslog90 table. However, the only events that are being logged in the Windows Event Log are the package start and end events which I believe SSIS is doing automatically anyway.
Is there something I need to do to enable WIndows Event Log logging other than defining a log provider and making sure it is checked active? Won't SSIS write to two different logs with one Dts.Log call? Any ideas on what might be going wrong with my approach?
Hi, I decided to use the SQL Server log provider to store logging data of all my Integration Services packages. I also created some reports about this data for operating purposes. I have a problem occurs the name of the executing package is not always written to the log,but the name of the single task which failed. But that is not very useful information for operating, because I do not see any chance to get the name of the package by the information which is logged in the sysdtslog90 table in the database which I defined for SSIS Logging.
How do I configure the package to always log the package information into the table, too?
I have 2 aspx pages. one is "login.aspx" and the second is "test_connection.aspx". the "login.aspx" is using the membership class for my website's security. if u have have restarted your computer and you first load this "login.aspx", this will work fine and you will see that you can create a user. when you load (or view) next "test_connection.aspx" you will get this error message:
System.Data.SqlClient.SqlException: Cannot open user default database. Login failed.
Login failed for user 'sqluser1'.
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
If I will restart my computer and I will load first the "test_connection.aspx", this page will work just fine and if you will load next "login.aspx", i will get this error message:
Cannot open user default database. Login failed.
Login failed for user 'YECIAASPNET'.
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: Cannot open user default database. Login failed.
Login failed for user 'YECIAASPNET'.
Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.
the page "test_connection.aspx" is using an sql username i have created. I am only using one database in this project and thats it ASPNET.mdf
what's happening? i cant understand. I dont know what im doing wrong. i am very new to this dotNET and MS-SQL. im not sure if this has relation to the other problem i have on the other thread (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=254568&SiteID=1).
i was working on a production server, and have stopped the sqlserver service along with the sql server agent. since i had to copy a MDF file . now i started the service again . but i find that there are no transactions happening ....what could be the reason.
Im facing one problem When more than one user tries to insert record through the application only one users data is getting inserted. Wat might be the problem?
This is the stored proc im using..
create proc [dbo].[GSK_insertregion] (@Country varchar(50),@Userid int,@RegionId varchar(50)) as begin insert into UserHierarchy(Country,Userid,RegionId) values(@Country,@Userid,@RegionId) end
Hi I have deployed a website on a server having Windows2000, IIS5.0 . It uses SQL Server 2000 which is on another remote server. While developing I used the visual tools in VS.net to make a connection and have used DYnamic properties of the connection object to map the connection string to the entry in to the config file. This works fine on my developement machine which has IIS and SQL Server 2000 on the same machine. The entry in the web.config for my connection string is:
value= " server=xxx.yyy.com; Trusted_Connection=yes;provider=SQLOLEDB.1;Initial Catalog=events; User id=myuser; Password=password;"
where xxx.yyy.com is the server running SQL Server2000.
I do not get any error but the conncetion doesnot happen and my datagrid doesnot get filled. The code for creating the connection is designer generated code. Any clues? -svp
Hello people, I don't expect anyone to know the answer to this but I guess we'll see huh?
I'm using Microsoft SQL Server Management Studio to do all my SQL stuff, and one of the tools that comes with this is a program called Microsoft SQL Profiler 2005. Anyways, so I'm using this profiler to capture SQL processes in the background while I work with the front end.
Here is a problem I'm facing... I am looking at this table right... and I see that when I do this certain process on the front end, it inserts 3 rows into the table. So I'm thinking "I know that on the tracer, I should be looking for an insert or a Stored Procedure with and insert in it."
So I use the tracer on the front end process and it shows me all the stored procedures that happen during the process of inserting these 3 rows into this table (btw this process is doing other things besides inserting stuff into this table, but it's what i'm currently working with at the moment.)
So what I do here at my job is I take this code, tweak it to work for our front end, and wah lah.... we're good to go. So in saying that, I make my own stored procedure and I all these stored procedures that are happening.
My result......
I get only 2 row inserts into the table.... the 1st row and the last row..... the middle row isn't inserted for some mysterious reason. I tried checking all the stored procedures for unique information pertaining to that specific row insert but to no avail I couldn't.
So my question to you guys is.... is there anything I'm overlooking that could possibly be inserting that row in the table? Thanks guys!
I have two problems I need some help with.First, I've just inherited a system and am delving into a few timeoutproblems that are causing problems for the users.Now, if I do a simple select * from the table (which looks to be thecause of the problem at this stage) in QA, I get the results back inless than a second. If I open the table in EM it takes about 10. Isthere a difference in viewing the data this way ? I'm used to EM beingvirtually the same speed. There is only one row. Minor questionreally, just something I'd like to understand if there is adifference.CREATE TABLE [QUERY] ([QUERY_ID] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[CAT_ID] [numeric](18, 0) NOT NULL ,[QUERY_DESCR] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,[USER_NAME] [varchar] (40) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,[USER_ID] [int] NOT NULL ,[IND_EURO] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOT NULLCONSTRAINT [DF_QUERY_IND_EURO] DEFAULT ('N'),[IND_DGCOLUMNS] [char] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL CONSTRAINT [DF_QUERY_IND_DGCOLUMNS] DEFAULT ('N'),[NO_GROUPS] [int] NOT NULL CONSTRAINT [DF_QUERY_NO_GROUPS] DEFAULT(0),[NO_FIELDS] [int] NOT NULL CONSTRAINT [DF_QUERY_NO_FIELDS] DEFAULT(0),[NO_LINES] [int] NOT NULL CONSTRAINT [DF_QUERY_NO_LINES] DEFAULT (0),CONSTRAINT [PK_QUERY] PRIMARY KEY CLUSTERED([QUERY_ID]) WITH FILLFACTOR = 90 ON [PRIMARY] ,CONSTRAINT [FK_QUERY_QUERY_CATEGORY] FOREIGN KEY([CAT_ID]) REFERENCES [QUERY_CATEGORY] ([CAT_ID]) ON DELETE CASCADE ON UPDATE CASCADE) ON [PRIMARY]GOI don't think any re-indexing has been done on this (or the othertables in the db). I was wondering if constant adding/deleting rowscould cause the index to be massive and in need of a good clear out.Any pointers would be appreciated. From what I can tell, there wassome problems trying to get replication to work. I need to dig deeperto see if this is now correct.-------------------------Secondly, there is a another table in the same database.CREATE TABLE [FIELD_DATA] ([ID] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[DATA_ID] [numeric](18, 0) NOT NULL ,[FIELD_ID] [numeric](18, 0) NULL ,[FIELD_CODE] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NOTNULL ,[FIELD_VALUE] [numeric](15, 5) NULL ,CONSTRAINT [PK_FIELDDATA] PRIMARY KEY CLUSTERED([ID]) WITH FILLFACTOR = 90 ON [PRIMARY]) ON [PRIMARY]GOIt holds approx 4 million rows. The rest of the tables have minimaldata and about the same amount (consider them the same if you will).Now, another 'copy' of this database is held elsewhere (differentclient data) and this holds 40 million rows. The difference is thatthe first DB is 4.5GB and the second 6.5GB (approx). Does this provemy theory that re-indexing would be a good idea ?ThanksRyan
I recently read the project real ETL design best practices whitepaper. I too, want to do custom logging as I do today, and also use SSIS logging. The paper recommended using the variable system::PackageExecutionId to tie the 2 logging methods together.
"The log file for database is full. Back up the transaction log forthe database to free up some log space."Now I only know this way to deal with that manually,Step1. in option , chance Recovery model from FULL to Simple.Step2: go to task to manually shrink the log fileStep3: Change recovery model back from simple to FULL.But by this way, I could get same problem again, the log file is fill,and need free up.Could you give an idea how to prevent this from happening? what andhow should I do???Thanks a lot in advance for your help.
Iam having 500 pages report,the report ia running for the file key ,I have reference some of the feilds to the reports header segment to display in every page,For every file key the report header feild also will change ,It is working for all file keys but the reference not occuring from body to report header when file key grouping changes to other grouping
Please let me know is there any possible way to do this one
I have configured an alert like below to track all blocked events in SQL Server across all databases and then kick start a sql job when a blocking happens which inserts data to a table, when there is a blocking in SQL server , i get an email  --which is working fine and i am able to track all queries.
but, HOW to get notifications ONLY if BLOCKING IS HAPPENING FOR MORE THAN 30 SECONDS OR 1 MINUTE with out using sp_configure?
---ALERT USE [msdb] GO EXEC msdb.dbo.sp_update_alert @name=N'Blocking Process', @message_id=0, @severity=0, @enabled=1,Â
i am making a query which select the data again a particuler date.
I insert values in the table for with current date(Today's date) and the records is inserted with the date format(2006-07-14 16:12:09),now when i run the query after 2 or 3 minutes to select the records inserted today, my query returns no results.
I think it is because of the the time (14:16 in this case) that after 2 minutes, the query looks for the records inserted at (2006-07-14 18:12 or 2006-07-14 19:12) and does not get the result.
Is there a method to not consider the time(14:16) when running the query but the query fetches the records including the records inserted at this time(14:16) no matter at what time I run the query today?
declare @error int, @rowcount int select @rowcount = COUNT(1) FROM STG_BCDR; while @rowcount > 0 begin  BEGIN TRAN Deletion
[code]....
Above code i try to delete records batch by batch to avoid table locking at BCDR table.total records in this BCDRÂ table is 40,000 records. Â However I run the code at execution plan, the BCDR table still clustered index scan which means that the locking still happend.
If i change the delete top (5000)...... to delete top (5).... then thre is clustered index seek, which is good..The problem here is  each time  only delete top 5 records which is means it will realy take very long time to remove those data.
how to cater the situation inorder for me to delete those huge data without table locking happend. If table locking happend , then other user will not be able to access this table at the same time.
Hi everyone, I use the forms authetification for my report manager and server. But, i have this problem : The Logon page running successfully but redirect to Folder.aspx not happening
To clarify:
>I get the logon page (UILogon.aspx)
>My user has been registered ok (i have checked the entry in the db to make sure it was created)
>I enter the login & password correctly and page posts back
The redirect never happens - In the browser, it never leaves the UILogon.aspx
Using Win2003, SQL Server 2005, Reporting Services
How can I check from database username and password? It doesent need any special authentication, just a lookup through the database and if the user exist than continue with the next page.Thanks
I have a web application accessing a SQL Server database (the ususal stuff).
I want to be able to log who did what on which table. I need to display this information on the web application. Is there an easy way of doing this, rather that making duplicates of a lot of data?
The best way I have thought of so far is making a new table with the following fields: Table_Changed Table_Primary_Key Old_Field_Value New_Field_Value User Date_Changed
Every time someone changes something, it is logged in this table, so that, at any time, I can display who changed what. I have one more question. If I do do it this way, is there a way of getting the primary key value of any table? E.G. could I do something like this_table.primary_key.value ?
Is there a way to produce a log of all SQL statements hitting a database in a given range of time by a specific SPID? Obviously the SQL Server activity logs do not go into that much detail, except when errors are produced or a change is made to a system table. Is there a setting to add more detail, or to log a specific SPID's actions, or maybe a third party software that will give me what I am looking for?
does SQL 7.0 have any built in logging capabilities to identify row level actions by operator. For instance, can it tell me that a particular user deleted or inserted a row? How would I tell it who the operator is?
I've been asked to write a trigger that will basically log changes to certain fields in certain tables, then create a front-end application where the user will be able to review the info. The front-end app. is not a problem for me - the trigger is. I have found example of how to do this on Update when it's a complete row you want to log, but not a specific field. In addition, I also need to know if someone is attempting to read certain data and who that user is. If the user is not someone that is allowed to read the data, then I need to send an email alert. I believe it's possible to do the above (despite my lack of knowledge :) - Does anyone know where I can get more information on how to accomplish the above - or where to start looking? Thanks to any who can guide me in the right direction.
Hi,I´m currently playing around with ASP.NET.Is there a way to log all Queries that are send to the SQL-Server? Something like the query.log of a Mysql.
I have a problem acessing MSDE. My server runs at "NT AUTHORITY/NETWORK SERVICE" so it is not allowed to connect through the windows authentication feature. It seems that the password for the "sa" user account was changed during setup, after reading the logs from the setup I can see that it was changed but I cannot see what it was changed to. How would I set up a new account that I could use to access the server though SQL Authentication???
Lets say I have version 1 of a database - DB1. I am creating the second database, DB2.
What I need is a log of all the SQL statements that where used to change DB1 into DB2. This means recording both what happened in the GUI and in the SQL Query Analyser.
Is there a way I can do this? I know SQL Server has a transaction log somewhere. Is there a way to set this to output all the changes made from a set date on a database into a SQL log file?