My cuestion is, when grows (in %) the file mdf if " Automatically Grow file" is on, because my database have the 94 % used and the file no grow. I read several docs ,and not found nothing. thanks
We encounter the problem, that a reportservertempdb grows extremly large... the sessiondata and snapshotdata tables are about 10gb each at the moment and keep growing and growing... The CleanupCycleMinutes Configuration is set to default=10. We are not taking snapshots of our reports. Shouldnt those tables be cleaned up every 10minutes with this setting?
What can we do to stop this database growing? is truncating those tables on a weekly base a solution? We are using SQL Server Reporting Services 2005 SP2 with Cumulative Hotfix Package 5 (Build 3215).
Asp.net 2.0 (VB) application with Sql server 2000 Installed in windows 2000 professional server.Only less than 10 users using this application.
The problem is sometimes the application didn't response, it's very slow to get the information and after sometimes its never responding it showing hourclass.
I used Helper class (Application Block). Since Sql Server 2000 doesn't have the MARS Function, i have manually close the connection and opening again everytime.
Any idea?. Thanks in advance.
Help Class Modified Code: -------------------------------------- Code :
' If the provided connection is not open, we will open it
If connection.State <> ConnectionState.Open Then
connection.Open()
mustCloseConnection = True
Else
mustCloseConnection = False
End If
Data ObjectBase : ---------------------------
'The transaction will call this method and also check the connection status
Public Sub BeginTransaction()
Try
If Not (myTransaction Is Nothing) Then
'Throw New Exception(ConfigManager.ReadFromXml("msg_DataAccess_Begin_trans"))
Throw New Exception("Transaction Error in Begin")
End If
If ourConnection.State <> ConnectionState.Open Then
I currently have a SQL backup process that backs up my databases via the network to a backup hard drive on a separate system. I recently began getting strange issues with my backup process in which it continually writes to the backup drive until the drive fills up and then the job fails. I also noticed that when I kill the job on the host server, the backup file drops to the normal file size. The normal file size is 300 GB but it has grown to over 400GB. I looked at various logs and even performed several backup tests with success.
I am trying to figure out if this is a known SQL Server issue or an issue with the OS?
Hi,The tempdb file on one of our servers grew very large and used allavailable disk space. This is SQL Server 2000 SP4. I have installedhotfix version 8.00.2187. I opened a profiler trace but can't still getto the root of the problem. Any help will be appreciated.Egbon*** Sent via Developersdex http://www.developersdex.com ***
This managed application was written to run on a Symbol 3090 Win CE 5.0 scanning device. We are using the symbol provided classes to access the scanning interface, and SQL Compact database on the device to collect the scanned data, and then using merge replication to synchronize scanned data when the device is docked. The problem we have experienced seems to be releated to the performance when inserting and updating records in the database.
We have tested some randomly generated 1000 records and inserting/updatating into a database. At first the time to commit a record increases when the database is flushing into the memory (The flush interval in the connection string property is 10 seconds by default). and then as the database size grows increasing the time to commit every single record which is causing the application to perform slowly as they scan items into the database. However, the device program memory remains consistant as they are scan items. From our tests, I found the time to execute either a update/insert command on 2MB sqlMobile database (upto 10000 records, depending on the size of the columns) is taking nearly 2 to 2 and half seconds to complete. Below is the only code I am executing,
I have a publisher database set up for a merge replication. This is using parameterized filter with join filters.
I also have a stored procedure that does deletes & inserts on the table where the parameterized filter is applied to aid in changing a subscriber's eligibility to receive so and so data. I have observed that running the stored procedure takes extraordinarily long and as a result, the log file grows to a size 1.5 - 2.5 times the database size.
At first I reasoned that this might because I had it set up to use precomputed partitions and changing it requires recalculating the partitions. As a test, I turned off the precomputed partitions. Didn't work. I turned on "optimize synchronization" AKA "keep_partition_changes", which normally is not available when you have precomputed partition on, and that didn't work, either.
At this point, I think I can rule out precomputed partitions being a problem here but I'm stumped now what else I should do to reduce the amount of log writes being required. We do need the parameterized filters & join tables, so that can't go.
I am using SQL Server 2005 and trying to create a linked server on Oracle 10. I used the commands below: EXEC sp_addlinkedserver @server = 'test1', @srvproduct = 'Oracle', @provider = 'MSDAORA', @datasrc = 'testsource' exec sp_addlinkedsrvlogin @rmtsrvname = 'test1', @useself = 'false', @rmtuser='sp', @rmtpassword='sp'
When I execute select * from test1...COUNTRY I get the error. "The OLE DB provider "MSDAORA" for linked server "...." does not contain the table "COUNTRY". The table either does not exist or the current user does not have permissions on that table." The 'sp' user I am connecting is the owner of the table. What could be the problem ? Thanks a lot.
I have created a table Table with name as Varchar and id as int. Now i have started inserting the rows like, insert into Table values ('arun',20).Yes i have inserted a row in the table. Now i have got the values " arun's ", 50. insert into Table values('arun's',20) My sqlserver is giving me an error instead of inserting the row. How will you solve this problem?
I am having a table called as status ,in that table one field is there i.e. currentstatus. the rows which are having currentstatus as "ticket closed",i want to move those rows into other table called repository which is having same table structure as status table. I can do programatically. but is there any way for every 3 months system has to check and do this action means moving to repository table automatically?
I'm inserting from TempAccrual to VacationAccrual . It works nicely, however if I run this script again it will insert the same values again in VacationAccrual. How do I block that? IF there is a small change in one of the column in TempAccrual then allow insert. Here is my query
INSERT INTO vacationaccrual (empno, accrued_vacation, accrued_sick_effective_date, accrued_sick, import_date)
For reasons that are not relevant (though I explain them below *), Iwant, for all my users whatever privelige level, an SP which createsand inserts into a temporary table and then another SP which reads anddrops the same temporary table.My users are not able to create dbo tables (eg dbo.tblTest), but arepermitted to create tables under their own user (eg MyUser.tblTest). Ihave found that I can achieve my aim by using code like this . . .SET @SQL = 'CREATE TABLE ' + @MyUserName + '.' + 'tblTest(tstIDDATETIME)'EXEC (@SQL)SET @SQL = 'INSERT INTO ' + @MyUserName + '.' + 'tblTest(tstID) VALUES(GETDATE())'EXEC (@SQL)This becomes exceptionally cumbersome for the complex INSERT & SELECTcode. I'm looking for a simpler way.Simplified down, I am looking for something like this . . .CREATE PROCEDURE dbo.TestInsert ASCREATE TABLE tblTest(tstID DATETIME)INSERT INTO tblTest(tstID) VALUES(GETDATE())GOCREATE PROCEDURE dbo.TestSelect ASSELECT * FROM tblTestDROP TABLE tblTestIn the above example, if the SPs are owned by dbo (as above), CREATETABLE & DROP TABLE use MyUser.tblTest while INSERT & SELECT usedbo.tblTest.If the SPs are owned by the user (eg MyUser.TestInsert), it workscorrectly (MyUser.tblTest is used throughout) but I would have to havea pair of SPs for each user.* I have MS Access ADP front end linked to a SQL Server database. Forreports with complex datasets, it times out. Therefore it suit mypurposes to create a temporary table first and then to open the reportbased on that temporary table.
The following dbo.Tables of Northwind.mdf in my .SQLEXPRESS (SQL Server Management Studio Express) are missing: dbo.Categories dbo.CustomerCustomerDemo dbo.CustomerDemographics dbo.Customers dbo.Employees dbo.EmployeeTerritories dbo.Order Details dbo.Orders dbo.Products dbo.Regions dbo.Shippers dbo.Suppliers dbo.Territories.
But, I have these dbo.Tables in a different Database "xyzDatabase". How can I copy each of these dbo.Tables to the another blank dbo.Table of Northwind Database?
I right clicked on the dbo.Categories and I saw the following thing: dbo.Categories New Table... Modify Open Table Script Table as |> CREATYE To |> DROP To |> SELECT To |> INSERT To |> New Query Editor Window File.... Clipboard UPDATE To |> DELETE to |> From the above observation,I think it is possible to copy the dbo.Table from the one Database to the Northwind Database that needs to be repaired. Please help and advise me how to do this task or tell me where I can find the Microsoft document that gives the details of this X-copy thing.
Thanks in advance, Scott Chang
P. S. I am using VB 2005 Express to create a project to learn "Calling Stored Procedures with ADO.NET" (see Paul Kimmel's article in http://www.developer.com/db/article.php/3438221) that needs the dbo.Tables of Northwind Database and my Northwind Database has been screwed up for quite a while and needs a big repair.
--Table 1 "Employee" CREATE TABLE [MyCompany].[Employee]( [EmployeeGID] [int] IDENTITY(1,1) NOT NULL, [BranchFID] [int] NOT NULL, [FirstName] [varchar](50) NOT NULL, [MiddleName] [varchar](50) NOT NULL, [LastName] [varchar](50) NOT NULL, CONSTRAINT [PK_Employee] PRIMARY KEY CLUSTERED ( [EmployeeGID] ) GO ALTER TABLE [MyCompany].[Employee] WITH CHECK ADD CONSTRAINT [FK_Employee_BranchFID] FOREIGN KEY([BranchFID]) REFERENCES [myCompany].[Branch] ([BranchGID]) GO ALTER TABLE [MyCompany].[Employee] CHECK CONSTRAINT [FK_Employee_BranchFID]
-- Table 2 "Branch" CREATE TABLE [Mycompany].[Branch]( [BranchGID] [int] IDENTITY(1,1) NOT NULL, [BranchName] [varchar](50) NOT NULL, [City] [varchar](50) NOT NULL, [ManagerFID] [int] NOT NULL, CONSTRAINT [PK_Branch] PRIMARY KEY CLUSTERED ( [BranchGID] ) GO ALTER TABLE [MyCompany].[Branch] WITH CHECK ADD CONSTRAINT [FK_Branch_ManagerFID] FOREIGN KEY([ManagerFID]) REFERENCES [MyCompany].[Employee] ([EmployeeGID]) GO ALTER TABLE [MyCompany].[Branch] CHECK CONSTRAINT [FK_Branch_ManagerFID]
--Foreign IDs = FID --generated IDs = GID Then I try a simple single row DELETE
DELETE FROM MyCompany.Employee WHERE EmployeeGID= 39
Well this might look like a very basic error: I get this Error after trying to delete something from Table €œEmployee€?
The DELETE statement conflicted with the REFERENCE constraint "FK_Branch_ManagerFID". The conflict occurred in database "MyDatabase", table "myCompany.Branch", column 'ManagerFID'.
Yes what I€™ve been doing is to deactivate the foreign key constraint, in both tables when performing these kinds of operations, same thing if I try to delete a €œBranch€? entry, basically each entry in €œbranch€? and €œEmployee€? is child of each other which makes things more complicated.
My question is, is there a simple way to overcome this obstacle without having to deactivate the foreign key constraints every time or a good way to prevent this from happening in the first place? Is this when I have to use €œON DELETE CASCADE€? or something?
Banti writes "IF i create temporary table by using #table and ##table then what is the difference. i found no difference. pls reply. first: create table ##temp ( name varchar(25), roll int ) insert into ##temp values('banti',1) select * from ##temp second: create table #temp ( name varchar(25), roll int ) insert into #temp values('banti',1) select * from #temp
both works fine , then what is the difference waiting for ur reply Banti"
I would like to return the nearest date of Table B in my table like for
ID W001 in table B should return ID A002 CreatedDatetime: 2014-06-03 20:05:48.000 ID W002 in table B should return ID A004 CreatedDatetime: 2014-06-04 01:05:48.000
I have a column defined as smalldatetime. Default length (4), and "allow NULLS" is checked.In the Enterprise Manager UI, when i enter data into that table row, if i just tab past that column, all is well, and the value is represented in the UI as <NULL>.The problem comes once i ever enter a date into that column. Say i have entered a date (all is well), and now i want to remove that entry and go back to NULL (after the date value has been committed, different entry session, say).How is that done?It seems to me, once a date has ever been entered into that column, now, if i try to remove it, i get the error "The value you entered is not consistant with the data type or length of the column, or over grid buffer limit". I have tried deleting the value, entering spaces, entering the string NULL or the string <NULL>; maybe some other tries as well, but none works, i always get that error message and am not allowed to proceed past that cell until i restore a date value to it. I want to get back to <NULL>.Anybody know?Thank you.Tom
I am stuck on finding a solution to transpose source data from a system via a metadata look-up table into a destination table. I need a method to transpose/pivot the source data into columns (which are by various data-types). The datatypes for each column are listed in a metadata table.
Source Data Table:
Table Name: Source
SrcID AGE City Date 01 32 London 01-01-2013 02 35 Lagos 02-01-2013 03 36 NY 03-01-2013
Metadata Table:
Table Name:Metadata
MetaID Column_Name Column_type 11 AGE col_integer 22 City col_character 33 Date col_date
Destination table:
The source data to be loaded into the destination table(as shown below):
i am inserting something into the temp table even without creating it before. But this does not give any compilation error. Only when I want to execute the stored procedure I get the error message that there is an invalid temp table. Should this not result in a compilation error rather during the execution time.?
--create the procedure and insert into the temp table without creating it. --no compilation error. CREATE PROC testTemp AS BEGIN INSERT INTO #tmp(dt) SELECT GETDATE() END
only on calling the proc does this give an execution error
I created am inventory table with few columns say, Servername, version, patching details, etc
I want a tracking of the table.
Let's say people are asked to modify the base table and I want a complete capture of the details modified and the session of the user ( ) who (system_user) is actually modifying the details.
I have 2 tables: Order(ID, Quantity) and Product(ID,Name, Price) and I want to add a calculated field in Order table based on the price column in the Product table. How do i do that?
this query returns the values i want in the table.
select a.quantity * b.price from tblCustomerPurchases as a join tblProduct as b on a.ID=b.ID
SQL Server 2005 SP2 Error: Script failed for Table 'dbo.TableName'
I keep getting this error when trying to generate a snapshot for transact replication. Here is the publishing table schema:
CREATE TABLE [dbo].[viMediaPlaylist]( [MemberID] [bigint] NOT NULL, [xmlPlaylist] [xml](CONTENT [dbo].[viMediaPlaylistCollectionSchema]) NOT NULL, PRIMARY KEY CLUSTERED ( [MemberID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON, FILLFACTOR = 90) ON [PRIMARY] ) ON [PRIMARY]
I tried the following: #1)Create table schema at the subscriber manually, and change package to Keep object as is. Got the same error during snapshot.
#2Filtered replication to NOT include XML column. This worked, and no error was generated, replication was up and running.
So my question is, what is the problem with XML column being replicated? Any ideas how i can make this work? At this point i'm not sure what the problem is with the xml column, but it seems like ti tries to run schema script, but i'm not asking it to create the schema i already did it myself. Thanks!
Here is a log dump from the distributor:
2007-09-11 16:40:14.56 SQL Command dump 2007-09-11 16:40:14.56 ================ 2007-09-11 16:40:14.56 Server: SQL02 2007-09-11 16:40:14.56 Database: Video 2007-09-11 16:40:14.56 Command Text: sys.sp_releaseapplock 2007-09-11 16:40:14.56 Parameters: 2007-09-11 16:40:14.56 @Resource = SQL02-Video_viMediaPlaylis-71 2007-09-11 16:40:14.56 @LockOwner = Session 2007-09-11 16:40:14.56 @DbPrincipal = db_owner 2007-09-11 16:40:15.17 [0%] The replication agent had encountered an exception. 2007-09-11 16:40:15.17 Source: Unknown 2007-09-11 16:40:15.17 Exception Type: Microsoft.SqlServer.Management.Smo.FailedOperationException 2007-09-11 16:40:15.17 Exception Message: Script failed for Table 'dbo.viMediaPlaylist'. 2007-09-11 16:40:15.17 Message Code: Not Applicable 2007-09-11 16:40:15.17 2007-09-11 16:40:15.17 Call Stack: 2007-09-11 16:40:15.17 Microsoft.SqlServer.Management.Smo.FailedOperationException: Script failed for Table 'dbo.viMediaPlaylist'. ---> Microsoft.SqlServer.Management.Smo.UnsupportedVersionException: Either the object or one of its properties is not supported on the target server version. 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.UserDefinedDataType.GetTypeDefinitionScript(ScriptingOptions so, SqlSmoObject oObj, String sTypeNameProperty, Boolean bSquareBraketsForNative) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.UserDefinedDataType.AppendScriptTypeDefinition(StringBuilder sb, ScriptingOptions so, SqlSmoObject oObj, SqlDataType sqlDataType) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.Column.ScriptDdlCreateImpl(StringBuilder sb, ScriptingOptions so) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.Column.ScriptDdl(StringCollection queries, ScriptingOptions so) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.Table.ScriptTableInternal(ScriptingOptions so, StringBuilder sb, ColumnCollection columns, IndexCollection indexes) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.Table.GetTableCreationScript(ScriptingOptions so, StringBuilder sb) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.Table.ScriptCreate(StringCollection queries, ScriptingOptions so) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.Scripter.ScriptWithListWorker(DependencyCollection depList, SqlSmoObject[] objects) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.Scripter.ScriptWithList(DependencyCollection depList, SqlSmoObject[] objects) 2007-09-11 16:40:15.17 --- End of inner exception stack trace --- 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.Scripter.ScriptWithList(DependencyCollection depList, SqlSmoObject[] objects) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Management.Smo.Scripter.ScriptWithList(SqlSmoObject[] objects) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Replication.Snapshot.TransSmoScriptingManager.GenerateLogBasedArticleSchScript(Scripter scripter, BaseArticleWrapper articleWrapper, Table smoTable) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Replication.Snapshot.TransSmoScriptingManager.GenerateLogBasedArticleScripts(ArticleScriptingBundle articleScriptingBundle) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Replication.Snapshot.TransSmoScriptingManager.GenerateArticleScripts(ArticleScriptingBundle articleScriptingBundle) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Replication.Snapshot.SmoScriptingManager.GenerateObjectScripts(ArticleScriptingBundle articleScriptingBundle) 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Replication.Snapshot.SmoScriptingManager.DoScripting() 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Replication.Snapshot.SqlServerSnapshotProvider.DoScripting() 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Replication.Snapshot.SqlServerSnapshotProvider.GenerateSnapshot() 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Replication.SnapshotGenerationAgent.InternalRun() 2007-09-11 16:40:15.17 at Microsoft.SqlServer.Replication.AgentCore.Run()
Hello, Maybe anyone have done that before? I have table where i store SOURCE_TABLE_NAME and DESTINATION_TABLE_NAME, there is about 120+ tables. i need make SSIS package which selects SOURCE_TABLE_NAME from source ole db, and loads it to DESTINATION_TABLE_NAME in destination ole db.
I made such SSIS package. set ole db source data access mode to table or view name variable. set ole db destination data access mode to table or view name variable. set to variables defoult values (names of existing tables) but when i loop table names is changed, it reports error, that can map columns, becouse in new tables is different columns.
Hi... I was hoping if someone could share me some thoughts with the issue that I am having at the moment.
Problem: When I run the package in my local machine and update local SS DB/table - new records writes OK in the table. BUT when I changed my destination meaning write record into another physical SS DB/table there is no INSERT data occurs. AND SO when I move/copy over that same package into another server (e.g. server that do not write record earlier) and run it locally IT WORKS fine too.
What I am trying to do is very simple - Add new records in a SS table using SSIS . I only care for new rows and not even changed rows. Here is my logic - 1. Create Ole DB source to RemoteSERVER - using SELECT stmt 2. I have LoopUp component that will look for NEW records - Directs all rows that don't find match and redirect rows (error output). 3. Since I don't care for any rows that is matched in my lookup - I do nothing or I trash the rows 4. I send the error rows (NEW rows) into OleDB destination
RESULTS when I run the package locally and destination table is also local - WORKS FINE; But when I run the package locally and destination table is in another Sserver (remote) - now rows is written.
The package is run thru BIDS manually so there is no sucurity restrictions attached to it.
I am not sure what I am missing. And I do not see error in my package either. It is not failing.
How do I use table names stored in variables in stored procedures?
Code Snippetif (select count(*) from @tablename) = 0 or (select count(*) from @tablename) = 1000000
I receive the error 'must declare table variable '@tablename''
I've looked into table variables and they are not what I would require to accomplish what is needed. After browsing through the forums I believe I need to use dynamic sql particuarly involving sp_executesql. However, I am pretty new at sql and do not really understand how to use this and receive an output parameter from it(msdn kind of confuses me too). I am tryin got receive an integer count of the records from a certain table which can change to anything depending on what the user requires.
Code Snippet
if exists(Select * from sysobjects where name = @temptablename) drop table @temptablename
It does not like the 'drop table @temptablename' part here. This probably wouldn't be an issue if I could get temporary tables to work, however when I use temporary tables i get invalid object '#temptable'.
Heres what the stored procedure does. I duplicate a table that is going to be modified by using 'select into temptable' I add the records required using 'Insert into temptable(Columns) Select(Columns)f rom TableA' then I truncate the original table that is being modified and insert the temporary table into the original.
Heres the actual SQL query that produces the temporary table error.
Code Snippet Select * into #temptableabcd from TableA
Insert into #temptableabcd(ColumnA, ColumnB,Field_01, Field_02) SELECT ColumnA, ColumnB, Sum(ABC_01) as 'Field_01', Sum(ABC_02) as 'Field_02', FROM TableB where ColumnB = 003860 Group By ColumnA, ColumnB
TRUNCATE TABLE TableA
Insert into TableA(ColumnA, ColumnB,Field_01, Field_02) Select ColumnA, ColumnB, Sum(Field_01) as 'Field_01', Sum('Field_02) as 'Field_02', From #temptableabcd Group by ColumnA, ColumnB
The above coding produces
Msg 208, Level 16, State 0, Line 1
Invalid object name '#temptableabcd'.
Why does this seem to work when I use an actual table? With an actual table the SQL runs smoothly, however that creates the table names as a variable problem from above. Is there certain limitation with temporary tables in stored procedures? How would I get the temporary table to work in this case if possible?