Hi!
I hope anyone can help me out. Our Transaction Log file
of our PUBLISHED Database has unfortunately been moved to
a differant folder. After that a new one has been created.
After movin back the old one, the DB isn't accessible from
the clients anymore. (ODBC error message)
How can I get that old log file back to work???
Is it possible to use sp_detach and sp_attach on a replicated
database?
Please help me with that urgent problem!
Thanx in advance
Gert
what happens if the physical location of a box(which had sql server 2000 on it) is chaned. what happens to the replication and distributed queries. Thanks.
I have a SQL Server 2000 database and want to now split up the tablesonto seperate file groups as well as some indices.How do you breakup an existing table to move it from one filegroup(Primary) to the new filegroup ?Thanks.Craig
database has been recently upgraded from ms sql 2000 to ms sql 2005.are there anything I need to be aware after upgrading to ms sql 2005?for my experience, i got an error if i use column alias in ORDER BYclause which was fine on ms sql 200.thanks
I haven't gotten a response yet, so I moved this from another group. Ihave been working on this for 2 days so if anyone has any ideas, Iwould be grateful.I have a 3rd party program that creates and populates tables in mySQLServer 2005 database.The program fails on the inserts on "tblB" because the field itcreates is too small for the data that it is trying to put in it(stupid).I wrote a DDL trigger that attempts to alter the table as soon as itcreated, allowing all the data to be loaded.However, something about this trigger causes a prior table "tblA" tofail.Here is the error message that I get on inserting into tblA with thetrigger for tblB in place:Execution of this SQL statement failed: Create table tblA(STATUSCHAR(1) NOT NULL DEFAULT'', SCHOOLNUM[Microsoft][ODBC SQL Server Driver][SQL Server]SELECT failed becausethe following SET options have incorrect settings:'CONCAT_NULL_YIELDS_NULL, ANSI_WARNINGS, ANSI_PADDING'. Verify thatSET options are correct for use with indexed views and/or indexes o(yes, it truncates the error message)My trigger is basically:USE [IGPLINK]GO/****** Object: DdlTrigger [NO_SOUP_FOR_YOU] Script Date:03/24/2008 16:04:42 ******/SET ARITHABORT ONGOSET CONCAT_NULL_YIELDS_NULL ONGOSET QUOTED_IDENTIFIER ONGOSET ANSI_NULLS ONGOSET ANSI_PADDING ONGOSET ANSI_WARNINGS ONGOSET NUMERIC_ROUNDABORT OFFGOCREATE TRIGGER [NO_SOUP_FOR_YOU] ON DATABASEFOR CREATE_TABLEASSET NOCOUNT ONDECLARE @xmlEventData XML,@tableName VARCHAR(50)SET @xmlEventData = eventdata()SET @tableName = CONVERT(VARCHAR(25), @xmlEventData.query('data(/EVENT_INSTANCE/ObjectName)'))IF @tableName ='tblB'BEGINALTER TABLE dbo.tblB ALTER COLUMN STULINK Numeric(16,0)ENDHowever, when I have enterprise manager script my trigger, it looksaltered. I think these ON/OFF settings at the end are screwing thingsup. Any suggestions?USE [IGPLINK]GO/****** Object: DdlTrigger [NO_SOUP_FOR_YOU] Script Date:03/25/2008 11:10:05 ******/SET ANSI_NULLS ONGOSET QUOTED_IDENTIFIER ONGOCREATE TRIGGER [NO_SOUP_FOR_YOU] ON DATABASEFOR CREATE_TABLEASSET NOCOUNT ONDECLARE @xmlEventData XML,@tableName VARCHAR(50)SET @xmlEventData = eventdata()SET @tableName = CONVERT(VARCHAR(25), @xmlEventData.query('data(/EVENT_INSTANCE/ObjectName)'))IF @tableName ='tblB'BEGINALTER TABLE dbo.tblB ALTER COLUMN STULINK Numeric(16,0)ENDGOSET ANSI_NULLS OFFGOSET QUOTED_IDENTIFIER OFFGOENABLE TRIGGER [NO_SOUP_FOR_YOU] ON DATABASE
I've recently moved an asp.net website from my PC to a network share because another tech it going to be working on it. I finally got the correct permissions on the network share and the correct .NET Framework settings on my PC to be able to run the app. Now I can't access the SQL server which is on a different server. Getting the following error: Request for the permission of type 'System.Data.SqlClient.SqlClientPermission, System.Data, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. How to I setup access to my SQL server for the app from any given PC on my LAN?
I have a table named [Customers] which has about 100 fields and tens of thousands of records. These fields have been created over a couple years, but now they are so disorganized its getting crazy. I want to move the fields using click and drag on the record selectors the GUI in design view of SQL Server Management Studio, but I am worried about possible dangers. Is there anything I should be worried about by doing this before saving the changes? I don't want to make a view because the front end program is pretty big and would require way too much reprogramming to point all its parts to a new view on a table used so much already.
The question arises because when I try, it wants to also save all tables that are FKs to this table, which are many(and have cascade updates on). So I decided to abort and get info I should know first. Although the data shouldn't change when moving the field, I don't want to just assume that nothing will go wrong.
Also, the system is live, and it isn't easy to shut down to do this change. Is it essential that the system be offline, or at least has no users connected to the tables?
1.) Can an aspnetdb.mdf database be configured and setup on one server and then be moved to a production server or is there something machine specific that keeps this from being possible? 2.) Is putting this file in the app_data folder something that is used only for SQL 2005 or SQL express? I had to set up a connection string in my SQL 2000 installation to get the connection to work. Thanks for any input! Colelaus
Hi there, I am not sure if this is the right place to seek help for the problem i have but as i don't see any other link to discuss the situation i have i am just posting it here. To explain a little bit about the project i am on..... Originally the appication(developed in asp.net and vb) i am assigned to was developed by a different team and most of the databases were on SQL server 6.5 servers. So they have used oledb connections where ever they had to connect SQL server 6.5 data sources. Lately the client has moved all of the sql server 6.5 data bases to SQL server 2000 and now the application is kinda not working as it is suppose to as it was before. So i am hired to fix the problem. So as a first step what i am doing is finding out all of the .net oledb proivder part of the database connections code to SQL server 6.5 data sources and changing them to .NET provider for SQL server. Say for example in the below GetConnection function i am commenting out OledbConnection and defining a SqlConnection instead. Private Function GetConnection() As IDbConnection 'Dim conn As IDbConnection = New System.Data.OleDb.OleDbConnection Dim conn As IDbConnection = New System.Data.SqlClient.SqlConnection conn.ConnectionString = Helpers.DBHelpers.GetConnectionString(Helpers.DBHelpers.COMMON) Return conn End Function and also commenting out OledbDataAdapter line of code and defining a SqlDataAdapter instead. You can see it below 'Dim adapter As OleDb.OleDbDataAdapter = New OleDb.OleDbDataAdapter(CType(cmd, OleDb.OleDbCommand)) Dim adapter As SqlClient.SqlDataAdapter = New SqlClient.SqlDataAdapter(CType(cmd, SqlClient.SqlCommand)) And connection strings are defined in web.config file and also i am changing those as well . See below.
<!--<add key="Common" value="User ID=Test;pwd=*****;Data Source=ESMALLDB2K;Initial Catalog=cj_common;Auto Translate=True;Persist Security Info=False;Provider="SQLOLEDB.1";" />--> <add key="Common" value="User ID=Test;pwd=*****;Data Source=ESMALLDB2K;Initial Catalog=cj_common;" /> So i hope i am in the right direction as far as the first step. But please throw in any kind of suggestions on this.
One more thing. I have a search screen and T-sql query thats built for this purpose searches 4,5 tables and brings the data back. When i make a search from the web browser it doesn't return the data for the first couple of times but it brings the data 3rd time but even its taking as long as 60 seconds to bring the data back. when i close the browser and debug and paste the SQL query in the query analyzer it returns the data in the query analyzer and when i complete the remaining part of debugging and bind the data to the gird i also see the data on the broswer for the first time itself. Question : Why i don't get the data for the first time when i search it from the front(web browser)? But like i said the executing time to the query in the query analyzer itself takes considerably long time( i would say around 60 seconds just to return 3,4 records)) in the query analyzer. When i talked to the database guys why sql queries are a little slow they say they have a lot of datat out there around 180 thousand records in it and thats why its taking that much time to search agains all of the rows. Question - Do you think it could be some thing to do with dropping and recreating the indexes should solve our problem? May be its some thing to do with the indexes but i am sure they have not dropped out the indexes of all of the table objects and recreated yet after the databases are moved to SQL servere 2000. Hope i am able to explaing what i am looking for and what i am doing. Please help me understand in solving these problems. Thanks in advance -D
I am totally confused, please help me. I am new to this
I am trying to split one DB (A) into two databases original (A) and new-Blank(B) by moving some tables from DB (A) to the new DB (B) however some of the tables has FK and Stored procedures referencing other tables that need to stay in DB (B), The questions are
1. after scritping these tables while they are in DB (A), and runing the script in DB (B) to re-create them, do I delete these table from DB (A), and the FK that references them.
2. What shall I do with the stored procedures. Turn them into trigers or else if turn them into trigers, in which DB should the triggers run (DB A of DB B),if these are becoming triggers do I delete the stored procedures from DB (A)
Hi all, I moved a report to a different solution using the "Add Existing Item" function. Verified that all the data sources were properly connected and that my queries to provide parameter lists were running ok. When I try to run the main data source (a stored proc) from the data tab, I get this message:
"..An error occurred while executing the query. Procedure or function 'ReportSP' expects parameter '@Item, which was not supplied. (Microsoft SQL Server, Error: 201)"
This SP runs fine from the old solution and in Management Studio, so there must be something I need to do in the new report. Can anyone point me in the right direction? I checked the report parameters and it's there. When I try to preview the report, it lets me pick all the parameters, including the one it's griping about, then it throws the message.
I recently moved the db to a new server using detach and reattach sproc. However the moved db in its new location does not have any of the tables or sprocs that I created. As I understand it, information on the databases on the server is stored in the master db. Could it be that the tables are not showing up because the master db on the new server knows nothing of the new db? Must I also copy the master from the other server, perhaps? Has anybody come across this before? TIA D. Lewis
We had our backups backing up to the server where the databases reside. Now I modified the backups to backup to a file share. Now when we try to restore from the file share the restore fails, so we have to copy the backup to a drive on the server and recover for there. Should I be able to restore directly from the file share (using the gui)? Do I need to change something else to modify the default backup drive?
We have a valid full backup of a database. We know it is valid, we have restored it twice from the network with no problems, but we do not have access to the network location from our sandbox environment.
The .bak file is sizable at about 9GB. The .bak file resides on our internal network, in a SAN drive. No problems there. When we copy the file from there into a sandbox environment to attempt the restore in the sandbox environment it gets corrupted. We've tried three different times and all three different times it gets corrupted. First time we copied the file over to the sandbox the restore went up to 50% and failed. The second time we copied the file again and attempted the restore again it failed at 70%.
The third time it failed at 60%. The error message we get during the restore is "...Read on ... failed: 13(The data is invalid) Msg 3013, Level 16, State 1, Line 1 Restore database is terminating abnormally."
Now some background here. To move the file our network team is doing this: they have this .vmdk file that they mount out in our production environment (which has access to the network location where the .bak file is), copy the file into the drive, then move the .vmdk file into the sandbox environment(which does not have access to the network location), mount the drive in the sandbox environment, and then I have access to the .bak file from within the sandbox environment.
Something in the process of using the .vmdk file to move the .bak file from production into the sandbox is causing the file to get corrupted.
I encounter a bug while exporting to excel a matrix. one cell is shifted to right and so i have wrong numbers and empty cell in the middle of the matrix.
instead of :
ABS Recent College Graduate (C)
Male (M) Total Female (F) Male (M) Total
Headcount Headcount Row % Headcount Col % Headcount Headcount Row % Headcount Col % Headcount Headcount Row % Headcount Col % Headcount Headcount Row % Headcount Col % Headcount Headcount Row % Headcount Col %
232 34 54 56 56 5 24 56 56 56 34 23 43 54 56
I get:
ABS Recent College Graduate (C)
Male (M) Total Female (F) Male (M) Total
Headcount Headcount Row % Headcount Col % Headcount Headcount Row % Headcount Col %
Headcount Headcount Row % Headcount Col % Headcount Headcount Row % Headcount Col % Headcount Headcount Row %
232 34 54 56 56 5 24 56 56 56 34 23 43 54 56
does anyone familiar with a solution to this issue? Thanks. Yuval.
We have a Job that calls a SSIS package 2005 that does some processing and execute a BAT file. This Job is being called by a web application.The BAT file creates a folder and named it based on the current date ( YYYY_MM) e.g 2015_07
It was working okay in the SQL Agent 2005 server until we moved to the new server SQL Agent 2012 using the same package SSIS package 2005. Now the issue is, instead of creating the folder based on YYYY_MM, it's now being created as YYYY_DD.I've checked the Regional settings of both server and they have the same "ENGLISH (United States) format. I even ran the code below and they're returning the same output echo %date:~ 10,4%_% date :~4,2%
I know the BAT file can be improved by not depending current locale in WINDOWS, but I just want to understand how this issue occurs and how does the regional setting being overridden?
I am getting the error: Cannot open database "aspnetdb" requested by the login. The login failed. When I browse to my ASP.NET 3.5 LINQ web application on the IIS 6.0 server on Server 2003. I imagine this is because while I granted SQL Server 2005 login and permissions to my database that the application stores its data in, I did NOT grant any rights to the service account the IIS Application Pool uses for its identity to the aspnetdb database on SQL Server which is where all my roles information is stored at. My question is what are the MINIMUM permissions needed for this database so it can perform its roles related functions? I'm using Windows Authentications with the SQL Role provider for authorization.
Thank you.
EDIT: I think I only need to open the aspnetdb database and add my login to the aspnet_Roles_FullAccess role. Is that correct?
helo all...,i want to make procedure like:examplei have table: item (itemid,itemname,stock)orderdetail(no_order,itemid,quantity)itemmoment(itemid,itemname,stock)item table itemid itemname stock c1 coconut 2 p1 peanut 2orderdetail tableno_order itemid quantity 1 c1 5itemmoment tableitemid itemname stock c1 coconut 0 p1 peanut 0 when customer paid, his quantity in orderdetail decrease stock in item table..so stock in item table became:itemid itemname stock c1 coconut -3 p1 peanut 2it's not good, because stock may not minus...so i want to move -3 to itemmoment table..so stock in item table became:itemid itemname stock c1 coconut 0 p1 peanut 2and in itemmoment table became:itemid itemname stock c1 coconut 3 p1 peanut 0my store procedure like:ALTER PROCEDURE [dbo].[orders]( @no_order as integer, @itemid AS varchar(50), @quantity AS INT)ASBEGIN BEGIN TRANSACTION DECLARE @currentStock AS INT SET @currentStock = (SELECT [Stok] FROM [item] WHERE [itemid] = @itemid) UPDATE [item] SET [Stock] = @currentStock - @quantity WHERE [itemid] = @itemid COMMIT TRANSACTIONENDit's only decrease stock with quantity. i want move stock minus from item to itemmoment..can anyone add code to my store procedure?plss.. helpp.thxx....
Our techs moved sql server over to a different server thats solely for sql and now the backups wont work right, they check for the consistency but wont back up the data. Can anyone give some suggestions please