I'm looking for ideas on how to write SQL scripts for updates that are
pushed out to clients for product updates. Obviously, We could just
keep track of the changes on a pad or write a database that requires
us to input those changes and eventually hand write the update
scripts. I was wondering if anybody has any solutions that may help
automate this process.
Is there a way to write an application that will compare a current
(updated) database structure against the last realease that will give
us the fields that need to be changed?
As far as creating the scripts for the initial install, thats easy. We
can do that right from the SQL Enterprise Manager.
There is all kinds of great info out there about the mechanics behind column level encryption in SQL2005, but it all seems to assume I only have 1 or 2 database servers. If I am using an X509 certificate to encrypt my data, it looks as if I can script the administration of this fairly easily.
But what if I have 1000 SQL Servers?
Is there any guidance/best practices/tools out there that will help me manage the 1000 certificates that I would need to deploy in such a scenario. Also, what if I need to 'rotate' the certificates for some reason. Can a PKI for the domain help me to automate and manage this?
It seems as if the management of these certificates is purely 'manual' at this point.
I am trying to update some informaiton on a linked server based on information on the local server, and I get the following error:
Could not open table '"infoimg"."dbo"."queues2"' from OLE DB provider 'SQLOLEDB'. The provider could not support a row lookup position. The provider indicates that conflicts occurred with other properties or requirements.
The table does exeist, and linked-server selects on it work just fine. It's when I try to run this statement:
update imgtr02.infoimg.dbo.queues2 set o_resource = null, lock_time = 0 from imgtr02.infoimg.dbo.queues2 a, sysid_stuck b where a.id = b.id
I have a problem in that I have a live SQL 7.0 server as a back end to a web server for a large company in Germany and of course we also have a development server locally which is more or less the same apart from data. The ASP developers here want to upload some data to the live server and also download live data from the live server so that we have more or less the same data on both systems.
I have thought about doing this with DTS adn writing SQL scripts to merge the tables and then upload back to the live server etc. but before I do I wondered if anyone had experience of doing this before (I am sure) and could either tell me what are the best methods or package to do or point me or mail me an article that covers this sort of thing.
Of course extreme paranioa creeps in at the thought of copying down large tables from the live server etc. etc. so if anyone can help - great!
I have two servers S1 and S2. Inmediately after new data on S1 is available I want to perform some actions on S2.
I can use a trigger on S1, but if S2 is down the transaction on S1 will be lost. I could use database replication but I only need one single table in S1 to report changes to S2
I'm trying to work out a database design to make it quicker for my clientprogram to read and display updates to the data set. Currently it reads inthe entire data set again after each change, which was acceptable when thedata set was small but now it's large enough to start causing noticabledelays. I've come up with a possible solution but am looking for others'input on its suitability to the problem.Here is the DDL for one of the tables:create table epl_packages(customer varchar(8) not null, -- package_type char not null, -- primary keypackage_no int not null, -- /dimensions varchar(50) not null default(0),weight_kg int not null,despatch_id int, -- filled in on despatchloaded bit not null default(0),item_count int not null default(0))alter table epl_packagesadd constraint pk_epl_packagesprimary key (customer, package_type, package_no)My first thought was to add a datetime column to each table to record thetime of the last change, but that would only work for inserts and updates.So I figured that a separate table for deletions would make this complete.DDL would be something like:create table epl_packages(customer varchar(8) not null,package_type char not null,package_no int not null,dimensions varchar(50) not null default(0),weight_kg int not null,despatch_id int,loaded bit not null default(0),item_count int not null default(0),last_update_time datetime default(getdate()) -- new column)alter table epl_packagesadd constraint pk_epl_packagesprimary key (customer, package_type, package_no)create table epl_packages_deletions(delete_time datetime,customer varchar(8) not null,package_type char not null,package_no int not null)And then these triggers on update and delete (insert is handled automaticallyby the default constraint on last_update_time):create trigger tr_upd_epl_packageson epl_packagesfor updateas-- check for primary key changeif (columns_updated() & 1792) > 0 -- first three columns: 256+512+1024insert epl_packages_deletionsselectgetdate(),customer,package_type,package_nofrom deletedupdate Aset last_update_time = getdate()from epl_packages Ajoin inserted Bon A.customer = B.customer andA.package_type = B.package_type andA.package_no = B.package_nogocreate trigger tr_del_epl_packageson epl_packagesfor deleteasinsert epl_packages_deletionsselectgetdate(),customer,package_type,package_nofrom deletedgoThe client program would then do the initial read as follows:select getdate()selectcustomer,package_type,package_no,dimensions,weight_kg,despatch_id,loaded,item_countfrom epl_packageswherecustomer = {current customer}order bycustomer,package_type,package_noIt would store the output of getdate() to be used in subsequent updates,which would be read from the server as follows:select getdate()selectcustomer,package_type,package_no,dimensions,weight_kg,despatch_id,loaded,item_countfrom epl_packageswherecustomer = {current customer} andlast_update_time > {output of getdate() from previous read}order bycustomer,package_type,package_noselectcustomer,package_type,package_nofrom epl_packages_deletionswherecustomer = {current customer} anddelete_time > {output of getdate() from previous read}The client program will then apply the deletions and the updated/insertedrows, in that order. This would be done for each table displayed in theclient.Any critical comments on this approach and any improvements that couldbe made would be much appreciated!
I have many sql 2000 DTS packages that I support from my development workstation running v2000 sp4. Packages are altered on the development machine and then go through a normal release mechanisms to production via testing servers etc.
I have recently installed the client tools for SQL Server 2005 on my desktop to evaluate the product. The 2005 DB instance is running on a seperate server.
So, I have dev edition of sql 2000 and 2005 client tools (including BI Dev studio etc) on my workstation.
I have recently had to make changes to a 2000 DTS package and used my 2000 enterprise manager to do so. No Problem- saved and tested fine on my workstation.
But when I try and release it to another server, or open the package using enterprise manager from another machine that does not have sql 2005 installed - I get an error message 'Unspecified error'. This I've seen before when trying to open packages created in v2000 , using v7 or where the service packs are different between machines.
Digging around my workstation and comparing some of the DLLs I know to be required to distribute DTS packages (from RDIST.txt) it seems that some of the SQL 2000 dll files have been updated by my 2005 installation.
E.g
DTSFFILE.DLL on my machine is 2000.85.1054.0 whilst on any 'clean' 2000 machine is at version v2000.80.760.0
Surely it cant be right that SQL 2005 has newer versions of components for SQL 2000 than is available with the latest SP for the actual product! Especially considering that the installation of 2005 does not even allow you to edit 2000 DTS packages through the management studio without a 'special' download' of the feature pack,(whihc by the way does not work very well either)
So am I to conclude that you can not run side by side installations of SQL 2000 and 2005 on a single machine and expect 2000 to run as it did previously !!!
I have a question regarding the nature of virtual sql servers, specificially what protocol is used to communicate to the server when a request is made by a client.
For example, if a scheduled job is run on the virtual sql server, what determines the protocol used (e.g. TCP/IP, named pipes etc.) by SQL Server agent? Is it the client network alias set up on the virtual server?
I am asking because currently the client aliases on some of our virtual sql servers are using named pipes and I think this is causing a problem with our backups.
I can't map other sql servers without creating alias with proper port number on client network utility. Other users using same version of the client tools, MMC, SQL DMF etc. I need to map 70 sql servers on using my client tools. Any help is appreciated.
I have setup a database mirroring session with witness - MachineA is the principal, MachineB is the mirror, and MachineC is the witness. Each SQL Server instance is hosted on its own machine. The mirroring is working correctly. If I submit data to the database on MachineA, and then unplug the network cable on MachineA, MachineB automatically becomes the principal, and I can see the data that I originally submitted to MachineA on MachineB. All the settings are showing correctly in Management Studio.
My issue is with the SQL Native Client and a front-end application that needs to make use of this database. I have setup my front-end application to use the ODBC client and specified the failover server in both the ODBC setup and the connection string. Here is the connection string that I am using :
Everything works perfectly on my front-end application when MachineA is the principal. If I unplug the network cable on MachineA, MachineB becomes the principal, and the failover occurs correctly on the database side. The problem is that my front-end application is not able to query the database on MachineB.
BUT - if I plug the network cable back in on MachineA (making the database on MachineA the mirror), the front-end application now works and can access the principal database on MachineB. I wrote a quick tester application to verify what I am seeing, and I am convinced that this is what is happening. The mirroring is working perfectly, and everything is setup correctly. The SQL Native Client is setup correctly. The problem is that the automatic failover to MachineB that is built into the SQL Native Client only works if both servers are plugged in.
In this scenario, when I plug both servers in, I know that the front-end app is definitely pulling from MachineB (since the mirror database on MachineA is in recovery mode, it's unavailable, and the front-end app displays the server that it is pulling data from).
Am I using an out-dated SQL Native Client? The version number displayed in the ODBC configuration page is 2005.90.2047.00, and is dated 4/14/2006. Has anyone experienced this issue? I'm guessing that it's a problem in the SQL Native client, since the mirroring really seems to be working correctly.
I have a project that consists of a SQL db with an Access front end as the user interface. Here is the structure of the table on which this question is based:
Code Block
create table #IncomeAndExpenseData ( recordID nvarchar(5)NOT NULL, itemID int NOT NULL, itemvalue decimal(18, 2) NULL, monthitemvalue decimal(18, 2) NULL ) The itemvalue field is where the user enters his/her numbers via Access. There is an IncomeAndExpenseCodes table as well which holds item information, including the itemID and entry unit of measure. Some itemIDs have an entry unit of measure of $/mo, while others are entered in terms of $/yr, others in %/yr.
For itemvalues of itemIDs with entry units of measure that are not $/mo a stored procedure performs calculations which converts them into numbers that has a unit of measure of $/mo and updates IncomeAndExpenseData putting these numbers in the monthitemvalue field. This stored procedure is written to only calculate values for monthitemvalue fields which are null in order to avoid recalculating every single row in the table.
If the user edits the itemvalue field there is a trigger on IncomeAndExpenseData which sets the monthitemvalue to null so the stored procedure recalculates the monthitemvalue for the changed rows. However, it appears this trigger is also setting monthitemvalue to null after the stored procedure updates the IncomeAndExpenseData table with the recalculated monthitemvalues, thus wiping out the answers.
How do I write a trigger that sets the monthitemvalue to null only when the user edits the itemvalue field, not when the stored procedure puts the recalculated monthitemvalue into the IncomeAndExpenseData table?
Hi... I have data that i am getting through a dbf file. and i am dumping that data to a sql server... and then taking the data from the sql server after scrubing it i put it into the production database.. right my stored procedure handles a single plan only... but now there may be two or more plans together in the same sql server database which i need to scrub and then update that particular plan already exists or inserts if they dont...
this is my sproc... ALTER PROCEDURE [dbo].[usp_Import_Plan] @ClientId int, @UserId int = NULL, @HistoryId int, @ShowStatus bit = 0-- Indicates whether status messages should be returned during the import.
AS
SET NOCOUNT ON
DECLARE @Count int, @Sproc varchar(50), @Status varchar(200), @TotalCount int
SET @Sproc = OBJECT_NAME(@@ProcId)
SET @Status = 'Updating plan information in Plan table.' UPDATE Statements..Plan SET PlanName = PlanName1, Description = PlanName2 FROM Statements..Plan cp JOIN ( SELECT DISTINCT PlanId, PlanName1, PlanName2 FROM Census ) c ON cp.CPlanId = c.PlanId WHERE cp.ClientId = @ClientId AND ( IsNull(cp.PlanName,'') <> IsNull(c.PlanName1,'') OR IsNull(cp.Description,'') <> IsNull(c.PlanName2,'') )
SET @Count = @@ROWCOUNT IF @Count > 0 BEGIN SET @Status = 'Updated ' + Cast(@Count AS varchar(10)) + ' record(s) in ClientPlan.' END ELSE BEGIN SET @Status = 'No records were updated in Plan.' END
SET @Status = 'Adding plan information to Plan table.' INSERT INTO Statements..Plan ( ClientId, ClientPlanId, UserId, PlanName, Description ) SELECT DISTINCT @ClientId, CPlanId, @UserId, PlanName1, PlanName2 FROM Census WHERE PlanId NOT IN ( SELECT DISTINCT CPlanId FROM Statements..Plan WHERE ClientId = @ClientId AND ClientPlanId IS NOT NULL )
SET @Count = @@ROWCOUNT IF @Count > 0 BEGIN SET @Status = 'Added ' + Cast(@Count AS varchar(10)) + ' record(s) to Plan.' END ELSE BEGIN SET @Status = 'No information was added Plan.' END
SET NOCOUNT OFF
So how do i do multiple inserts and updates using this stored procedure...
Product: Microsoft SQL Server 2005 -- Error 29515. SQL Server Setup could not connect to the database service for server configuration. The error was: [Microsoft][SQL Native Client]Encryption not supported on the client. Refer to server error logs and setup logs for more information. For details on how to view setup logs, see "How to View Setup Log Files" in SQL Server Books Online.
On Windows XP systems I get the following issue when trying to browse the MSDB folder in SSIS
Client unable to establish connection Encryption not supported on SQL Server. (Microsoft SQL Native Client)
I have noticed another post where several others have noticed the same issue. It appears to only occur on Windows XP installations. Is there a workaround or fix for this?
I have SQL2000 installed as the default instance, and now I'm trying to install SQL 2005 standard edition as a named instance.
I receive this error : SQL Server could not connect to database service for server configuration.. [SQL Native client] Encryption not supported on the client. However I'm able to install client tools
The setup works fine on other box with the same config : SQL 2000/Windows XP, is there any work around for this issue ?
In my SQL 2000 client network utilty "Force proctocol encryption " is desabled and did not find the setting for SQL 2005 !
I have a report that was designed using SQL Reporting Services that sits on a SQL reporting server. It's nothing too exciting, it is essentially a three page application with legal jumbo on pages 2 and 3 and applicant data in fields on page 1.
We use rectangles to force page breaks to page 2 and to page 3.
When running the report on the report server, it shows and prints fine.
When running the report from the QA website internally, it shows and prints just fine.
When running the report from the production website from a machine internally, it shows and prints just fine.
When running the report from outside of the company network, the report is jacked. It obliterates large chunks of text, crams text together, and creates blank pages.
I need help in determining where I even begin with trouble shooting this!
Hello,Is it necessary to upgrade the Client Connectivity Tools on all clientmachines after the SQL Server database server is upgraded from Version7.0 to 2000?Thank you in advance!Eddy
Does anyone know if it's possible to grant a user the ability to manage jobs in server agent besides giving them SA rights? I none of the server roles beside SA seem to be able to work.
I am currently working on a large project that will be developed using MS SQL. I have been considering two options when it comes the creation of the database. The first option is to create a sepereate table for each of my clients. The tables would be labeled by their Id number then the tables reqular name. Fore example : 781_Users.
The other option is to create a different database for each of the clients.
I just need the data to be seperated between clients, it cant mix. Please just give me your opinion on which option you would consider if you were operating a large project.
Hello everyone. Have a little situation here. I am not normally the DBA at my work, but i've been poking around a little with a SQL Server 2005 implementation. I'm pretty new to this, so please pardon my lack of knowledge.
This particular server is running a lightly used SQL server, or so im told. As part of my Sys Admin duties, I logged into the box today for some routine checks, only to be notified that I was running out of disk space (which is another story). After poking around awhile, I see that my tempdb is almost 15gb in size.
I don't have a lot of experience with SQL Server 2005. I've worked mostly with 2000. I did read a few things where tempdb has changed a bit in 2005? I wasn't sure.
So I have about 10gigs of space free on my HD (I'm adding more space tomorrow) and I need to figure out what to do.
Hello,Has anyone ever come across a reason why someone would manually create auser table incl. permission flags and not use the inbuilt user/rolesprovided by that database? The only reason that stands out for me is tomake the database that bit more portable?Thanks,Craig.
Hi everyone,I'm looking into moving my desktop system at work to Linux from WindowsXP, and one tool I've yet to find a suitable replacement for is MS SQLEnterprise Manager. I mainly need to query MS SQL databases fromLinux, and management tasks (monitor backups, services, etc) can bedone on the MS SQL servers themselves. THere was an application I usedto use on OSX which would connect to a variety of databases, includingMS SQL, and they had a Linux version of the client .. but can'tremember the name. But really any Sybase client that can connect to MSSQL with some ease will work.Thanks for any suggestions or ideas ...Alex
I want to centralize my previous standalone application. Previous application was using VB.NET and Access XP. Now I want to keep a centralized database (SQL Server 2005) and VB.NET 2005. At this point of time thousands of concurrent users will connect to the database at the same time.
In my application, when a ticket is being issued to a tourist, an SQL query finds the max(ticketno) for the current month from the main table and reserves this number for the current ticket. When the operator has finished entering information and clicks SAVE button, the record is saved to the main table with this ticket no. Previously there was no issue since the database was standalone.
I want to know how to block the new ticket no so that other concurrent users are not using the same number for saving a record. How to better tune the database for thousands of concurrent users at the same time? I also want that the other user must not get an error message when he attempts to save a record. It should be automatically handled by the database.
Hello,I have an ASP.NET website which uses SQLServer 2000 as database. I am into tracking the customers who visit to the website. We are getting around 10000 users per day. When each page gives request, request details will be inserted into SQL tables.Table's I am using is indexed. I am in confusion that when these index will be regenerated ? Will it regenerated automatically when new row's are inserted ? or do we need to regenerate it manually ? And what is the difference on Clustured Indexing and Full text indexing ? Which one I should use for better performance ?Thanking youNavaneeth
Hi everyone, When an error occured in a transaction and we can also create a message for this error ourself too by using @@error in order to print a message . However, I wonder that is it possible to really catch and handle the error ?? For example, when error occured it makes the transaction terminated. So how can we prevent transaction from terminating ?? Meanwhile, I also examine RAISERROR statement but it also does not prevent a transaction from terminating instead ÅŸt only provide an additional message for the error.
can anyone tell me if they know of a way to automate the update process from development servers to live server, with little interference from an administrator
I have a development team that are constantly updating their databases along with their ASP code, and want to publish changes an a weekly basis. They have asked me for a way to take their new structures, tables, procedures etc, and copy them to the live servers, but NOT to interfere with existing customer data.
Funny I know – and I hate the idea btw :(
Any references, contacts, 3rd party tool recommendations welcome,
I am in the middle of a major migraton project, moving from x86 SQL 2000 to IA64 SQL 2005. I have a business need to link to several legacy servers. I have a number of problems I am trying to solve.
1) Linking a Kerberos server to a non-Kerberos server. 2) Linking x64 or IA64 servers to x86 servers. 3) Linking SQL 2005 to SQL 2000.
Two of the errors I am encountering are: ------------------------------ TCP Provider: An existing connection was forcibly closed by the remote host. Login failed for user '(null)'. Reason: Not associated with a trusted SQL Server connection. OLE DB provider "SQLNCLI" for linked server "SCDC250DB" returned message "Communication link failure". (Microsoft SQL Server, Error: 10054) ------------------------------ And ------------------------------ The OLE DB provider "SQLNCLI" for the linked server "SCDC250DB" reported an error. Authentication failed. Cannot initialize the data source object of OLE DB provider "SQLNCLI" for linked server "SCDC250DB". OLE DB provider "SQLCLI" for linked server "SCDC250DB" returned message "Invalid authorization specification". (Microsoft SQL Server, Error: 7399)
If someone has worked through these problems before, I would appreciate it if you could direct me to the relevant documentation to resolve these issues.
Thanks!
Brandon Forest
Database Administrator
Data & Web Services Team
Sutter Connect Information Technologyforesb@sutterhealth.org
I've combed through SQL Help to find the answer to my question but I think it's telling me it can't be done. I work both from an office with my servers and from home. When I'm at home I would like to access my SQL server remotely using a tool such as MS SQL Server Management Studio. But it appears there is no way to access my SQL Server for management purposes using Management Studio over a remote internet connection. I can access the server using Management Studio while I'm on the internal office network but not from home. Has anyone been able to do this or might recommend a third party tool as robust as Management Studio? Thanks
HiI'm trying to access the properties of my datasource programatically so I can change the stored procedure it connects to depending on the value of a querystringIn my page load I have Dim myMode = Request.Params("mode")Select Case myMode Case "Category" sqldatasource1.SelectCommand = "productsbycategory" sqldatasource1.SelectCommandType = SqlDataSourceCommandType.StoredProcedure ******'Need to add querystring parameter Case "Wishlist" sqldatasource1.SelectCommand = "productsbywishlist" sqldatasource1.SelectCommandType = SqlDataSourceCommandType.StoredProcedure ******'Need to add Session variable parameter Case Else sqldatasource1.SelectCommand = "allProducts" sqldatasource1.SelectCommandType = SqlDataSourceCommandType.StoredProcedureEnd SelectThe problem that I have is that I need to pass a parameter to the first 2 procedures above, a querystring to the first and a session variable to the second.How do I add parameters programaticallyMany thanks
Hi, I just start using MSSQL server 2005, I already install vss studio 2005 so SQL server 2005 is already installed by default (express edition).
But unlike SQL server 2000, I coultn't find any tools similiar to "sql entreprise manager". I was able to create a new database only from server explorer tools in VSS. But pretty strange that this database act like a microsoft access database (creating a database file on my app_data folder). (I am sorry I am new in SQL server 2005 ExpressEdition ).
My question : 1. is that (above) really how mssql server 2005 works? 2. how to create a new user for this database? (example user: sa password: "admin")