Sql Report Works Fine On Internal Servers - Hosed On External Servers - Need Some Help
Nov 21, 2007
I have a report that was designed using SQL Reporting Services that sits on a SQL reporting server. It's nothing too exciting, it is essentially a three page application with legal jumbo on pages 2 and 3 and applicant data in fields on page 1.
We use rectangles to force page breaks to page 2 and to page 3.
When running the report on the report server, it shows and prints fine.
When running the report from the QA website internally, it shows and prints just fine.
When running the report from the production website from a machine internally, it shows and prints just fine.
When running the report from outside of the company network, the report is jacked. It obliterates large chunks of text, crams text together, and creates blank pages.
I need help in determining where I even begin with trouble shooting this!
Hi All, I am doing some research about External Linked Servers and am hoping that someone can point me towards some best practices information and let me know about any gotchas that I should look out for when using this capability in applications. Thanks in advance,
I am attempting to use Service Broker to distribute message processing across multiple machines. I have mulitple queues setup to all notify a single service with the QUEUE_ACTIVATION event. My external application then sits in a "WAITFOR RECEIVE" on that event queue and when it receives a message for a particular queue, will begin processing the messages in that queue if that particular server is configured to process messages of that type.
My problem is that once one server starts receiving messages, the others don't always start processing and I end up with one server attempting to process all the messages by itself.
What is the best way to notify multiple servers that there are messages in a particular queue which need processing? I really like the WAITFOR clause since it eliminates the need to poll the server intermittently, however I also don't want to have each server sit in a WAITFOR RECEVE on every queue to avoid polling, I could easily end up with around a thousand connections sitting in that state. I also don't know how many servers there will be processing messages, so I'd like to avoid having to setup an event queue for each server.
(Remus, if you read this, I was at the SQL Performance Lab in building 35 about a month ago and we spoke about using Service Broker in our application.)
On some servers the below stored procedure runs fine. On others we get error message - 2147217900 Incorrect Syntax Near '___ ___' - which is from the line that starts with AND (ISNUMERIC(SUBSTRING etc ----------------------------------------------------------------------------- IF @Specialty = 'None' BEGIN -- No agent specialty selected. EXEC(' TRUNCATE TABLE tblAgents INSERT INTO tblAgents SELECT DISTINCT c1.accountno, c1.recid, c1.company, c1.contact, c1.secr, c1.address1, c1.address2, c1.city, c1.state, LEFT(UPPER(c1.zip),5) AS zip, c1.phone1, c1.fax, '''' AS email, ''0'' AS dist, NULL AS lat, NULL AS long FROM contact1 c1 WHERE c1.u_key1 LIKE ''' + @DsgntrAgnt + ''' AND (c1.u_key4 IN (' + @PrefDsgntrList + ') AND (ISNUMERIC(SUBSTRING(c1.zip,1,5)) = 1) OR (c1.zip LIKE ''___ ___'')) ') END -------------------------------------------------------------------------- Thanks
I have a report which I have tested and works fine. now I'm trying to use it as a subreport. the "outer" or main report is very simple: it just has a company standard banner and some header/footer information, and then a single subreport. there is no passing of parameters between main report and sub report. the subreport does have its own parameter to govern its dataset, and provides its own default for that.
The error that I'm getting is this:
[rsErrorExecutingSubreport] An error occurred while executing the subreport €˜subreport1€™: An error has occurred during report processing.
[rsMissingFieldInDataSet] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Year€™. This field is missing from the returned result set from the data source.
[rsErrorReadingDataSetField] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Year€™. The data extension returned an error during reading the field.
[rsMissingFieldInDataSet] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Month€™. This field is missing from the returned result set from the data source.
[rsErrorReadingDataSetField] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Month€™. The data extension returned an error during reading the field.
[rsMissingFieldInDataSet] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Date€™. This field is missing from the returned result set from the data source.
[rsErrorReadingDataSetField] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Date€™. The data extension returned an error during reading the field.
[rsMissingFieldInDataSet] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Wt_TO_MTD€™. This field is missing from the returned result set from the data source.
[rsErrorReadingDataSetField] The data set €˜WarrantMasterCube€™ contains a definition for the Field €˜Wt_TO_MTD€™. The data extension returned an error during reading the field.
[rsNone] An error has occurred during report processing.
Of course, this doesn't happen when I execute the subreport by itself. What kinds of things should I be looking at to get to the bottom of this. Thanks!
works fine in designer but when i load the report services I get the following error anybody know what to do there is one subreport with this report maybe the passing value but what could be wrong ????
Item has already been added. Key in dictionary: '9' Key being added: '9'
I have setup a database mirroring session with witness - MachineA is the principal, MachineB is the mirror, and MachineC is the witness. Each SQL Server instance is hosted on its own machine. The mirroring is working correctly. If I submit data to the database on MachineA, and then unplug the network cable on MachineA, MachineB automatically becomes the principal, and I can see the data that I originally submitted to MachineA on MachineB. All the settings are showing correctly in Management Studio.
My issue is with the SQL Native Client and a front-end application that needs to make use of this database. I have setup my front-end application to use the ODBC client and specified the failover server in both the ODBC setup and the connection string. Here is the connection string that I am using :
Everything works perfectly on my front-end application when MachineA is the principal. If I unplug the network cable on MachineA, MachineB becomes the principal, and the failover occurs correctly on the database side. The problem is that my front-end application is not able to query the database on MachineB.
BUT - if I plug the network cable back in on MachineA (making the database on MachineA the mirror), the front-end application now works and can access the principal database on MachineB. I wrote a quick tester application to verify what I am seeing, and I am convinced that this is what is happening. The mirroring is working perfectly, and everything is setup correctly. The SQL Native Client is setup correctly. The problem is that the automatic failover to MachineB that is built into the SQL Native Client only works if both servers are plugged in.
In this scenario, when I plug both servers in, I know that the front-end app is definitely pulling from MachineB (since the mirror database on MachineA is in recovery mode, it's unavailable, and the front-end app displays the server that it is pulling data from).
Am I using an out-dated SQL Native Client? The version number displayed in the ODBC configuration page is 2005.90.2047.00, and is dated 4/14/2006. Has anyone experienced this issue? I'm guessing that it's a problem in the SQL Native client, since the mirroring really seems to be working correctly.
can anyone tell me if they know of a way to automate the update process from development servers to live server, with little interference from an administrator
I have a development team that are constantly updating their databases along with their ASP code, and want to publish changes an a weekly basis. They have asked me for a way to take their new structures, tables, procedures etc, and copy them to the live servers, but NOT to interfere with existing customer data.
Funny I know – and I hate the idea btw :(
Any references, contacts, 3rd party tool recommendations welcome,
I am in the middle of a major migraton project, moving from x86 SQL 2000 to IA64 SQL 2005. I have a business need to link to several legacy servers. I have a number of problems I am trying to solve.
1) Linking a Kerberos server to a non-Kerberos server. 2) Linking x64 or IA64 servers to x86 servers. 3) Linking SQL 2005 to SQL 2000.
Two of the errors I am encountering are: ------------------------------ TCP Provider: An existing connection was forcibly closed by the remote host. Login failed for user '(null)'. Reason: Not associated with a trusted SQL Server connection. OLE DB provider "SQLNCLI" for linked server "SCDC250DB" returned message "Communication link failure". (Microsoft SQL Server, Error: 10054) ------------------------------ And ------------------------------ The OLE DB provider "SQLNCLI" for the linked server "SCDC250DB" reported an error. Authentication failed. Cannot initialize the data source object of OLE DB provider "SQLNCLI" for linked server "SCDC250DB". OLE DB provider "SQLCLI" for linked server "SCDC250DB" returned message "Invalid authorization specification". (Microsoft SQL Server, Error: 7399)
If someone has worked through these problems before, I would appreciate it if you could direct me to the relevant documentation to resolve these issues.
Thanks!
Brandon Forest
Database Administrator
Data & Web Services Team
Sutter Connect Information Technologyforesb@sutterhealth.org
I'm working with the SQL Report Viewer in VS2k5. In the Data tab, where the Dataset drop down list is, I click the "..." to edit the dataset. I am currently using the following for the datasource... Data Source=ServerName;Initial Catalog=DatabaseName In one database, I store an employee ID number. I need to access a different database on a different server to reference the employee ID and pull the employee's name. Is there a way to specify two different databases on two different servers in the connection string above? Thanks!
I am trying to configure MOSS and SSRS and when I go to Grant Database Access via Central Admin, I get this error No report servers were found on the specified machine.
I have created the sharepoint integrated databae via Reporting Services Config tool. Any ideas?
Same RDL, 2 different servers. I run the report on my computer and export to PDF, it prints properly. When the customer runs the report on their server (SSRS 2K5 SP1, same as mine), they get it displayed differently. The columns on the report extend to the next page and the lines are thicker.
Is this a formatting issue on the customer's PC? It uses standard fonts (Tahoma, Sans-serif).
Error: "no report servers were found on the specified machine"
I get this error when launching the Reporting Services Configuration Manager...I've never seen this error, usually it can talk to my local servername but unfortunately it's not. This is a new test server and the first time I've come across this message after setting up Reporting services on 2 machines sucessfully in the past.
I'm looking to deploy some SQL Server reports and I want to restrict the access that the users have. Currently when connecting to the reports site they have access to a lot of functionality through the header bar, for example - Properties - New Folder - New Data Source - My Subscriptions - Site Settings - Search etc.
How can I disbale or hide all these options so that all the user sees is the list of reports?
Our report server are constantly getting the below error.
What causes this - I know how to fix it, in fact, I've automated it but why does it constantly happen on some servers? I guess I'd like to know what causes it to try and fix it at those points instead of having to fix it here. Proactively.
Reporting Services Error
The report server cannot decrypt the symmetric key used to access sensitive or encrypted data in a report server database. You must either restore a backup key or delete all encrypted content. Check the documentation for more information. (rsReportServerDisabled) (rsRPCError) Get Online Help
The report server cannot decrypt the symmetric key used to access sensitive or encrypted data in a report server database. You must either restore a backup key or delete all encrypted content. Check the documentation for more information. (rsReportServerDisabled)
The report server cannot decrypt the symmetric key used to access sensitive or encrypted data in a report server database. You must either restore a backup key or delete all encrypted content. Check the documentation for more information. (rsReportServerDisabled)
I have a report designed in RB3 that uses a data source from a SQL database that is on the report server. I want to add data to the report from an access database an a network drive.
I can add the second data source and create a data set to add data to the report. The dataset query returns data from the Access database but when I run the report I get the following error.
An error has occurred during report processing. (rsProcessingAborted)
Cannot create a connection to data source 'Feasibility'. (rsError OpeningConnection)
Also when I test the connection to the Access data base I get error ERROR (IM002) (Microsoft)(ODBC Driver Manger) Data source name not found and no default driver specified.
I noticed that Report Builder is connecting to the report server. If I disconnect from the report server I can a can connect to the Access data base but not the SQL database.
How can I get the report to run against both data sources?
I am a webdesigner who at the moment does not know SQL (although, I plan on remedying that) so I am developing a page with a DB designer - he is doing the DB work, I am doing the look/feel, but he asked me the following questions, of which I cannot seem to find an answer and the guys who tend our server are useless - so I am hoping someone here can help me. In general what he needs to know is:
"what is the external address (either in domain format or IP) and the equivalent internal address so that we can access the msSQL running on your server. The internal address is needed for the webpages to talk to it, while the external address is needed for development of the pages in visual studio and the database tools." Also, he later sent me an email asking that when I get this info (from someone) that it would be usefull to get a sample string/query to get access to the DB. I am running SQL 2000 and have enterprise manager. Where can I find this information? or how do I figure this out????
Thank you for all the help -- Please let me know if you need any more info.
I am stuck at a very awkward place. I have created one package which uses an oracle view as its source for data transfer the problem is when i run the package through dtexec it works fine but when i try to schedule it I get the following error
Error: 2008-03-24 13:52:40.22 Code: 0xC0202009 Source: pk_BMR_FEED_oracle Connection manager "Conn_BMR" Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80004005 Description: "Oracle client and networking components were not found. These components are supplied by Oracle Corporation and are part of the Oracle Version 7.3.3 or later client software installation.
Provider is unable to function until these components are installed.".
I am able to run the package outside the sql job and also connect to the oracle. I have oracle 9i client installed on the server and sql server is 2005.
So I get the basics, have a script that can deploy a single report to a single server. I have other simple scripts that work as well. Now I want a script that can deploy a single report to multiple Sharepoint servers. I have about 30. I know we are probably doing this wrong, as in, we shouldn't be deploying the same report to 30 sites, as they are all all technically all on the same server however they are all independent site collections so as far as I know there is no way to deploy them to a central location and then link to them from each site.
So I tried to copy elements of my single report deploy script and duplicate the Sub Main() to Sub Main1() and Sub Main2() with multiple "ParentPath1" dims, yes I know I'm stupid. That obviously didn't work. So looking at some other scripts including my own subscription updater and I'm sure I need to do some kind of "For Each item in X" but where to start. The other lists I've seen all come from rs.Something. My list is just 30 variables in the script, how can I say for each site in this list, Publish this report?
Here's the simple script that works fine. I just don't know how to modify it to make it deploy this to multiple sites. After I figure that out, I was going to accept the challenge of deploying multiple reports to multiple sites in a single script...
Dim warnings As Warning() = Nothing Dim parentFolder As String = "Reports" Dim filePath As String = "filepath" Dim reportName As String = "Report file name" Dim parentPath As String = "Sharepoint site URL" Public Sub Main()
I have many jobs on sql 05 and all work but one. This one writes to an Access DB on the same server as SQL. The package works fine. But when executed in the context of the SQL Agent job, it fails.
Jobs that write to a text file work fine. The Access DB has no password required. By the way, that job in sql 2000 worked fine.
Hi, I am executing a SSIS package using dtexec. 64 bit version of dtexec works fine. But when i use 32 bit version of dtexec, it fails. i have local admin rights. Following is error description. Please help.
Microsoft (R) SQL Server Execute Package Utility Version 9.00.1399.06 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved.
Started: 9:24:30 AM Error: 2008-03-18 09:24:32.54 Code: 0xC0202009 Source: IMALCRM Connection manager "IMAL SRC" Description: An OLE DB error has occurred. Error code: 0x800703E6. An OLE DB record is available. Source: "Microsoft OLE DB Service Components" H result: 0x800703E6 Description: "Invalid access to memory location.". End Error Error: 2008-03-18 09:24:32.54 Code: 0xC020801C Source: Load Fund Detail V_FUND_DETAIL [16] Description: The AcquireConnection method call to the connection manager "IMA L SRC" failed with error code 0xC0202009. End Error Error: 2008-03-18 09:24:32.54 Code: 0xC0047017 Source: Load Fund Detail DTS.Pipeline Description: component "V_FUND_DETAIL" (16) failed validation and returned er ror code 0xC020801C. End Error Error: 2008-03-18 09:24:32.54 Code: 0xC004700C Source: Load Fund Detail DTS.Pipeline Description: One or more component failed validation. End Error Error: 2008-03-18 09:24:32.54 Code: 0xC0024107 Source: Load Fund Detail Description: There were errors during task validation. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 9:24:30 AM Finished: 9:24:32 AM Elapsed: 2.078 seconds
Hola!I'm currently building a site that uses an external database to store all the product details, and an internal database that will act as a cache so that we don't have to keep hitting the external database to retrieve the products every time a customer requests a list.What I need to do is retrieve all these products from External and insert them into Internal if they don't exist - if they do already exist then I have to update Internal with new prices, number in stock etc.I was wondering if there was a way to insert / update these products en-mass without looping through and building a new insert / update query for every product - there could be thousands at a time!Does anyone have any ideas or could you point me in the right direction?I'm thinking that because I need to check if the products exist in a different data store than the original source, I don't have a choice but to loop through them all.Cheers,G.
If this post belongs somewhere else I appologize. I have spent several days trying to solve this problem with no luck. My site is online. Hosted at NeikoHosting. I can connect to the database remotely when adding a datacontrol. It tests fine. But when running the page it won't connect. Even if I go in and change the Web.Config connection string to a local Data Source provided to me by Neiko, it still won't work. It just won't connect. Here are the two connection strings in the Web.Config, minus my login info: Only the remote string will pass testing. Neither works on the site. <add name="yourchurchmychurchDBConnectionString" connectionString="Data Source=MSSQL2K-A;Initial Catalog=yourchurchmychurchDB;Persist Security Info=True;User ID=me;Password=pwd" providerName="System.Data.SqlClient" /> <add name="yourchurchmychurchDBConnectionString2" connectionString="Data Source=66.103.238.206;Initial Catalog=yourchurchmychurchDB;Persist Security Info=True;User ID=me;Password=pwd" providerName="System.Data.SqlClient" /> Here is the stack trace, if that helps. [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Data.OleDb.OleDbException: [DBNETLIB][ConnectionOpen (Connect()).]SQL Server does not exist or access denied.Source Error:
An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace:
My DTS Package work fine if I Execute it manually, but I need to do it automatically just after midnight. I defined my schedule and made sure the job was present in the SQL Server Agent>Jobs, but it fails and the Job History shows the following error:
DTSRun: Loading... DTSRun: Executing... DTSRun OnStart: DTSStep_DTSDataPumpTask_1 DTSRun OnError: DTSStep_DTSDataPumpTask_1, Error = -2147467259 (80004005) Error string: [Microsoft][ODBC Microsoft Access Driver] Cannot start your application. The workgroup information file is missing or opened exclusively by another user. Error source: Microsoft OLE DB Provider for ODBC Drivers Help file: Help context: 0 Error Detail Records: Error: -2147467259 (80004005); Provider Error: 1901 (76D) Error string: [Microsoft][ODBC Microsoft Access Driver] Cannot start your application. The workgroup information file is missing or opened exclusively by another user. Error source: Microsoft OLE DB Provider for ODBC Drivers Help file: Help context: 0 DTSRun OnFinish: DTSStep_DTSDataPumpTask_1 DTSRun: Package execution complete. Process Exit Code 1. The step failed.
I have the following query which works fine when it's executed as a single query. but when i union the result of this query with other queries, it returns a different set of data.
any one know why that might be the case??
select top 100 max(contact._id) "_id", max(old_trans.date) "callback_date", 7 "priority", max(old_trans.date) "recency", count(*) "frequency" --contact._id, contact.callback_date from topcat.class_contact contact inner join topcat.MMTRANS$ old_trans on contact.phone_num = old_trans.phone where contact.phone_num is not null and contact.status = 'New Contact' group by contact._id order by "recency" desc, "frequency" desc
i've included the union query here for completeness of the question
begin declare @current_date datetime set @current_date = GETDATE()
select top 100 _id, callback_date, priority, recency, frequency from ( ( select top 10 _id, callback_date, 10 priority, @current_date recency, 1 frequency --, DATEPART(hour, callback_date) "hour", DATEPART(minute, callback_date) "min" from topcat.class_contact where status ='callback' and (DATEPART(year, callback_date) <= DATEPART(year, @current_date)) and (DATEPART(dayofyear, callback_date) <= DATEPART(dayofyear, @current_date)) -- all call backs within that hour will be returned and (DATEPART(hour, callback_date) <= DATEPART(hour, @current_date)) and (DATEPART(hour, callback_date) <> 0) order by callback_date asc --order by priority desc, DATEPART(hour, callback_date) asc, DATEPART(minute, callback_date) asc, callback_date asc ) union ( select top 10 _id, callback_date, 9 priority, @current_date recency, 1 frequency from topcat.class_contact where status = 'callback' and callback_date is not null and (DATEPART(year, callback_date) <= DATEPART(year, @current_date)) and (DATEPART(dayofyear, callback_date) <= DATEPART(dayofyear, @current_date)) and (DATEPART(hour, callback_date) <= DATEPART(hour, @current_date)) and (DATEPART(hour, callback_date) = 0) order by callback_date asc ) union ( select top 10 _id, callback_date, 8 priority, @current_date recency, 1 frequency from topcat.class_contact where status = 'No Connect' and callback_date is not null and (DATEPART(year, callback_date) <= DATEPART(year, @current_date)) and (DATEPART(dayofyear, callback_date) <= DATEPART(dayofyear, @current_date)) and (DATEPART(hour, callback_date) <= DATEPART(hour, @current_date)) order by callback_date asc ) union ( select top 100 max(contact._id) "_id", max(old_trans.date) "callback_date", 7 "priority", max(old_trans.date) "recency", count(*) "frequency" --contact._id, contact.callback_date from topcat.class_contact contact inner join topcat.MMTRANS$ old_trans on contact.phone_num = old_trans.phone where contact.phone_num is not null and contact.status = 'New Contact' group by contact._id order by "recency" desc, "frequency" desc ) ) contact_queue order by priority desc, recency desc, callback_date asc, frequency desc
I have a report which has multivalue parameters enabled and If i give NULL it displays everything correctly. But if I give different ClientId it doesnt do it in the report.. But if i run my sproc in VS2005 and in ssms it works the way i want it. this is my sproc
Code Snippet set ANSI_NULLS ON set QUOTED_IDENTIFIER ON go ALTER Procedure [dbo].[usp_GetOrdersByOrderDate]
@StartDate datetime, @EndDate datetime, @ClientId nvarchar(max)= NULL AS Declare @SQLTEXT nvarchar(max) if @ClientId is NULL BEGIN SELECT o.OrderId, o.OrderDate, o.CreatedByUserId, c.LoginId, o.Quantity, o.RequiredDeliveryDate, cp.PlanId, cp.ClientPlanId --cp.ClientId FROM [Order] o Inner Join ClientPlan cp on o.PlanId = cp.PlanId -- and o.CreatedByUserId = cp.UserId Inner Join ClientUser c on o.CreatedByUserId = c.UserId WHERE --cp.ClientId = @ClientId --AND o.OrderDate BETWEEN @StartDate AND @EndDate ORDER BY o.OrderId DESC END ELSE BEGIN SELECT @SQLTEXT = 'Select o.OrderId, o.OrderDate, o.CreatedByUserId, c.LoginId, o.Quantity, o.RequiredDeliveryDate, cp.PlanId, cp.ClientPlanId --cp.ClientId FROM [Order] o Inner Join ClientPlan cp on o.PlanId = cp.PlanId --AND cp.ClientId in ('+ convert(Varchar, @ClientId) + ' ) Inner Join ClientUser c on o.CreatedByUserId = c.UserId WHERE cp.ClientId in (' + convert(Varchar,@ClientId) + ') AND o.OrderDate BETWEEN ''' + Convert(varchar, @StartDate) + ''' AND ''' + convert(varchar, @EndDate) + ''' ORDER BY o.OrderId DESC' exec(@SQLTEXT) END --return (@SQLTEXT)
I have 2 datasets in this report one for the above sproc and other dataset that gives me the clientname and it is as follows
Code Snippet
ALTER Procedure [dbo].[usp_GetClientsAll]
@ClientId nvarchar(max) = NULL
AS
--Declare @ClientId nvarchar(max)
SELECT
NULL ClientId,
'<All Clients >' ClientName
FROM
Client
Union
SELECT
ClientId,
ClientName
FROM
Client
Where
ClientId = @ClientId
OR
(
ClientId = ClientId
OR
@ClientId IS NULL
)
In the first dataset Parameter list i have omitted ClientId but kept it in the report parameter.. So when i give select all it works.. but when i just select particular it gives me the same result as Select all..
We have one LEDGER, where all the daily activities are stored. The LEDGER table has 4 indexes (1 clustered and 3 non-clustered). To get AR we use this table.
Well problem is some times in 1-2 months, any simple AR query takes a long time and every other client gets slow response (queries are very slow or sometimes block).
If we DROP any index on LEDGER table and again put it back (RECREATE), all our queries work fine and faster. This goes on till 1-2 months, till we see the same issue again.
This is a classic case happened today. Queries were running fine till morning 8 AM. We upload some 50 thousand records to Ledger table (Data Conversion). Well after 30 mins, all simple AR queries started taking a long time. We DROPPED an index in LEDGER table and everything was faster....Just to be same we added back the same index again.......everything is Faster.....
What is this. ....is it our QUERY, index or huge Transactions or no free space ???
We are scheduled to run SP4, next week. But is there any solution in the mean time on what is this?
Also is they any way to KILL all SQL server processes that take more than a mins. We just don't want ALL our client to Slow down because of one query????
Hey I had a table with a column of data encrypted in a format. I was able to decrypt it and then encrypt it using Symmetric keys and then updating the table column with the data. Now, there is a user sp which needs to encrypt the password for the new user and put it in the table. I'm not being able to make it work. I have this so far. Something somewhere is wrong. I dont know where. Please help Thanks. I used the same script to do the encryption initially but that was for the whole column. I need to see the encrypted version of the @inTargetPassword variable. But it's not working. It doesn't give me an error but gives me wrong data...
People,I'm trying to publish my first website and am having a few problems.I've got Visual Web Developer 2005 Express and am trying to use the Personal Website Starter Kit. (my SQL server is SQL Server Express Edition 2005 - which is also running on my local machine)It seems to work fine when I run it on my localhost, as soon as I ftp it up to my web hosting company, I get an error message (see below) :-An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) My hypothesis is :-It would appear to me that when running locally, the starter kit website uses my installation of SQL 2005 Express Edition, but when I upload all the files, I'm guessing the application is still trying to point at a local instance of SQL on my local PC which it now cannot see. I'm guessing I need to somehow upload the SQL database onto my web host (I've purchased 100M of SQL Server 2005 space), and point the application at that SQL instance instead. But I don't know if I'm right about all this, or indeed how to do it if I am. Can anyone help?Much thanks in advance,Will
I have a simple update statement that is running forever in SQL 2005 but works fine in SQL 2000. We have a new server we put SQL 2005, restored db. The table in question WEEKLYSALESHISTORY I even re-indexed all the indexes and rebuilt the stats as well. But still no luck, still running extremely long. 1 hour 20 minutes.
I'll try to give you some background on these table. Weeklysalehistory has approx 30 fields. I have 11 indesxes set up weekending date being one of them. And replication control has index on lasttrandatetime as well. So I think my indexes are fine.
/* Update WeekEnding Date for current weeks WeeklySales Records */ Update WeeklySalesHistory set weekendingdate = (SELECT LastTransDateTime from ReplicationControl where TableName = 'WEEKHST') where weekendingdate is null
Weekly sales has approx 100,000,000 rows Replication control has 631,000 (Ithink I can delete some from here to bring it down to 100 or 200 records) Although I don't think this is issue since on 2000 has same thing and works fine.
I was trying to do this within SSIS and thought that was issue. I am new so SSIS but it runs long even if I just run it as a job with this simple Update statement so I think its something with tables, etc that is wrong.
One thing on noticed if I look at the statistics in SQL Server Management studio there is a ton of stats. some being statistics on indexes which makes sense then I have a ton of hind_113_9_6 and simiiar one like this. I must have 90 or so named like this. Not sure how to check on SQL 2000 all the stats to see if they moved over from there or what. I checked a few other tables and don't have all these extra stats. Could this be causing the issue do I need to delete all these extras? Any help would be greatly appreciated.