I am getting a warning message in my SQL Server 6.5 Error log file .
The warning is
Lazywriter: WARNING, LRU LIST IS EMPTY (177 FREE BUFS,358 TOTAL BUFS)
I need to do testing on capturing connection coming from outside using Profiler. Basically I need to do something from other SQL servers connecting to this SQL server and test if it is captured by profiler.
The column prod_Master.M2_Pct is defined as float.Yet for some of the rows, the value in [% of Total Target] comes up as null even though there is a number in prod.Measure column. There is valid non-null data in prod_Master.M2_Pct.
I tried LTRIM(RTRIM(prod.Measure)), but no change.
Use ProdDB SELECT TOP (100) PERCENT dbo.prod.ProdNo AS [Prod No], dbo.prod.ProdName AS [Prod Name], CASE WHEN dbo.prod.Measure = 'P1' THEN dbo.prod_Master.P1_Pct WHEN dbo.prod.Measure = 'P2' THEN dbo.prod_Master.P2_Pct WHEN dbo.prod.Measure = 'P3' THEN dbo.prod_Master.P3_Pct WHEN dbo.prod.Measure = 'P4' THEN dbo.prod_Master.P4_Pct
I am trying to add a where condition on an ID column(type - INT) with values coming from a variable (type - STRING). i am using cast to cast the ID as Varchar and then apply the condition, but i am not getting any results back. following is an example of what i am trying to do.using temp table in the example , so you can copy the t-sql and run as is.
I have configured windows failover clustering 2012 on 4 of my test nodes.
I am trying to add another node into this cluster but its not happening. I am not even able to start the cluster service in services.msc
After installing windows failover clustering, when I go to the C:WindowsCluster folder, I am unable to find CLUSDB, CLUSDB.1.container, CLUSDB.2.container and CLUSDB.blf files in the folder.
These files are very much present on the other nodes where cluster service is running.
I tried copying these files manually to server where its missing but still no luck.
Coming from Sybase env. I would like to know the equivalent of sp_showplan. What it does is, it shows the showplan for the query being run by spid, which is passed as a parameter to sp_showplan.
I've got an employee table with a date of birth field in it. i need a query that will a allow me to list all employees who's birthdays are coming up the next 30 days (or 1 month, if easier). I've tried several approaches & am getting nowhere... Any help would be greatly appreciated.
I just read the release notes for SQL 2005 SP2 CTP and the 'Select All' on multi-select parameters is coming back EXACTLY as it was in the first release. If this is not the case then I apologise for the rant and request that someone clarify the situation but otherwise .... I had to change loads of reports because it was removed in SP1, now I'll have to change them again for SP2.
Please Microsoft, stop doing this to us!
At least make it a check box option somewhere. The reasons for it's removal in SP1 still exist, but it is a nice feature to have, so just make it configurable.
I will probably have to delay installing SP2 at various customer sites because it will make my parameter lists look stupid. This is a shame because there are some useful updates in it.
Hi, i've downloaded a membership and roles provider but i am very new to ado.net. I have come from asp background. Here's my code:public override void AddUsersToRoles(string[] usernames, string[] rolenames) { // Validate arguments foreach (string rolename in rolenames) if (!this.RoleExists(rolename)) throw new ProviderException("Role name not found"); foreach (string username in usernames) { if (username.IndexOf(',') > 0) throw new ArgumentException("User names cannot contain commas."); foreach (string rolename in rolenames) { if (IsUserInRole(username, rolename)) throw new ProviderException("User is already in role."); } } SqlConnection db = this.OpenDatabase(); SqlCommand cmd = new SqlCommand("INSERT INTO UsersInRoles (UserName, RoleName) VALUES (@UserName, @RoleName)", db); cmd.Parameters.Add("@UserName", SqlDbType.VarChar, 100); cmd.Parameters.Add("@RoleName", SqlDbType.VarChar, 100); SqlTransaction tran = null; try { tran = db.BeginTransaction(); cmd.Transaction = tran; foreach (string username in usernames) { foreach (string rolename in rolenames) { cmd.Parameters["@UserName"].Value = username; cmd.Parameters["@RoleName"].Value = rolename; cmd.ExecuteNonQuery(); } } tran.Commit(); } catch { tran.Rollback(); throw; } finally { db.Close(); }}private SqlConnection OpenDatabase() { SqlConnection DB = new SqlConnection(this.connectionString); DB.Open(); return DB;}
I'm baffled by the results of the following query: __________________________________________________ ______ select events.[report number], [microfilm number], [crash date] into lowspeed from events inner join vehicles on events.[report number] = vehicles.[report number] where [dummy record] = 'N' and [estimated mph] between 1 and 10 and ([1st harmful event] = 01 or [2nd harmful event] = 01) and [type of vehicle] in ('01','02','03','04','05','06','07','08','10','11' ,'12') and [crash injury severity] = 5
select events.[report number], [microfilm number], [crash date] into notlowspeed from events inner join vehicles on events.[report number] = vehicles.[report number] where [dummy record] = 'N' and [estimated mph] > 10 and ([1st harmful event] = 01 or [2nd harmful event] = 01) and [type of vehicle] in ('01','02','03','04','05','06','07','08','10','11' ,'12') and [crash injury severity] = 5
select distinct [report number], [microfilm number], [crash date] into truelowspeed from lowspeed where [report number] not in (select [report number] from notlowspeed)
select [report number] into pedtable from pedestrians
select [report number],[microfilm number], [crash date] as [Lowspeed Fatal Crashes - 1994] from truelowspeed where [report number] not in (select [report number] from pedtable)
drop table pedtable drop table truelowspeed drop table lowspeed drop table notlowspeed _______________________________________________
Let me explain what I'm trying to do.
In the first temp table "lowspeed," I'm trying to lump all crashes where a vehicle was doing between 1 and 10 mph. In "notlowspeed," I'm setting up a table of crashes in which at least one vehicle was going over 10 mph. By the third gyration, I'm trying to generate "truelowspeed;" which I thought would contain only crashes in which all of the criterion in the first two queries were satisfied and all of the vehicles involved in the crashes were doing between 1 and 10 mph (NO vehicles exceeding the 1 to 10 mph parameter). The "pedtable" maneuver is simply to pull out any of these crashes in which a pedestrian was involved. What I'm getting is a lot of crashes in which I have dummy records, several crashes in which one of the vehicles was going over 10 mph, and types of vehicles involved in the crash that should have been excluded by the criterion I specified.
Thoroughly stumped. Would be most grateful for any kind insight/advice.
I am currently reading through Itzik Ben-Gan's "Microsoft SQL Server 2012 High-Performance T-SQL using Windows Functions." In attempt to test the SUM OVER() function in SQL 2008 because that's what I've got. I do not currently have sample data (trying to generate it has become a major PITA), but I have some pseudocode.
My current code (actual production code) pulls a bunch of ITD (inception to date) contracts then calculates a certain dollar amount based on monthly changes. Not all contracts have values during a given month, so here's what I cobbled together a few months ago. (Per our finance team, these numbers ARE accurate).
WITH MonthlyVals AS (SELECT ContractID, SUM(Col1 - (Col2 + Col3 + Col4 + Col5)) AS MyTotal FROM MyTable WHERE MyDate >= @ThisMonthStartDate AND MyDate <= @ThisMonthEndDate AND StatementType IN (8,4,2)
[code]....
To test the totals, I also added a COMPUTE SUM(MyTotal) to the end of each query. (Yes, I know COMPUTE is deprecated. Just wanted a quick check.). The difference between the two bits of code was over 68k, with the SUM OVER() code coming up with a total higher than the CTE code. I know CTE code is correct for a fact. It went through extensive testing before getting put in Production. Is it the way I joined the table for the SUM OVER()? Or is it the use of PARITION BY?
How's everyone doing with 2005? I still feel like my heals are dragging a bit. The "knowledge" out there still seems a bit sparse. Am I right?
I'm about 1/5 the way through a BI book and decided the course will probably be necessary as well: http://www.kimballgroup.com/html/kucourseMDWD.html There seems to be a lot to come to grips with. So that's all paid for.
It's been years since I've been on a course of any kind. Too busy working! What's the etiquette? Trousers and T-shirt? Tie? Sandles & three quarter lengths?
This particular course seems to cover the "general overall approach" - which I am pretty sure is a necesary part of what you need to be able to do. The book that is related to the course (which I have already bought) seems really helpful and valuable.
But I don't think it will cover the nitty gritty of SSIS - where most of the grunt work happens. They have another course for that. But I can't wait that long. You can spend your whole life just reading books!
Are all the new little things in 2005 SSIS just a little to specialised? A little too clever? Does it limit us? Or empower us? Can we use it to get the job done? Or will it be a case again of "clever workarounds" when we travel far down a road that we find out (when it's too late and we are committed) can't provide us with what we want. Is it really saving us time? Are we better off writing the scripts ourselves?
------------------------ Me: What do you want to know from your data warehouse? Client: Err...Emm...Everything Me: OK, that's great. That's all I need to know. I'll see you when it's done.
I have two date fields, start_date and end_date.I'd like to subtract the two dates, and come up with a number (thenumber of difference between the two dates).What function is there to do this? I haven't been able to find anythingin BOL.Start_date = 6/1/03End_date = 6/8/03End_date - start_date = 7*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!
I've tried setting the "Process Binary as Character" setting to true on the data source. I've tried cast and convert statements in the queries. I've tried conversion steps... Am I missing something simple, or is it really this difficult to get text values pulled from DB2 as text? I'm using Microsoft's OLE DB provider for DB2.
Anyone have experience with conversion problems pulling text from DB2 databases? Thanks in advance... Laura
Hello: I see some packages in the MSDB folder in Mgmt. Studio that I cannot figure out where they are coming from. The SSIS project in BIDS does not contain them, I do not see it in the Deployment folder, nor do I see it in the destination folder to which the manifest file deploys the packages. Which folder location am I missing? Each time I delete them they keep reappearing when I deploy.
I'm not sure if this is the right forum, but i'm having an issue where SSIS Variables window will not come up. The most i get is a gray bar that looks like it is supposed to hold the variables.
I've tried getting it to show by: 1) Menu --> SSIS --> Variables 2) Right Click --> Variables 3) Menu --> View --> Other Windows --> Variables
The same thing seems to happen for "log events" in Menu --> View --> Other Windows --> Log Events.
Does anyone know how to fix this? Is anyone else having this problem?!?!?!
When I brought up the BI DEV studio first tiem the Connection Managers area showed up, but after I closed it by mistake from next time on wards it is not coming up.
Can some one from this form help me how to get the Connection Managers area(pane) back?
I would like to attend a good sql server training conference here in USA. What would you recommend? My areas of interest are: OLTP, SSIS, SSAS, Data warehousing etc..
I googled for some excellent training or conference events but could not find one.
Program Descritpion: Our product allows our customers to enter their product catalogs into our ASP.NET pages and we save the data using SQL Server 2005. Customer use GridViews to edit their data and teh data is broken into manageable groupings (called Field Groups) stored as meta-data in SQL Server Tables. Then when a user wants to edit a set of data they choose the Field Group and we dynamically generate a data source and a gridview, bind them together, and our users update their data. A bit more complicated but that's teh gist. Another thing they can do, and where our problem occurs, is they can print reports or do exports into excel using this data. Since there are so many fields, they use the Field Groups to select which fields they want included so everything needs to be built dynamcially, at runtime, using the Field Group meta-data. Essentially, I want my grid to have these 10 fields. Problem:For the export we generate an XML data source using FOR XML in SQL Sprocs. The xml is pulled into a xmldatadocument, trasnformed with an xsl file into an Excel XML Spreadsheet. Our problem is that FOR XML is generating our XML real numbers in scientific format. The xsl "format-number" function does not recognize scientific format and returns NaN (not a number) instead of the value in my spreadsheet. If I leave it blank I get the scientific number but Excel doesn't format it correctly, the cell has an error tag that wants me to choose Number stored as text or convert to a number. I need it to show up on teh Excel form already formatted without that message. I can't change the DataType of the field in SQL, too many other things depend on it and it needs to be a real. I can't use CONVERT(decimal,fieldName) in SQL because the SQL string is dynmically generated using Dynamic SQL and most of the fields are not real. When we build the Field List we just have field names, we can't check anything to add CONVERT functions to only real fields. Is there any way to force the XML output to not be scientific for the entire document? Or another function in XSL I'm unaware of (pretty new to xsl)? Or perhaps something I can do in the XML Spreadsheet tags to force the conversion?
I have a form and a connectionString to a SQL database. If the textbox at the form is empty i want to store a null value there but when i pass this value as a parameter it brings the following error: Failed to convert parameter value from a String to a Int32.. cmd.Parameters("@Segundo_nombre").Value = txtSecondName.Text.ToString --> suposing is null it brings an Error. cmd.Parameters.Add(Apellido) How can i manage this? I want to store this value if it is null or not. Also i don't know how to assign a null value to a variable. I tried with v_flag = check_selection.check_string(v_idioma)
If v_flag = 1 Then 'la variable posee el texto Seleccione v_idioma = DBNull.Value but it's not working. Thanks!!
Our front-end GUI person changed the system date one month back to avoid the expiration of the Servelet Exec software. After rebooting the system, MSSQL refuses to come up and the errorlog has the following message: "The evaluation period has expired". The version we had was never an evaluation version!
What's making it think that and is there a way to correct the situation?
Any help/suggestions would be greatly appreciated.
I apologize ahead of time if this has been covered. I tried searching but found only the OS specific response to my question (http://www.microsoft.com/windows/timezone/dst2007.mspx).
With the coming changes to DST in 2007, is there -- or is there even a need to -- patch either SQL Server 2005 or 2000 to account for those changes?
We have gone with the approach of always storing DateTime in UTC. We are adding validation code to throw exceptions if any client code passes a date in that is not DateTimeKind.UTC. However, our validation is being tripped when populating our entity objects from the database, as the DateTimeKind on the DateTime value from the data reader comes in as Unspecified.
Any suggestions? Just toUtc the data reader value?
I feel like ssis encryption model has a serious flaw. Especially when linked to SQL Agent jobs.
I have posted and others have posted messages about this. Something is plain wrong with ssis encryption keys and password protection. Also, you do not have the choice not to protect the packages. In my case, protecting packages is completely useless.
I created config files for al my packages connections passswords.
Now, by our IT Policy, I had to change again my password and of course, all packages now return multiple errors when I open them.
Hopefully, the config file did its job and the packages are ran anyways by SQL Agent, however, having to manually retype and resave all packages not to have the errors is just a plain hassle. Not to speak about people not using the config files and the correct "Run As" sql agent account.
I stress the fact that in a real world production environment all packages are driven by SQL Agent jobs and MUST run automatically.
Here is the error I get after opening a package after changing my password:
Error 1 Error loading Constants05.dtsx: Failed to decrypt protected XML node "DTS:Password" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available. c:projectsssis packagesssis constantsConstants05.dtsx 1 1
So Why is'nt this key automatically adjusted after Windows NT Domain password Change?
How can I refresh the key, not to have to reype all the packages connections passwords and rebuilding, Checkin-in again all the stuff?
I do not think the solution is "Use an application account which password never changes when you create your ssis packages" however at this time, this is the only solution I can think of.
How do you guys deal with this problem?
I still do not understand the ssis security model I feel it is diconnected from the reality and unpracticable in a production environment like mine.
I'm having a very frustrating problem that no amount of formatting via properties and/or expression seems to be clearing up.
I have a column in the dataset that is data typed decimal (6,4) (from the SQL query). The numeric value is, for example, being returned to the data set as 0.6500.
The problem is the report wants to just show zero. Formatting efforts to date have only yielded, '0', '0.00', & '0.0000'.
I have a Merge-Pull subscription setup on a laptop running SQL 2000 synchronizing with a SQL 2005 database. For the most part everything works fine. However, when I take the laptop off the network for more than 10 minutes and then reconnect back to the network, I see the following status message for the synchronization agent:
The process could not connect to Distributor 'OurServerInstanceName'. Specified SQL server not found. NOTE: The step was retried the requested number of times (10) without succeeding.
It seems that the agent never picks up and starts again once this occurs.
Is there a "best practice" for handling this situation? For example, is there some way to tell the agent to auto-start once a network connection has been re-established or do I need to have some other process monitor the status and restart it when it detects that it has stopped with this error?
Create report with SharePoint list. Columns are displaying on the report I used out of the group SUM function and it is group by report. one group total is showing correct total and other groups are showing wrong total and showing too much big value even there is whole column is empty.where value is not in column I am using "-" with IIF function. these columns are calculating by using date difference function between two date columns. these values coming with minus value . so ABS function is also using.
Columns calculation expression:
=IIF(ISNOTHING(Fields!DateCAPackage.Value) OR (ISNOTHING(Fields!Date_CA_Application.Value)) , "-" , CINT(Abs(DateDiff("d",Fields!Date_CA_Application.Value,Fields!DateCAPackage.Value))))
We are facing an issue while executing a stored procedure which uses a table of current database with INNER JOIN a table of another database in same instance.
Per our requirement, we are inserting select statement output in table variable. Then applying business logic and finally showing the data from table variable.
This scenario is working exactly fine in Dev environment. But when we deployed the code in quality environment. Stored procedure does not returning OUTPUT/ (No column names) from table variable.
During initial investigation, we found that collation of these two databases are different but we added DATABASE_DEFAULT collation in the JOIN.