-----------------------------
DECLARE @String AS varchar(1)
SELECT @String = 'z'
IF @String LIKE '[' + CHAR(32) + '-' + CHAR(255) + ']'
PRINT 'Expected'
ELSE
PRINT 'Unexpected'
-----------------------------
If the @String variable is set to 'y' (or in fact any ANSI character other
than 'z'), the result is 'Expected'. The comparison also evaluates as
expected if CHAR(255) is replaced with CHAR(254). The server collation, if
that matters, is SQL_Latin1_General_CP1_CI_AS.
It would be helpful to find the explanatin of this behavior. Thanks.
I've recently been struggling with moving a bigint data result from a query into an SSIS variable through the Execute SQL Task. Now, I could just be doing this incorrectly, but I couldn't get it to work at all if I made the variable int64 in SSIS. So here's what I found:
1. bigint from the query into int64 SSIS variable doesn't work at all. It fails.
2. bigint from the query into string SSIS variable works. BUT, it truncates it. And it's a crazy result, at least to me. All of a sudden the number 840550000000000 becomes 8405500000. Now, that's a really unexpected result to me. It seems to me this is a pretty crazy result.
3. Now, if I cast the bigint to varchar in the query and put it into a string SSIS variable, it works perfectly.
Is this a bug? If not, how is this expected behavior? Or maybe I'm just nuts after hours of futzing with this.
I ran into a variety of problems trying to set a script task breakpoint in a package containing multiple script tasks. The debugger apparently treats the breakpoint as if it were set in ALL tasks in the package, not just the one in which it is actually set.
At best this results in hitting breakpoints in scripts where they are not set and at worst the debugger brings up the "Send error report" dialog and quits (while the package continues to run). The latter seems to happen most often when the script with the breakpoint has more lines than an earlier script and the breakpoint is set at a line number that exceeds the number of lines in the earlier script--it bombs when the earlier, shorter script starts.
To get the debugger to work under these circumstances I had to add some nonsense code like
While False Dim i as Integer = 0 End While to every script, at the same line number near the beginning of the script (line 40, for example). I then set a breakpoint on the middle statement in one of the scripts (doesn't matter which) to cause the debugger to open at runtime. It doesn't hit the breakpoint because the line is never executed. If the breakpoint is set on a line that can be executed in any script in the package, bad things tend to happen. I then add a "stop" statement to the script that I want to debug. This only works if the debugger is already open, hence the dummy breakpoint above. This workaround is usable, but I am debugging a package that has quite a few scripts and having to insert dummy code in all of them at a fixed line number is rather inconvenient. I would really like to see breakpoints work the way one would expect--only in the scripts where they are set. Is there some other, easier way around this problem? Is there at least an easier way to get the debugger to open so that "stop" will work?
I am working with SQL server 6.5. DB Library display error unexpected EOF from SQL when I execute a batch Can somebody tell me what can occurs this error ?
We are trying to configure and run SSRS on a server installed at the customer data centre. The ReportServer web service is unable to start up and leaves behind this odd error.
w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing RunningRequestsDbCycle to '60' second(s) as specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing RunningRequestsAge to '30' second(s) as specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing CleanupCycleMinutes to '10' minute(s) as specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing DailyCleanupMinuteOfDay to default value of '120' minutes since midnight because it was not specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing WatsonFlags to '1064' as specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing WatsonDumpOnExceptions to 'Microsoft.ReportingServices.Diagnostics.Utilities.InternalCatalogException,Microsoft.ReportingServices.Modeling.InternalModelingException' as specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing WatsonDumpExcludeIfContainsExceptions to 'System.Data.SqlClient.SqlException,System.Threading.ThreadAbortException' as specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing SecureConnectionLevel to '0' as specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing DisplayErrorLink to 'True' as specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: i INFO: Initializing WebServiceUseFileShareStorage to 'False' as specified in Configuration file. w3wp!library!1!11/19/2007-19:21:29:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.ServerConfigurationErrorException: The report server has encountered a configuration error. See the report server log files for more information., unexpected SKU value; Info: Microsoft.ReportingServices.Diagnostics.Utilities.ServerConfigurationErrorException: The report server has encountered a configuration error. See the report server log files for more information. We suspect that when they first installed SQL Server, the optical media that they reported to be "faulty" was replaced with another one, and ended up installing Enterprise Edition of SQL Server. They were supposed to install Standard Edition. However, we'd like to confirm if the error message really the case that somehow SSRS is expecting the Std Ed SKU but meets up with Ent Ed?
I'm having issues serializing an XMLChoiceIdentifier in SQLServer. I'm attempting to serialize a type within a larger XML message. The parent object contains and array of sub-objects. The xml I'm trying to serialize is:
public struct DocumentIdentifier : IBinarySerialize, ICloneable, IComparable, INullable
{
private bool _isNull;
private int _idNumber;
[XmlAttribute("id")]
public int ID
{
get { return _idNumber; }
set
{
_idNumber = value;
_isNull = false; }
}
.... }
The error I keep receiving in SQLServer is:
A .NET Framework error occurred during execution of user-defined routine or aggregate "DocumentRequest":
System.InvalidOperationException: There is an error in XML document (2, 2). ---> System.InvalidOperationException: <Documents xmlns=''> was not expected.
System.InvalidOperationException:
at Microsoft.Xml.Serialization.GeneratedAssembly.XmlSerializationReader1.Read156_DocumentRequest()
System.InvalidOperationException:
at System.Xml.Serialization.XmlSerializer.Deserialize(XmlReader xmlReader, String encodingStyle, XmlDeserializationEvents events)
at System.Xml.Serialization.XmlSerializer.Deserialize(XmlReader xmlReader, String encodingStyle)
at System.Xml.Serialization.XmlSerializer.Deserialize(TextReader textReader)
at Mitochon.Global.Messaging.Internal.DocumentRequest.Parse(SqlString s)
at Mitochon.Global.Messaging.Internal.DocumentRequest.Read(BinaryReader r)
at DocumentRequest::.Deserialize(IntPtr , DocumentRequest& )
Any have any advice on how to get rid of this error?
I'm installing SQL Server 2005 on Windows Server 2003 R2. When installing the support files for the SQL installation I receive the following error message:
"There was an unexpected failure during the setup wizard. You may review the setup logs and/or click the help button for more information"
The information I receive when clicking on the help button are as follows: LinkID: 20476 ProductName: Microsoft SQL Server Product Version: 9.00.1399.06 Message Source: setup.rll Message ID: 50000 EvtType: setupsqlsetupactions.cpp@invokeS
I tried looking on the net for probable causes, but could not find any.
when i am running bulk insert from codebehind but have a error =
"Unexpected end-of-file (EOF) encountered in data file. OLE DB provider 'STREAM' reported an error. The provider did not give any information about the error. OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows returned 0x80004005: The provider did not give any information about the error.]. The statement has been terminated."
Has anyone ever experienced the SQL Server unexpectedly rebooting?
Yesterday my SQL Server 7 installation rebooted itself and the error log seemed fine as did the restart. The only evidence I can find was a message in the event viewer's application log stating the the MSSQLServer service terminated unexpectedly. Any ideas are appreciated.
I created a few DTS packages in our Test Server saving them as file.dts. When I try to open them in from the production server I get an Unexpected error and the DTS is not opened. Do you have any fresh idea for me, please? Thanks!! :)
/*********** Script 1 **************/ declare @nr_1 as decimal (10,2) declare @nr_2 as decimal (10,2) set @nr_1=5 set @nr_2=3 select round(@nr_1/@nr_2,0)
I've got a DTS package that runs an active-x script. The script issimple - it runs a stored procedure and saves the results to a CSVfile. I kept getting this error message when trying to run it sayingthat the recordset object I was using could not be used when closed.Well, it didn't make a whole lot of sense to me as to why that washappening, and it doesn't realte to my question except to give you asense of what I'm trying to do. After spending an inordinate amount oftime on that... I decided to just create a SQL Server connectionobject and an Excel Object and then use a transformation to load thequery results. Simple enough, or so I thought. So in thetransformation object under the Source tab, I typed in the query to runthe Stored Procedure:Declare @S nvarchar(30)Declare @E nvarchar(30)SET @S = Convert(nvarchar(30), GetDate()-1, 101) + ' 12 AM'SET @E = Convert(nvarchar(30), GetDate()-1, 101) + ' 11:59:59 PM'exec CTI_REPORT_Q_ACTIVITY_DETAIL @S, @EAnd then I clicked on the preview button. I got the message that norowset was returned. In a way, that explains the issue with theActive-X script. BUT, I know darn well it returns data. It returns438 rows of data when I run this in Query Analyser.So, here's my question....how could that be? Is there some issue thatDTS packages have with temporary tables? I do use a couple in theStored Procedure. Without having to post the stored procedure andtables, etc. could someone let me know if they've run into somethinglike this before?Thanks,Jennifer
If I have an SQL query which returns an aggregate of several decimal fieldslike so:(sum(COALESCE(myDecimal1, 0)+sum(COALESCE(myDecimal2, 0)+sum(COALESCE(myDecimal3, 0)) as MyTotalI get an rounded integer in MyTotal.However, if I do the following:sum(COALESCE(myDecimal1, 0)+COALESCE(myDecimal1, 0)+COALESCE(myDecimal1, 0)) as MyTotalI get a (proper) decimal value.Does anyone know why the first case returns an Integer?- Don
I posted this to microsoft.public.sqlserver.programming, but no onecould answer my question. So I think it is a good place to re-post myquestion here.My question:I found that if you do not include any effective SQL statement betweenBEGIN/END block, SQL Server 2000 Query Analyzer will think it is anerror:Server: Msg 156, Level 15, State 1, Line 6Incorrect syntax near the keyword 'end'.if 1=1select getdate()elsebegin--select 'ok'endIs this behavior a SQL standard or simply a M$ standard glitch?
I'm trying to run what I thought was a relatively straightforward query to find all entries from one table that don't appear in another table:
select * from Search_Suggestion where Suggestion not in (select distinct C106 from Search_Log)
The Search_Suggestion table contains 4060 entries and the Search_Log table contains 142,000+ distinct entries.
From running a similar query using 'in' instead of 'not in' I find that there are 3778 matches between the two tables would logic which suggest leave 282 that don't exist in the Search_Log table.
However....when I run the above query it returns no records.
I've tried changing it around and have also tried using temp tables but each time I still get no records.
We encounter the problem, that a reportservertempdb grows extremly large... the sessiondata and snapshotdata tables are about 10gb each at the moment and keep growing and growing... The CleanupCycleMinutes Configuration is set to default=10. We are not taking snapshots of our reports. Shouldnt those tables be cleaned up every 10minutes with this setting?
What can we do to stop this database growing? is truncating those tables on a weekly base a solution? We are using SQL Server Reporting Services 2005 SP2 with Cumulative Hotfix Package 5 (Build 3215).
Anyone know why SQLNumResultCols would return a different value for ColumnCountPtr in ODBC 3.0 than in ODBC 2.0?
The only difference in our test program is that we set a different value for the SQL_ATTR_ODBC_VERSION attribute via SQLSetEnvAttr. When we set it to SQL_OV_ODBC2, SQLNumResultCols returns the expected value (10). When we set it to SQL_OV_ODBC3, SQLNumResultCols returns an unexpected value (0).
Hi,My code worked fine before i placed "SqlTransaction" command in my code. Now it is showing "Unexpected existing transaction." Please tell me where I am going wrong. My Code:
protected void Page_Load(object sender, EventArgs e) { using (SqlConnection con = new SqlConnection(ConfigurationManager.ConnectionStrings["Mfund_String"].ConnectionString)) { con.Open(); using (SqlConnection destinationConnection = new SqlConnection(ConfigurationManager.ConnectionStrings["Mfund_String"].ConnectionString)) { destinationConnection.Open(); using (SqlTransaction transaction = destinationConnection.BeginTransaction()) { SqlCommand DelCmd = new SqlCommand("delete from mfund_data", con); DelCmd.ExecuteNonQuery(); using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection)) { bulkCopy.DestinationTableName = "mfund_data"; try { bulkCopy.WriteToServer(CreateDataTableFromFile()); transaction.Commit(); } catch (Exception ex) { transaction.Rollback(); Response.Write(ex.Message); } } } destinationConnection.Close(); } con.Close(); } }Regards,Jagadeesh
I have programmatically created a SqlConnection that begins a SqlTransaction. During the first part of this SqlTransaction, the contents of a table are deleted. The next part uses the SqlBulkCopy object to copy data from another database (in the form of a DataTable). The delete goes through fine, but the SqlBulkCopy always generates a SqlException with the message "Unexpected existing transaction." I cannot think of anything I am doing wrong. The code looks at an XML file for instructions on each transaction. Each transaction is composed of tasks. Each task will pull data from a different type of database (MVR.Command is a Factory Database object). Please view the code below and tell me if you can spot what I am doing wrong: using (SqlConnection destinationConnection = new SqlConnection(MVR.ConnectionSource.GetConnectionString(destinationServiceName))) { destinationConnection.Open();
using (SqlTransaction transaction = destinationConnection.BeginTransaction(IsolationLevel.Snapshot, "Transport")) { transaction.Save("Beginning");
int totalTasks = 0; int successfulTasks = 0;
foreach (XmlNode taskNode in transactionNode.SelectNodes("Tasks/Task")) { totalTasks += 1;
try { // Prepare the destination table (delete everything) int rowsDeleted = new SqlCommand("delete from " + destinationTablename, destinationConnection, transaction).ExecuteNonQuery();
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(destinationConnection)) { bulkCopy.DestinationTableName = destinationTablename;
// Based on the success of all tasks, either commit or rollback if (successfulTasks == totalTasks) { transaction.Commit(); } else { transaction.Rollback(); } }
I have W2K Adv, SQL 7 Enterprise in a workgroup with mixed security. The SQL installation is an almost entirely default install.
At the console, in SQL Enterprise Manager, when I select open a database, open tables, select a table, open table, and select all rows the system displays "an unexpected error happened during this operation". As far as I know this has never worked on this installation.
The error occurs with every logon account (including sa and local administrator accounts), both the "all rows or top row" options, every table, every database (including the customers database and the Northwind database).
I have tried adding administrators and accounts as users of the databases etc. and given the accounts all the permissions I can dream up.
There are no interesting messages in the event viewer. The SQL agent is running.
Technet found two documents but not related to the problem.
I can run SQL Analyzer and run "select * from table_name". That works on the Northwind and customer database tables - every time.
Colleagues with other installations do not get the error, and their systems return the rows correctly.
DECLARE @num int SET @num = 0 DECLARE @tableVariable table(ColA int, ColB decimal(18,4)) WHILE @num < 1000 BEGIN INSERT INTO @tableVariable VALUES (2, 10.56) SET @num = @num +1 ENDWhen this code is run in SQL Server 2000 Query Analyzer it commits in less than 1 second. The same code run as a SQLServerAgent job takes 16 seconds.
Similar behaviour appears if INSERT statement is substituted with UPDATE.
SELECT statement runs equally nice on both alternatives.
Has anybody got an idea what might be the reason for slow execution of INSERT and UPDATE in a job?
I'm getting this message on a replication distribution task after a successful SYNC task. The exact error message is: " Unexpected EOF encountered in BCP data-file.Failed while bulk copying into '<table name>' " I'm running SQL 6.5 Sp4. I tried recreating the article, resyncing and redistributing, with no luck. Is there anything I can do to fix this? Any help would be appreciated
I have lost the reference but I read somewhere that when running DBCC DBREINDEX against a clustered index, all the secondary indexes on the table are automatically re-indexed as well. I did a test of this on a small table and it seemed to confirm this. However, now I've put this into practice, I am finding that it doesn't seem to work this way. I noticed that having run DBCC DBREINDEX against a table's clustered index (DBCC DBREINDEX ('tablename', 'clusteredIndexName', fill_factor)), the secondary indexes were not automatically re-indexed - as born out by the fact that they remained badly fragmented.
First of all, do the dba's who read this beleive it is correct that DBCC DBREINDEX run against a clustered index will automatically rebuild the secondary indexes too? If so, why wouldn't it work in all cases?
Normally, after I use DBCC DBREINDEX, I can be sure that Scan Density on a clustered or non-clustered index is very good - eg. 99% or 100%. However, I have one database where there are a number of indexes that are not showing any improvement in Scan Density after running DBCC DBREINDEX. In on case, a clustered index, I run it on two days in succession and Scan Density actually go worse! Can anyone give me a reason for this? Can anyone suggest how to fix it?
Something weird happened tonight.. here's the deal. I have two databases exactly the same.. one is for dev, one's for live data.
Anyway, the live DB had an outdated view.. so I exported from the dev DB to the live.. just copying that one view over (using "copy objects" in the export DTS wizard). Very oddly, it actually copied over the data in the tables referenced by the view! Not good, cuz I told my coworkers I'd leave the data in the tables alone :( How do I copy a view over, but just the view definition, and NOT the actual table data??
Hiwhen i right click table and click design table then error occured(an unexpected error occured during this operation)If any one knows please let let me know your help would be appreciated .thankspardhi--Message posted via http://www.sqlmonster.com
I'm trying to install Microsoft SQL Server (en_SQL2005_DEV_Servers_Sept2005). But when I do so I get:
TITLE: Microsoft SQL Server 2005 CTP Setup ------------------------------
There was an unexpected failure during the setup wizard. You may review the setup logs and/or click the help button for more information.
For help, click: http://go.microsoft.com/fwlink?LinkID=20476&ProdName=Microsoft+SQL+Server&ProdVer=9.00.1314.06&EvtSrc=setup.rll&EvtID=50000&EvtType=packageengine%5cinstallpackageaction.cpp%40InstallToolsAction.10%40sqls%3a%3aInstallPackageAction%3a%3aperform%400x643
I keep getting this error message at the same point everytime I try to install SQL 2005. I have tried everything I can find on the website and nothing works! Any ideas???
I'm using a SQL Server Analysis Services data source.
I have existing reports that I have to modify.
The report preview and layout functions work until I switch to the data tab.
When I switch to the data tab the DATASET GETS BLOWN AWAY!!!!!!
This does not happen with reports using a SQL Server data source.
I have:
copied a report loaded the original report and switched to the data tab saved the report
compared the copy to the ruined versionThe only difference is the datasets section
Some version info: Microsoft Visual Studio 2005; Version 8.0.50727.762 (SP.050727-7600) SQL Server Reporting Services; Microsoft SQL Server Reporting Services Designers; Version 9.00.1399.00
Why is this happening?
What can I do to prevent SSRS from ruining these reports?
We are running a COM+ DLL that handles transactions and security with a SQL Server back-end. This project was started in VS 2003, but we just converted it to VS 2005. Since the conversion, creating a number of transactions in a row generates errors at consistent, but moving, locations in the code (i.e. the location where the failure occurs changes when I do things such as add additional try/catch statements, so it is not random but also clearly not associated with any particular line of code).