A Way To Force SQL Server To Ignore Errors On DTS Import?
Nov 11, 2004
Hello - the very nature of this question seems to make no sense I know - but we received a huge volume of data (29 tables) in flat file format. I first imported them into MS Access because of its portability and it seemed to be more forgiving on imports. Now I have a complete MS Access DB with all tables, so I figured importing to SQL server should be a snap. However, on the import, I had 14 tables import successfully, and 15 failed!
Here is an example of one of the error messages I received:
Insert Error, Column 3 - status 6; Data Overflow...this was on a date/time field in access, and here is the data contained in the referenced row/column: "8/19/4999"
the year "4999" is obviously the problem (at least i think), and I have no idea why this successfully imported to MS Access, but not to SQL Server....
what i'd like to be able to do (not the best practice, i know) for now is ignore these types of errors - and just force SQL server to take the data straight from MS Access and replicate it. We received this data from a 3rd party, and there's no telling how many data entry errors like this could be in each table - many of the tables have over 500,000 rows, and i don't want to have to go through fixing each of these errors by hand...anyone have any ideas?
View 1 Replies
ADVERTISEMENT
Nov 9, 2006
Hi,
I have a big table and want to make a plausibility check of it´s data.
Problem is, that my query stops, if there is an unexpected datatype in one of the rows. But that is it, what i want to filter out of my table with that query and save the result as new correct table.
How can i write a parameter to my query SQL Code, that if a error occurs, the querry resumes and the error line will not displayed in my final querry overview?
In my books and on the net, i don´t found something to this theme ;-(.
Thx in advance.
View 9 Replies
View Related
Jun 7, 2006
I have a simply SSIS package with following data flow structure:
Flat File Source > Data Conversion > Aggregate > Dervied Column > Ole dB Destination
Basically, this package is executed on daily basis to import sales from a text file into sql server. Now there's a possibility that text file may contain previous sales (occasionally).
My sql table structure enforces data integrity through primary key and therefore my package errors out if there's a duplicate in text file which already exists in sql server.
I'm basically looking for a way to ignore these duplicates and continue to import rest of the file. I need a way to force execution (suppress errors if possible) and finish importing all text file.
I've tried making the maximumerrorcount more than # 50000 and failparent/failpackage on error = false.
Any help is greatly appreciated...thanks
Here's the errors I receive:
SSIS package "Package2.dtsx" starting.
Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Data Flow Task, DTS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Data Flow Task, DTS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Data Flow Task, Source - SALES_TXT [1]: The processing of file "Z:SALES.TXT" has started.
Information: 0x4004300C at Data Flow Task, DTS.Pipeline: Execute phase is beginning.
Information: 0x402090DE at Data Flow Task, Source - SALES_TXT [1]: The total number of data rows processed for file "Z:SALES.TXT" is 20450.
Information: 0x402090DF at Data Flow Task, Destination - SALES [37]: The final commit for the data insertion has started.
Error: 0xC0202009 at Data Flow Task, Destination - SALES [37]: An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "The statement has been terminated.".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Violation of PRIMARY KEY constraint 'PK_Sale'. Cannot insert duplicate key in object 'dbo.Sale'.".
Information: 0x402090E0 at Data Flow Task, Destination - SALES [37]: The final commit for the data insertion has ended.
Error: 0xC0047022 at Data Flow Task, DTS.Pipeline: The ProcessInput method on component "Destination - SALES" (37) failed with error code 0xC0202009. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread1" has exited with error code 0xC0202009.
Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Data Flow Task, Source - SALES_TXT [1]: The processing of file "Z:SALES.TXT" has ended.
Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "Destination - SALES" (37)" wrote 18750 rows.
View 6 Replies
View Related
Jan 11, 2008
I have added an email task to the ON Error Event of my SSIS package, so that I will always know when there are errors.
However I would like the SQL Server job executing the package to succeed even if the package fails.
What setting do I change in the SSIS packageto achieve this? MaximumErrorCount?
View 1 Replies
View Related
Feb 12, 2008
When I create/alter a store procedure in SQL Server 2005, SQL server always checks for syntax errors first and won't let me save the change if it detects any error. Is there a way we can force the SQL server to save the store procedure that fails the syntax check?
I know SQL server will allow such invalid store procedures if you detach & re-attach the entire database from one SQL server to another server. However, if I try to manually create the same store procedure from one server on a different server with a script, then it won€™t let you save the store procedure if the linked server (or the table) can€™t be accessed from the new sql server.
How do you get around this?
Thanks
View 13 Replies
View Related
Jan 25, 2006
In my stored procedure I'm calling a buggy and flaky stored procedurethat comes from a third party. When I run my stored proceure from QA,I'm getting a whole buch of errors raised inside the third party one.Is there any way I could just ignore them, so that if I run my SP fromQA, only errors from my code, if any, show up?TIA
View 2 Replies
View Related
Dec 12, 2006
if there is an error in the trigger then the update to the table does not happen. is there a way to make sql ignore errors in a trigger and still update the table
View 4 Replies
View Related
Mar 5, 2004
Hi,
I am trying to import data from a Text file into a database Table using SQLserver BCP utility. I am able to do that when I have all new records in my Text file. But I am getting primary key violation error when I am trying to import the record which is already existing in the table. This is correct, but I want my program to ignore these errors and import only those records which are fine.
I tried [-m maxerrors] option, but it is not working. My BCP program is getting interrupted at the first error itself, even if I give [-m100] option.
my command looks something like this,
bcp pub..employee in C:data.txt -b1 -m100 -c -t, -Sdatabase -Uuser -Ppassword
here -b1 is, processing 1 row per batch transaction
-m100 is, ignoring first 100 errors
please help.
thanks
madhu
View 1 Replies
View Related
Apr 10, 2008
Ive started using try/catch in my t-sql code now and I rather like it, since im a C# developer. I read that some errors with an error code below 10 will not cause the catch block to be entered. What kind of errors does this include ?
View 2 Replies
View Related
Jul 7, 2006
Is there any way to emulate the try/catch mechanism that SqlServer2005 provides using SqlServer2000?
Or more simply,
Is there any kind of IGNORE_ERROR or CONTINUE_ON_ERROR setting for SqlServer 2000?
Thanks
- John
View 2 Replies
View Related
Feb 6, 2008
I have a package that uses a for loop to iterate through an unknown amount of excel files and pull their data into a table. However, there will be cases when the file is corrupted or has some sort of problem so that either the transformation will fail or the excel source will fail.
I have it so that for each iteration if the transform was successful the file is moved to an archive directory, and if it fails the file is moved to a different directory.
But I don't want the package to be marked as failed. For the control flow tasks I have set the individual components to FailPackageonFailure = False, and for the Data Flow tasks I have set ValidateExternalMetadata = False.
It no use to set the MaxErrorCount higher because I can't guarantee how many files will be processed and how many might fail.
Could anyone suggest a clean way to trap these errors? Specifically, the "Cannot Aquire Connection from Connection Manager", which is the excel connection.
Thanks
View 6 Replies
View Related
Apr 2, 2008
I have an application that is moving from an home made full text search engine to using the full text indexing engine of SQL 2005. I have a stored procedure that I want to behave as:
check documents table to determine whether a full text index for SQL's full text engine has been created.
If it has not, query the documentText table (which is the table for my in-house full text search)
If it has, use the full text indexing engine
My problem is that compilation of the TSQL to create the stored procedure fails when the full text index has not already been created with the followign error:
Msg 7601, Level 16, State 2, Procedure My_FullTextSearch, Line 0
Cannot use a CONTAINS or FREETEXT predicate on table or indexed view 'Documents' because it is not full-text indexed.
In my test lab, I tried:
1. creating the full text index
2. creating the stored procedure
3. deleting the ful text index
which gets me to the desired end result of having a stored procedure that can determine whether or not the full text index has been created yet (the procedure works in this state). But I creating this index as part of this stored procedure creation in production is not an option.
My question - Can I somehow tell SQL to ignore the compilation errors it encounters while creating this stored procedure? If not, is there some other way to create this "smart" stored procedure?
Here's a code snippet stripped down to the bare minimum to generate the error:
CREATE PROCEDURE [My_FullTextSearch]
@Term VarChar(1000)
AS
BEGIN
SET NOCOUNT ON;
IF NOT OBJECTPROPERTY(OBJECT_ID('Documents'), 'TableHasActiveFulltextIndex')=1
BEGIN
Select [DocumentID]
from [DocumentText]
where [Term] like '%' + LTRIM(@Term) + '%'
END
ELSE
BEGIN
Select [key] from FREETEXTTABLE(Documents, Contents, @Term)
END
END
View 5 Replies
View Related
Mar 23, 2006
I need a way to programmatically (via JDBC) find out which triggers for a table may not compile properly, so that I can disable the bad triggers.
I can do this fine in Oracle but cannot figure out if there's a way to do this in SqlServer. (In Oracle I'd just "alter trigger... compile" and select from user_errors.)
I know how to find the triggers that exist on a table, and I know how to enable/disable individual triggers. I know about sp_recompile, but all that does is flag the trigger for recompile at the next execution.
I need to verify whether the trigger is valid without having to actually invoke it. For example, if there's a bad Update trigger, I don't want to actually execute an update on the table.
One example of what I'm dealing with is this... We have Table A and Table B. There is an update trigger on Table B that references column A.col1. Then we alter Table A to drop col1. Later we have to update Table B. At this point the update will fail because of the bad trigger. I want to find and disable the trigger before executing the update on Table B. If there are other triggers on Table B that are valid, I want to leave them alone.
View 2 Replies
View Related
Jul 3, 2007
I have an SSIS package that imports from an Excel file with data beginning in the 7th row.
Unlike the same operation with a csv file ('Header Rows to Skip' in Connection Manager Editor), I can't seem to find a way to ignore the first 6 rows of an Excel file connection.
I'm guessing the answer might be in one of the Data Flow Transformation objects, but I'm not very familiar with them.
Any pointers would be greatly appreciated.
Eric
View 12 Replies
View Related
Oct 12, 2007
I'm new to SSIS, and trying to automate data imports from text files. The text files I'm importing always contain a fixed set of columns, or a subset of those columns. If I include a subset of columns in the import file (and exclude others), the data doesn't import...I assume because the actual file doesn't include every column defined in the flat file source object?
Is it possible to accomplish this without dynamically selecting the columns, as indicated here: http://msdn2.microsoft.com/en-us/library/ms136020.aspx
View 12 Replies
View Related
Feb 3, 2006
I know that this almost never happens... but Im dealing with an AIX flatfile-database conversion that brings 80 tables into SQL. Im not allowed to touch this stuff or to massage how its brought over to SQL, as it deals with medical records...
What I need to do though, is on creating my own table imports, for my "account" fields to match with the existing SQL conversion table "account" fields, I have to match requirements...
Existing account numbers are a total of 6 charaters. Account numbers with less than six characters contain leading whitespace character equivalents to total the six character spaces for "account number"
When I import records, I need to force the same requirement and have a min and max length of characters for "account number" = 6 characters and any account number less than 6 characters must also have the necessary whitespace character equivalent added to it.
How would I do this? It needs to be automated, as this is a process that will run nightly and cannot have a human sitting on it every day, 7 days a week... I cannot accurately join unless I can meet this requirement and my hands are tied because I can't change the way the formatting is done on the imported tables =(
Any help would be greatly appreciated... I'm quite stuck
View 5 Replies
View Related
Apr 17, 2002
I have a text file with 10 columns that need to be imported into a table with 15 columns. The first 10 columns of the table correspond to the 10 columns of the text file.
When I run bcp, it fails. If I remove the last 5 columns from the table, it works fine. Any ideas on how I can leave the 5 columns in place?
I know that I can bcp, and then copy the data to another table, or use a DTS package to copy the data, but with the amount of data, I prefer bcp.
any help is appreciated.
View 2 Replies
View Related
Jul 25, 2006
I have a client who we moved to our new web server and sql 2005 server from a sql 2000 server. I detached the database from sql 2000 and attached it to 2005. I also just set the compatibility to 2005 also.
My client used enterprise manager to import data into the tables on the sql 2000 just fine. Now using the SQL Management Studio, importing the same table produces all kinds of error, truncation errors, etc.
I have played with a bunch of the settings, did the "Suggest Types" options, but I still just get a bunch of errors. It seem to get it to work I have to go in on the columns of the flat file i am importing and change EVERY COLUMN field to match the table i'm importing too. That is just too much work.
I basically have a 2 record text file i easily imported to sql 2000. but importing into sql 2005 proves to be a *** load of work! Aren't products supposed to get better with future releases? What am I doing wrong?
I've tried the sql native client and the oledb sqlserver client and get the same results.
Any ideas?
View 1 Replies
View Related
Mar 18, 2008
Hi Guys,I’m trying to do a Bulk Insert but I am receiving the following error:
conversion error (type mismatch or invalid character for the specified codepage) for row 2, column 8 (Phone).Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 8 (Phone).".
Task failed: dbo.tblNoName
The data is comma delimited which I stipulated in my Import Connection section. But the data also has quotes around it (ie. The field Lname nvarchar (17) = “Brown� and the field phone nvarchar (10) = “12345678921�. Is there a way to ignore the quotes or do I have to remove them before I import?
Or is my problem something else all together?
The connection is solid;
Format = “Specify�
RowDelimiter = {CR}{LF}
columnDelimiter = Comma {,}
No other options are set.
The data looks like:
"tstLName","tstFname","000 N Tst DR","IDAHO sp","ID","00000000",
Thank you,
View 2 Replies
View Related
Feb 7, 2008
Hi,
I am trying to import from Excel file. So In between Excel file source and OLEDB destination I am using One Data Transformations to convert excel unicode characters to Sqlserver varchar.
Iam getting this following errors:
1)
Error: 0xC020901C at Data Flow Task, OLE DB Destination [382]: There was an error with input column "Copy of Zip" (615) on input "OLE DB Destination Input" (395). The column status returned was: "The value could not be converted because of a potential loss of data.".
Copy of zip is the Data Transformation column mapped Sqlserver Varcahr(200) column of Zipcode.
In excel file the Zip codes are like this:
78712-2344
78123
12345
87651-1234
2)
The column "State" needs to be updated in the external metadata column collection.
This is warning. This type of warnings are for all columns in excel file.
3) Intially I declared the Sqlserver table columns like this Varchar(100), then SSIS showing some warning like truncation of column State 255 characters to .. So I changed columns datatype from Varchar(100) to Varchar(500)? Why we need to change like this.
Thanks in advance
View 1 Replies
View Related
Apr 19, 2007
I€™m trying to copy data from production to my local machine using the SQL Server 2005 import and export wizard. It works fine if I select a small number of tables but throws errors
When there€™s a lot of tables. Have you ever experienced problems using it? Is there a better way to transfer the data?
the data source is SQL Server 2000 and the target is 2005. I have the optimize for many tables and transaction options selected
Here€™s the errors I get
Execute the transfer with the TransferProvider. (Error)
Messages
· ERROR : errorCode=-1073451000 description=The package contains two objects with the duplicate name of "output column "ErrorCode" (49)" and "output column "ErrorCode" (14)".
helpFile=dtsmsg.rll helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
· ERROR : errorCode=-1073451000 description=The package contains two objects with the duplicate name of "output column "ErrorCode" (31)" and "input column "ErrorCode" (52)".
helpFile=dtsmsg.rll helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
· ERROR : errorCode=-1073451000 description=The package contains two objects with the duplicate name of "output column "ErrorCode" (49)" and "output column "ErrorCode" (14)".
helpFile=dtsmsg.rll helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
· ERROR : errorCode=-1073451000 description=The package contains two objects with the duplicate name of "output column "ErrorCode" (31)" and "input column "ErrorCode" (52)".
helpFile=dtsmsg.rll helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
· ERROR : errorCode=-1073451000 description=The package contains two objects with the duplicate name of "output column "ErrorCode" (49)" and "output column "ErrorCode" (14)".
helpFile=dtsmsg.rll helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
· ERROR : errorCode=-1073451000 description=The package contains two objects with the duplicate name of "output column "ErrorCode" (31)" and "input column "ErrorCode" (52)".
helpFile=dtsmsg.rll helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC}
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
· ERROR : errorCode=-1073451000 description=The package contains two objects with the duplicate name of "output column "ErrorCode" (49)" and "output column "ErrorCode" (14)".
helpFile=dtsmsg.rll helpContext=0 idofInterfaceWithError={8BDFE893-E9D8-4D23-9739-DA807BCDC2AC} (Microsoft.SqlServer.DtsTransferProvider)
For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&LinkId=20476
View 7 Replies
View Related
Aug 28, 2006
Recently installed Sql Server 2005 client and am now attempting to import data from a spreadsheet into an existing table. This works fine with Sql Server 2000 but I am getting data conversion truncation errors that stop the process when this runs using import utility in Sql Server 2005.
Any help would be appreciated.
View 1 Replies
View Related
Jul 1, 2015
I recently updated the datatype of a sproc parameter from bit to tinyint. When I executed the sproc with the updated parameters the sproc appeared to succeed and returned "1 row(s) affected" in the console. However, the update triggered by the sproc did not actually work.
The table column was a bit which only allows 0 or 1 and the sproc was passing a value of 2 so the table was rejecting this value. However, the sproc did not return an error and appeared to return success. So is there a way to configure the database or sproc to return an error message when this type of error occurs?
View 1 Replies
View Related
Feb 11, 2008
I'm using Microsoft SQL Server Management Studio. I've created a view (very basic two field select with a inner join) -- I expect the query to take some time as there are about 1.7 million records in one table and about 3 million in the join table. Problem is, I always get a timeout error when I try to run the query.
I've checked the server settings and "Execution time-out" = 0 (no time-out). Checked this at the server and in Microsoft SQL Server Management Studio. So my question is why am I getting a time-out when I have it set to not time-out?
What is going on?
Thanks, Rob.
View 4 Replies
View Related
May 1, 2007
Hello,
I am trying to insert rows to a table with a unique index that has the ignore duplicate property.
My program is running fine if running locally but the entire transaction failed if it is over a linked server.
The following is an example:
create table t1 (c1 int,c2 int)
create unique clustered index i1 on t1 (c1) with IGNORE_DUP_KEY
if running locally :
select count(*) from t1
go
insert into t1 values (1,2)
insert into t1 values (1,2)
insert into t1 values (2,2)
insert into t1 values (1,2)
go
select count(*) from t1
go
The output is:
(1 row(s) affected)
(1 row(s) affected)
Server: Msg 3604, Level 16, State 1, Line 3
Duplicate key was ignored.
(1 row(s) affected)
Server: Msg 3604, Level 16, State 1, Line 5
Duplicate key was ignored.
(1 row(s) affected)
and the count(*) returns 2 at the end.
If running over a linked server:
select count(*) from linkserver.db.dbo.t1
go
insert into linkserver.dbarchive.dbo.t1 values (1,2)
insert into linkserver.db.dbo.t1 values (1,2)
insert into linkserver.db.dbo.t1 values (2,2)
insert into linkserver.db.dbo.t1 values (1,2)
go
select count(*) from linkserver.db.dbo.t1
go
The output is:
(1 row(s) affected)
(1 row(s) affected)
Server: Msg 3604, Level 16, State 1, Line 3
Duplicate key was ignored.
and there is only one row in the table !
Any clue how to avoid that?
Eyal
View 2 Replies
View Related
Jul 6, 2007
If I do the query below, SQL Server does a table scan (thousands of rows) for fn_TestCol(), then evaluates the CONTAINS clause:
SELECT col1, col2
FROM myTable
WHERE CONTAINS((col1, col2), 'foo and bar')
AND fn_TestCol(col1) = 0
How can I force it to evaluate CONTAINS clause, which returns only a few rows, first? The best I've come up with is this:
SELECT sub.col1, sub.col2
FROM (
SELECT col1, col2
FROM myTable
WHERE CONTAINS((col1, col2), 'foo and bar')
) sub
WHERE fn_TestCol(sub.col1) = 0
It's much faster, but still not as fast as if I could just use the first query, but force SQL Server to evaluate CONTAINS first.
View 8 Replies
View Related
Apr 27, 2005
This is a solution for a very specific problem, and it's one that you'll hardly ever use, but it's important to know about that one scenario where it can save your neck. Ordinarily, stored procedures are only recompiled if they're no longer in the procedure cache. But if a stored procedure's execution plan is still in the cache, then SQL Server reuses the compiled storedprocedure and its existing execution plan. This is almost always the best course of action. Almost always, but not always.Sometimes, however, reusing an existing plan doesn't offer the most efficient performance. Imagine, for example, that your stored procedure accepts a parameter that determines the natureof a JOIN operation. The results can vary in a big way, so you wouldn't want your procedure to be locked into an execution plan that might be completely inappropriate for that JOIN. In a highlyspecialized case like this, you might want to force SQL Server to recompile the procedure every time the procedure runs. Doing so comes at a performance cost, but this might be offset by thesavings you gain in not executing the procedure with an awful compiled execution plan. Consider carefully whether to use this approach (or whether to re-engineer the over-design of yourapplication to avoid this situation in the first place). Should you need to instruct SQL Server to recompile each time, add the WITH RECOMPILE directive to the procedure, like this: CREATE PROCEDURE ProcName @Param int /* ... other parameters */ WITH RECOMPILE AS /* ... procedure code follows */
If we omit "WITH RECOMPILE", what will be the consequence? Thanks
View 3 Replies
View Related
Jul 20, 2005
I have a problem with an instance of SQL Server that refuses torespond to a shutdown request. I've managed to shutdown the SQLManager and DTC services but the sqlservr.exe process is permanentlyin a "Stopping" state.I cannot logon to the instance to issue a SHUTDOWN WITH NOWAITcommand. Short of rebooting the entire server, is there a way I canforce the process to end?Tony
View 3 Replies
View Related
May 8, 2007
Hi,
I am at a loss here, unless I misunderstand the whole point about server encryption. My 2005 SQL server has a certificate from a trust CA, I have turned on the 'force encryption' flags on the server. My understanding is any client will be "force" to connect with encryption?? I found out that unless I turn on encryption on my clients, the server will allow connections without the requiring encryption. Am I missing something here? Thanks for any help you can provide.
View 6 Replies
View Related
Aug 7, 2001
My SQL 7.0 server is currently querying the SAM database on the PDC for Windows NT authentication. How can I force it to use the SAM database on the server(BDC) that I specify?
View 1 Replies
View Related
Jul 13, 2015
I need to investigate about what happened to our production server at the last weekend.i restored it to another server which is development. I restored it under a name "old_master_2015_07_10".
But if I run a query
SELECT *
FROM [old_master_2015_07_10].[sys].[servers]
it actually selects from master.sys.servers, not from my old_master... In order to prove it, I created a linked server in this, development server, and if I run SELECT * FROM [old_master_2015_07_10].[sys].[servers], it selects it. And in database selection drop-down box I also selected old_ master_ 2015_07_10. What I think it apparently recognizes familiar names like sys.servers and redirects the query to the master.
What I can do to select really from old_master_2015_07_10 database? I already thought about renaming sys.servers to something different, but did not do it not to break something in master in case if SQL Server will run it in master as well.
View 9 Replies
View Related
Oct 30, 2015
Select A.* from A inner join B on ( A.ID= B.ID )
I know there is some key word that you use to force SQL server to generate a new query plan ?What can that be ?
View 7 Replies
View Related
Feb 25, 2008
A view named "Viw_Labour_Cost_By_Service_Order_No" has been created and can be run successfully on the server.
I want to import the data which draws from the view to a table using SQL Server Import and Export Wizard.
However, when I run the wizard on the server, it gives me the following error message and stop on the step Setting Source Connection
Operation stopped...
- Initializing Data Flow Task (Success)
- Initializing Connections (Success)
- Setting SQL Command (Success)
- Setting Source Connection (Error)
Messages
Error 0xc020801c: Source - Viw_Labour_Cost_By_Service_Order_No [1]: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "SourceConnectionOLEDB" failed with error code 0xC0014019. There may be error messages posted before this with more information on why the AcquireConnection method call failed.
(SQL Server Import and Export Wizard)
Exception from HRESULT: 0xC020801C (Microsoft.SqlServer.DTSPipelineWrap)
- Setting Destination Connection (Stopped)
- Validating (Stopped)
- Prepare for Execute (Stopped)
- Pre-execute (Stopped)
- Executing (Stopped)
- Copying to [NAV_CSG].[dbo].[Report_Labour_Cost_By_Service_Order_No] (Stopped)
- Post-execute (Stopped)
Does anyone encounter this problem before and know what is happening?
Thanks for kindly reply.
Best regards,
Calvin Lam
View 6 Replies
View Related