I have a simple invoice, inventory, billing program that is doing something strange but I can't track it down.
I can't figure out what triggers it but occasionally when I add an invoice to a customer all the other invoices are added to the last customer updated.
My update statement on the invoice table is
UPDATE Invoices
SET Date = @Date, InvoiceTotal = @InvoiceTotal, SubTotal = @SubTotal, Tax = @Tax, CustomerID = @CustomerID
WHERE (CustomerID = @CustomerID)
The only triggers on the table are to update the qty in Inventory and another trigger that fires when a payment is entered.
Hi All,I'm trying to track down a mysterious problem we're experiencing inwhich updates and inserts to tables in our mssql2k server appear to be'disappearing.'To explain our situation:We have a web page (written in ASP, if that's relevant) on which weaccept enrollment information.When that page is submitted, the form data is passed to a storedprocedure on our mssql2k server, which performs several operations,all of which are wrapped in a transaction.In particular, the stored procedure performs an update operation on arecord in one table (i'll call it TableA) and an insert into anothertable (TableB).If the procedure encounters a problem (ie after each update / insertoperation in the procedure we test for IF @@Error<>0) it performs arollback, performs a select similar to the one immediately below, andthen RETURNs.SELECT '1' as error, 'Unable to update TableA' as errormsgIf the procedure doesn't fail any of the @@Error tests, thetransaction is committed, and a membership number is SELECTed to bereturned.SELECT '0' as error, @memnum as membershipnumberThe @memnum variable is populated within the transaction.Back in the ASP page we test both for the proc returning an emptyrecordset, or for it passing an explicit value in the error field, andpush the page to an error page if either of these conditions are met.If, on the other hand, none of these conditions are met, and themembershipnumber field in the recordset is populated with a validmembership number, we push to a confirmation page.This confirmation page receives the membership number in a sessionvariable, performs a SELECT against TableB (the table that receivedthe insert during the proc) using that membership number in the WHEREclause, and the resultant recordset is used to populate theconfirmation details on that page. That recordset is also then used topopulate the details of a confirmation email, which is automaticallysent by the confirmation page.And now here's our problem: we've become aware of a handfull of peoplewho have gone through the enrollment process, have received theconfirmation email containing the information they supplied asexpected, but the data appears to be entirely missing from our tables.By that I mean that the record in TableA does not appear to have beenupdated (under normal circumstances that record should have hadseveral flags set, and several other fields updated with informationsupplied by the person enrolling), and the record in TableB does notappear to have been inserted.In essence, looking at our tables, it *feels* like the transaction inthe stored procedure for that particular enrollment hit a problem andwas rolled back. However, the evidence that we have in the form of theconfirmation email argues strongly that the data must have existed inour tables (particularly in TableB), if only for an unknown period oftime.We're kind of at our wit's end to work out what is going wrong withthese enrollments. From my understanding of transactions (and I couldwell be wrong) any changes to data (ie updates, inserts etc) containedwithin are essentially 'invisible' to any other operation (ie theSELECT that happens in the confirmation page) until the transaction iscommitted, implying that the effect of the update and insert shouldhave been 'permanently' successful if no error code is received and ifa valid membership number was returned. I ask, because someone in ourteam has suggested that maybe the operations in the transaction'lasted long enough' in the tables to have been visible for the SELECTon the confirmation page to have worked, but were then subsequentlyrolled back, explaining why the confirmation email is appropriatelypopulated and why the data then appears to be missing. However, as Isaid, this doesn't match my understanding of how transactions behave.Sorry for the length of this post, but I felt it was best to explainthis as best as I could.Does anyone have any advice they can give us on this situation? ie,are there any known problems with operations in transactions 'bleedingover' into tables, but then being rolled back at some later point?Does anyone have any thoughts or suggestions on how we can furtherdiagnose this issue?Truly, any help will be immensely appreciated...Thanks in advance,M Wells
Hi All,Further to my previous long-winded question about a situation in whichwe appear to be mysteriously losing data from our mssql2k server.We discovered an update statement, in the stored procedure we believeis at fault, after which no error check was being performed.Under certain conditions, this update is fired against the same recordin the same table as the immediately preceding update statement withinthe transaction. We are now suspecting that under some circumstances,these two updates get into a locking conflict that is eventuallyforcing the transaction to be rolled back.However, I'm still left with three questions.1) Where an update in a transaction gets locked, and an error isn'ttested immediately afterwards (ie no 'IF @@Error<>0' test is made),would the transaction proceed as normal?2) Most critically, would statements in the stored procedure thatappear after the COMMIT TRAN statement also be executed, even if anunresolved lock existed within the transaction?3) Assuming that (2) does happen, would a SELECT made on anotherconnection with a 'WITH(NOLOCK)' locking hint be able to see thechanges made in the locked transaction even if the server is set toREAD COMMITTED, and the SELECT takes place some time after the COMMITTRAN is issued? More to the point, given (2), how long would thelocked transaction survive before being rolled back after the COMMITTRAN has been issued? Is it possible that the COMMIT TRAN takes place,the transaction is flagged for potential rollback while a lockresolution is attempted, the stored procedure exists as thougheverything was fine, a subsequent SELECT (ie performed as one of thenext operations in the same application) using WITH(NOLOCK) 'sees' thechanges made by the transaction, reinforcing the impression that thetransaction succeeded, and then at some point thereafter the lock isdetermined to be unresolvable and the transaction is rolled back,making it seem as though the data disappeared, even though it had beenSELECTable via a different connection to the server?Thanks, by the way, to Simon and Erland for your advice on my previousquestions about this problem.Much warmth,M Wells
I hope this is the right forum for this question, my apologies in advance if it isn't....
We have a web based CGI product (written in C++ VS 6) that uses ODBC and takes text from a submitted web page and stores it in a SQL Server table in a field of type "ntext". The user in question is copying and pasting this text from an MS Word 2003 document. After the initial save our app errors out trying to access the table it just wrote to, and when we look in the table we see that up to **200 carriage returns** have been mysteriously inserted into the ntext field!! (Our product has been out in the field with no such problem for several years, so we are thinking it's related to something specific the customer is doing - perhaps with using MS Word for the source text.) We have tried but cannot duplicate the problem, but the customer sees it with each attempt to modify the table in question. The only thing that I see out of the ordinary is that the field in question is of type "ntext" - which supports unicode, instead of nvarchar. Does any of this ring a bell for anybody? I'm thinking of changing the field type to nvarchar to see if that solves the problem. Thanks, Steve Bradbery
Hi. I am stuck with error: An error occurred while receiving data: '64(error not found)'. My Service Broker configuration: Server A initiator, Server B target. Server A sends message to Server B, Server B sends back reply. On this stage I receive problem. Server B message does not come to server A. It stays on sys.transmission_queue on server B. This problem occurs only during initial setup. During initial setup, on Server B messages, queues, services, contracts, routes, bindings are created. On Server A: routes and bindings. As a last step, a message is sent to Server B. So, reply on this message never comes. Initial setup is run using procedure with execute as €˜owner€™, owner=€™dbo€™. But, whenever I send messages after that, everything works fine.
1. Created a INSTEAD OF TRIGGER for INSERT 2. In the trigger, I have written a sql which inserts data in the same table 3. When I do the INSERT on the table the data in the trigger gets inserted 4. I was sort of expecting a recursive loop
Is this the normal behavior?
------------------------ I think, therefore I am - Rene Descartes
I have encountered a very frustrating situation when trying to use SQLBulkCopy. I have two excel files that I am trying to import into two tables in an MSSQL Server 2005 Express DB. One excel file has 5,000 rows, while the other file has 500,000 rows.I was able to import the smaller file successfully using this vb.net code: Protected Sub L26ExcelToSQL() 'Declare Variables Dim sSQLTable As String = "Local26Members" Dim sExcelFileName As String = "Full Local 26 List Formatted.xls" Dim sWorkbook As String = "[Sheet1$]"
Dim sSqlConnectionString As String = ConfigurationManager.ConnectionStrings("SiteSqlServer").ConnectionString.ToString 'Execute a query to erase any previous data from our destination table Dim sClearSQL = "DELETE FROM " & sSQLTable Dim SqlConn As SqlConnection = New SqlConnection(sSqlConnectionString) Dim SqlCmd As SqlCommand = New SqlCommand(sClearSQL, SqlConn) SqlConn.Open() SqlCmd.ExecuteNonQuery() SqlConn.Close() 'Series of commands to bulk copy data from the excel file into our SQL table Dim OleDbConn As OleDbConnection = New OleDbConnection(sExcelConnectionString) Dim OleDbCmd As OleDbCommand = New OleDbCommand(("SELECT * FROM " & sWorkbook), OleDbConn) OleDbConn.Open() Dim dr As OleDbDataReader = OleDbCmd.ExecuteReader() Dim bulkCopy As SqlBulkCopy = New SqlBulkCopy(sSqlConnectionString) bulkCopy.DestinationTableName = sSQLTable bulkCopy.WriteToServer(dr) OleDbConn.Close() End Sub However, when I tried to import the 500,000 row excel file, I got the following error: Server Error in '/L26' Application.
A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)
Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.Data.SqlClient.SqlException: A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)
[SqlException (0x80131904): A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)] System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) +925466 System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) +800118 System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) +186 System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error) +556 System.Data.SqlClient.TdsParserStateObject.ReadSni(DbAsyncResult asyncResult, TdsParserStateObject stateObj) +164 System.Data.SqlClient.TdsParserStateObject.ReadPacket(Int32 bytesExpected) +34 System.Data.SqlClient.TdsParserStateObject.ReadBuffer() +44 System.Data.SqlClient.TdsParserStateObject.ReadByte() +17 System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) +79 System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal() +1336 System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount) +916 System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader) +151 _Default.CSVToSQL() in d:hostingmemberwolsite1L26DuesDefault2.aspx.vb:440 _Default.ButtonTest3_Click(Object sender, EventArgs e) in d:hostingmemberwolsite1L26DuesDefault2.aspx.vb:905 System.Web.UI.WebControls.Button.OnClick(EventArgs e) +105 System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument) +107 System.Web.UI.WebControls.Button.System.Web.UI.IPostBackEventHandler.RaisePostBackEvent(String eventArgument) +7 System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument) +11 System.Web.UI.Page.RaisePostBackEvent(NameValueCollection postData) +33 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +1746
Version Information: Microsoft .NET Framework Version:2.0.50727.1433; ASP.NET Version:2.0.50727.1433 After I received this error message, I tried viewing my database through the MSSQL Control Panel utilized by my hosting provider (WebHost4Life). However, I was unable to connect to the database and received this error: ___________________Microsoft OLE DB Provider for SQL Server error '80040e14' Database 1496 cannot be autostarted during server shutdown or startup. /getDBinfo.asp, line 29
_____________________ Now here is the most frustrating/mysterious part. I figured that maybe the error message were a result of the large size of the second excel file, so just for testing purposes, I created a new table in my MSSQL database. The table just has two fields, both set to varchar(50). I then created a test excel file, that had one row with the word "test" in the first and second column. When I tried using the code above to import the test excel data into the test table, I got the same exact error as I did with the 500,000 row file!Please help, I'm really stumped and I am not sure when I am having so much trouble replicating the success I had the 5,000 row file. Any suggestions are much apprecaited. -Bryan
Is there a way to either set Sql Server 2000 or ASP.net datetime fields to a standard format. The problem is that I am passing correct datetime fields using stored procedures and keep getting "Cannot convert datetime into string". It seems to me that many other developers are having that same problem. I tried alot of different methods and still have the same problem. I'm using c# and I never had a problem with datetime fields when I was using vb.net. The problem is that SQl Server is returning datetime formats that are not compatible with c#. I have code that works in other projects but when I try to use that same code I get that conversion error. How do I set the datetime in SQl Server and ASP.Net when I run queries so that the datetime output is in mm/dd/yyyy?
I get this mysterious error while running an update. SQLServer tells me that my update row is to big to fit in a row. Then I check the size of the row and and it is like 2000 bytes so it should fit without any problem at all. The really strange about it that when comments a field it works just fine. That field is always null. Could that be the problem. Anyone who have ever had a similar problem or so?
Second post for the day. I've just noticed a few indexes in our tables that have are named like such: hind_c_207_1. They seem to have been put in automatically, and are messing with our index structure. Anyone know where these come from and how I can stop them? Any help would be great.
Hey all,strange problem here... query #1 displays 357 records correctly and allis well. However, when placed within query #2 as a subquery, it updatesevery single record in the lta table, what's going on here? anythoughts?1.) select *from LTA INNER JOIN new_listON lta.voy = new_list.voy ANDlta.poe = new_list.poe2.)update ltaset lta.LL_RCVD = 'N'where exists (select *from LTA INNER JOIN new_listON lta.voy = new_list.voy ANDlta.poe = new_list.poe)
I am facing a pretty strange issue running a query in SQL Server through an ODBC client. Here is what is happening -
1. I am able to connect to SQL server. The connect entry is present in ODBC log as well as SQL Server trace. Running sp_who2 active indicates the connection to be active. No problem here.
2. At this point, running a simple query (something that returns in less than 10 mts or so) comes back with the result set. No problem here.
3. However, when a complex query (that it does not return anything even after 10 mts) is run, a run of sp_who2 active or dbcc inputbuffer commands indicate that the connection created initially is not there any longer. SQL Server trace indicates a logout event for the connection. However, ODBC log indicates that the connection is still active in SQL Server and the ODBC client continues to wait for the results of the query for hours until I kill it.
I use ODBC 4.2 client on AIX 5.2 and connect to SQL Server 2005.
One of the theories I have is that the AIX server does not respond to more than a certain number of keep alive messages resulting in SQL Server closing the connection gracefully thinking that the client is no longer available. However, even after changing the keep alive setting for TCP/IP in Network Configuration Menu to 4 hrs, the same behaviour continues.
Can someone throw some light as to what might be happening?
Hi everyone,It looks like a mystery, but I hope there should be some explanation tothe issue I experience. Once in a blue moon a random stored procedurestops working the way it was designed. The stored procedure code looksunchanged. Recompiling, altering the code do not help. It looks like itis simply does not execute some part of it or does not run at all.However it returns no errors.One time a procedure entered into infinite loop and almost hang thewhole server.When I copy procedure code and save it under different name, it worksas designed. But nothing helps with existing procedure. The only wayhow to fix it is to completely drop and recreate.The problem is, that you usually have to do it in the middle of thebusiness day after you spent few hours trying to realize what wentwrong before you realize that you got another mysterious corruption. Ofcause I have no clue of how to detect such things in advance and toprevent them from occurring in the future.I can guarantee that the SQL code in those procedures was absolutelybug free, fully tested and was working fine for a long time.For the first time I thought that internal compiled code might corrupt.In this case altering or recompiling should help.I also thought about execution plan, but it should be also fixed bydoing things above.DBCC checkdb does not find any errorsThe issue never goes away until stored procedure is manually droppedand recreated with the same SQL code.So, I'm asking all if someone experienced something similar and canexplain how to prevent it, please share the knowledge. I wouldappreciate any type of help.Thank you.
Yesterday afternoon I lost access to my local MSDE 2000 sp3(a?)instance. Usinig NT or SQL authentication I get an 'access denied'error when I try to connect. This happened suddenly with no changesto the server on my part. I've tinkered with the Client NetworkUtility settings, from Multiprotocol only, every combination of Multiprotocol, TCP/IP, and named pipes. I've tried connecting with mynetwork machine name, using (local), and using my IP address (both127.0.0.1 and my network IP). I've tried connecting from QueryAnalyzer on other computers. Of course I've rebooted as well, all tono avail.I know that there were network changes being made, but my domain login(which I normally use to connec to my SQL Server as a sysadmin) isstill in the local administrators group on my machine and I haven'tnoticed any problems accessing anything else.Any ideas?
When executing some "big" update on a Sql Server 2005 standard edition SP1 (not clustered), sometimes it stops in a middle of work with this message:
The specified network name is no longer available. [SQLSTATE 08S01] (Error 64) Communication link failure [SQLSTATE 08S01] (Error 64)
I was reading some articles but they were all about clustered servers. Somewhere I read that it could be a memory problem, so we put 3G RAM and separate 20G paging partition beside the other data and backup partitions. The problem stays...
Funny thing is that it can sometimes happen but sometimes not with the same conditions!?
The server is Windows 2003 Server standard edition with SP1.
Hello, I have a report that looks fine in preview, but when put to PDF or printed contains gaps at the right and bottom.
Here is a picture of the problem: http://northeasttigers.webng.com/pdfproblem.jpg
My tables are the same width as the body, 15.5cm, and the report width is 21cm. Also adding the bottom table's top location attribute to it's height gives the same height as the body.
I get the following error messages in the sql server error log
Source Logon
Message Error: 18456, Severity: 14, State: 11.
and
Source Logon
Message Login failed for user 'NT AUTHORITYANONYMOUS LOGON'. [CLIENT: 185.23.11.33]
The scenario is: We set up log-shipping (LS) between a clustered sql server system (source server) and a stand-alone sql server box (target server). (SQL Server 2K5 EE + SP1), and LS goes very well, but on the target server, we found the above-mentioned error messages. BTW: the two servers are in the same domain.
We have a highly transactional database. It was owned by a third party before but now both the database and the application is on our site and we are trying to improve this project. So, we have a big (902919 rows), heap table, which is getting bigger and bigger everyday and sometimes deadlocks occur. The table has only 4 columns, "token", "type", "value" and "cacheTime", unique index cannot be created. It has one index on "token"(char(36)) and "type"(varchar(50)) ("value" should also be included but it is nvarchar(max)).
I have a dts which creates a table which is utilized on my localintranet. The DTS runs without error and the table iscreated/populated/transfered to the appropriate db. Then it appearsthat there is an action on this table which truncates it. I have beenunable to determine the culprit. Can I create a trigger that willcapture truncation? I have tried to create a trigger to capture thisinformation but none that I attempt seem to work on capturing atruncation or a drop table and re-create.Any help would be greatly appreciated.MT.
A report is picking up some values from the body and displaying them in text boxes within the Page Header, via the ReportItems collection. The text boxes within the body have their format specified as #,###; (#,###) - so displaying negative values within brackets. If the following value is set for the Page Header text box:
="My Value" & " " & ReportItems!variance.Value
the value displayed is, for example:
(My Value (1,123
Hence the requested trailing bracket has been swapped to become a leading bracket. Whatever I've tried I cannot get the bracket in the correct place. Am I missing something obvious or is this a bug?
I have function that returns a table of information about properties. Thedata comes from three different tables -- addresses (called PropertyID),property characteristics, and events concerning those properties (sales,appraisals, etc.), plus a table that maps one representation of propertytypes into another. The records are selected on the basis of location(longitude & latitude), property type, event type, and a range ofevent dates (upper and lower date specified). There are tens of millionsof records of all types, and almost any location, property type, event typeand date range will yield records.The heart of it is a cursor that selects records from joins on this basis:SELECT <a bunch of fields>FROM Property dJOIN PropTypeMap ptm ON ptm.PropertyTypeID = d.PropertyTypeIDJOIN PropertyID a ON a.PropID = d.PropIDJOIN Event e1 ON e1.PropID = d.PropIDLEFT OUTER JOIN Event e2 ON e2.PropID = d.PropIDWHEREd.LastSaleDate >= @LoDateAND a.GeoLongitude BETWEEN @LowerLon AND @UpperLonAND a.GeoLatitude BETWEEN @LowerLat AND @UpperLatAND ptm.PropCategory = @PropTypeAND a.GeoMatch <= @MinGeoQualityAND e1.EventTypeID = @SaleEventTypeAND e1.TransactionType = 'R'AND e1.EventDt BETWEEN @LoDate AND @HiDateAND e1.EventAmt > 0AND e2.EventTypeID = @AssessmentEventTypeAND e2.EventDt <= @HiDateAND e2.EventAmt > 0Each property has one PropertyID record, one Property record, and N Eventrecords (average perhaps five).What is the mystery? If @HiDate, which is the upper end of the time window,is 2002-11-08 or earlier, nothing is returned. If it's 2002-11-09 or later,oodles of records are found. I get the same query plan for either one, andbased on the content of the data, they should return almost exactly the sameset of records -- exactly the same set in almost all cases, in fact.Is 2002-11-08/09 some sort of magic dividing point? I have replicated thison the large database and on a smaller test version on another SQL Server.(SQL Server 2000) I dropped the indexes and tried it, andthe same thing happened. This is driving me crazy!
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.
The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?
Is there any performance impact if we enable CDC on OLTP db?
Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?
What is the best way to implement CDC to take incremental changes for reporting.
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:Â
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected. It seems that somehow the 3 columns are not read in from the source file? ans alslo fiscal year, fiscal week is not set up up properly in my data destination? anyone faced such errors before?