Inconsistent Metadata From Linked 6.5 Server
Aug 16, 1999
I have a problem retrieving data from a linked SQL Server 6.5 server in version 7. I have set-up the server as a linked server and most operations occur with no problem. With some tables, though, I get the following error when I attempt to query any column of data:
Server: Msg 7356, Level 16, State 1, Line 1
OLE DB provider 'SQLOLEDB' supplied inconsistent metadata for a column. Metadata information was changed at execution time.
This occurs consistently for a subset of tables. I cannot see any pattern to the configuration of these tables that would make them any different to those where a query succeeds.
Any ideas?
Regards
Philippe
View 1 Replies
ADVERTISEMENT
Dec 20, 2007
I am running a SQL2K on a Server2K box. I have linked servers to SQL2005 Express on WinXP. I am trying to get data off of the SQL2005 DB to the central DB on SQL2K. This has been running for about a year. I am now getting an error message as follows:
Server: Msg 7356, Level 16, State 1, Line 1
OLE DB provider 'SQLNCLI' supplied inconsistent metadata for a column. Metadata information was changed at execution time.
OLE DB error trace [Non-interface error: Column 'TestVarChar' (compile-time ordinal 2) of object '"WKS"."DBo"."testtable"' was reported to have changed. The exact nature of the change is unknown].
When I run the following Queries on the SQL2K box:
SELECT * from WKSA400.WKS.DBo.testtable
or
SELECT * from OPENQUERY(WKSA400, 'SELECT * from WKS.DBO.testtable')
I did notice that if my testtable has only numeric fields, it works fine. It only has a problem with alphanumeric fields.
If I run 'exec sp_tables_ex WKSA400' on the SQL2K, it works fine. I get a listing of all the tables on the remote server. This tells me that my security is fine. It seems to be a conversion from the 2005 to 2000 SQL. If it had not been working for over a year, the dent in the wall the size of my head would be much smaller.
View 7 Replies
View Related
Jul 11, 2006
I have a Source Query with this sql set as a property expression:
"SELECT Category, Server_Name, Entitle_UserID,User_SubID,Start_Time,End_Time,Entitle_User_Name,Stat_Name,Stat_Count,Stat_Type,pk,create_date,run_num,Average,Median,Maximim FROM tbl_ws_stats WHERE pk > " + (DT_STR, 100, 1252)@[pk_var]
There is a message : 'The component has inconsistent metadata.'
Then when I click on the Source Query: 'The component is not in a valid state. Do you want the component to fix itself automatically?'
I also notice that there are no columns on the Column Mappings tab and no way to add columns.
How can I correct?
Thanks
View 9 Replies
View Related
Feb 4, 2008
I have SQL Server 2000 SP4 and i am using a link server which is also 2000 sp4. I am facing an interesting error which is
OLE DB provider 'SQLOLEDB' supplied inconsistent metadata while extracting the data.
FYI.
I am executing the dynamic query which uses another server to fetch the data.
Let me know if anyone can help me out to resolve this issue.
Thanks
Vishal
View 3 Replies
View Related
May 4, 2008
help...
i successfully linked the sybase database to sqlserver. at first i can select a record, after i execute the same statement the result become different. i was looking for answer in the web but i can't find one. can somebody help me...
i suspect that sqlserver cache the result.. so i try to create a view for sybase database table. when i select the view it gave me the complete result but when i rerun the statement again it became inconsistent.
is there a cache or something in sqlserver. if there is how can i clear it.. so i will always get the fresh copy of the result.
View 6 Replies
View Related
Feb 26, 2007
Hello,
I have a linked server named 'Charlie_File' to an Excel Workbook that I set up in SQLServer 2005 Management Studio. The workbook is on my local C drive. Sometimes, I get the results back that I expect when I run the following query;
SELECT * FROM OPENQUERY(Charlie_file, 'SELECT * FROM [Feb$]')
Sometimes, on subsequent runs of the above query, I get the following message;
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "MSDASQL" for linked server "Charlie_file" reported an error. The provider did not give any information about the error.
Msg 7303, Level 16, State 1, Line 1
Cannot initialize the data source object of OLE DB provider "MSDASQL" for linked server "Charlie_file".
There seems to be about a minute or so of a delay before the query will run correctly on subsequent attempts. Is there a connection issue here where a connection blocks subsequent attempts to select the data within a specific time span?
Thank you for your help!
cdun2
View 1 Replies
View Related
Jun 5, 2014
I have a lot of rows of hours, set up like this: 0745, 0800, 2200, 1145 and so on (varchar(5), for some reason).
These are converted into a smalldatetime like this:
CONVERT(smalldatetime, STUFF(timestarted, 3, 0, ':')) [this would give output like this - 1900-01-01 11:45:00]
This code has been in place for years...and we stick the date on later from another column.
But recently, it's started to fail for some rows, with "The conversion of a varchar data type to a smalldatetime data type resulted in an out-of-range value".
My assumption is that new data being added in is junk. If I query for these values and just list them (rather than adding a column to convert them also) that's fine, of course. I've checked all the stuffed (but not yet converted - so 11:45 rather than 1145) output to see if it ISDATE(), and it is. There are no times with hours > 23 or minutes greater than 59 either.
If I add the CONVERT in, we see the error message. But here's the oddity, if I place all of the rows into a holding table, and retry the conversion, there is no error. It's this last bit that is puzzling me. Plus I can't see any errors in the hours data that would cause a conversion problem.
I've put the whole of this into a cursor to try to trap the error rows too, but all processes fine. Why would it fail if NOT in a cursor?
View 9 Replies
View Related
Aug 27, 2014
I have the following data stored in a column in a table:
£10,000.00 & ' " % < > Guðmundsdóttir Björk Lårs Marqués María-Jose Carreño Quiñones
I query this in a stored procedure using the FOR XML clause and universal table, and store the result in an XML data type variable:
DECLARE@ResultXML
EXECUTE[someStoredProcedure] @Result OUTPUT
This data (as XML) is then written to a file; in order to do this, I CONVERT the data to NVARCHAR since there are unicode characters in the source:
DECLARE@strResultNVARCHAR(MAX)
SET@strResult= CONVERT( NVARCHAR( MAX ), @Result, 1 )
Now this works fine, except on inspection, SQLServer has decided to render the data thus:
£10,000.00 & ' " % < > Guðmundsdóttir Björk Lårs Marqués María-Jose Carreño Quiñones
Why it has changed the apos and quot entities to the corresponding character but not the other entities is beyond me.
how to preserve XML entities?
View 0 Replies
View Related
Jun 1, 2015
I have an issue in my production , SQL server 2008 R2 , in Error log it is said “ An error occurred in dialog transmission Error : 9655 , State : 3 and Error : 9736 Severity : 16 State 0
I checked in sysmessages , it is said :” The transmission queue table structure in database is inconsistent. Possible database corruption “
So do you think I need to run DBCC checkdb ? Should I run it after working hours or it is fine within working hours ?
View 9 Replies
View Related
Apr 21, 2015
USE <database>
select * from sys.database_files
and
select * from sys.master_files where database_id= <db id>
give me different size of memory optimized file in <database>
Microsoft SQL Server 2014 - 12.0.2456.0 (X64)
View 1 Replies
View Related
May 6, 2015
I am getting inconsistent results when BULK INSERTING data from a tab-delimited text file. As part of my testing, I run the same code on the same file again and again, and I get different results every time! I get this on SQL 2005 and SQL 2012 R2.
We have an application that imports data from a spreadsheet. The sheet contains section headers with account numbers and detail rows with transactions by date:
AAAA.1234 /* (account number)*/
1/1/2015 $150 First Transaction
1/3/2015 $24.233 Second Transaction
BBBB.5678
1/1/2015 $350 Third Transaction
1/3/2015 $24.233 Fourth Transaction
My Import program saves this spreadsheet at tab-delimited text, then I use BULK INSERT to bring the data into a generic table full of varchar(255) fields. There are about 90,000 rows in each day's data; after the BULK INSERT about half of them are removed for various reasons.
Next I add a RowID column to the table with the IDENTITY (1,1) property. This gives my raw data unique row numbers.
I then run a routine that converts and copies those records into another holding table that's a copy of the final destination table. That routine parses though the data, assigning the account number in the section header to each detail row. It ends up looking like this:
AAAA.1234 1/1/2015 $150 First Purchase
AAAA.1234 1/3/2015 $24.233 Second Purchase
BBBB.5678 1/1/2015 $350 Third Purchase
BBBB.5678 1/3/2015 $24.233 Fourth Purchase
My technique: I use a cursor to get the starting RowID for each Account Number: I then use the upper and lower RowIDs to do an INSERT into the final table. The query looks like this:
SELECT RowID, SUBSTRING(RowHeader, 6,4) + '.UBC1' AS AccountNumber
FROM GenericTable
WHERE RowHeader LIKE '____.____%'
Results look like this:
But every time I run the routine, I get different numbers!
Needless to say, my results are not accurate. I get inconsistent results EVERY TIME. Here is my code, with table, field and account names changed for business confidentiality.
TRUNCATE TABLE GenericImportTable;
ALTER TABLE GenericImportTable DROP COLUMN RowID;
BULK INSERT GenericImportTable FROM 'SERVERGeneralAppnameDataFile.2015.05.04.tab.txt'
WITH (FIELDTERMINATOR = ' ', ROWTERMINATOR = '', FIRSTROW = 6)
ALTER TABLE GenericImportTable ADD RowID int IDENTITY(1,1) NOT NULL
SELECT RowID, SUBSTRING(RowHeader, 6,4) + '.UBC1' AS AccountNumber
FROM GenericImportTable
WHERE RowHeader LIKE '____.____%'
View 3 Replies
View Related
Mar 25, 2002
Hi ,
On my Desktop i registered Production Server in Enterprise Manager
on that Server if i go to SecurityLinked Servers
There is another Server is already mapped, when i am trying to see the Tables under that one of the
Linked Server i am getting the Error message saying that
"Error 17 SQL Server does not exist or access denied"
if i went to Production Server location and if i try to see the tables i am able to see properly, no problems
why i am not able to see from my Desk top
i am using the sa user while mapping the Production Server on my DESKTOP using (ENTERPRISE MANAGER)
And i check the Client Network Utility in the Alias using Named Pipe only, i changed to TCP/IP still same problem
What might the Problem how can i see the Tables in Linked Server from my DESKTOP
Thanks
View 5 Replies
View Related
Apr 24, 2015
I am using Linked Server in SQL Server 2008R2 connecting to a couple of Linked Servers.
I was able to connect Linked Servers, but I cannot point to a specific database in a Linked Server, also, I cannot rename Linked Server's name.
How to point the linked server to a specific database? How to rename the Linked Server?
The following is the code that I am using right now:
USE [master]
GO
EXEC master.dbo.sp_addlinkedserver
@server = N'Machine123Instance456',
@srvproduct=N'SQL Server' ;
GO
EXEC sp_addlinkedsrvlogin 'Machine123Instance456', 'false', NULL, 'username', 'password'
View 6 Replies
View Related
Jul 20, 2005
Be careful when implementing views (from SQL Server 97/2K). SQL Serverstores the metadata on the view at creation (or the last time it wassaved). This means if you have:SELECT * FROM table1it will put all the fields of table1 in the view's metadata. If youthen change table1 and add (for example) another field, this fieldwill not be visible in the view until you open it in design view andclick save (to update it).
View 1 Replies
View Related
Mar 30, 2007
Hi
I add extended properties to table in sql server 2005, for example i add Caption property to every column.
In case I generate a datasource from table in sql server in visual studio 2005, Is there a way that "Caption" property in datatable to be filled from that metadata, or metadata is useless for dataset.
Every time i had to fill manually the caption property of datatable generated .
View 4 Replies
View Related
Jul 18, 2006
Is there a way to bypass the syntax checking when adding a stored procedure via a script?
I have a script that has a LINKed server reference (see below) .
INSERT
INTO ACTDMSLINKED.ACTDMS.DBO.COILS ..etc.
ACTDMSLINKED does not exist at the time I need to add the stored procedure that references it.
PLEASE to not tell me to add the LINK and then run the script. This is not an option in this scenerio.
Thanks,
Terry
View 4 Replies
View Related
Oct 21, 2015
I'm curious if there is a way to gather any metadata about a query that was just ran, or about a dynamic query passed in as a parameter, such as being able to "dynamically" return a list of columns in the query, along the lines of:
SELECT * FROM [myTable];
SELECT @@COLUMNS --Expecting this to return a recordset of columns for the previously run query
I'm not talking about querying metadata about the tables themselves, I'm really looking for what "metadata" might be available about the query itself.
This would be useful for me to dynamically generate some code for ad-hoc queries with a stored procedure call, or to simply return a list of just column names only without the data.
Is this at all possible in SQL Server?
View 1 Replies
View Related
May 26, 2015
I can find many examples of loading DBCC results into tables. They all begin with a create table statement defining the results. My question is , other than trial and error, is there a way to determine what data types will be returned. Sure you can say that first element looks like an integer, but is it really a bigint, and that text string can be varchar(max) but will char(2) work.
I'm not looking for an answer for a specific DBCC function, but rather a generic way I can determine the characteristics of any DBCC result set.
I tried
SELECT *
INTO #tmp
FROM OPENROWSET('SQLOLEDB',
'Server=ray;Trusted_Connection=Yes;Database=Ed_sandbox',
'Set FmtOnly OFF; DBCC loginfo WITH tableresults ')
but I got back
Msg 11527, Level 16, State 1, Procedure sp_describe_first_result_set, Line 1
The metadata could not be determined because statement 'DBCC loginfo WITH tableresults' does not support metadata discovery.
View 1 Replies
View Related
Jan 15, 2006
I have my SQL Server installation on drive c: and the databases on anotherdrive. My c drive got corrupted and I am planning to restore it from an oldbackup. This means drive c will not be up to date.Is there anything that SQL Server saves in the registry or in its homefolder that changes frequently? I am hoping that nothing does which meansthe old backup will be good enough.Although I think the default databases are in the c drive, I will try toget them back from a recent backup.John Dalberg
View 1 Replies
View Related
Mar 17, 2008
Hi,
Can someone tell me where I can download the "SQL Server 2005 BI Metadata Samples toolkit"? Seems the old ulr doesn't work right now.
Thanks in advance.
Mike
View 5 Replies
View Related
Aug 22, 2014
While starting querying with sql server mdx query i m getting error loading metadata:no cubes found.
View 6 Replies
View Related
Oct 1, 2014
I am tasked with truncating and reloading tables from one server to another. Company policy prevents cross-server queries, but allows SSIS packages with cross-server connections. I am doing this for about 25 tables. I have the table names in a single table & I have created an FEL to execute tasks against each table one-by-one. It works fine to truncate all the tables. I run into issues, though, with the DataFlowTask. I'm able to tell it which server & table to dynamically connect from and to, but it doesn't know how to map the metadata. They're the exact same columns and field names in both source & destination.
View 9 Replies
View Related
Aug 18, 2015
SSIS package meta data information is needed to covert SSIS into a Stored Proc.
100s of table mapping reside within SSIS, what tool/method would get those information so that they copied and pasted to SP to speed up efforts.
View 2 Replies
View Related
Jan 17, 2006
Today, I download SQL Server 2005 Business Intelligence Metadata Samples Toolkit from Microsoft download center, per readme.txt instructions, build the project, but there was an error as below:
Error 1 The project could not be deployed to the 'localhost' server because of the following connectivity problems : A connection cannot be made. Ensure that the server is running. To verify or update the name of the target server, right-click on the project in Solution Explorer, select Project Properties, click on the Deployment tab, and then enter the name of the server. 0 0
After then, I just build the project,but didn't deploy it, then I try to launch
DependencyAnalyzer console application, there was also an there:
Error occurred: A connection cannot be made. Ensure that the server is running.
Are you success on this Toolkit?
Thank you in advance.
View 4 Replies
View Related
Feb 17, 2011
I have multiple xml data file in a directory say C:XMLData abc1.xml, abc2.xml, abc3.xml etc.
Need to loop through each file in ssis with Foreach loop container, and get the file name say abc1, and load the data of abc1.xml to abc1 table in sql server DB.
Next iteration will pick up the abc2.xml and find the abc2 table in sql server DB then insert the data in abc2 table.
While each iteration, xml source should also point each xsd file correspondingly.
Tables are already created in DB
I solved my problem up to getting the file name from ech iteration and assigned file name to variable, in oledb destination data access mode I select Table or view name variable, then corresponding table will get selected for data insertation.
Just wanted to know how can I read each xsd file for each xml data files while iteration.
View 12 Replies
View Related
Feb 15, 2011
I have a scenario, need to create SQL server Tables dynamically.
I Have multiple xml data file on a particular location, and want to load those XML data into sql server tables, but he metadata of each xml data files are not same.
Hence the approach is that,
1. Pick first file from that location
2. Create a table according to that xml data file metada
3. load data on newly created table.
4. Pickup the next xml data files.
5. loop through, till the XML data files are exists on that location.
View 4 Replies
View Related
Feb 11, 1999
I need some advise on how to check the consistensy of a DateTime attribute.
Some of my Datetime field(s) won't read as a NULL and some does for example:
Select FileNo,DateReceived,DateEntered,DateFU from RECEIVING
where DateReceived is not null and DateEntered is not null and DateFU is not null
I should get several records but the result is zero.
Is there any articles out there about DateTime ?
I appreciate any help.
Thanks
Rey Caunca
View 1 Replies
View Related
Apr 11, 2006
HiI have an oddity. If I run a piece of SQL:SELECT EmployeeNo, MailToFROM ST_PPS.dbo.Employeewhere AddedOn BETWEEN '01-jan-2006' and '01-feb-2006'AND MailTo NOT IN ( '3', 'x')order by MailToI get the resultsEmployeeNo MailTo----------- ------608384 1606135 1608689 1609095 1607163 1606165 1606472 1608758 1.....for 2594 rowsIf I create a stored procedure with the same SQL:-CREATE PROCEDURE dbo.PPS_testASSELECT EmployeeNo, MailToFROM ST_PPS.dbo.Employeewhere AddedOn BETWEEN '01-jan-2006' and '01-feb-2006'AND MailTo NOT IN ( '3', 'x')order by MailToGOand run it:-EXEC PPS_testI get three extra rowsEmployeeNo MailTo----------- ------607922 NULL606481 NULL605599 NULL606316 1608871 1607427 1608795 1.....for 2597Does anyone know what is happening here? It appears that the clause:-MailTo NOT IN ( '3', 'x')excludes NULL in raw SQL, but includes NULL (correctly I think) in astored procedure.Chloe CrowderThe British Library
View 5 Replies
View Related
Jul 20, 2005
I get an error every so often with a DTS package on SQL 7. Error asfollows.The connection is currently being used by a task. The connectioncannot beclosed or re-used.This doesn't happen all the time and I can sometimes (more often thannot) get the DTS package to complete in it's entirety.To explain what the DTS package does...Truncate tables in reporting environment(several in a batch)Clear Transaction LogsCopies data from live environment into CSV (for speed)Copies data from CSV files into tables previously truncated.Builds up a table based on the data copied (for reporting)Clear Transaction logsI'm using a pretty basic set up, Connection (1st DB) -> Transformationto CSV -> Transformation to Connection (2nd DB). It seems to fail oneither the first or second transformation at random (?).I've checked the transformations so that they close the connectionafterwards so it should in theory be releasing the CSV files for thenext step. I suspect that there is a timing issue with this. I cancopy the CSV files over, but this is a little sloppy and I wouldprefer not to do it.Any ideas how to find a tidy way to ensure these are closed bothbefore and afterwards ?ThanksRyan
View 2 Replies
View Related
May 5, 2008
Hi,
I am building a report with a recursive hierarchy for drill-down purposes. The hierarchy is built by querying a SSAS OLAP cube and defining a details grouping for the table/matrix.
Every time I run the report one or more of the leaf members in the recursive hierarchy "jumps" up to the highest level. First I thought that this may be due to the fact that the leafs parents are not part of the returned dataset. However, the queries makes sense and the "offending" members does never contain any data (while the query should return only non empty members) which is why this is a very strange behavior. Furthermore, the "offending" member differs between different executions of the report, despite the fact that the parameters is exactly the same and the cube is untouched between executions.
I am actually pressing "View Report", waiting for the report to execute and when I press "View Report" again, the returned datasets seem to differ, yielding different "offending" members in the report.
When I run the queries individually in the Data-tab in BIDS, the returned datasets are always the same. Execution caching is turned off for the report.
Checking against SSRS's ExecutionLog, the RowCount for consecutive executions with the exact same parameters differ. For example, RowCount:
3094
3080
3079
3088
3087
Why does SSRS behave such inconsistently? Any tips or tricks?
View 3 Replies
View Related
Apr 10, 2001
I am running SQL Server 7.0 on NT 4.0. I have created a simple query:
SELECT SUM(month1) As total_month1
FROM eac_manload
WHERE project_number = 8800
and dept IN (50,51,52,55,57,60,61,62,63,64,65,68,69)
I first time I run the query I get the correct result. Subsequent times that I run the query the result is 1 record with a Null value. The data has not changed. If I stop MSSQLSERVER and restart the service I get the correct result the first time and the Null value each time thereafter. Anybody out there with any idea of what is going on here? Any help will be appreciated!!
View 1 Replies
View Related
Mar 2, 1999
I apologize for the length of this message, but I think I need to include all this info so that the problem is understood. I am having what appears to be a problem capturing the return code from a failed BCP.
I create a stored proc to use BCP to load a table:
create procedure sp_bcp_load as
declare @RC int
execute @RC = master..xp_cmdshell "bcp JON..W4KPV in e:inetpubftprootfinreslaw4kpv.g4000.data /Sdbmtss1 /m 0 /f d:mssqluserdatafinresW4KPV.fmt /Usa /P /e d:mssqluserdatafinrescp1.err /t""|"" /r "
select 'Return code from bcp = ', @RC
if @RC <> 0
BEGIN
print 'BCP Error.'
return (8)
END
GO
If I execute the SP, and encounter a transaction log full error, the return code is still zero:
1000 rows sent to SQL Server. 45000 total
1000 rows sent to SQL Server. 46000 total
Msg 1105, Level 17, State 2:
Server 'DBMTSS1', Line 1:
Can't allocate space for object 'Syslogs' in database 'Jon' because
the 'logsegment' segment is full. If you ran out of space in Syslogs,
dump the transaction log. Otherwise, use ALTER DATABASE or sp_extendsegment to increase the size of the segment.
(54 row(s) affected)
----------------------- -----------
Return code from bcp = 0
If I execute the SP again, it correctly returns a non-zero value:
Msg 1105, Level 17, State 2:
Server 'DBMTSS1', Line 1:
Can't allocate space for object 'Syslogs' in database 'Jon' because
the 'logsegment' segment is full. If you ran out of space in Syslogs, dump the transaction log. Otherwise, use ALTER DATABASE or sp_extendsegment to increase the size of the segment.
(6 row(s) affected)
----------------------- -----------
Return code from bcp = 1
(1 row(s) affected)
BCP Error.
Does anybody have an idea why this behaves this way? Any suggestions on how to trap an error on the first call?
Thanks,
Jon Carter
View 2 Replies
View Related
Jun 8, 2006
One of my developers recently installed a backup of the production database onto his test site. His test server has the same configuration as the production server.
One of the Stored Procedures that is called takes 1:45 to run on his machine, but only 2 seconds on the production server. This same SP takes only 2 seconds on my development database.
The SP is called iteratively, up to 10 times... to run against 10 separate fields. Depending on a value for a parameter called @CriteriaClassID, depends on which portion of the SP runs.
The significant difference in processing time in itself is baffling (since the servers are same specs / configuration, as far as I can tell, and the data is identical, since he has a backup of the most recent production data).
But more baffling: if, in his data, I switch the values from field 1 to field 2, and vice versa, his results take 2 seconds (switching the values in field 1 to field 2 switches the value in @CriteriaClassID which is passed through to this SP).
It's exactly the same SP; the only difference is that field 1 is processed first, field 2 second, field 3 third etc. On the production site and my development site, it doesn't make a difference in the order they are processed. On his machine it does.
Any ideas? I though perhaps his Indexes were corrupted in the rebuild, but we ran a SQL Server maintenance schedule to clean it up, and no improvement.
This is the SP, if it is of any help:
CREATE procedure [dbo].[st_pull_model_data] @ModelID as integer, @CriteriaID as integer
as
declare @ClientID as integer, @CriteriaClassId as char(1)
/*Procedure to pull data from org_model_data and postalcode_model_data for modeling and media analysis */
/*Need to have table #temp_data created outside of SP with fields org_id and zip_code */
/*This procedure is used by SP st_model_data */
If @CriteriaID is not null
begin
set @CriteriaClassId = (Select model_criteria_type from model_criteria where model_criteria_id = @CriteriaID)
if @CriteriaClassID = 'G' -- changes client_id from specific to general, if General is required.
begin
set @ClientID = 0
end
else
begin
set @ClientID = (Select client_id from model where model_id = @ModelID)
end
If @CriteriaClassId in ('G','P')
Begin
update#temp_data
setdata1 = postal_criteria_value
from #temp_data t
left outer join
(select postalcode, postal_criteria_value
from postalcode_model_data pmd
join model_org_trade_area mota on mota.zip_code = pmd.postalcode
join model_org mo on mo.model_org_id = mota.model_org_id
where model_criteria_id = @CriteriaID
and client_id = @ClientID
and mo.model_id = @ModelID) as PMD
onPMD.postalcode = t.zip_code
end
else
Begin
update#temp_data
setdata1 = org_criteria_value
from#temp_data t
left outer join
(select distinct postalcode, org_criteria_value, omd.org_id
from org_model_data omd
join org o on o.org_id = omd.org_id
join model_org_trade_area mota on mota.zip_code = omd.postalcode
join model_org mo on mo.model_org_id = mota.model_org_id and mo.org_id = o.org_id
where model_criteria_id = @CriteriaID and o.client_id = @ClientID and mo.model_id = @ModelID) as OMD
on OMD.postalcode = t.zip_code and omd.org_id = t.org_id
end
end
View 5 Replies
View Related