Update An Analysis Services Cube From Stored Proc?
Jul 1, 2004
I'd like to be able to update an Analysis Services cube through a stored proc.
Currently I can:
- Make a DTS package that updates the cube
- run xp_cmdshell which runs dtsrun which runs the DTS package.
That is messy, easily broken, and hard to get good error info when an error occurs. Is there a better route?
View 1 Replies
ADVERTISEMENT
May 13, 2014
I have a cube that we are processing nightly via an Analysis Service Processing Task in SSIS. In order to increase the performance of the processing time, we elected to use a lot of rigid dimension attributes, and do a full process of everything in the SSIS task. The issue that I am having is that after that task completes, I need to go into Visual Studio to deploy the cube becuase we are unable to browse or use the cube. This issue seemed to start once we changed the SSIS Analysis Service Processing Task to do a full process on the dimensions, rather than an incremental.
I would expect that once development is done, and it is processed and deployed, that is it. My thinking is that the SSIS task should just update the already deployed cube,
View 2 Replies
View Related
Apr 24, 2015
I am trying to configure the reporting for TFS using SQL Server. But I get following error when viewing any report:
So I try to manually process the cube to check if it works. I am following this article: [URL] ....
When I click on GetProcessingStatus and invoke it (with last field set as TRUE) I get following error:
How to resolve this issue and be able to see the reports.
View 5 Replies
View Related
Oct 5, 2015
I have been trying to get the ValueR column of the following query through the MDX but instead getting ValueW as output of the MDX
select
exp(Log(sum(MTMROR)+ 1 ))-1 as ValueW,
exp(sum(Log(MTMROR + 1)))-1 as ValueR
from
Temp_Performance
where Rundate in ('2015-03-01','2015-03-02')
MDX written for the above query is
With
Member [Measures].[LogValuePre]
as ([Measures].[MTMROR] + 1)
Member [Measures].[LogValuePre1]
as VBA![LOG]([Measures].[LogValuePre])
[Code] ...
[MTMROR] measure has the aggregate function Sum. What i can get from this behavior is MDX is first aggregating the result and default aggregation function is Sum. When i try to see the value with more granular data by having the date dimensions on the row (un-commenting the date dimension) i get the correct log and exp log values. Its showing the correct value as date dimension is most granular level in the fact table. While trying to get the data at less granular level(Fund level), getting the sum function applied automatically.
If i set AggregateFunction to none in the cube structure, i get null as the output.
How could i apply the log function before the sum function in the [MTMROR] measure?
View 4 Replies
View Related
Nov 2, 2015
Scenario: [**tableA**] plus [**tableDim1**] plus [**tableDim2**]. I create a DSW, a cube I deploy the [@@CUBE@@]..so connect the data using an excel file that show the data as [¬¬dashboard¬¬].
It works.
My question is: after three days the [**tableA**] is populated with new rows. In order to allow my colleagues to see the new rows I deploy the [@@CUBE@@] again. My colleagues can see the new rows in the [¬¬dashboard¬¬].
It works, ok. But do I really need to deploy the [@@CUBE@@] every time or it should be update automatically when you, for example, refresh the data in Excel. Do I miss something?
View 3 Replies
View Related
Jun 19, 2015
We have an application that takes an existing cube, clones it and then updates it in C#.
Database dbTarget = dbSource.Clone();
dbTarget.Name = databaseName_Target;
dbTarget.ID = databaseName_Target;
dbTarget.DataSourceImpersonationInfo = new ImpersonationInfo(ImpersonationMode.ImpersonateServiceAccount);
sSAS_CalculationServer.Databases.Add(dbTarget);
dbTarget.Update(UpdateOptions.ExpandFull);
We are receiving the following error when trying to Update the cube (the last line of code)Cannot update the 'Database' object 'DB Cube_Temp', it needs to be part of a connected Server object.
View 2 Replies
View Related
Jun 21, 2007
I'm making my first attempt at creating a cube using Analysis Services based on my exisiting datamart. Datasource, views, and dimensions have been defined. But comes deploying the cube, it's giving the error saying "A connection cannot be made. Ensure that the server is running." The Deploy Target server and database are the same where my datamart is. Or, maybe I don't know what I'm doing.
Would appreciate any suggestion for my enlightenment. Thanks
View 1 Replies
View Related
Jun 30, 2007
I setup Hiarchy in the dimentions of my cube, however when I go and look at it via proclarity, I can't see the hiearchy there.
I have one table that has:
Category
Subcategory
Partnumber
And I have the hiearchy set from Subcategory to partnumber.
Then I go into proclarity and limit the category by "my catname" and I still see the 8000 partnumbers in the list.
Any ideas?
View 1 Replies
View Related
Jul 20, 2005
I have a (hopefully typical) problem when it comes to cube design. Westore millions of product records every year, broken down bymonth/quarter. Each product can be assigned to various heirarchialclassification groups etc. The data in an OLTP DB occupies roughly100G for a typical year.We're looking at breaking this out into OLAP to provide faster accessto the data in various configurations and groupings. This is not aproblem, as this is the intended use for Analysis Services.The problem is that we apply projection factors on the product pricesand quantities. This would be ok if it only happened once, however,this happens every quarter (don't ask why). The projection factorschange 4 times a year, and they affect all historical product records.This presents a challenge because to aggregate the data into a usefulconfiguration in the cubes, you throw out the detail data, but thismeans throwing out the price and quantities which are needed to applythe projection.So if you have Product A at $10 and Product B at $20, and roll both upinto Category X, you'll have $30, but you'll lose the ability to applya projection factor of .5 to Product A and .78 to Product B. They'rerolled up.I don't want to regenerate the cubes every 3 months. That's absurd.But we can't live without the ability of projection theprices/quantities on a product level (detail level). So how can thisbe achieved when the other cubes are created at a higher level withless details and sums of the detail data?My initial guess is that we have to update the product data, and thenreaggregate all the other data that is built upon that product data.Is there any other way to apply math to the data on the way out?Thanks in advance!Regards,Zach
View 2 Replies
View Related
Apr 21, 2004
Hi
This may not be possible, but still i would like to know if any of you have suggestions:
Is it possible to implement the functionality of a cube in a stored proc.
I have a table called:
Scores
Columns:
StudentID,YearID,ClassID,TestID,SubjectID,Score
Primary Key:
StudentID,YearID,ClassID,TestID,SubjectID
My requirements are:
i). For a given StudentID,YearID,ClassID,TestID,SubjectID i need to get top 5 scorers, top 10, top 15... and so on.
ii). For a given StudentID,YearID,ClassID,TestID i need to get top 5 scorers, top 10, top 15... and so on. In this case i need to get all SubjectIDs for each (StudentID,YearID,ClassID,TestID ) and sum all of them up and then get the top 5 or 10... etc...
iii). Also imagine we need to get the similar info just given the StudentID,YearID,ClassID or
StudentID,YearID
If we write Stored procedure we think will affect the performance severly, this is why we are thinking we need a cube... but i don't think we can get anylysis services with our budget... if you have any suggestions how to achieve/simplify the scenarios please let us know... It will be really appreciated.
Thanks.
View 1 Replies
View Related
Oct 7, 2015
I am very new to SSAS. I have two queries:
1) As per my project requirement, if the changes in SSAS cube are approved, they should be committed back to the actual SQL Server 2012 tables. Is that possible, if yes how?
2) For rolling back to original data I truncate the relevant writeback table and process the cube.
View 4 Replies
View Related
Dec 8, 2005
If there are any AS gurus out there, I could use some help. I've been having some problems with particular AS dimension and cubes and it's driving me crazy! It doesn't matter what I do, nothing seems to work.
Anyway, here's what I'm trying to do. I've got a fairly simple dimension. There is a date stored in the dimension that is formatted as an int. The dimension needs to only display AGE, so I cast the int as a date and do a datediff to get AGE. The dimension builds just fine and I get the results I want. My problem is when I add the dimension to the cube. The cube fails to build and I get an error message - "Data source provider error: The column prefix 'MY TABLE' does not match with a table name or alias used in the query." Basically, AS is telling me that the table that is used for the dimension doesn't exist, even though the fact and the dimension are joined properly and I've validated the structure.
I've run a number of queries in QA on the two tables and everything works fine. No funky data issues. I've run the service packs a few times, but that didn't work either. I've tried making a cube that only has the just fact table and the one dimension table, and it still fails.
Basically, I'm out of ideas. Any help that anyone has is greatly appreciated.
View 5 Replies
View Related
Feb 22, 2006
Hi,
I am having the requirement where I have to use string as measures. Is this possible in Analysis services 2K.
Any help on this will be greatly appreciated
View 23 Replies
View Related
Apr 10, 2008
Hello:
How do I connect to an Analysis Services 2000 cube while in Integration Services? My idea is to execute an MDX query from within a package, and then stage the data to a SS 2005 table for a RS report to query.
Thanks in advance for your suggestions!
Tim
View 3 Replies
View Related
Nov 10, 2015
I am trying to incrementally update a Cube to get near real time data for the end users. Currently we have a Sql server agent Job that does a FullProcess on the Cube. The Cube consists of a single Measure group which is simply one named query containing inner joins of all the dimensions and fact tables in the underlying relational database. The end users have a lot to upload during the day and they would like us to refresh the cube (near real time) to ensure the adjustments are loaded so that they could reconcile their daily PnLs. We have a MeasureId added which is an auto increment column in the Measures table.
I am trying to schedule the below XMLA query in Sql server agent Job and schedule it to run every 15mins or even less (if possible). However it seems to be not working and keep throwing all sorts of errors.
DECLARE @LastMeasureId AS INT, @myXMLA nvarchar(max)
SELECT @LastMeasureId = "[Measures].[Maximum Measures Id]" FROM
OpenRowset(
'MSOLAP',
'DATA SOURCE=L68F728326574; Initial Catalog=GMDR;',
'SELECT NON EMPTY {[Measures].[Maximum Measures Id]} ON COLUMNS FROM [GMDR]');
[Code] ....
View 2 Replies
View Related
Aug 11, 2004
Can I create a cube file, .cub ,without the Microsoft Analysis Services
Pls Guide me as I am new in this field.
Thanks
Loyd
View 3 Replies
View Related
Jan 9, 2004
Hi,
I am processing one cube using Full Process option and it's giving
following error.
Analysis Server Error: Internal error [Object does not exist] '11948' ;
Time:1/8/2004 6:11:11 PM
Error(-2147221421): Internal error (Internal error [Object does not
exist] '11948' ); Time:1/8/2004 6:11:11 PM
Can anyone help me on this.
View 1 Replies
View Related
Sep 5, 2006
hi,
I try to generate ad hoc reporting model for analysis services cube in reporting services.
in Management Studio I connected to reporting services, defined a data source to the analysis services database.
But If i try to generate model i get the error:
"While connection with a data source a mistake has appeared, or the query is invalid for the data source. (rsCannotPrepareQuery)(Report Services SOAP Proxy Source)"
Any ideas as to what may be the problem?
Thanks!
karaman
View 3 Replies
View Related
Jan 23, 2006
Hi,
I've been trying to use Analysis Services 2005 Cube as a data source, query it via MDX and then use the data returned elsewhere in SSIS.
However, I've been unable to get this working and can't find any information regarding how this can be done. Surely it should be possible when I can get this working even in Excel?
I've looked in December edition of BOL and no luck - have also sent a feedback to BOL regarding this and have been told that "it should be possible, since there is a way to send SQL queries to AS." However the person I was speaking with knew of no one who had actually tried this scenario and to try posting here.
Any help as to how to get this done would be greatly appreciated.
NR
View 18 Replies
View Related
Nov 23, 2015
I know where the data files are etc, but what is the physical structure of a cube like on disk?
Logically we see the data as a star. Is the physical file akin to a star also? Or is a single file a measure group containing all the required member and measure data - thereby eliminating the need for physical join operators.
View 2 Replies
View Related
Feb 7, 2008
Mr. Pearson,
I am currently reading your article titled, "MDX in Analysis Services: Create a Cube-Based Hierarchical Picklist". This article is directly applicable to a problem we are currently trying to solve regarding ragged hierarchies as input parameters.
I have not read the entire article through but am in the process of doing so. Also, I will be trying to implement your solution.
I have one question, will SSRS support multi selection when using a hierarchical picklist? Thanks in advance for any assistance.
View 1 Replies
View Related
Oct 19, 2007
We have set up an IS package to process an AS 2005 database (comprising cube & dimensions, etc) daily, via a SQL Server Agent job on both development and production systems. This has been working fine for months.
A new dimension was added to the cube on the development system - automatic processing via the IS package continued without issue. However, when the new dimension was added to the production system the IS package no longer processes the cube correctly. Although all appears ok (and all is present and correct in the logs), no data updates to the cube are made. Only when the cube is manually processed does the cube get updated.
Anyone got any ideas about how to get around this issue? We have created a new IS package, with a single Analysis Services Processing Task, and tried this but get the same outcome.
View 4 Replies
View Related
Jul 29, 2015
I am new to SSAS. I have requirement to build a cube based on SQL Stored procedure. This Stored Procedure contains lot of temp tables, which are aggregated as measure columns.
Initially I have done creating views on each temp table, finally I created a view which calls like 15 views. when I try to execute the view, it is taking long time to execute the view.
I tried building cube on this view, when I try to deploy, even it is taking long time to deply..I have waited for 2 hours, still the deployement process going..
What I wonder is, is there any other way I can build cube based on SQL stored Procedure.
View 2 Replies
View Related
Oct 19, 2006
Hello--
Is it possible to install/configure SQL-Server 2005 on a multi-processor machine so that the relational DB utilizes a given subset of processors while Analysis Services utilizes another subset?
Thanks,
- Paul
View 4 Replies
View Related
May 19, 2015
I have problems creating a cube with AMO.
I can add the cube to the database object and fill it with dimensions and a measuregroup (see code below).
If I call cube.Update() it says something like "Error in meta data manager. Cube has no measuregroups." (getting the message in german language)
The error in Microsoft.AnalysisServices.OperationException.Results.Messages is -1055653629
I can't find any documentation about this (or any other) error code in Microsoft documentation.
Here's my Code:
Cube newCube = database.Cubes.Add("MyCube","MyCube");
newCube.Language = 1031;
newCube.Collation = "Latin1_General_CI_AS";
CubeDimension dim = newCube.Dimensions.Add("dim1","dim1","dim1");
CubeAttribute attrib = dim.Attributes.Find("dim1Attr1");
[code]....
View 2 Replies
View Related
Jul 23, 2006
I'm having a strange problem that I can't figure out. I have an SQL stored procedure that updates a small database table. When testing the Stored Procedure from the Server Explorer, it works fine. However, when I run the C# code that's supposed to use it, the data doesn't get saved. The C# code seems to run correctly and the parameters that are passed to the SP seem to be okay. No exceptions are thrown.
The C# code:
SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["touristsConnectionString"].ConnectionString);
SqlCommand cmd = new SqlCommand("fort_SaveRedirectURL", conn);
cmd.CommandType = CommandType.StoredProcedure;
Label accomIdLabel = (Label)DetailsView1.FindControl("lblID");
int accomId = Convert.ToInt32(accomIdLabel.Text);
cmd.Parameters.Add("@accomId", SqlDbType.Int).Value = accomId;
cmd.Parameters.Add("@path", SqlDbType.VarChar, 250).Value = GeneratePath();
try
{
conn.Open();
cmd.ExecuteNonQuery();
}
catch(Exception ex)
{
throw ex;
}
finally
{
conn.Close();
}
The Stored Procedure:
ALTER PROCEDURE developers.fort_SaveRedirectURL
(
@accomId int,
@path varchar(250)
)
AS
DECLARE
@enabled bit,
@oldpath varchar(250)
/* Ensure that the accommodation has been enabled */
SELECT @enabled = enabled FROM Experimental_Accommodation
WHERE Experimental_Accommodation.id = @accomId
IF (@enabled = 1)
BEGIN
/* Now check if a path already exists */
SELECT @oldpath = oldpath FROM Experimental_Adpages_Redirect
WHERE Experimental_Adpages_Redirect.accom_id = @accomId
IF @oldpath IS NULL
BEGIN
/* If Path already exists then we should keep the existing URL */
/* Otherwise, we need to insert a new one */
INSERT INTO Experimental_Adpages_Redirect
(oldpath, accom_id)
VALUES (@path,@accomId)
END
END
RETURN
View 2 Replies
View Related
Jan 12, 2005
I'm trying to run a UPDATE stored proc to allow my users to update records that are in a datagrid. What the update proc looks like is as follows:
Proc sp_UpdateRecords
@fname nvarchar(30), @lname nvarchar(30), @address1 nvarchar(50), @address2 nvarchar(50), @CITY nvarchar(33), @ST nvarchar(10), @ZIP_OUT nvarchar(5), @ZIP4_OUT nvarchar(4), @home_phone nvarchar(22), @autonumber int
AS
UPDATE NCOA20040603 SET fname=@fname, lname=@lname, address1=@address1, address2=@address2, CITY=@CITY, ST=@ST, ZIP_OUT=@ZIP_OUT, ZIP4_OUT=@ZIP4_OUT, home_phone=@home_phone
WHERE autonumber=@autonumber
The message I'm getting is as follows:
Index was out of range. Must be non-negative and less than the size of the collection. Parameter name: index
What could be the problem
Any ideas are appriciated -- Thanks in advance
RB
View 5 Replies
View Related
Nov 6, 1998
Referencing the sample stored procedure
below that updates table 'this_table' containing
3 INT columns: 'key', 'col2', 'col3'.
How do I revise the stored procedure so that *** I do not
have to specify ALL the columns *** to update just one column?
e.g. if I want to update @col3 to 9 (exec sp_update_table, @col3 = 9),
how can I set col2 to its existing value?
create proc sp_update_table
@key int,
@col2 int,
@col3 int
as
update this_table
set col2 = @col2,
col3 = @col3
where key = @key
go
View 1 Replies
View Related
Mar 22, 2008
Hello,,,I need to update table but not all fields in stored proc have to be with value (some of fields will be NULL), so i need to avoid updating any field that his parameter with null value.I need the syntax for (IF) statement to optionally update that field.something like this :CREATE PROCEDURE [dbo].[MyUpdate]
(
@ID int,
@Field1 nvarchar(50),
@Field2 nvarchar(50)=null,
@Field3 nvarchar(50)=null,
@Field4 nvarchar(50)
)
AS
UPDATE MyTable
SET Field1 = @Field1,
if (@Field2<>null) Field2 = @Field2,
if (@Field2<>null) Field3 = @Field3,
Field4 = @Field4
WHERE ID = @ID
I saw example before but i can't remember where.
Thank you in advance
View 4 Replies
View Related
Dec 3, 2003
I have a stored procedure that updates about a dozen rows.
I have some overloaded functions that I should update different combinations of the rows - 1 function might update 3 rows, another 7 rows.
Do I have to write a stored procedure for each function or I can I handle it in the Stored Procedure. I realise I can have default values but I the default values could overwrite actual data if the values are not supplied but have been previously written.
Many thanks for any guidance.
Simon
View 5 Replies
View Related
Jan 31, 2005
I wrote a stored proc to be implemented in a datagrid. I went and used it on a record to update some information and it updated THE WHOLE DATABASE WITH THAT ONE RECORD..
IF anyone could shead some light on what I'm doing wrong that would be great.
Here is the syntax for my stored proc.
CREATE PROC updateResults2
@id int, @address1 nvarchar (50), @address2 nvarchar (50), @CITY nvarchar (33), @ST nvarchar (10), @ZIP_OUT nvarchar (5), @ZIP4_OUT nvarchar (4), @home_phone nvarchar (22), @NEWPhone nvarchar (20)
AS
UPDATE Results
SET address1 = @address1,
address2 = @address2,
CITY = @CITY,
ST = @ST,
ZIP_OUT = @ZIP_OUT,
ZIP4_OUT = @ZIP4_OUT,
home_phone = @home_phone,
NEWPhone = @NEWPhone
GO
As said previously it ran but it updated the WHOLE DATABASE with the same change (WHICH I DIDNT WANT IT TO DO)!!
Thanks in advance.
RB
View 3 Replies
View Related
Jul 20, 2005
Hi, I've been reading all sorts of info on the ntext field. I needthis to store xml documents in sql server via a stored proc.Because of its size, I apparently can not use SET (as in UPDATE)therefore I'm trying to do an INSERT of the row with this field (afterdeleting the old row).CREATE PROCEDURE dbo.UpdateXmlWF(@varWO varchar(50)@strWF ntext@varCust varchar(50)@varAssy varchar(50))ASINSERT INTO tblWorkOrders (WorkOrder, Customer, Assy, xmlWF) VALUES(@varWO, @varCust, @varAssy, @strWF)I'm using MSDE so I can't tell what's wrong...it just won't save theproc.PLEASE HELP!Thanks, Kathy
View 2 Replies
View Related
Nov 5, 2006
we just moved a database from a shared SQL 2000 server to SQLExpress on a VPS server. Every thing seems to work great, really no performance loss at all, except for one stored proc that is giving us problems. It's function is to update a history table, and then update the production table. On the old server, it would take less than a second to run this, now it takes anywhere from 45 seconds to 1 minute or it times out. This database is used on a classic asp web app. ANY help at all on this would be appreciated as I am pulling my hair out trying to figure out what's wrong here. here is the proc.
------------------------------------------ code ----------------------------------------------
set ANSI_NULLS ON
set QUOTED_IDENTIFIER ON
SET NOCOUNT ON
go
ALTER PROCEDURE [dbo].[sp_UpdatePersonalization]
@p_id nvarchar(50),
@cust_num varchar(50),
--@publication varchar(25),
@full_name varchar(500),
@email varchar(200),
@web_address varchar(300),
@include_web bit,
@ReturnAddressYN varchar(1),
@include_email bit,
@job_title varchar(500),
@company_name varchar(500),
@tagline varchar(1000),
@phone1 varchar(30),
@phone2 varchar(30),
@phone3 varchar(30),
@phone4 varchar(30),
@phoneext1 varchar(10),
@phoneext2 varchar(10),
@phoneext3 varchar(10),
@phoneext4 varchar(10),
@phonedesc1 varchar(20),
@phonedesc2 varchar(20),
@phonedesc3 varchar(20),
@phonedesc4 varchar(20),
@company_mail_address varchar(500),
@masthead varchar(250),
@verbiage_page1 varchar(1400),
@verbiage_page4 varchar(2650),
@inc_bbb bit,
@inc_ehl bit,
@inc_eho bit,
@inc_rl bit,
@p_comments varchar(500),
@p_update_flag varchar(10)
--@frequency varchar(50),
--@mail varchar(50),
--@fold varchar(50),
--@contact varchar(50),
-- do not add this on the update! @date_added,
AS
declare @last_updated datetime
select @last_updated=GETDATE()
declare @set_update bit
--if @p_update_flag='False'
--select @set_update = 1
--else
--select @set_update = 0
if @p_update_flag='False'
INSERT INTO pers_main_arch
(
p_id,
cust_num,
publication,
full_name,
email,
web_address,
include_web,
include_email,
job_title,
company_name,
tagline,
phone1,
phone2,
phone3,
phone4,
phoneext1,
phoneext2,
phoneext3,
phoneext4,
phonedesc1,
phonedesc2,
phonedesc3,
phonedesc4,
company_mail_address,
masthead,
verbiage_page1,
verbiage_page4,
inc_bbb,
inc_ehl,
inc_eho,
inc_rl,
p_comments,
frequency,
mail,
fold,
contact,
date_added,
last_updated,
start_month,
ReturnAddressYN
)
SELECT
pm.p_id,
pm.cust_num,
pm.publication,
pm.full_name,
pm.email,
pm.web_address,
pm.include_web,
pm.include_email,
pm.job_title,
pm.company_name,
pm.tagline,
pm.phone1,
pm.phone2,
pm.phone3,
pm.phone4,
pm.phoneext1,
pm.phoneext2,
pm.phoneext3,
pm.phoneext4,
pm.phonedesc1,
pm.phonedesc2,
pm.phonedesc3,
pm.phonedesc4,
pm.company_mail_address,
pm.masthead,
pm.verbiage_page1,
pm.verbiage_page4,
pm.inc_bbb,
pm.inc_ehl,
pm.inc_eho,
pm.inc_rl,
pm.p_comments,
pm.frequency,
pm.mail,
pm.fold,
pm.contact,
pm.date_added,
pm.last_updated,
pm.start_month,
pm.ReturnAddressYN
FROM pers_main pm
WHERE pm.cust_num = @cust_num
if @p_update_flag='True' OR @p_update_flag='False' OR @p_update_flag IS NULL OR @p_update_flag=''
UPDATE pers_main SET
--cust_num=@cust_num,
--publication=@publication,
full_name=@full_name,
email=@email,
web_address=@web_address,
include_web=@include_web,
include_email=@include_email,
job_title=@job_title,
company_name=@company_name,
tagline=@tagline,
phone1=@phone1,
phone2=@phone2,
phone3=@phone3,
phone4=@phone4,
phoneext1=@phoneext1,
phoneext2=@phoneext2,
phoneext3=@phoneext3,
phoneext4=@phoneext4,
phonedesc1=@phonedesc1,
phonedesc2=@phonedesc2,
phonedesc3=@phonedesc3,
phonedesc4=@phonedesc4,
company_mail_address=@company_mail_address,
masthead=@masthead,
verbiage_page1=@verbiage_page1,
verbiage_page4=@verbiage_page4,
inc_bbb=@inc_bbb,
inc_ehl=@inc_ehl,
inc_eho=@inc_eho,
inc_rl=@inc_rl,
p_comments=@p_comments,
--frequency=@frequency,
--mail=@mail,
--fold=@fold,
--contact=@contact,
--date_added,
last_updated=@last_updated,
updated_flag=1,
ReturnAddressYN=@ReturnAddressYN
WHERE cust_num=@cust_num
------------------------------------------ code ----------------------------------------------
View 1 Replies
View Related