Does SP1 Change The Way Strings Or Data Are Processed?

Nov 1, 2006

We've just patched our Dev server, and, of our 3 servers (Dev, Test, Prod), we see major changes in the output of a raw data import process that runs nightly. Each night we import tables from a Remedy helpdesk system running on Oracle and place each ticket into a row on a table, tracking the changes and history of the ticket, etc. This includes tracking when supervisor groups are changed during the course of a ticket (ie Helpdesk to Data Comms to Billing etc). Now, after SP1, the results on Dev are skewed with partial strings showing in the From and To fields, broken in odd places (like the middle of words).

Has anyone noticed any changes in which post-SP1 SQL Server 2005 processes strings? Does it automatically trim spaces or convert NULLs etc?

The data import should be identical between Dev and the other servers.

View 3 Replies


ADVERTISEMENT

Highlight First Row From The Processed Data

Nov 28, 2007



Hi,
i have a report like this


Is there a way i can highlight only the first row and not other rows.

Its should look like this:

Texas
XYZ ABC Kingsville
PSD ATY city2
RTE RET city3


Is there a way i can do like this??

View 4 Replies View Related

Possible To Use ReportViewer To Change Connection Strings Of A Report's Datasources?

Nov 9, 2007

Is it possible to use a ReportViewer control (or something in .net) to change connection strings of a report's datasources?
The connection string comes from a shared data source, so using an expression for the connection string in the actual report is out.


Essentially, I'm looking to tack on Analysis Services cube role(s) to the connection string to change data permissions. If there's another way to do this with a RS report, I'm all ears.

View 6 Replies View Related

Reporting Services :: Refresh SSRS Report As Soon As Its SSAS Cube Data Source Is Processed?

Oct 12, 2015

how to trigger report's refresh following its underlying SSAS cubes process?

I want to keep reports data updated at all times, especially when this happens while user is already browsing the report..

I don't want to set an auto refresh for the report every 5 minutes as my cube is processed only one time during the day...

View 8 Replies View Related

Tranform Large Volume Of Data. Sholuld It Be Processed Chunk By Chunk?

Dec 16, 2007



Hi,
I have to transform about 60 millions of data and it runs so slow that it never finishes in my testing. Should I have to process it chunk by chunk? Or is there any other techniques I can use (I am using data flow task). Thanks for advice.


View 12 Replies View Related

Concatenate Strings After Assigning Text In Place Of Bit Strings

Feb 19, 2007

I have a whole bunch of bit fields in an SQL data base, which makes it a little messy to report on.

I thought a nice idea would be to assigne a text string/null value to each bit field and concatenate all of them into a result.

This is the basic logic goes soemthing like this:


select case new_accountant = 1 then 'acct/' end +

case new_advisor = 1 then 'adv/' end +

case new_attorney = 1 then 'atty/' end as String

from new_database

The output would be

Null, acct/, adv/, atty, acct/adv/, acct/atty/... acct/adv/atty/

So far, nothing I have tried has worked.

Any ideas?

View 2 Replies View Related

Connection Strings/Data Sources

Jan 3, 2005

Hey all,

I'm coming from ASP and I used to have a global connection string accessible to all of my ASP pages that I'd use for all of my data access. This was really convenient because I could easily switch to a backup or local data source for testing/debug by changing the connection string in one place.

Now, I'm using ASP.NET with Web Matrix and I love the drag and drop functionality but its dropping my connection string all over the place. How can I do this and keep my connection string in one spot so I can have the same convenience of switching data sources.

Thanks in advance!
Larry

View 3 Replies View Related

How To Select Data From Text Strings

Nov 10, 2007

Hi everyone,

You have helped me resolved my previous problem with the LIKE statement, and now I'm running into this TEXT STRING problem that I desperately need your help and guidance again.

The following is the set of various descriptions in a PRODUCT_DESC field. I need to be able to calculate the Squared Meter of these products. As you can see, I need to be able to extract the part in the middle (like 2x60YD, etc.) for each record and perform some sort of calculation and conversion. The problem is that I can't find a way to do this effectively. Can someone please help me? Thanks.

PG21..181 MASKING TAPE 2X60YD 7.3MIL IPG PREMIUM
CH PG21..181 MASKING TAPE 2X60YD 7.3MIL IPG PREMIUM
PG5...130 MASKING TAPE 2X60YD 6.4MIL IPG PREMIUM
PG21..179 MASKING TAPE 24MMX55 7.3MIL IPG PREMIUM
PG21..179 MASKING TAPE 24MMX55 7.3MIL IPG PREMIUM
PG21..181 MASKING TAPE 2X60YD 7.3MIL IPG PREMIUM
PG5...130 MASKING TAPE 2X60YD 6.4MIL IPG PREMIUM
PG21..179 MASKING TAPE 24MMX55 7.3MIL IPG PREMIUM
PG21..181 MASKING TAPE 2X60YD 7.3MIL IPG PREMIUM

View 10 Replies View Related

Data Flow Task SQL Strings

Jan 11, 2006

Hi,

I just wanna ask:

I'm creating an SSIS package, a Data Flow Task. I have used OLEDB Source connected to a SQL Server Destination. Now in my OLEDB Source, I have this SQL statement

SELECT FirstName, LastName, Age FROM Employees WHERE (Age > 10) AND (Age < 95)

But what I want is to have the last name and first name concatenated and in proper case(capitalize first letter of the firstname and surname). I also want to TRIM or remove the blank spaces of the field in my SQL statement. How I be able to do this? 

I tried using proper(), trim() and ucase() like in MSAccess but no success.

Please help. Thanks in advance.

 

View 2 Replies View Related

SQL Data Source Not Showing Any Connection Strings

Mar 21, 2007

Everything is set up the way it should be, as far as we can tell.
However, when we drop in a SQL data source and try to configure, it, we
get a blank box for choosing the connection string, and it won't let us
make a new one.

Has anyone encountered this before? If so, how can we fix it?

View 1 Replies View Related

Database Data Type Design (Strings Vs. 1's And 0's)

Mar 17, 2008

 Hello guys, I am a hobbyist programmer and now that I have started asp.net, I was wondering about the correct way to enter data into a table.  For example, i have a table called players with a field called status.  What should I code the status as?  "Active" or "Retired" or 1 and 0? I thought using 1 and 0 would be better as far as database size, but it is pretty difficult to understand how to modify the gridview.  I come from PHP and everything is a lot more accesible there as far as modification of output.  Thanks in advance! 

View 3 Replies View Related

Altering Strings Depending On Data In A Table.

Sep 10, 2007

Hi Guys,

I'm wondering if an idea I'm playing with is feasible and if so, how you would recommend implementing it.

Let's say I have a Dictionary table, 2 columns:

Word | Definition

And I have a string - "The cat sat on the dog"

If there's a definition for "cat" in the dictionary table, I want to alter the string so it becomes "The >>cat<< sat on the dog"

At the same time, if there's also a definition for "dog" then my string now becomes "The >>Cat<< sat on the >>Dog<<"

The idea being that when I manipulate the data in my ASP I can replace() the >><< with specific HTML code. (I'm trying to recreate the "in text" advertising thing that lots of people seem to be using - but not doing adverts, just information for our users - Someone hovers over a highlighted word, and with a little bit of Ajax, I can pull the definition out...

I'm not sure (but I'm suspecting) that it would make more sense to do this as I'm storing the string in a table, rather than as I'm pulling it out ready for use (don't want to be slowing my end users down )

Any ideas?

Thanks in advance
-Craig

View 4 Replies View Related

AS 400 Data Being Interpreted As Unicode Strings (DT_WSTR) Datatype

Jun 23, 2006

I am building a data warehouse. Some of the data comes from an AS 400 EPR system. I used the OLEDB connector when first pulling the data into SQL Server doing simple import data from table option. That worked great for getting the initial data load into SQL Server and creating the base SQL Server tables although it was excruciatingly slow (that was probably due to the transport from the AS 400).

Now, I need to get new records that are added to the AS400 side of things on a daily basis. For that, I was trying to use the OLEDB AS400 connector. However, I found that the OLEDB connector wouldn't work when I was trying to specify an SQL Statement for what to get; i.e., a simple query like Select * from TWLDAT.STKT where BYSDAT >= '2005-01-27' would simply not work. Found articles here explaining that it is probably a problem on the AS400 side of things and where people recommended using an ADO ODBC data reader source for this type of thing. So, I'm trying to implement that. However, I have a huge problem with it.

The original tables that got created were mapped to use NVARCHAR fields for character data. When the ADO ODBC data reader source accesses the AS400 data, it insists on interpreting the string type fields as being unicode strings and giving it a data type of DT_WSTR when what I need it to have is a plain old DT_STR data type. When the strings are interpreted as unicode strings, they cannot be converted in a way that allows the NVARCHAR fields to be filled with the data. The exact error message I get for all the fields that should wind up being nvarchar fields is as follows:

Column "BYStOK" cannot convert between unicode and non-unicode string data types.

Okay, so I try to change the data types in the ADO ODBC data reader to be plain DT_STR data types and I cannot do so.

Does anyone have any idea why the ADO ODBC data reader source insists on interpreting the string data coming from the AS 400 as unicode string data or why it refuses to allow that to be changed to DT_STR data type?

Thanks in advance for any info. By the way, if there is a better way than the ADO ODBC data source to get at this data when I need to specify an SQL command, I would love to hear about it. Not wild about using ODBC in the OLEDB age.



Steve Wells

View 9 Replies View Related

Foxpro .net Odbc Data Provider Misidentifying Strings As Unicode

Nov 23, 2006

Since my foxpro OLE driver has been rendered useless by service pack 1 for sql server 2005 I am forced to use the .net data provider for odbc.

I am importing a number of tables.. each time I add the DataReader Source to the dataflow and connected it to the OLE DB Destination I get a load of the good old "cannot convert between unicode and non-unicode string data types" errors...

So I'm having to do derived column transforms, for each and every column that it coughs up on.

Using expressions like (DT_STR,30,1252)receivedby to convert the "recievedby" column to a DT_STR,


Some of these tables have 100 string columns.. so I'm getting a bit sick of the drudgery of adding all these derivations...

Is there any way to tell this provider to stop deciding that the strings in the foxpro tables are unicode?

Thanks
PJ

View 3 Replies View Related

Scd Type 2 Problem With The Data Having Empty Strings In Business Keys

Aug 24, 2007

I am having data where there are empty string in the business keys which should be used for Slowly changing dimesnion type 2, how do i over come this as due to empty strings i am getting new rows even though the rows havent really changed.


example of data is name and salary are business keys

name salary age address
dev 23 klddldldlk
sdfg 24 34 kdlddlkd



when the same is given as input the row
dev 23 klddldldlk
is coming as anew row where it already exists how do i over come this

View 4 Replies View Related

Date Parameters Show As Strings When Using Analysis Services As A Data Source

Jun 7, 2007

We have been a Crystal shop for ages; we are currently doing a proof-of-concept for a conversion to MS Reporting Services. As such, we are developing some Analysis Services 2005 cubes to drive some new SSRS reports, which our users will access through Report Manager. Unfortunately, we are all MDX noobs here, so we are making heavy use of the Wizards until we can come up to speed.



The problem we are running into is when we develop a report with Date Parameters. When we deploy this report, the date parameter box is a dropdown box instead of a date picker. I've seen a couple of other posts on this topic, but when I try to apply the fixes mentioned in them, I throw errors.



I have two quick questions:

Why does this happen? Is it a limitation in the MDX language, in SSAS, or SSRS? Are there any planned fixes?
Can someone please show me how to fix this on my actual query string for one of our basic reports? I've highlighted the date parameters.




Code Snippet

SELECT NON EMPTY { [Measures].[Lead] } ON COLUMNS, NON EMPTY { ([Store].[Store ID].[Store ID].ALLMEMBERS ) } DIMENSION PROPERTIES MEMBER_CAPTION, MEMBER_UNIQUE_NAME ON ROWS FROM ( SELECT ( STRTOSET(@LeadSourceTypeLeadSourceType, CONSTRAINED) ) ON COLUMNS FROM ( SELECT ( STRTOSET(@StoreStoreID, CONSTRAINED) ) ON COLUMNS FROM ( SELECT ( STRTOMEMBER(@FromLeadCreationDateCalendarDate, CONSTRAINED) : STRTOMEMBER(@ToLeadCreationDateCalendarDate, CONSTRAINED) ) ON COLUMNS FROM [Referral Leads]))) WHERE ( IIF( STRTOSET(@LeadSourceTypeLeadSourceType, CONSTRAINED).Count = 1, STRTOSET(@LeadSourceTypeLeadSourceType, CONSTRAINED), [Lead Source Type].[Lead Source Type].currentmember ) ) CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS



I'm afraid, given my user community, that if I can't get the date picker to work properly, it could be a deal breaker.



Thanks very much in advance for your help.



Regards,



Steve

View 7 Replies View Related

Processed Files

Jul 4, 2006

Hi ppl

I have a confusing situation that I CAN resolve with cursors, BUT cursors are far too performance hungry to be run.

Tables: LogFileIn, LogFileOut,

I want to look on LogFileIn FLI for files that has a NULL LogFileOut(response to LFI) AND dont have any other value.

A file can have multiple LFO's depending on how many times that file gets processed.

It is confusing but bear with me.


Code:


select * from LogFileIn where IdentityFileChar = 'BPRTFILE'
and responseLogFileOutId is null



This does pull out a list of files that havent been processed, BUT that same file might have been processed again and succeeded. Therefore the results I get arent accurate.

With the cursor I would have taken the filename from the LFI and used that to check if that filename comes up again in the LFO as processed.

Did you get that? And can you help?? Im confusing myself!

This table records everytime a file is run, so there might be 20 identical LFI's all with different LFO's... Im looking for LFI's that have null, and only Null.

View 1 Replies View Related

SQL Query Not Being Processed By SQL Server

Oct 24, 2001

Hello all,
I have a query that keeps getting errors from our SQL Server. Can anyone see why ?

select job.key,job.job,job.item,job.job-date
From job
where job.job-date > (dateadd (year,-1,job-date))
order by job.job, job.item, job.key



I have tried bracketing the job-date but I still get errors saying it is
unable to understand the query. Thank you for your help in advance.

View 1 Replies View Related

Order For Conditions To Be Processed

Mar 3, 2004

what is order in which conditions are processed for sql query i.e for
select * from table1, table2 where cond1 and cond2 and cond3 which condition will be processed first (i.e. for optimination purpose condition cutting down max no. of row shud be placed first or last?)

View 3 Replies View Related

Done: 1 Processed Of 1 Total; 1 Errors.

Apr 18, 2008

I'm working with SQL Server Reporting Services (SQL Server 2005)
I tried to configure a Data-Driven Subscription but the run failed with the error:
Done: 1 processed of 1 total; 1 errors.

I cannot find anything into the error log ReportServerService__<datetime>.log located at <Program Files>Microsoft SQL ServerMSSQL.#Reporting ServicesLogFiles.

How can I find useful information about the error message?

View 1 Replies View Related

Integration Services :: Replace Blank Strings Values To NULL And Convert To Integer Data Type

Oct 5, 2015

I need to convert a a string column to integer. Before converting, I need to check if it has blank values then convert it to NULL. Someone told me that its easier to convert it to NULL before converting to integer.

View 5 Replies View Related

How To Get Last Processed Date Of Stored Procedure

Jul 30, 2007

Hello everybody

Help me to get the last processed date of a stored procedure.

View 6 Replies View Related

Query Could Not Be Processed. Access Is Forbidden

Jul 20, 2005

I am using http to access my cube (MS Analysis Server) and there are userswho are getting "Query could not be processed. Access is forbidden"intermittently. Can someone help out and let me know what is causing thisissue? Some users are not getting this error.

View 3 Replies View Related

Received Message Constantly Processed.

May 21, 2007

Hello,

when is seemd that everything works some weird behaviours comes out.


I try to summarize the problem without to post the complete code.

Service Broker is set to have a dialog between two databases on the same SQL Server instance.



The Initiator queue has retention=on and there is an activation SP to handle errors and Target's end dialog message.

The Target queue has retention=off, MAX_READER =1 and there is an activation SP to receive the message (WAIT FOR (RECEIVE (1) ...), TIMEOUT 30000 and do something with this message (sample insert into a DB).



The conversation has a Timeout Dialog to end the dialog after a while.


The problem that the message is constantly processed. The Process doens't stop is I end the dialof after the processing either.

n.b.the Receive is within a Transation that I commit at the end.



some other informations that in the meanwhile I found out :

This was my complete WAIT FOR(RECEIVE :

WAITFOR ( RECEIVE top(1) -- just handle one message at a time
@message_type=message_type_id, --the type of message received
@messagetypename=message_type_name,
@message_body=message_body, -- the message contents
@dialog = conversation_handle -- the identifier of the dialog this message was received on
FROM [TargetQueue]
), timeout 1000;

if (@@ROWCOUNT = 0)
BEGIN
COMMIT;
BREAK;
END

IF I delet TIMEOUT 1000, everything works as expected ...
Inside if (@@ROWCOUNT = 0)BEGIN..END I wrote also an Insert into a table to see wheter the end of the queue was reached but this insert never occurs (neither with not without timeout)
I'm happy that it works what if this is the solution, it make no sense to me!

Any ideas?
Thank you!


M.B.





Thank you very much

M.B.

View 9 Replies View Related

Need To Get The Last Processed Cube Date In SSIS

Apr 15, 2008



Hi All,

I have a scenerio where i need to run the scheduled package for every 10 minutes. Let me give some examples. Say once my ETL process is done, my SS_Batch table gets inserted once. Here i have two columns like SS_Batch_cd and SS_Create_TS respectively. Once the ETL process runs successfully this SS_Batch_CD column will have values as 'C' which means 'Completed'. Similarly when ETL process fails this SS_Batch_CD column will have values as 'F' which means 'Failed'. And Similary when ETL process is in progress this SS_Batch_CD column will have values as 'P' which means it is in 'Progress'. In SS_Create_TS column, it will have date like 2008-04-15'.

Note : This SS_Batch table will get inserted only once when ETL job is over (whether the job is runned sucessfully or not or in Progress) along with the date.

Actually i need to run the package for every 10 minutes because i dont know when the last cube was processed. If i get this last processed cube date then i can check this processed cube date with SS_Create_TS column and SS_Batch_cd in SS_Batch table. Say if Last processed cube date is greater than SS_Create_TS then i can refresh or process the cube. This validation i need to do to achieve my goal.

What are all the control flows should we need to have in SSIS to achieve this scenerio? Please give me briefly to solve this problem.

Please advise on this.

Thanks in advance.
Anand Rajagopal.

View 4 Replies View Related

Display Number Of Records Processed

Aug 20, 2007

I've got a stored procedure that processes a TON of records...

What I would liek to do is to write a row to a "progress" table which shows how many rows have actually been processed.

I have a simple counter defined in the procedure:

SET @COUNTER = 0

Each time the procedure loops through, it increments by 1:

SET @COUNTER = @COUNTER + 1

What I would like to do is write rows to a PROCESS table which would reads:

PROCESSED 1000 rows
PROCESSED 2000 rows
PROCESSED ...... rows
etc.

I have a slight idea how to pull this off, but not sure about the whole even number thing by 1000.

If anyone has any insight it would be greatly appreciated!!

Thanks in advance!

View 3 Replies View Related

How To Stop The Dts Package In SSIS Once The Cube Is Processed

Apr 17, 2008

Hi All,
I have some clarifications on stopping my package once cube is refreshed or processed.

Below i have given steps for the transformation in my package

Let me give you what are all the dataflow transformations that i had given in my package.

1. Data Flow Task

2. Script Task 1- I have written code for getting the last processed cube (global variable has been declared for Last processed cube date - lastProcessedCube)

3. Script Task 2 - I have written code for SS_Batch table where i can get Create_Ts date that is assigned to another global variable - create_ts.

4. Analysis Processing Task.

In between Script Task 2 and Analysis Processing Task i have given @lastProcessedCube > @create_ts for Expression and Constraint under Precedence Constraint Editor

Actually i need to run package for every 10 minutes which i can do it in Job Schedule and need to refresh or process the cube daily. Is there any way to stop the package once when my cube is processed on that day. Again start the package for the next day.... Is it possible to do this? Please let me know.

Thanks in Advance,
Anand Rajagopal

View 14 Replies View Related

SQL Server Admin 2014 :: Change Data Capture(CDC) For Data Warehouse / Reporting?

Aug 12, 2015

I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.

The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?

Is there any performance impact if we enable CDC on OLTP db?

Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?

What is the best way to implement CDC to take incremental changes for reporting.

View 0 Replies View Related

SQL Security :: Making Data Change In Read Only Database Without Letting Other Users Update Data

Aug 6, 2015

I want to make data changes in read_only database , that's why i must set database read_write. While database is at read_write mode, i want to be sure that no one makes change in database.

For this aim, i write the code below, but i suspect that after setting the database read_write, till the setting database
single_user ,is it possible get DML script from another user. Is the code below enough for this operation. Or is there another way?

Reminding: Read_only database can not be set single_user mode. That's why, first you must set database read_write.

The code;

use master
alter database xxx set read_write
with rollback immediate
alter database xxx set single_user
with rollback immediate

use xxx
update  tablexxx set columnxxx=yyy
use master
alter database xxx set read_only
with rollback immediate
alter database xxx set multi_user
with rollback immediate

View 5 Replies View Related

SQL 2012 :: CDC (Change Data Capture) Is Not Capturing Data Correctly

Apr 21, 2014

I am using SQL Server 2012 and to me a part of data captured by CDC is not making sense.

I have a table called 'Schema.Table1', and I enabled CDC on it by running 'sys.sp_cdc_enable_table'. I see that a table called 'cdc.Schema_Table1_CT' got created which now gets an entry when ever I Insert, Update or delete a record in the original table.

Till this point every thing works fine.

My original Table has a NOT NULL INT column called 'AuditTrackerUserID' with a default value of 1996. My application does not provides a value for this column, but because the column itself has a default value, records get inserted without error.

When I try to execute the following Query I see multiple records with __$operation of 3 and 1.

SELECT * from cdc.Schema_Table1_CT where AuditTrackerUserID IS NULL

My expectation is that I should not ever see any record returned by this query because AuditTrackerUserID is a not null column, but I do.

View 2 Replies View Related

SQL Server 2008 :: Error - Database Page Could Not Be Processed

Oct 16, 2015

I have moved some DB's from a physical server to a virtual server while the physical server is being upgraded. This morning when I checked to make sure all the jobs are running fine I saw that a backup job failed for one of the databases. The error was :

[color=#ff0000]Executed as user: NT AUTHORITYSYSTEM. BACKUP 'database 1' detected an error on page (1:1400911) in file 'D:Datadatabase 1.mdf'. [SQLSTATE 42000] (Error 3043) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013). The step failed.[/color]

Then I ran a DBCC check db command and found the following errors:

Msg 8928, Level 16, State 1, Line 1
Msg 8939, Level 16, State 98, Line 1
Msg 8976, Level 16, State 1, Line 1
Msg 8978, Level 16, State 1, Line 1
Msg 8939, Level 16, State 98, Line 1
Msg 8978, Level 16, State 1, Line 1

The messages varies from data from one page cannot be referenced on another, data page cannot be processed, page was not seen in the scan although the parent was etc etc.

View 9 Replies View Related

Query Analyzer : Losing The ___ Rows Processed Message

Jul 20, 2005

Hi;I have been writing a lot of quick and diryt tsql scripts to correctdatabase issues.I've been saving the output of the scripts for future records.I have been executing them through query analyzer.I find the messages query analyzer puts out to be obscuring to myoutput: ie" 6 rows processed "Is there anyway to shut this off?Steve

View 3 Replies View Related

Error With Analysis Service 2005: The Query Could Not Be Processed

May 22, 2008




I have created a cube with Analysis Service 2005. I then publish the pivot table (generated in Microsoft Excel 2003) as a web page. When I view the web page on the domain the LAN everything is working properly. But when the web page I view on Internet I get the error.



The query could not be processed:

* An error was encounted in the transport layer

* The peer prematurely closed the connection.



The web page is publish on the Internet Information Server, I changed the security directory of the website, but the error persists.

The firewall of the machine is disabled.

The port 2382 and 2883 are allowed.

The components needed to consult the cube are installed in the machine on which you are viewing the cube.

On the Rol of Analysis Service are allowed all users.


What other type of restriction may be?


Thanks...!!!

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved