SQL Server For Data Processing

Jul 23, 2005

I'm working on a system right now where I have a database (two,
actually, but one is discarded halfway through), but it's created
and used as part of a process (reporting), rather than as the
actual production data repository. I may be keeping the database
permanantly, but it would be completely read-only; once the
process is complete, the database will not change again. This has
me wanting to do a few things that are rather foreign to my usual
experience, and I don't know how many of them are supported.

In several cases, I'm summarizing one table into another by several
fields, and then updating the original table with an ID for the
summary row each source row was summarized into (e.g., I summarize
PlaceAndProductSummary into PlaceSummary, and then populate
PlaceSummaryID in PlaceAndProductSummary). The update of the
source table is much faster if the summary table has a clustered
index on the summarized fields, but all later access will be faster
if the clustered index is on the identity column. I've been
including an ORDER BY the summarized fields in the original insert,
so the identity column is in the same order as the summarized fields,
but I don't know of any way to take advantage of that in the
indexing declarations.

As another approach to the above situation, if I change the
clustered index on a table, and the rows happen to be in the
same order by both indexes, will the table still get rebuilt?

I will never do a roll-back in the process; if an action fails, I
want to raise an error and halt (and I haven't lost any data).
Is there any way to completely turn off logging?

Will I gain anything by marking the database as single-user?

Any indexes that I am not using while I populate the tables, I'm
adding at the end with FillFactor 100, to keep any slack out.
Is there a way to remove all the slack from everything else, at
the end of the process? During a backup operation would be fine.

Thanks,
Bill

View 2 Replies


ADVERTISEMENT

SQL Server 2014 :: One To Many Relationship In Data Processing

Jun 24, 2014

I have a scenario where one partner can have multiple competency like Grade A,Grade B or Grade C and Grade AB,Grade BC etc.

I need to pull the records of competency of higher level also there is a chance of changing this competency level in feature like Grade A of Grade AA like this.

What would be the best option to select this kind of records.

View 9 Replies View Related

Data Processing Extensions On SQL Server Express Edition?

Jan 3, 2007

I developed nice reports using custom data processing extensions.  When I deployed the reports on my report server (I am using the express edition of SQL Server 2005) I was surprised to see that my reports were not rendering successfully.
 
After searching the web, I found this page listing the supported/unsupported features of SQL Server 2005 Express Edition: http://msdn2.microsoft.com/en-us/library/ms365166.aspx
 
On this page it clearly says €œThe Reporting Services API extensible platform for delivery, data processing, rendering, and security is not supported.€?
 
Is there a way to get my reports to work on the express edition?
 
If not, which minimal version of SQL Server should a buy to get it to work (workgroup, standard or enterprise)?
 
Thanks for your help.

View 1 Replies View Related

How To Install And Register The Office Business Scorecard Custom Data Processing Extension With Report Server?

Apr 25, 2007

Hi , all Microsoft BI experts here,

Thanks for your kind attention.

I am having a question as stated in the subject title, yes, when we want to deploy scorecards to reporting services, as the prerequisite, how can we install and register the scorecard custom data processing extension with the Microsoft reporting services server?

I am looking forward to hearing from you shortly and thank you again.

With best regards,

Yours sincerely,

View 1 Replies View Related

Data Access :: What Is Correct Usage For Processing Data Adapter Rows

Sep 9, 2015

I have a table that is returning rows from a table query. It seems I have done it before but I cannot seem to get the right procedure to obtain the values. I will paste in the code below in which you will see my bad attempts at accomplishing what I need.

Dim uid As String
Dim pw As String
Dim em As String, fn, ln, mi As String
Dim par As String
Dim Field, n, j As Integer
Dim JJ As Integer

[code]...

View 3 Replies View Related

XML Data Processing

Dec 30, 2005

Hello everyone. I need help regarding the following:Given the following table:CREATE TABLE T1 (C1 nvarchar(10), C2 money)INSERT INTO T1 VALUES ('A',1)INSERT INTO T1 VALUES ('B',2)INSERT INTO T1 VALUES ('C',3)let's say that i have this table in a local server and i want to uploadit to a remote server and in the remote server upload it to a databasethat contains the same table.the uploading part can be done by another application in the remoteserver, but i want i need is a way to transfer the data at the fastestpossible way.what steps do i need to follow?tia,Rey Guerrero

View 10 Replies View Related

Processing XML Data From A Table

Sep 8, 2006

We have an XML column in a SQL Server 2005 table. Each row of this table contains one XML document.

I want to shred values from the XML documents and process these within a Data Flow. I want the Data Flow to execute once across a record set comprised of all of the XML documents.

I can shred the XML using a For-Each loop and XML Task. I'm kinda stuck on how I then get the data from variables into a Recordset or similar so that I can process this within single iteration of a Data Flow.

Or - is my approach incorrect? I seem to be building a verbose and clunky solution to this problem. I know I could accomplish the same in a pretty simple SQL statement using .value on the XML column... am I missing something? Is a SQL query just better suited to this problem?

Any help much appreciated.

James

View 1 Replies View Related

Processing Data From Stored Procedure

Aug 22, 2001

I want to capture and process data inside of a stored procedure by executing another stored procedure. If proc1 calls proc2 and proc2 returns 1 or more rows, consisting of 2 or more columns, what is the best way to do this?

Currently, I know I'm returning 1 row of 3 columns, so I concatenate the data and either return it with an OUTPUT parameter or using the RETURN stmt.

TIA,
mike

View 1 Replies View Related

Log Errors, But Continue Processing Data

Jun 26, 2007

Do you have to set the Error Output on DF components to "Fail Component" in order to get the errors?



What I would LIKE to do is a combination of "Ignore Failure" and "Fail Component". You see, I am using the Logging feature in my package that creates the sysdtslog90 table in the SQL database. The errors that I am logging make sense and have enough information for my purposes.



The problem is that I would like to continue processing the data and not have it stop when a data error occurs. I REALLY do not want to Redirect Rows unless it is necessary for me to do what I am asking.



Using Ignore Failure on both the source text file and destination SQL table allows the "good" data to be inserted, but I cannot get any info on the columns in error. Conversely, if I choose to Fail component, I get the info on the columns in error, but only the data that was inserted before the error was encountered is inserted into the table.



Suggestions?

View 14 Replies View Related

Errors Processing Data From Oracle To SQL

Aug 7, 2006

hi here´s a new one.....
i´ve created a dts that is using an odbc source to connect to an oracle server, the conexion works just fine and i have no problems with it, then, i run this package from mi computer and the data transfer ends succesfully, when i upload it to my server in SQL 2005 and set it into a job.. i get errors like these:

Event Name: OnError
Message: Thread "WorkThread0" has exited with error code 0xC0047039.

Event Name: OnError
Message: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.

Event Name: OnError
Message: The PrimeOutput method on component "table" (1) returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

Event Name: OnError
Message: The component "faccomitecedidos" (1) was unable to process the data.

Event Name: OnError
Message: The "component "table" (1)" failed because error code 0x80131541 occurred, and the error row disposition on "output column "diasperm" (1707)" specifies failure on error. An error occurred on the specified object of the specified component.


why is this happening , any solutions or ideas? the data i want to extract comes from an sql command not a table
this is the query:

SELECT
FCNSS ,
FITIPOSOLICITUD ,
FIFOLIO ,
RTRIM(FCNOMBRE)||' '||RTRIM(FCAPPATERNO)||' '||RTRIM(FCAPMATERNO) AS NOMBREAFILIADO,
fdfinicta as FAFIL ,
fdFecCedido as FCED ,
(fdFecCedido-fdfinicta) AS diasperm ,
FNSALARIOACTUAL AS SALAFIL ,
(SELECT DISTINCT FISDI FROM gentec_own.AFILCEDIDOS WHERE
FIFOLIO = gentec_own.faccomite.FIFOLIO AND FITIPOSOLICITUD = gentec_own.faccomite.FITIPOSOLICITUD AND FCNSS = gentec_own.faccomite.FCNSS AND ROWNUM = 1) AS salCED ,
(SELECT DISTINCT (FISDI/48.60) FROM gentec_own.AFILCEDIDOS WHERE
FIFOLIO = gentec_own.faccomite.FIFOLIO AND FITIPOSOLICITUD = gentec_own.faccomite.FITIPOSOLICITUD AND FCNSS = gentec_own.faccomite.FCNSS AND ROWNUM = 1) AS CalSalCED ,
FCNUMPROMOTOR AS cod_promotor,
(SELECT RTRIM(FCNOMBRES)||' '||RTRIM(FCAPEPATERNO)||' '||RTRIM(FCAPEMATERNO)
FROM gentec_own.prommaestro where FCNUMPROMOTOR = gentec_own.faccomite.FCNUMPROMOTOR AND ROWNUM = 1) AS NOMPROMOTOR,
FICVEENTCED as aforeorig ,
FCAFORECEDIDO as aforeced ,
FNINGCOMT ,
FNCTOPROMOCION ,
FNCTOADMON ,
FNCONTRIBUCION ,
fcCanal as Canal ,
FCDIVISION as Division ,
FCREGION as Gerencia
FROM gentec_own.FACCOMITE
WHERE FCCEDIDO = 1
and (to_char(fdFecCedido,'yyyymmdd')>=to_char(sysdate-8,'yyyymmdd') and to_char(fdFecCedido,'yyyymmdd')< to_char(sysdate,'yyyymmdd'))
order by fdFecCedido;




please!!!1 somebody

View 4 Replies View Related

Data Processing Extension Not Visible

May 3, 2007

hello,



I have written a Custom Data Processing Extension for SSRS 2005.

The Report Server is in SharePoint integrated mode, this works perfect except for the extension.

When I want to set a DataSource in SharePoint (WSS 3.0), the extension is not in the ComboBox.

This works well on another server with the same extension.

The DLL file of that extension is copied in the reportserver bin folder.

I have added the references in the rsreportserver.config and rssrvpolicy.config (see below).


Code Snippetrsreportserver.config


<Data>

<Extension Name="SPSLISTS" Type="ICom.ReportingServices.SharepointListsExtension.Connection,

ICom.ReportingServices.SharepointListsExtension" />

<Data>




Code Snippet
rssrvpolicy.config
This code is located inside the CodeGroup with Name="SharePoint_Server_Strong_Name".


<CodeGroup class="UnionCodeGroup"

version="1"

PermissionSetName="FullTrust"

Name="SPSLISTS_CodeGroup"

Description="Code group for my SPS LISTS data processing extension">

<IMembershipCondition class="UrlMembershipCondition"

version="1"

Url="D:Program FilesMicrosoft SQL ServerMSSQL.3Reporting ServicesReportServerinICom.ReportingServices.SharepointListsExtension.dll"

/>

</CodeGroup>



I already tried re-installing Reporting Services and the add-in for SharePoint but the problem remains.



Does anyone has an idea of what's wrong here?



Thanks in advance,

Tom

View 3 Replies View Related

WSS 3.0 Lists Data Processing Extension

Mar 12, 2007

Hello,



I'm writing a Data Processing Extension for SSRS 2005 that allows you to use a WSS 3.0 List as a dataset.
This works for one list per dataset. Now I am trying to extend the extension to allow multiple lists in a dataset because I want to join lists on a common field (INNER JOIN). The problem is that I can only return one DataTable to the Report Server with the DataReader. Does anyone know if it is possible to return multiple DataTables (without the join)?

I'm a student, working on this project. I know Visual Basic.net but only started last month with Data.

Thanks in advance

Tom

View 2 Replies View Related

An Error Has Occurred During Report Processing. A Data Source Instance Has Not Been Supplied For The Data Source DetailDS_get_o

Mar 13, 2008

hi ,

i am trying for a drill through report (rdlc)

ihave written the following code in drill through event of reportviewer, whenever i click on the first report iam getting the error like

An error has occurred during report processing.


A data source instance has no
t been supplied for the data source "DetailDS_get_orderdetail".







the code is



using System;

using System.Data;

using System.Data.SqlClient;

using System.Configuration;

using System.Collections;

using System.Web;

using System.Web.Security;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.UI.WebControls.WebParts;

using System.Web.UI.HtmlControls;

//using Microsoft.ApplicationBlocks.Data;

using Microsoft.Reporting.WebForms;

using DAC;

public partial class _Default : System.Web.UI.Page

{

protected void Page_Load(object sender, EventArgs e)

{

ReportViewer1.Visible = false;

}

protected void Button1_Click(object sender, EventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

DataSet ds = new DataSet();

ds = obj.get_order();

ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DataSet1_get_order", ds.Tables[0]);



ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = "C:/Documents and Settings/km63096/My Documents/Visual Studio 2005/WebSites/drillthrurep/Report.rdlc";

ReportViewer1.LocalReport.Refresh();

ReportViewer1.Visible = true;

}

protected void ReportViewer1_Drillthrough(object sender, DrillthroughEventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

ReportParameterInfoCollection DrillThroughValues =

e.Report.GetParameters();



foreach (ReportParameterInfo d in DrillThroughValues)

{

Label1.Text = d.Values[0].ToString().Trim();

}

LocalReport localreport = (LocalReport)e.Report;

string order_id = Label1.Text;

DataSet ds = new DataSet();

ds = obj.get_orderdetail(order_id);



ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DetailDS_get_orderdetail", ds.Tables[0]);

ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = Server.MapPath(@"Reportlevel1.rdlc");

ReportViewer1.LocalReport.Refresh();





}



}

the code in method get_orderdetail(order_id) is

public DataSet get_orderdetail(string order_id)
{
SqlCommand cmd = new SqlCommand();
DataSet ds = new DataSet();
cmd.Parameters.Add("@order_id", SqlDbType.VarChar, 50);
cmd.Parameters["@order_id"].Value = order_id;
ds = SQLHelper.ExecuteAdapter(cmd, CommandType.StoredProcedure, "dbo.get_orderdetail");
return (ds);
}pls help me.

View 1 Replies View Related

Processing Data From Source To Destination Table

Sep 22, 2013

I want to process data from source table to destination table without using cursor.

At this point; we are creating temp table and inserting data from source to temp table. once we get data into temp table; using while loop we are processing record one by one to destination table.

while executing stored procedure; we noticed that there are few records in source table which are invalid and stored procedure is terminating from such records.

Any better approach to log INVALID data and resume code to process next record instead of terminating.

View 7 Replies View Related

Data Processing Script Justs Stops

Jul 20, 2005

I have a problem with a stored procdure I wrote. This proc willprocess 80 to 100 million every night, it aggregates, inserts into adifferent table, and then deletes the original data. The process willhang after processing 20 to 40 million records. I must stop the procand then restart it before I can complete the process.Any ideas why it would hang?Current Specs:Hardware: Compaq Proliant 370 - Dual 933MHz - 1GB or Ram - 35 gig Raid5OS: Windows 2k SP4DB: MS SQL 2000 Enterprise SP3

View 1 Replies View Related

Custom Data Processing Extension With Dataset

Feb 8, 2007

I have been getting to grips with writing a custom data processing extension for using a supplied dataset. I have got it working OK so far, but I don't see how to make use of relational data. There's plenty of code for walking through the fields, but nothing to walk through the tables. Since this clearly has to be possible can someone tell me where I'm going wrong?

Thanks

John Williams.

View 3 Replies View Related

Ssimple Data Flow ? - Processing After Using A Destination

Jan 19, 2006

I have a data flow that reads multiple rows from a table and then inserts to another table for each row. I use an ole destination for my inserts. However, after that insert I need to do other table inserts and I can't figure out how to continue the data flow with the fields in the pipeline. Out of the destination is only the Error flow - Is there a way to do this ?



thx

View 5 Replies View Related

Post Processing Data Flow Destinations

Apr 2, 2007

What I am trying to do is move data from a staging table into a live environment and then update the staging table AFTER the row has move (and not errored). There does not seem to be a reliable method for doing this.



Any assistance would be appreciated.

View 5 Replies View Related

TSQL + VBA Excel 2003 - Importing Data From MS Excel 2003 To SQL SERVER 2000 Using Multi - Batch Processing

Sep 11, 2007

Hi,
I need to import an SQL string from MS Excel 2003 to SQL SERVER 2000.
The string I need to import is composed by 5 different several blocks and looks like:



Code Snippet

CommandLine01 = "USE mydb"
CommandLine02 = "SELECT Block ..."
CommandLine03 = "GO
ALTER TABLE Block...
GO"
CommandLine04 = "UPDATE Block..."
CommandLine05 = "SELECT Block..."

The detail of the SQL string is at:
http://forums.microsoft.com/msdn/showpost.aspx?postid=2093921&siteid=1&sb=0&d=1&at=7&ft=11&tf=0&pageid=1



I am trying to implement OJ's suggestion:
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2117223&SiteID=1
to use multi - batch processing to import the string to SQL SERVER, something like:




Code Snippet
Dim SqlCnt, cmd1, cmd2, cmd3
'set the properties and open a connection

cmd1="use my_db"
cmd2="create table mytb"
cmd3="insert into mytb"

SqlCnt.execute cmd1
SqlCnt.Execute cmd2
SqlCnt.Execute cmd3

Below is the code (just partial) I have, and I need help to complete it.
Thanks in advance,
Aldo.




Code Snippet
Function TestConnection()
Dim ConnectionString As New ADODB.Connection
Dim RecordSet As New ADODB.RecordSet

ConnectionString = "Driver={SQL Server};Server=myServer;Database=myDBName;Uid=UserName;Pwd=Password"
ConnectionString.Open

CmdLine01 = " USE " & myDB
CmdLine02 = " SELECT ACCOUNTS.FULLNAME FROM ACCOUNTS" ...

CmdLine03 = "GO
ALTER TABLE Block...
GO"

CmdLine04 = "UPDATE Block..."
CmdLine05 = "SELECT Block..."

RecordSet.Open CmdLine01, ConnectionString
RecordSet.Open CmdLine02, ConnectionString

ConnectionString.Execute CmdLine01
ConnectionString.Execute CmdLine02

'Retrieve Field titles
For ColNr = 1 To RecordSet.Fields.Count
ActiveSheet.Cells(1, ColNr).Value = RecordSet.Fields(ColNr - 1).Name
Next

ActiveSheet.Cells(2, 1).CopyFromRecordset RecordSet

'Close ADO objects
RecordSet.Close
ConnectionString.Close
Set RecordSet = Nothing
Set ConnectionString = Nothing

End Function






View 7 Replies View Related

SSIS Partition Processing Data Flow Item

Feb 12, 2007

Does anyone have a helpful link for using the partition processing data
flow task in SSIS? I am trying to process a monthly partition
from within my package and am getting the following error:



Error: 0xC113000A Errors in the high-level relational engine. Pipeline
processing can only reference a single table in the data source view.



If anyone has used this before and could point me in the right direction, I would appreciate it.



Thanks,

Nick

View 3 Replies View Related

Importing Data And Script-processing Errors In SSIS

Mar 18, 2008

I'm currently trying to pull data from a ProvideX database and replicate it in a collection of SQL Server tables. However, I'm having a heck of a time trying to convert some strange decimals stored by the ProvideX database. As an example of the data I'm trying to retrieve, I'll see something like [. 1] or [. 1] ([]'s are to show the bounds of the field). After analyzing the data, it seems the decimal in the field represents a 1,000 placeholder. Thus [. 1] really means 1, and [. 1] really means 10. Something like .100 would be 100. 6.500 would be 6500.

As you can imagine, the spaces are causing errors when trying to pull the data, and I can't for the life of me figure out to just pull it as a string, run a script to convert it to a correct number, and then save the transformed data into SQL Server. When running the import wizard, it seems I'm being forced to pull these columns as decimals. Currently I'm trying to just pull the data out "as is" and throw it in a raw file, to be processed out of SSIS. Obviously doing it all within SSIS would be ideal, but if that can't be done, I'll do whatever it takes. I should also say I'm new to SSIS packages, but not necessarily new to SQL Server or SQL in general.

1) How can I pull these columns as strings? If I try to change the Export columns in the source query data flow step, it gives me an error saying that I can't do that.

2) If I have to pull as decimals, how can I capture the row on error, process it, and send it back to the export? So far, when I get an error, I lose all information in the row to the right of and including the error field.

I appreciate any responses, as I'm kind of going in circles at this point. If this sort of thing has been discussed here prior, I apologize...I didn't find it in any searches I did. Please just point me in the right direction if you've dealt with this sort of problem before. It seems to me that it should be an easy thing to do. I'm just not finding any tutorials on it.

View 14 Replies View Related

Data Warehousing :: ETL Package Execution Error In Processing

Sep 25, 2015

ETL Packages are getting failed sometimes(Package Execution Error). Eventhough executing ETL Package again from start, getting the same Error. But after Restarting Sql Service in BI Server, it is working fine. Whether it is the issue from Developer Code side or from server side.

View 7 Replies View Related

The Data Processing Extension Used For This Report Is Not Available. It Has Either Been Uninstalled, Or It Is Not Configured Cor

May 22, 2008

I am using an OLEDB data source connecting to an Access database. Using SQL Server 2005 SP2 Workgroup Edition.

I can build the report fine and view it in Visual Studio. When I deploy the report and try to run it on the Report Manager site I get this error


An error has occurred during report processing. (rsProcessingAborted)
An attempt has been made to use a data extension 'OLEDB' that is not registered for this report server. (rsDataExtensionNotFound).









When I go to the Shared Data Source, this message appears beside the Connection type field


The data processing extension used for this report is not available. It has either been uninstalled, or it is not configured correctly.
What am I doing wrong??

View 6 Replies View Related

C# Stored Procedure Processing Binary Data To Sql Fields

Jul 25, 2007





In the code sample below, case eLABEL, eENGUNITS works ok. The target SQL field is defined as varchar(50).

The second section is not so happy. It is attempting to write to an SQL field defined as binary(2)



Executing an SQL script to excercise this line results in error:


System.Data.SqlClient.SqlException: Incorrect syntax near the keyword 'foreign'

where 'foreign' is the name (FieldDef.sFldName) of the SQL field being written.



The c# code composes the following command:



"UPDATE " + acPointType + " SET " + FieldDef.sFldName +

" = @data_params WHERE VEC = '" + acVECName +

"' and name = '" + acPointName + "';";
What is the proper syntax for the second case set?






Code Snippet

case vcidatatype.eLABEL:

case vcidatatype.eENGUNITS:

{


byte[] bbuff = new byte[512];

bbuff = rdr.ReadBytes(FieldDef.iLen);

vciSqlCommand.Parameters.Add(new SqlParameter("@data_params", SqlDbType.VarBinary));

vciSqlCommand.Parameters["@data_params"].Value = bbuff;

break;

}

case vcidatatype.eFIDADR:

case vcidatatype.eLANADR:

{


byte[] bbuf = new byte[512] ;

bbuf = rdr.ReadBytes(2);

vciSqlCommand.Parameters.Add(new SqlParameter("@data_params", SqlDbType.Binary));

vciSqlCommand.Parameters["@data_params"].Value = bbuf;

break;
}





















View 1 Replies View Related

Hello , How To Handle Error Inside A Reporting Data Processing Extension..

May 1, 2007

I need help about this , What i've done so far is a data processing extension that gets its data from a WS , I use it as a datasource for a report , That report will be displayed withing an ASPX page containing a reportviewer control that displays that report , The problem is catching errors that occurs withing the data processing extension .. ,

When i debugged the extension i saw that after throwing an exception the code ends up inside the "cancel()" event inside the IDBcommand implementation...

In all examples for a data processing extension i could find online they always have a comment saying the cancel is not implemented and throw a NewUnsupportedExtension()..

What happens is i get some uninformative , non-userfriendly error from the reporting services ,
something like "an error occurred during the report processing(rssomehing)"

How should i go about handling those exception that happened inside the data processing extension?

I've tried catching the error inside the event of the reportviewercontrol - reporterror but it seems that event don't fire when those errors are generated..

Any help on how to approach this would be highly welcome

View 7 Replies View Related

Extracting Parameter Values From A Custom Data Processing Extension

Jun 23, 2006

I've created a custom Data Processing Extension and I've implemented the IDBCommandAnalysis interface so that my reports can enter parameters and pass them to my Data Processing Extension.

My question is, how do I extract the value from the Parameters coming from the report? Where do the parameters get passed off from the report? I can query the Parameters collection and my report gets prompted in Preview mode to enter something for the parameter but I can't find the spot where it gets passed for processing.

View 5 Replies View Related

Analysis :: Processing Database Failed - Cannot Connect To Data Source

Apr 23, 2015

I have deployed my SSAS project to an Analysis services database on a SQL 2014 server instance using the "Development" build configration in Visual Studio. In order to avoid users accessing the development instance of the cube, I've created an additional "Release" build configuration and deployed my project to a different database on the same server. I've also created a production copy of the data source and changed the data source configuration of the production cube to point to this one.

I've provided the same domain service account on the "Impersonation information" tab of the data source for both the development as well as the production instance of the cube. This account has also been granted identical permissions for both data sources. While everything works fine with the development database, processing the production database fails with an error message saying that it cannot connect to its respective data source.

I'm using VS Ultimate 2013.4, SSDT 12.0.50318.0.

View 3 Replies View Related

Any Way To Control Execution Processing (parallel Or Serially) When Using Data Driven Subscriptions?

Apr 12, 2007



I've got an SSRS report that is set up using a data-driven subscription to supply input parameters to the stored procedure that is called to generate the report results.



I was wondering if there is any way to specify the execution processing method (running the reports in parallel or serially). The subscription that we have set up appears to be running all of the reports in parallel which is causing massive load on our servers.



Thanks.

View 3 Replies View Related

Excel Data Mining Add-in - Analyze Key Influencers - Remote Processing Issue

Nov 12, 2007

I have installed the excel DM addin and am trying to work through the tutorials -

When I run the 'Analyze Key Influencers' tool against the sample data through a remote AS server I get:
The task was not able to detect any key influencers for the 'Purchased Bike' column. The values of 'Purchased Bike' seem unrelated to values of other columns.

however when I run it against a local AS server I get the expected results.

I can see no differences in settings or setup between the AS instances I am trying to use - perhaps a permissions issue?
Thank you

View 4 Replies View Related

Reporting Services :: SSRS Custom Data Processing Extension Error

May 5, 2015

I am using Custom Data Processing Extension to call a stored procedure. Iam getting following error when creating a dataset in report designer using the extension. I wrote the code in c#.

could not update a list of fields for the query. verify that you can connect to the data source and that your query syntax is correct.(Details-Object reference not set to an instance of an object.)

Here is my code

using System;
using System.Collections.Generic;
using System.Text;
using System.IO;
using System.Data;

[Code] .....

View 2 Replies View Related

An Error Has Occurred During Report Processing. (rsProcessingAborted).. Query Execution Failed For Data Set

Sep 11, 2007

Hi All,

I'm facing a strange problem..
I've developed few reports. they are working fine in develop environment. after successfull testing they were published on web.
in web version, all reports are executing for first time.. if I change any of parameters values or without chaning also..
if I press "View Report" following error occurs..




An error has occurred during report processing. (rsProcessingAborted)

Query execution failed for data set 'dsMLGDB2Odbc'. (rsErrorExecutingCommand)

For more information about this error navigate to the report server on the local server machine, or enable remote errors

please suggest any alternative ways to overcome this issue
thanks in adv.

View 11 Replies View Related

Data Mining :: How To Reduce Data Mining Processing Time

Aug 4, 2015

With SASS Database i have created Data mining Structure Using Time series algorithm, while processing the SSAS db, Data mining  taking long time to process, so how we can  reduce processing time ???

View 2 Replies View Related

Client/server Processing

Nov 1, 2006

I'm going through a set of MS e-learning classes and am confused about where the processing occurs. In general, I thought that all the processing occurred on the Server once the client made a request. This e-learning confuses me as it says that the processing is spread out between the client and the server. What is the correct answer?

thx,

Kat

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved