Processing XML Data From A Table
Sep 8, 2006
We have an XML column in a SQL Server 2005 table. Each row of this table contains one XML document.
I want to shred values from the XML documents and process these within a Data Flow. I want the Data Flow to execute once across a record set comprised of all of the XML documents.
I can shred the XML using a For-Each loop and XML Task. I'm kinda stuck on how I then get the data from variables into a Recordset or similar so that I can process this within single iteration of a Data Flow.
Or - is my approach incorrect? I seem to be building a verbose and clunky solution to this problem. I know I could accomplish the same in a pretty simple SQL statement using .value on the XML column... am I missing something? Is a SQL query just better suited to this problem?
Any help much appreciated.
James
View 1 Replies
ADVERTISEMENT
Sep 22, 2013
I want to process data from source table to destination table without using cursor.
At this point; we are creating temp table and inserting data from source to temp table. once we get data into temp table; using while loop we are processing record one by one to destination table.
while executing stored procedure; we noticed that there are few records in source table which are invalid and stored procedure is terminating from such records.
Any better approach to log INVALID data and resume code to process next record instead of terminating.
View 7 Replies
View Related
Sep 9, 2015
I have a table that is returning rows from a table query. It seems I have done it before but I cannot seem to get the right procedure to obtain the values. I will paste in the code below in which you will see my bad attempts at accomplishing what I need.
Dim uid As String
Dim pw As String
Dim em As String, fn, ln, mi As String
Dim par As String
Dim Field, n, j As Integer
Dim JJ As Integer
[code]...
View 3 Replies
View Related
Dec 30, 2005
Hello everyone. I need help regarding the following:Given the following table:CREATE TABLE T1 (C1 nvarchar(10), C2 money)INSERT INTO T1 VALUES ('A',1)INSERT INTO T1 VALUES ('B',2)INSERT INTO T1 VALUES ('C',3)let's say that i have this table in a local server and i want to uploadit to a remote server and in the remote server upload it to a databasethat contains the same table.the uploading part can be done by another application in the remoteserver, but i want i need is a way to transfer the data at the fastestpossible way.what steps do i need to follow?tia,Rey Guerrero
View 10 Replies
View Related
Jul 20, 2005
Hello,I have an application that will be logging to a SQL Server 2000database user user activity from several Windows 2003 terminalservers. This information will be retrieved by monitoring theSecurity logs of these servers (this part I know how to accomplishalready).A table in the database, tblLogEntries, will contain the followingfields:- ID = autoincrementing int- LogTime = Date/Time the user activity was recorded in the securitylog- Username = User's login ID that the activity was recorded with- Type = int, referencing a lookup table with the values of Logon,Logoff, and possible other future items- Server = The name of the server the activity was recorded on.The only question I have is, can you offer a way to process the totaluser login time during a given range using T-SQL.For Example...Given the table data:ID LogTime Username Type Server1 10-10-2003 8:30:00 Tom Logon SERVER-A2 10-10-2003 8:45:00 Sarah Logon SERVER-A3 10-10-2003 16:45:00 Tom Logoff SERVER-A4 10-10-2003 17:00:00 Sarah Logoff SERVER-A5 10-11-2003 8:30:00 Tom Logon SERVER-A6 10-11-2003 8:45:00 Sarah Logon SERVER-A7 10-11-2003 16:30:00 Sarah Logoff SERVER-A8 10-11-2003 17:15:00 Tom Logoff SERVER-AHow would you receive the output:User Logon Total Time for SERVER-ATom 17.0 hrsSarah 16.0 hrsI know I can handle this type of processing on my ASP.NET front-end,but I'm curious as to how easily it can be done by the database,itself.Thanks in advance for your assistance.
View 4 Replies
View Related
Jul 23, 2005
I'm working on a system right now where I have a database (two,actually, but one is discarded halfway through), but it's createdand used as part of a process (reporting), rather than as theactual production data repository. I may be keeping the databasepermanantly, but it would be completely read-only; once theprocess is complete, the database will not change again. This hasme wanting to do a few things that are rather foreign to my usualexperience, and I don't know how many of them are supported.In several cases, I'm summarizing one table into another by severalfields, and then updating the original table with an ID for thesummary row each source row was summarized into (e.g., I summarizePlaceAndProductSummary into PlaceSummary, and then populatePlaceSummaryID in PlaceAndProductSummary). The update of thesource table is much faster if the summary table has a clusteredindex on the summarized fields, but all later access will be fasterif the clustered index is on the identity column. I've beenincluding an ORDER BY the summarized fields in the original insert,so the identity column is in the same order as the summarized fields,but I don't know of any way to take advantage of that in theindexing declarations.As another approach to the above situation, if I change theclustered index on a table, and the rows happen to be in thesame order by both indexes, will the table still get rebuilt?I will never do a roll-back in the process; if an action fails, Iwant to raise an error and halt (and I haven't lost any data).Is there any way to completely turn off logging?Will I gain anything by marking the database as single-user?Any indexes that I am not using while I populate the tables, I'madding at the end with FillFactor 100, to keep any slack out.Is there a way to remove all the slack from everything else, atthe end of the process? During a backup operation would be fine.Thanks,Bill
View 2 Replies
View Related
Sep 5, 2007
I want to do conditional processing depending on values in the rows of a CTE. For example, is the following kind of thing possible with a CTE?:
WITH Orders_CTE (TerritoryId, ContactId)
AS
(
SELECT TerritoryId, ContactId
FROM Sales.SalesOrderHeader
WHERE (ContactId < 200)
)
IF Orders_CTE.TerritoryId > 3
BEGIN
/* Do some processing here */
END
ELSE
BEGIN
/* Do something else here */
END
When I try this, I get a syntax error near the keyword 'IF'
Any ideas? I know this kind of thing can be done with a cursor but wanted to keep with the times and avoid using one!
View 3 Replies
View Related
Aug 22, 2001
I want to capture and process data inside of a stored procedure by executing another stored procedure. If proc1 calls proc2 and proc2 returns 1 or more rows, consisting of 2 or more columns, what is the best way to do this?
Currently, I know I'm returning 1 row of 3 columns, so I concatenate the data and either return it with an OUTPUT parameter or using the RETURN stmt.
TIA,
mike
View 1 Replies
View Related
Jun 26, 2007
Do you have to set the Error Output on DF components to "Fail Component" in order to get the errors?
What I would LIKE to do is a combination of "Ignore Failure" and "Fail Component". You see, I am using the Logging feature in my package that creates the sysdtslog90 table in the SQL database. The errors that I am logging make sense and have enough information for my purposes.
The problem is that I would like to continue processing the data and not have it stop when a data error occurs. I REALLY do not want to Redirect Rows unless it is necessary for me to do what I am asking.
Using Ignore Failure on both the source text file and destination SQL table allows the "good" data to be inserted, but I cannot get any info on the columns in error. Conversely, if I choose to Fail component, I get the info on the columns in error, but only the data that was inserted before the error was encountered is inserted into the table.
Suggestions?
View 14 Replies
View Related
Aug 7, 2006
hi here´s a new one.....
i´ve created a dts that is using an odbc source to connect to an oracle server, the conexion works just fine and i have no problems with it, then, i run this package from mi computer and the data transfer ends succesfully, when i upload it to my server in SQL 2005 and set it into a job.. i get errors like these:
Event Name: OnError
Message: Thread "WorkThread0" has exited with error code 0xC0047039.
Event Name: OnError
Message: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.
Event Name: OnError
Message: The PrimeOutput method on component "table" (1) returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Event Name: OnError
Message: The component "faccomitecedidos" (1) was unable to process the data.
Event Name: OnError
Message: The "component "table" (1)" failed because error code 0x80131541 occurred, and the error row disposition on "output column "diasperm" (1707)" specifies failure on error. An error occurred on the specified object of the specified component.
why is this happening , any solutions or ideas? the data i want to extract comes from an sql command not a table
this is the query:
SELECT
FCNSS ,
FITIPOSOLICITUD ,
FIFOLIO ,
RTRIM(FCNOMBRE)||' '||RTRIM(FCAPPATERNO)||' '||RTRIM(FCAPMATERNO) AS NOMBREAFILIADO,
fdfinicta as FAFIL ,
fdFecCedido as FCED ,
(fdFecCedido-fdfinicta) AS diasperm ,
FNSALARIOACTUAL AS SALAFIL ,
(SELECT DISTINCT FISDI FROM gentec_own.AFILCEDIDOS WHERE
FIFOLIO = gentec_own.faccomite.FIFOLIO AND FITIPOSOLICITUD = gentec_own.faccomite.FITIPOSOLICITUD AND FCNSS = gentec_own.faccomite.FCNSS AND ROWNUM = 1) AS salCED ,
(SELECT DISTINCT (FISDI/48.60) FROM gentec_own.AFILCEDIDOS WHERE
FIFOLIO = gentec_own.faccomite.FIFOLIO AND FITIPOSOLICITUD = gentec_own.faccomite.FITIPOSOLICITUD AND FCNSS = gentec_own.faccomite.FCNSS AND ROWNUM = 1) AS CalSalCED ,
FCNUMPROMOTOR AS cod_promotor,
(SELECT RTRIM(FCNOMBRES)||' '||RTRIM(FCAPEPATERNO)||' '||RTRIM(FCAPEMATERNO)
FROM gentec_own.prommaestro where FCNUMPROMOTOR = gentec_own.faccomite.FCNUMPROMOTOR AND ROWNUM = 1) AS NOMPROMOTOR,
FICVEENTCED as aforeorig ,
FCAFORECEDIDO as aforeced ,
FNINGCOMT ,
FNCTOPROMOCION ,
FNCTOADMON ,
FNCONTRIBUCION ,
fcCanal as Canal ,
FCDIVISION as Division ,
FCREGION as Gerencia
FROM gentec_own.FACCOMITE
WHERE FCCEDIDO = 1
and (to_char(fdFecCedido,'yyyymmdd')>=to_char(sysdate-8,'yyyymmdd') and to_char(fdFecCedido,'yyyymmdd')< to_char(sysdate,'yyyymmdd'))
order by fdFecCedido;
please!!!1 somebody
View 4 Replies
View Related
May 3, 2007
hello,
I have written a Custom Data Processing Extension for SSRS 2005.
The Report Server is in SharePoint integrated mode, this works perfect except for the extension.
When I want to set a DataSource in SharePoint (WSS 3.0), the extension is not in the ComboBox.
This works well on another server with the same extension.
The DLL file of that extension is copied in the reportserver bin folder.
I have added the references in the rsreportserver.config and rssrvpolicy.config (see below).
Code Snippetrsreportserver.config
<Data>
<Extension Name="SPSLISTS" Type="ICom.ReportingServices.SharepointListsExtension.Connection,
ICom.ReportingServices.SharepointListsExtension" />
<Data>
Code Snippet
rssrvpolicy.config
This code is located inside the CodeGroup with Name="SharePoint_Server_Strong_Name".
<CodeGroup class="UnionCodeGroup"
version="1"
PermissionSetName="FullTrust"
Name="SPSLISTS_CodeGroup"
Description="Code group for my SPS LISTS data processing extension">
<IMembershipCondition class="UrlMembershipCondition"
version="1"
Url="D:Program FilesMicrosoft SQL ServerMSSQL.3Reporting ServicesReportServerinICom.ReportingServices.SharepointListsExtension.dll"
/>
</CodeGroup>
I already tried re-installing Reporting Services and the add-in for SharePoint but the problem remains.
Does anyone has an idea of what's wrong here?
Thanks in advance,
Tom
View 3 Replies
View Related
Mar 12, 2007
Hello,
I'm writing a Data Processing Extension for SSRS 2005 that allows you to use a WSS 3.0 List as a dataset.
This works for one list per dataset. Now I am trying to extend the extension to allow multiple lists in a dataset because I want to join lists on a common field (INNER JOIN). The problem is that I can only return one DataTable to the Report Server with the DataReader. Does anyone know if it is possible to return multiple DataTables (without the join)?
I'm a student, working on this project. I know Visual Basic.net but only started last month with Data.
Thanks in advance
Tom
View 2 Replies
View Related
Mar 13, 2008
hi ,
i am trying for a drill through report (rdlc)
ihave written the following code in drill through event of reportviewer, whenever i click on the first report iam getting the error like
An error has occurred during report processing.
A data source instance has no
t been supplied for the data source "DetailDS_get_orderdetail".
the code is
using System;
using System.Data;
using System.Data.SqlClient;
using System.Configuration;
using System.Collections;
using System.Web;
using System.Web.Security;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.Web.UI.WebControls.WebParts;
using System.Web.UI.HtmlControls;
//using Microsoft.ApplicationBlocks.Data;
using Microsoft.Reporting.WebForms;
using DAC;
public partial class _Default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
ReportViewer1.Visible = false;
}
protected void Button1_Click(object sender, EventArgs e)
{
DAC.clsReportsWoman obj = new clsReportsWoman();
DataSet ds = new DataSet();
ds = obj.get_order();
ReportViewer1.LocalReport.DataSources.Clear();
ReportDataSource reds = new ReportDataSource("DataSet1_get_order", ds.Tables[0]);
ReportViewer1.LocalReport.DataSources.Add(reds);
ReportViewer1.LocalReport.ReportPath = "C:/Documents and Settings/km63096/My Documents/Visual Studio 2005/WebSites/drillthrurep/Report.rdlc";
ReportViewer1.LocalReport.Refresh();
ReportViewer1.Visible = true;
}
protected void ReportViewer1_Drillthrough(object sender, DrillthroughEventArgs e)
{
DAC.clsReportsWoman obj = new clsReportsWoman();
ReportParameterInfoCollection DrillThroughValues =
e.Report.GetParameters();
foreach (ReportParameterInfo d in DrillThroughValues)
{
Label1.Text = d.Values[0].ToString().Trim();
}
LocalReport localreport = (LocalReport)e.Report;
string order_id = Label1.Text;
DataSet ds = new DataSet();
ds = obj.get_orderdetail(order_id);
ReportViewer1.LocalReport.DataSources.Clear();
ReportDataSource reds = new ReportDataSource("DetailDS_get_orderdetail", ds.Tables[0]);
ReportViewer1.LocalReport.DataSources.Add(reds);
ReportViewer1.LocalReport.ReportPath = Server.MapPath(@"Reportlevel1.rdlc");
ReportViewer1.LocalReport.Refresh();
}
}
the code in method get_orderdetail(order_id) is
public DataSet get_orderdetail(string order_id)
{
SqlCommand cmd = new SqlCommand();
DataSet ds = new DataSet();
cmd.Parameters.Add("@order_id", SqlDbType.VarChar, 50);
cmd.Parameters["@order_id"].Value = order_id;
ds = SQLHelper.ExecuteAdapter(cmd, CommandType.StoredProcedure, "dbo.get_orderdetail");
return (ds);
}pls help me.
View 1 Replies
View Related
May 23, 2007
I have a 13 GB MYSQL table with the following definition:
Code:
CREATE TABLE `WikiParagraphs` (
`ID` int(10) unsigned NOT NULL auto_increment,
`Paragraph` text NOT NULL,
PRIMARY KEY (`ID`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=73035281 ;
I query it like this:
Code:
select Paragraph from WikiParagraphs where ID in (1,2,3,4)
the 1,2,3,4 bit comes from the Sphinx FullText engine that gives me the IDs I need that match my query within about 500 milliseconds.
But retrieving the data itself takes approximately 3-6 seconds.
Obviously, I'd like to speed this query up.
Any ideas?
G-Man
View 1 Replies
View Related
Jun 24, 2014
I have a scenario where one partner can have multiple competency like Grade A,Grade B or Grade C and Grade AB,Grade BC etc.
I need to pull the records of competency of higher level also there is a chance of changing this competency level in feature like Grade A of Grade AA like this.
What would be the best option to select this kind of records.
View 9 Replies
View Related
Jul 20, 2005
I have a problem with a stored procdure I wrote. This proc willprocess 80 to 100 million every night, it aggregates, inserts into adifferent table, and then deletes the original data. The process willhang after processing 20 to 40 million records. I must stop the procand then restart it before I can complete the process.Any ideas why it would hang?Current Specs:Hardware: Compaq Proliant 370 - Dual 933MHz - 1GB or Ram - 35 gig Raid5OS: Windows 2k SP4DB: MS SQL 2000 Enterprise SP3
View 1 Replies
View Related
Feb 8, 2007
I have been getting to grips with writing a custom data processing extension for using a supplied dataset. I have got it working OK so far, but I don't see how to make use of relational data. There's plenty of code for walking through the fields, but nothing to walk through the tables. Since this clearly has to be possible can someone tell me where I'm going wrong?
Thanks
John Williams.
View 3 Replies
View Related
Jan 19, 2006
I have a data flow that reads multiple rows from a table and then inserts to another table for each row. I use an ole destination for my inserts. However, after that insert I need to do other table inserts and I can't figure out how to continue the data flow with the fields in the pipeline. Out of the destination is only the Error flow - Is there a way to do this ?
thx
View 5 Replies
View Related
Apr 2, 2007
What I am trying to do is move data from a staging table into a live environment and then update the staging table AFTER the row has move (and not errored). There does not seem to be a reliable method for doing this.
Any assistance would be appreciated.
View 5 Replies
View Related
Feb 12, 2007
Does anyone have a helpful link for using the partition processing data
flow task in SSIS? I am trying to process a monthly partition
from within my package and am getting the following error:
Error: 0xC113000A Errors in the high-level relational engine. Pipeline
processing can only reference a single table in the data source view.
If anyone has used this before and could point me in the right direction, I would appreciate it.
Thanks,
Nick
View 3 Replies
View Related
Mar 18, 2008
I'm currently trying to pull data from a ProvideX database and replicate it in a collection of SQL Server tables. However, I'm having a heck of a time trying to convert some strange decimals stored by the ProvideX database. As an example of the data I'm trying to retrieve, I'll see something like [. 1] or [. 1] ([]'s are to show the bounds of the field). After analyzing the data, it seems the decimal in the field represents a 1,000 placeholder. Thus [. 1] really means 1, and [. 1] really means 10. Something like .100 would be 100. 6.500 would be 6500.
As you can imagine, the spaces are causing errors when trying to pull the data, and I can't for the life of me figure out to just pull it as a string, run a script to convert it to a correct number, and then save the transformed data into SQL Server. When running the import wizard, it seems I'm being forced to pull these columns as decimals. Currently I'm trying to just pull the data out "as is" and throw it in a raw file, to be processed out of SSIS. Obviously doing it all within SSIS would be ideal, but if that can't be done, I'll do whatever it takes. I should also say I'm new to SSIS packages, but not necessarily new to SQL Server or SQL in general.
1) How can I pull these columns as strings? If I try to change the Export columns in the source query data flow step, it gives me an error saying that I can't do that.
2) If I have to pull as decimals, how can I capture the row on error, process it, and send it back to the export? So far, when I get an error, I lose all information in the row to the right of and including the error field.
I appreciate any responses, as I'm kind of going in circles at this point. If this sort of thing has been discussed here prior, I apologize...I didn't find it in any searches I did. Please just point me in the right direction if you've dealt with this sort of problem before. It seems to me that it should be an easy thing to do. I'm just not finding any tutorials on it.
View 14 Replies
View Related
Jan 3, 2007
I developed nice reports using custom data processing extensions. When I deployed the reports on my report server (I am using the express edition of SQL Server 2005) I was surprised to see that my reports were not rendering successfully.
After searching the web, I found this page listing the supported/unsupported features of SQL Server 2005 Express Edition: http://msdn2.microsoft.com/en-us/library/ms365166.aspx
On this page it clearly says €œThe Reporting Services API extensible platform for delivery, data processing, rendering, and security is not supported.€?
Is there a way to get my reports to work on the express edition?
If not, which minimal version of SQL Server should a buy to get it to work (workgroup, standard or enterprise)?
Thanks for your help.
View 1 Replies
View Related
Sep 25, 2015
ETL Packages are getting failed sometimes(Package Execution Error). Eventhough executing ETL Package again from start, getting the same Error. But after Restarting Sql Service in BI Server, it is working fine. Whether it is the issue from Developer Code side or from server side.
View 7 Replies
View Related
May 22, 2008
I am using an OLEDB data source connecting to an Access database. Using SQL Server 2005 SP2 Workgroup Edition.
I can build the report fine and view it in Visual Studio. When I deploy the report and try to run it on the Report Manager site I get this error
An error has occurred during report processing. (rsProcessingAborted)
An attempt has been made to use a data extension 'OLEDB' that is not registered for this report server. (rsDataExtensionNotFound).
When I go to the Shared Data Source, this message appears beside the Connection type field
The data processing extension used for this report is not available. It has either been uninstalled, or it is not configured correctly.
What am I doing wrong??
View 6 Replies
View Related
Jul 25, 2007
In the code sample below, case eLABEL, eENGUNITS works ok. The target SQL field is defined as varchar(50).
The second section is not so happy. It is attempting to write to an SQL field defined as binary(2)
Executing an SQL script to excercise this line results in error:
System.Data.SqlClient.SqlException: Incorrect syntax near the keyword 'foreign'
where 'foreign' is the name (FieldDef.sFldName) of the SQL field being written.
The c# code composes the following command:
"UPDATE " + acPointType + " SET " + FieldDef.sFldName +
" = @data_params WHERE VEC = '" + acVECName +
"' and name = '" + acPointName + "';";
What is the proper syntax for the second case set?
Code Snippet
case vcidatatype.eLABEL:
case vcidatatype.eENGUNITS:
{
byte[] bbuff = new byte[512];
bbuff = rdr.ReadBytes(FieldDef.iLen);
vciSqlCommand.Parameters.Add(new SqlParameter("@data_params", SqlDbType.VarBinary));
vciSqlCommand.Parameters["@data_params"].Value = bbuff;
break;
}
case vcidatatype.eFIDADR:
case vcidatatype.eLANADR:
{
byte[] bbuf = new byte[512] ;
bbuf = rdr.ReadBytes(2);
vciSqlCommand.Parameters.Add(new SqlParameter("@data_params", SqlDbType.Binary));
vciSqlCommand.Parameters["@data_params"].Value = bbuf;
break;
}
View 1 Replies
View Related
Oct 22, 2015
I have a job which executes hourly.
Essentially:
Begin
Truncate table A
Insert into A
(Col1,
Col2,
Col3...
)
Select Value1,
Value2,
Value3...
From Table B
End
The insert operation query takes approximately 3.5 minutes to execute. What's occurring is the Table is immediately truncated, and there are no rows in the table for those 3.5 minutes.
How can I avoid having this gap - where there are no rows in the table for that period of time during the job execution ? The table could be locked, but that doesn't seem like the best solution.
View 8 Replies
View Related
May 1, 2007
I need help about this , What i've done so far is a data processing extension that gets its data from a WS , I use it as a datasource for a report , That report will be displayed withing an ASPX page containing a reportviewer control that displays that report , The problem is catching errors that occurs withing the data processing extension .. ,
When i debugged the extension i saw that after throwing an exception the code ends up inside the "cancel()" event inside the IDBcommand implementation...
In all examples for a data processing extension i could find online they always have a comment saying the cancel is not implemented and throw a NewUnsupportedExtension()..
What happens is i get some uninformative , non-userfriendly error from the reporting services ,
something like "an error occurred during the report processing(rssomehing)"
How should i go about handling those exception that happened inside the data processing extension?
I've tried catching the error inside the event of the reportviewercontrol - reporterror but it seems that event don't fire when those errors are generated..
Any help on how to approach this would be highly welcome
View 7 Replies
View Related
Jun 23, 2006
I've created a custom Data Processing Extension and I've implemented the IDBCommandAnalysis interface so that my reports can enter parameters and pass them to my Data Processing Extension.
My question is, how do I extract the value from the Parameters coming from the report? Where do the parameters get passed off from the report? I can query the Parameters collection and my report gets prompted in Preview mode to enter something for the parameter but I can't find the spot where it gets passed for processing.
View 5 Replies
View Related
Apr 23, 2015
I have deployed my SSAS project to an Analysis services database on a SQL 2014 server instance using the "Development" build configration in Visual Studio. In order to avoid users accessing the development instance of the cube, I've created an additional "Release" build configuration and deployed my project to a different database on the same server. I've also created a production copy of the data source and changed the data source configuration of the production cube to point to this one.
I've provided the same domain service account on the "Impersonation information" tab of the data source for both the development as well as the production instance of the cube. This account has also been granted identical permissions for both data sources. While everything works fine with the development database, processing the production database fails with an error message saying that it cannot connect to its respective data source.
I'm using VS Ultimate 2013.4, SSDT 12.0.50318.0.
View 3 Replies
View Related
Apr 12, 2007
I've got an SSRS report that is set up using a data-driven subscription to supply input parameters to the stored procedure that is called to generate the report results.
I was wondering if there is any way to specify the execution processing method (running the reports in parallel or serially). The subscription that we have set up appears to be running all of the reports in parallel which is causing massive load on our servers.
Thanks.
View 3 Replies
View Related
Nov 12, 2007
I have installed the excel DM addin and am trying to work through the tutorials -
When I run the 'Analyze Key Influencers' tool against the sample data through a remote AS server I get:
The task was not able to detect any key influencers for the 'Purchased Bike' column. The values of 'Purchased Bike' seem unrelated to values of other columns.
however when I run it against a local AS server I get the expected results.
I can see no differences in settings or setup between the AS instances I am trying to use - perhaps a permissions issue?
Thank you
View 4 Replies
View Related
May 5, 2015
I am using Custom Data Processing Extension to call a stored procedure. Iam getting following error when creating a dataset in report designer using the extension. I wrote the code in c#.
could not update a list of fields for the query. verify that you can connect to the data source and that your query syntax is correct.(Details-Object reference not set to an instance of an object.)
Here is my code
using System;
using System.Collections.Generic;
using System.Text;
using System.IO;
using System.Data;
[Code] .....
View 2 Replies
View Related
Sep 11, 2007
Hi All,
I'm facing a strange problem..
I've developed few reports. they are working fine in develop environment. after successfull testing they were published on web.
in web version, all reports are executing for first time.. if I change any of parameters values or without chaning also..
if I press "View Report" following error occurs..
An error has occurred during report processing. (rsProcessingAborted)
Query execution failed for data set 'dsMLGDB2Odbc'. (rsErrorExecutingCommand)
For more information about this error navigate to the report server on the local server machine, or enable remote errors
please suggest any alternative ways to overcome this issue
thanks in adv.
View 11 Replies
View Related