Verify Data Source Viable Before Proceeding

Feb 12, 2004

I've completed my first SQL project, for which I've built a DTS Package. First thing it does it drop all records in the destination table, before importing new records from a txt file and then massaging them.

After I got done, I realized that if the data source is not available for some reason, the records will still be dropped, the process will fail, and the destination table will be left empty. In this case, leaving the existing records intact would be preferable to not having any.

How can I test that the txt file exists before dropping the records?

Thanks,

Randy

ps: Users will maintain a link to the table. I plan to update the table after business hours. If someone happens to have their linked application open while I'm trying to update the table, will it fail?

View 5 Replies


ADVERTISEMENT

An Error Has Occurred During Report Processing. A Data Source Instance Has Not Been Supplied For The Data Source DetailDS_get_o

Mar 13, 2008

hi ,

i am trying for a drill through report (rdlc)

ihave written the following code in drill through event of reportviewer, whenever i click on the first report iam getting the error like

An error has occurred during report processing.


A data source instance has no
t been supplied for the data source "DetailDS_get_orderdetail".







the code is



using System;

using System.Data;

using System.Data.SqlClient;

using System.Configuration;

using System.Collections;

using System.Web;

using System.Web.Security;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.UI.WebControls.WebParts;

using System.Web.UI.HtmlControls;

//using Microsoft.ApplicationBlocks.Data;

using Microsoft.Reporting.WebForms;

using DAC;

public partial class _Default : System.Web.UI.Page

{

protected void Page_Load(object sender, EventArgs e)

{

ReportViewer1.Visible = false;

}

protected void Button1_Click(object sender, EventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

DataSet ds = new DataSet();

ds = obj.get_order();

ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DataSet1_get_order", ds.Tables[0]);



ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = "C:/Documents and Settings/km63096/My Documents/Visual Studio 2005/WebSites/drillthrurep/Report.rdlc";

ReportViewer1.LocalReport.Refresh();

ReportViewer1.Visible = true;

}

protected void ReportViewer1_Drillthrough(object sender, DrillthroughEventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

ReportParameterInfoCollection DrillThroughValues =

e.Report.GetParameters();



foreach (ReportParameterInfo d in DrillThroughValues)

{

Label1.Text = d.Values[0].ToString().Trim();

}

LocalReport localreport = (LocalReport)e.Report;

string order_id = Label1.Text;

DataSet ds = new DataSet();

ds = obj.get_orderdetail(order_id);



ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DetailDS_get_orderdetail", ds.Tables[0]);

ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = Server.MapPath(@"Reportlevel1.rdlc");

ReportViewer1.LocalReport.Refresh();





}



}

the code in method get_orderdetail(order_id) is

public DataSet get_orderdetail(string order_id)
{
SqlCommand cmd = new SqlCommand();
DataSet ds = new DataSet();
cmd.Parameters.Add("@order_id", SqlDbType.VarChar, 50);
cmd.Parameters["@order_id"].Value = order_id;
ds = SQLHelper.ExecuteAdapter(cmd, CommandType.StoredProcedure, "dbo.get_orderdetail");
return (ds);
}pls help me.

View 1 Replies View Related

Viable Question

Oct 14, 2006

Why this always returning 'AAA'declare @ss char(20)set @ss='AAA'set @ss=@ss+'BBB'print @ss

View 2 Replies View Related

Verify If The Data Exist. If Not Insert If It Does Update

Jun 21, 2005

Im using a store proc in SQL server.I use one table to store my hockey statistics data of each player in the league.When i submit my form i would like to check if there is any sheet of that player that allready exist .. If so then i do update.. if it does not exist i do the insert statement.Here the way i figured it out ! One store proc to do the work of the insert or update. And one to check if the player exist in the statistics table. But i just cant find a way to return a good value from my second store proc. Does my logic make sence ? any ideas ?thank you<code>create procedure Hockey_Player_UpdateForwardStatistics
 @LeagueID int, @GamePlayerID int, @Games int
as
declare @PlayerID int
@PlayerID = Hockey_CheckPlayerStatistics(@LeagueID, @GamePlayerID)
if @PlayerID = 0begininsert into Hockey_PlayerStatistics( Games)values( @Games)end
ELSE
beginupdate Hockey_PlayerStatistics set Games = @Gameswhere LeagueID = @LeagueID and PlayerID = @PlayerIDend</code>

View 1 Replies View Related

Transact SQL :: How To Update Subsequent Columns If Proceeding Column Is Not Null

Jul 7, 2015

I've got a table with 6 fields :

EmployeeAccess
(MasterID, LoginID, AccessID, Storage1, Storage2, Storage3)
that needs to be updated using the data in the following spreadsheet
NewEmployeeAccessData
(ID, MasterID, AccessID1, LoginID1)

There is a 1:1 relationship between the two tables..I'm trying to code a pair of update statements on the EmployeeAccess table (1 for LoginID, 1 for AccessID) with the following logic:

If LoginID is NULL, then Update LoginID with new LoginID1 value,
If LoginID is not null, and Storage1 is NULL then Update Storage1 with New LoginID1 values
If LoginID is not null, and Storage1 is not NULL and Storage2 is NULL then Update Storage2 with New LoginID1 values
etc etc...

The same applies when trying to populate the AccessID column

If AccessID is NULL, then Update AccessID with new AccessID1 value,
If AccessID is not null, and Storage1 is NULL then Update Storage1 with New AccessID1 values
If AccessID is not null, and Storage1 is not NULL and Storage2 is NULL then Update Storage2 with New AccessID1 values
etc etc.

I have no control over the schema of this table  so I'm trying to work the logic on how to update the columns in my table only if the corresponding column data is NULL, else update the next non NULL Storage column.

View 7 Replies View Related

Can The Tell The SQL Server Agent To Wait For A Batch File To Finish Before Proceeding?

May 29, 2008

I'm setting up a Maintenance Plan to do a backup for me in the night of my SQL Databases but before the backup starts I have created a SQL Server Agent Job that runs a DOS batch (.bat) file using CmdExec. The problem is I need this process to finish processing before the rest of the tasks run in the Maintenance Plan. It seems like SQL Server Agent Job Task tells the batch file to start running (and it runs fine) but it just continues to the next task and does not wait for it to finish.

Is there any way I can configure this to wait before proceeding?

Thanks, Mark

View 7 Replies View Related

SQL Server 2008 :: Verify Partition Data Load Syntax

Feb 28, 2015

What is the syntax to verify that the partition data is loaded into the correct partition.

View 0 Replies View Related

SQL Server 2008 :: Verify Partition Data Load Syntax?

Mar 2, 2015

What is the syntax to verify Partition data load.

View 1 Replies View Related

Report Footer - Need Viable Workaround

Oct 18, 2007

After digging for some time now into the "guts" of SSRS, I am wondering if anyone out there has any ideas which might help me at this point.

I am trying to write an Invoice report.

Each report can have 1 to n invoices on it.
Each invoice can have 1 to n line items (spanning several pages for the larger ones)
Each page must have a fixed header and footer with account and payment information on it (the page header and page footer work OK for this).

And here is the problem. Each invoice must also include 1 to n images at the end of the report. 2 on a page and take up an entire 8.5 by 11 inch page. (spanning many pages when many line items exist)

Since the report already has a page header and footer with the report detail stuffed in a table in the middle of the page (report body), I am stuck.

I have read several posts which talk about having a can-grow container with a subreport in the existing footer, but I can't even come close to getting this to work. My footer would have to take up the entire page and having nothing but a subreport in it.

I can not provide a link to the images in the report, as each report must print in its entirety without user involvement (no drilling down).

I am thinking that my report is too complex for SSRS at this time. I would love to be proven wrong by someone on this forum.

Thanks for any and all replies.

John

View 2 Replies View Related

Is My Database Connection Alive And Viable?

Jan 23, 2007



Currently, our application is issuing a 'select count(*) from systypes' command to check to see if the database connection is live (has not been reaped by the firewall.) This is somewhat inefficient. Perhaps a less expensive way to implement this functionality is to issue a ping statement to the instance? At least a more simple SQL query is needed.

I'm sure this function is typical in application architecture. What recommendations do you have for confirming that a database connection is alive and viable from the application?



Stephanie

View 1 Replies View Related

The SSIS Data Flow Task Could Not Be Created. Verify That DTSPipeline.dll Is Available And Registered.

Jul 19, 2007

Hi all,



We are running SQL Server 2005 Standard edition on a Vista Business machine. Whenever we try to use DTS to import or export data we get this message:



The SSIS Data Flow Task could not be created. Verify that DTSPipeline.dll is available and registered.



Of course we have explicitly registered this dll and others.



Also we reinstalled SQL Server (and did it again. And did it again).

And we did the same for Visual Studio Standard (and again).



Any ideas in this elaborate group?



Thanks,

Tjerk

View 5 Replies View Related

ODBC Connect From MS Access To SQL Server 2000-Viable?

Jul 20, 2005

How viable is it to use MS Access as a front end (via ODBC) to a SQL Server2000 database?The users would access via the internet using a netgear vpn setup.Thanks,Paul S

View 1 Replies View Related

Amo And Creating Data Source And Data Source View Code

Feb 2, 2008

,
Hi
In this code how can I create a new data source and new data source view and model and structure that it run dynamic.
In this code I have a lot of errors, that they are about server and database don€™t have in current code,
In this code, first I should definition server or no?

Database dbNew = new Database (databaseName,
Utils.GetSyntacticallyValidID(databaseName, typeof(Database)));
srv.Databases.Add(dbNew);
dbNew.Update(true);
***********************************************************

How can I create data source and data source view and model and structure?
Please say code of that, and guide me.
databasename and srv is unknown.
Do I add other reference with analysis services?
Please explain about these codes:
************************************************************************
1)
RelationalDataSource dsNew = new RelationalDataSource(
datasourceName,
Utils.GetSyntacticallyValidID(
datasourceName,
typeof(RelationalDataSource)));

db.DataSources.Add(dsNew);
dsNew.ConnectionString = connectionString;

dsNew.Update();
2)

RelationalDataSourceView rdsv;

rdsv = db.DataSourceViews.Add(
datasourceviewName,
Utils.GetSyntacticallyValidID(
datasourceviewName,
typeof(RelationalDataSourceView)));

rdsv.DataSourceID = ds.ID
***************************************************************
3)
OleDbConnection cn = new OleDbConnection(ds.ConnectionString);
OleDbCommand cmd = new OleDbCommand(
"SELECT * FROM [" + tableName + "] WHERE 0=1", cn);
OleDbDataAdapter ad = new OleDbDataAdapter(cmd);

DataSet dss = new DataSet();
ad.FillSchema(dss, SchemaType.Source);

*************************************************************
4)

// Make sure we have the name we thought

dss.Tables[0].TableName = tableName;

// Clone here - the original DataTable already belongs to a DataSet
rdsv.Schema.Tables.Add(dss.Tables[tableName].Clone());
rdsv.Update();


5)

MiningStructure ms = db.MiningStructures.Add(miningstructureName, Utils.GetSyntacticallyValidID(miningstructureName,
typeof(MiningStructure)));
ms.Source = new DataSourceViewBinding(dsv.ID);
ms.CaseTableName = "Customer";

Add columns:
ScalarMiningStructureColumn smsc;

// From table "Customer" we will add a couple of columns
// CustomerID - key
smsc = new ScalarMiningStructureColumn("Customer ID",
Utils.GetSyntacticallyValidID("Customer ID", typeof(ScalarMiningStructureColumn)));
smsc.IsKey = true;
smsc.Content = "Key";
smsc.KeyColumns.Add("Customer", "customer_id", OleDbType.Integer);
ms.Columns.Add(smsc);

*******************************************
6)

MiningModel mm = ms.MiningModels.Add(miningmodelName,
Utils.GetSyntacticallyValidID(miningmodelName,
typeof(MiningModel)));


mm.Algorithm = "Microsoft_Decision_Trees";
mm.Parameters.Add("COMPLEXITY_PENALTY", 0.3);


MiningModelColumn mc = new MiningModelColumn("Customer ID",
Utils.GetSyntacticallyValidID("CustomerID",
typeof(MiningModelColumn)));
mc.SourceColumnID = ms.Columns["Customer ID"].ID;
mc.Usage = "Key";
mm.Columns.Add(mc);


mm.Update();

Please exactly say, whatever I want
Thanks a lot for your answer
Please don€™t move this question because I don€™t know where I should write this.

View 1 Replies View Related

How Do I Add An ODBC Connection Data Source As A Data Flow Source

Mar 2, 2007

I have set up a new connection as a connection from data source, but I cannot see how to use this connection to create my Data Flow Source. I have tried using an OLE DB connection, but this is painfully slow! The process of loading 10,000 rows takes 14 - 15 minutes. The same process in Access using SQL on a linked table via DSN takes 45 seconds.

Have I missed something in my set up of the OLE DB source / connection? Will a DSN source be faster?

Thanks in advance

ADG

View 2 Replies View Related

Create Data Source, Data Source View From XML?

May 8, 2007

hi everybody,
i want to create data source and data source view for data mining, with using C Sharp.
i have create data source and data source view and export to XML file, but when i change to another computer, run those XML file, it return error, when i run statement to create and biuld mining model, what can i change on xml or how to run XML on another computer sucessfully,
and have i build data source and data source view, how to do it.?

thanks for you helps

View 1 Replies View Related

Cannot Create A Connection To Data Source 'data Source Name'

Dec 11, 2007

Today I was making a few reports.
When I tested the reports in Visual Studio, they worked great: I got the expected result.
But when I deployed the reports to our reportserver the problem started.
When I click on the directory in which my reports are deployed, I got my 4 reports.
Till now everything worked correct.
But when I click on a report to view the results it went wrong.
I got an error:
"Cannot create a connection to data source 'Live'"
(Live is the name of our data source).

We are using the Windows Logons and I am sure that I have all the rights on the server, I gave myself 'sysadmin' rights, so it should work.
I also have tried it with all the roles assigned on my account, but then it still won't work.

When I modify the data source, and set it to another server en database it works.
The datasource 'Live' exists on a x64 MsSQL server, en the other datasource is on a x86 MsSQL server.
Maybe that is the problem?

Can someone tell me what is wrong?

View 1 Replies View Related

HELP: A Data Source Instance Has Not Been Supplied For The Data Source

May 20, 2007

Hi there,



I'm trying to build a report for Windows Forms (C#) using .rdlc files.

But every time I run it, I get this error message. I've followed the step-by-step from MSDN instructions... but it doesn't work!



Could anyone help me?!





Tks a lot!



Luis Antonio - Brazil

View 1 Replies View Related

Loading Data Using Ole Db Source With Input Source Being A View

Dec 13, 2007

I was trying to load data using SSIS, Data Flow Task, OLE DB Source, source was a view to a OLE DB Destination (SQL Server). This view returns 420,591 rows from Query Analyzer in 21 seconds. Row length is 925. When I try to executed the Data Flow Task from SSIS, I had to stop the process after 30 minutes, because only 2,000 rows had been retrieved. I modified the view to retun top 440, 000 and reran. This time all 420, 591 rows were retrieved and written in 22 seconds. Next, I tried to use a TOP 100 Percent. Again, only 2,000 rows were return after 30 minutes. TempDB is on a separate SAN Raid group with 200 gig free, Databases on a separate drive with 200 gig free. Server has 13 gig of memory and no other processes were executing.

The only way I could populate the table was by using an Execute SQL Task and hard code an Insert into table selecting data from the view (35 seconds) from SSIS.

Have anyone else experience this or a similar issue? Anyone have a solutionexplanation?

View 13 Replies View Related

Pipeline Error-excel Source-data Reader Does Not Read In Meta Data

Apr 16, 2008

Hi all, i got this error:


[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".

and also this:

[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.


I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected.
It seems that somehow the 3 columns are not read in from the source file?
ans alslo fiscal year, fiscal week is not set up up properly in my data destination?
anyone faced such errors before?

Thanks

View 13 Replies View Related

XML Data Source .. Expression? Variable? Connection? Error: Unable To Read The XML Data.

Feb 23, 2008

RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.

I want my XML Data source to be an expression as i will be looping through a directory of xml files.

I don't see the expression property or the connection property??

I tried setting the XMLData property to @[User::filename], but that results in:

Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning.
Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data.
Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0.
Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows.
Task failed: Load XML Files
Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted.
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).


Thanks for any help or information.

View 3 Replies View Related

Power Pivot :: Structural Data Model Changes In Data Source Leads To Errors

Oct 12, 2015

I've question about how to handle structural datamodel changes in a datasource of PowerPivot. Suppose I'm developing a starmodel in SQL Server and sometimes a datatype changes or a name of a field changes in a table. It seems to me that PowerPivot handle this not gracefully as Analysis MD does (mostly). I received an error because of a wrong fieldname or even no error when a dattype changes in PowerPivot. Is this common or do I something wrong here. Does this mean that every time the datamodel changes the PowerPivot should be recreated? Or am I missing the clue here?

View 6 Replies View Related

SQL 2012 :: SSRS Data Shared Data Source Connection Login

Jan 12, 2015

Was wondering if there was a best practice minimum permissions for creating a SQL login to use when setting up a new shared Data source for SSRS report manager?

Something along the lines of them being a data read for the DB and permissions to update tempdb?

Would have thought it not advisable to have the login be able to update the main db...

View 1 Replies View Related

Reporting Services :: Data Not Available For Selected Values To Be Set Based On Data Source

Jul 10, 2015

I have 4 Tablix and 2 of the Tablix get data from Server 1 and other 2 get the data from Server 2.I have set NoRowsMessage "=Data Not Available for the Selected Values"  for all the 4 Tablix.Now if data is not available from Server 1 then I must show "Data Not Available for the Selected Values" only once in the  outputbut now its appearing twice in the output because of the 2 tablix that had no rows.Similarly if data not available from Server 2 then it should show "Data Not Available for the Selected Values" only once in my output.If Data not avilable from all the Tablix then also i t should show only once as "Data Not Available for the Selected Values" in the report output.

View 3 Replies View Related

Truncation Of String Data With Data Reader Source Connecting To ODBC DSN

Mar 18, 2007

A data reader is using a connection manager to connect to an ODBC System DSN . A query in the SqlCommand property is provided. Data is being truncated in the only string column . The data type in data reader output-->external columns shows as Unicode string [DT_WSTR] Length 7.

The truncated output in a text file is the first 3 characters from left to right . Changing the column order has no effect.



A linked server was created in SQL Server Management Studio to test the ODBC System DSN using the following:

EXEC sp_addlinkedserver
@server = 'server_name',
@srvproduct = '',
@provider = 'MSDASQL',
@datasrc = 'odbc_dsn_name'

Data returned using "OPENQUERY" does not truncate the string column indicating that the ODBC Driver returns data as expected with sql 2005, but not with the Data Reader.

Any assistance would be appreciated.

Thanks,

View 3 Replies View Related

SQL 2012 :: SSIS Data Flow Items Tab Missing For Adding Data Source / Destination

Apr 3, 2014

I need to see inside a SSIS 2012 project a new SSIS installed component, but in the SSDT 2010 I cannot see the SSIS Data Flow Items tab for adding data source/data destination respect to the choose toolbox items pane.

View 4 Replies View Related

Problem Loading Data From FlatFile Source Data For Column Overflowed The Disk I/O Buffer

Sep 10, 2007



Hi i am trying to do a straight forward load from a Flatfile source , i have defined the columns according to the lenghts defined in the Data Dictionary Provided but when i am trying to run the Task i am encounterring this error

The column data for column "Column 20" overflowed the disk I/O buffer.

I tried to add another column 21 at the end and truncate or leave that column unmapped to destination but the same problem occurs for column 21 what should i do to over come this .

In case of Bad Data how to clean up the source.. Please help me with this








View 5 Replies View Related

Missing Data With SSAS Cube As A Report Data Source

May 9, 2006

I've got a report that is using a cube as a data source and I can't get the report to show all the data. Only data at the lowest level of the cube is displayed. The problem is that most of the data I'm concerned with is at higher levels. There's no problem with the MDX. I get the correct results when I run the query.

I'm using a table to show the results. I've also tried a matrix, but I get the same results. I'm using SSRS 2005 and SSAS 2000.

Anyone have experience with this? Am I missing something simple?

View 7 Replies View Related

Selecting A Single Data Source From Multiple Data Sources

Aug 4, 2006

Hi,

I am pretty new to SSIS. I am trying to create a package which can accept data in any of several formats. i.e. CSV, Excel, a SQL Server database/table and import the data into my destination database.

So far i've managed to get this working OK. However I am now TOTALLY stuck. I'm currently trying to just concentrate on the data sources being a CSV (using a Flat File Data Source) and/or an Excel Spreadsheet.

I can get the data in and to my destination using a UNION ALL component and mapping the data sources to it so long as both the CSV file and the Excel spreadsheet exist.

My problem is that I need my package to handle the possibility that only the CSV file might exist and there is no Excel spreadsheet. In which case i'd like the package to ignore the Excel datasource completely. Currently either of my data sources do not exist I get errors and the package terminates.

Is there any way in SSIS that I can check all my data sources to see which ones exist (i.e. are valid). If they exist I want to use them. If it doesn't exist i'd like to disgard it (without error - as long as there is a single datasource the package should run)

I've tried using the AcquireConnection method in a script task on each of my connections, hoping that it would error if the file/datasource did not exist. It doesn't though (in the case of an Excel datasource it just creates a empty excel file for me).

The only other option I can come up with are to have seperate packages depending on the type of data we want to import and then run a particular package depending on the format of the source data. This seems a bit long winded. I am pretty sure I must be able to do what I want to achieve but I can't work out how.

I'll be grateful to anyone who can send me any tips/hints/links on how I can achieve this.

Many thanks

Rob Gibson

View 5 Replies View Related

T-SQL (SS2K8) :: Load Data From Flat File Source Into OleDB Destination By Changing Data Types In SSIS

Apr 16, 2014

I have an source file and i have to load it into the data base by changing datatype of the columns in ssis

View 1 Replies View Related

Set The Data Source Of Data Flow From External Application (C#)

Jan 11, 2006

I am new to SSIS programming, so bear with me if my question seems naive to you gurus. I have a situation that needs to set the data source for a data flow from external .NET application ('external' means that the application will run on different process than the SSIS). I am trying to set the data source on which the data flow works from my C# application in a DataSet format. Ideal solution is not to save the DataSet to any file on harddisk (I know that will work, but has the overhead of writing, reading and managing the temp file). What I want to achive is that the business logic of picking data for SSIS Data Flow to process is controlled inside my C# application, the Data Flow just does what it does best - Transformation. Have any of you successfully done this before?. Thanks!

View 1 Replies View Related

Data Source Problem With Data-driven Subscription

Aug 24, 2007

I have a stored procedure that receives an input parameter of ReporID and then sends back the appropriate dataset to the data-driven subscription. In the stored procedure, there is some concatenated sql in producting the final select statement to send to the subscription, but it is generating the following error:










The dataset cannot be generated. An error occurred while connecting to a data source, or the query is not valid for the data source. (rsCannotPrepareQuery) Get Online Help




Insert Error: Column name or number of supplied values does not match table definition.
Because there are some optional columns that need to conditionally appear based on the report, I can't avoid concatenation. My only other option is to have hundreds of individual tables to source all our reports, which I don't have time to manage.

Please let me know how I can allow the data-driven subscription setup process to discover the columns for my concatenated SQL so I can seutp my schedules based on the single stored procedure with all my logic nested in my table structure.

Thanks.

View 7 Replies View Related

Data Access :: Initialization Of Data Source Failed

Sep 23, 2015

I'm new to sql server. Last week I made my first cube. Everything went fine. But when I want to explore the cube (from the explore panel in Visual Studio 2012) in Excel then I'm getting the error "Initialization of the data source failed".

Check your database server or contact your database administrator. Make sure the external database is available, and then try the operation again. If you see this message again, create a new datasource to connect to the database.

I'm using Sql Server 2012. Visual studio 2012. Excel 2013 on Windows 10

View 3 Replies View Related

Best Practice - Data Sources And Data Views Vs OLE DB Source

Feb 26, 2008

Hi, i'm wondering which is the best way to search data in a SQL Server.
I reach data using Data Sources and Data Views and also with OLE DB Source with a Data access mode: Named query.
I have to write the data into a Flat File. So, does any one knows which is the best practice for this? Or any one of the two are good choices?
Thanks for your help.

Beli

View 6 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved