XML Source Drops Data - Can I Fix In XSD?

Jul 19, 2006

XML looks like this:

<?xml version="1.0" standalone="yes" ?>
<hist key="ABC">
     <r date="2006/04/21" time="08:53:04" seq="1029">123</r>
     <r date="2006/04/21" time="09:21:40" seq="1613">123.25</r>
     <r date="2006/04/21" time="09:37:22" seq="  89">194.21</r>
     <r date="2006/04/21" time="09:37:22" seq="  91">194.21</r>
     <r date="2006/04/21" time="09:37:22" seq="  93">194.22</r>
     <r date="2006/04/21" time="09:37:22" seq="  95">194.22</r>
</hist>

In SSIS it reads all the date, time, seq etc but I lose that super critical "key" in the outputted data flow.  If I look at the XSD in visual studio it shows 2 tables related via a "nested heirarchy"... "hist" and "r" with "key" being an attribute in the hist table and all the rest as attributes in the "r" table.  However in SSIS it only shows one table so I can't get at that key :(

Any idea how to remedy?  I keep messing around with the XSD file but that hasn't gotten me anywhere thus far...

Thanks

View 9 Replies


ADVERTISEMENT

DTS Import Of MDB In SQL Server 2000 Drops Memo Field Data

May 22, 2006

I have used DTS in SQL Server 2000 to import an MDB filed (MS ACCESS) of a table. When the table is imported the primary key is lost and the memo field data is completely gone.

I use the tranformation option in the DTS wizard to add the primary key and make sure the data type for the memo field is varchar and has a size of 8000. I need that large size since I am storing lots of html code.

When I preview the data I see the html code that is supposed to get imported. However, when I return all rows from the table in Enterprise Manager the field is empty.

So I tried to manually copy the data from the MS Access Database into SQL Server. Could not figure out if SQL Server has an interface like MS Access to simply copy data into a table. So I linked to the tables from MS Access to the SQL Server table.

When I opened the linked table I see the data in the description field. However, if I return the rows from within SQL Server no data is present.

I have some ASP code trying to read the data in the SQL Server table. However, nothing is returned and when I run the SQL Statement, nothing gets returned. The SQL statement returns all rows. All the other data is present but nothing in the description field.

What am I doing wrong? Any suggestions anyone, please!

TIA

View 1 Replies View Related

An Error Has Occurred During Report Processing. A Data Source Instance Has Not Been Supplied For The Data Source DetailDS_get_o

Mar 13, 2008

hi ,

i am trying for a drill through report (rdlc)

ihave written the following code in drill through event of reportviewer, whenever i click on the first report iam getting the error like

An error has occurred during report processing.


A data source instance has no
t been supplied for the data source "DetailDS_get_orderdetail".







the code is



using System;

using System.Data;

using System.Data.SqlClient;

using System.Configuration;

using System.Collections;

using System.Web;

using System.Web.Security;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.UI.WebControls.WebParts;

using System.Web.UI.HtmlControls;

//using Microsoft.ApplicationBlocks.Data;

using Microsoft.Reporting.WebForms;

using DAC;

public partial class _Default : System.Web.UI.Page

{

protected void Page_Load(object sender, EventArgs e)

{

ReportViewer1.Visible = false;

}

protected void Button1_Click(object sender, EventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

DataSet ds = new DataSet();

ds = obj.get_order();

ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DataSet1_get_order", ds.Tables[0]);



ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = "C:/Documents and Settings/km63096/My Documents/Visual Studio 2005/WebSites/drillthrurep/Report.rdlc";

ReportViewer1.LocalReport.Refresh();

ReportViewer1.Visible = true;

}

protected void ReportViewer1_Drillthrough(object sender, DrillthroughEventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

ReportParameterInfoCollection DrillThroughValues =

e.Report.GetParameters();



foreach (ReportParameterInfo d in DrillThroughValues)

{

Label1.Text = d.Values[0].ToString().Trim();

}

LocalReport localreport = (LocalReport)e.Report;

string order_id = Label1.Text;

DataSet ds = new DataSet();

ds = obj.get_orderdetail(order_id);



ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DetailDS_get_orderdetail", ds.Tables[0]);

ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = Server.MapPath(@"Reportlevel1.rdlc");

ReportViewer1.LocalReport.Refresh();





}



}

the code in method get_orderdetail(order_id) is

public DataSet get_orderdetail(string order_id)
{
SqlCommand cmd = new SqlCommand();
DataSet ds = new DataSet();
cmd.Parameters.Add("@order_id", SqlDbType.VarChar, 50);
cmd.Parameters["@order_id"].Value = order_id;
ds = SQLHelper.ExecuteAdapter(cmd, CommandType.StoredProcedure, "dbo.get_orderdetail");
return (ds);
}pls help me.

View 1 Replies View Related

Amo And Creating Data Source And Data Source View Code

Feb 2, 2008

,
Hi
In this code how can I create a new data source and new data source view and model and structure that it run dynamic.
In this code I have a lot of errors, that they are about server and database don€™t have in current code,
In this code, first I should definition server or no?

Database dbNew = new Database (databaseName,
Utils.GetSyntacticallyValidID(databaseName, typeof(Database)));
srv.Databases.Add(dbNew);
dbNew.Update(true);
***********************************************************

How can I create data source and data source view and model and structure?
Please say code of that, and guide me.
databasename and srv is unknown.
Do I add other reference with analysis services?
Please explain about these codes:
************************************************************************
1)
RelationalDataSource dsNew = new RelationalDataSource(
datasourceName,
Utils.GetSyntacticallyValidID(
datasourceName,
typeof(RelationalDataSource)));

db.DataSources.Add(dsNew);
dsNew.ConnectionString = connectionString;

dsNew.Update();
2)

RelationalDataSourceView rdsv;

rdsv = db.DataSourceViews.Add(
datasourceviewName,
Utils.GetSyntacticallyValidID(
datasourceviewName,
typeof(RelationalDataSourceView)));

rdsv.DataSourceID = ds.ID
***************************************************************
3)
OleDbConnection cn = new OleDbConnection(ds.ConnectionString);
OleDbCommand cmd = new OleDbCommand(
"SELECT * FROM [" + tableName + "] WHERE 0=1", cn);
OleDbDataAdapter ad = new OleDbDataAdapter(cmd);

DataSet dss = new DataSet();
ad.FillSchema(dss, SchemaType.Source);

*************************************************************
4)

// Make sure we have the name we thought

dss.Tables[0].TableName = tableName;

// Clone here - the original DataTable already belongs to a DataSet
rdsv.Schema.Tables.Add(dss.Tables[tableName].Clone());
rdsv.Update();


5)

MiningStructure ms = db.MiningStructures.Add(miningstructureName, Utils.GetSyntacticallyValidID(miningstructureName,
typeof(MiningStructure)));
ms.Source = new DataSourceViewBinding(dsv.ID);
ms.CaseTableName = "Customer";

Add columns:
ScalarMiningStructureColumn smsc;

// From table "Customer" we will add a couple of columns
// CustomerID - key
smsc = new ScalarMiningStructureColumn("Customer ID",
Utils.GetSyntacticallyValidID("Customer ID", typeof(ScalarMiningStructureColumn)));
smsc.IsKey = true;
smsc.Content = "Key";
smsc.KeyColumns.Add("Customer", "customer_id", OleDbType.Integer);
ms.Columns.Add(smsc);

*******************************************
6)

MiningModel mm = ms.MiningModels.Add(miningmodelName,
Utils.GetSyntacticallyValidID(miningmodelName,
typeof(MiningModel)));


mm.Algorithm = "Microsoft_Decision_Trees";
mm.Parameters.Add("COMPLEXITY_PENALTY", 0.3);


MiningModelColumn mc = new MiningModelColumn("Customer ID",
Utils.GetSyntacticallyValidID("CustomerID",
typeof(MiningModelColumn)));
mc.SourceColumnID = ms.Columns["Customer ID"].ID;
mc.Usage = "Key";
mm.Columns.Add(mc);


mm.Update();

Please exactly say, whatever I want
Thanks a lot for your answer
Please don€™t move this question because I don€™t know where I should write this.

View 1 Replies View Related

How Do I Add An ODBC Connection Data Source As A Data Flow Source

Mar 2, 2007

I have set up a new connection as a connection from data source, but I cannot see how to use this connection to create my Data Flow Source. I have tried using an OLE DB connection, but this is painfully slow! The process of loading 10,000 rows takes 14 - 15 minutes. The same process in Access using SQL on a linked table via DSN takes 45 seconds.

Have I missed something in my set up of the OLE DB source / connection? Will a DSN source be faster?

Thanks in advance

ADG

View 2 Replies View Related

Create Data Source, Data Source View From XML?

May 8, 2007

hi everybody,
i want to create data source and data source view for data mining, with using C Sharp.
i have create data source and data source view and export to XML file, but when i change to another computer, run those XML file, it return error, when i run statement to create and biuld mining model, what can i change on xml or how to run XML on another computer sucessfully,
and have i build data source and data source view, how to do it.?

thanks for you helps

View 1 Replies View Related

Cannot Create A Connection To Data Source 'data Source Name'

Dec 11, 2007

Today I was making a few reports.
When I tested the reports in Visual Studio, they worked great: I got the expected result.
But when I deployed the reports to our reportserver the problem started.
When I click on the directory in which my reports are deployed, I got my 4 reports.
Till now everything worked correct.
But when I click on a report to view the results it went wrong.
I got an error:
"Cannot create a connection to data source 'Live'"
(Live is the name of our data source).

We are using the Windows Logons and I am sure that I have all the rights on the server, I gave myself 'sysadmin' rights, so it should work.
I also have tried it with all the roles assigned on my account, but then it still won't work.

When I modify the data source, and set it to another server en database it works.
The datasource 'Live' exists on a x64 MsSQL server, en the other datasource is on a x86 MsSQL server.
Maybe that is the problem?

Can someone tell me what is wrong?

View 1 Replies View Related

HELP: A Data Source Instance Has Not Been Supplied For The Data Source

May 20, 2007

Hi there,



I'm trying to build a report for Windows Forms (C#) using .rdlc files.

But every time I run it, I get this error message. I've followed the step-by-step from MSDN instructions... but it doesn't work!



Could anyone help me?!





Tks a lot!



Luis Antonio - Brazil

View 1 Replies View Related

Loading Data Using Ole Db Source With Input Source Being A View

Dec 13, 2007

I was trying to load data using SSIS, Data Flow Task, OLE DB Source, source was a view to a OLE DB Destination (SQL Server). This view returns 420,591 rows from Query Analyzer in 21 seconds. Row length is 925. When I try to executed the Data Flow Task from SSIS, I had to stop the process after 30 minutes, because only 2,000 rows had been retrieved. I modified the view to retun top 440, 000 and reran. This time all 420, 591 rows were retrieved and written in 22 seconds. Next, I tried to use a TOP 100 Percent. Again, only 2,000 rows were return after 30 minutes. TempDB is on a separate SAN Raid group with 200 gig free, Databases on a separate drive with 200 gig free. Server has 13 gig of memory and no other processes were executing.

The only way I could populate the table was by using an Execute SQL Task and hard code an Insert into table selecting data from the view (35 seconds) from SSIS.

Have anyone else experience this or a similar issue? Anyone have a solutionexplanation?

View 13 Replies View Related

Schema Changes When DROPS Are Necessary

Oct 9, 2007

We have a database that we are preparing to set up Merge replication on. We often make schema changes via T-SQL, many of these changes are made to tables on which an ALTER TABLE statement will not do (rather the creation of a temporary table, copying of the data, deleting the original table then renaming the temp table).


My question is how this will affect Merge replication. I have not been able to find anything that is very clear on this. From what I gather, if a table needs to be dropped that is participating in a merge replication, I need to go through the manual process (manual, as in calling the necessary system stored procedures) of removing the the article from any publication (and subsequent filtering), make the modifications, then re-add it to the publication and filtering.

Is this correct? If so, a new snapshot needs to be created, correct? If so, I have a follow-up question regarding that snapshot.

If a new snapshot needs to be created, what happens during replication/synchronization? Meaning, since it is a new snapshot, does the client (subscriber) see the whole thing as new or is it smart enough to recognize that only that one table i have changed needs to be synched?

I am quite new to replication, as you can tell, so please forgive the rambling. I ask these questions because I have heard different answers on both questions...so I would like to get the correct answers.

Greatly appreciated...

View 4 Replies View Related

File Drops

Jan 12, 2006

Is there a way to have SSIS monitor a folder for file drops? I have been unable to determine which object/task to use for this. We need the ability to have it monitor for files being dumped by other systems, pick those up and then process them. Thanks for your assistance.

- DeKlown

View 6 Replies View Related

Mirroring Session Drops

Jul 16, 2007

We've implemented mirroring between two identical servers. Sporadically, the mirroring session will drop and the ERRORLOG reflects the errors below at the exact time the mirroring session becomes suspended. We do not manage our back end network since we use a dedicated hosting environment at a remote location. Is this issue solely caused by network connectivity issues, or are there other factors at work?

2007-07-16 04:24:37.24 spid23s Error: 1453, Severity: 16, State: 1.
2007-07-16 04:24:37.24 spid23s 'TCP://192.168.215.92:5022', the remote mirroring partner for database 'evestment', encountered error 1204, status 4, severity 19. Database mirroring has been suspended. Resolve the error on the remote server and resume mirroring, or remove mirroring and re-establish the mirror server instance.
2007-07-16 04:24:48.46 spid23s Error: 1479, Severity: 16, State: 1.
2007-07-16 04:24:48.46 spid23s The mirroring connection to "TCP://192.168.215.92:5022" has timed out for database "evestment" after 10 seconds without a response. Check the service and network connections.

View 1 Replies View Related

SSIS Drops Rows

Mar 7, 2007

I've written an SSIS package which has two DataFlow tasks running simultaneously, importing the data from two separate ASCII files into two staging/prep tables: one Tasks imports records into an ORDER table while the other Task imports the ITEM-level records into an ITEM table.

Starting several days ago, the ITEM DF Task began importing a partial set of records even though the ASCII file clearly showed more records were available.

For example, in today's processing, here is the message from SSIS: << [ITEM Ascii File [1695]] Information: The total number of data rows processed for file "\webserv01globalscapeusr vr logstandarditem.txt" is 217361. >> yet i only found 10,401 rows -- and they all appeared to be the bottom 10,401 rows of the incoming ASCII file.

I cannot see any other untoward messages about SSIS encountering a problem.

There are no Error Output restrictions on the ASCII file.

The ASCII row preceding the 10,401 rows i DID get looks OK.

Can anyone suggest a possible place to look which might explain this "jumping" over the nearly 207,000 rows?

Thanks very much.

Seth J Hersh

View 7 Replies View Related

Insert That Drops Duplicate Records

Mar 23, 2007

I ought to know how to do this, but it escapes me at the moment. I need to write an insert statement for a table that will be based on a complex select query. The select query may return rows that are already in the target table. In that case, I don't want duplicates created, but I don't want the query to error, either. I can't remember how to set that up.

View 3 Replies View Related

Procedure Cache Usage Drops... Why?

Apr 28, 2008

My server (SQL 2005 SP2) typically runs with a procedure cache usage of about 92% or higher... lately it seems like at some point in time during the day it just drops to anywhere between 50% and 65%... with this comes horrible server performance and many snowball effects. If I clear the procedure cache it will go up only about 10% for a minute or two. The only way I can get it to recover completely seems to be restarting the SQL service. Then it will be fine till the next incident. The database is a read only (not set to read only but no updates other than replication). and the same SPs are run over and over and over throughout the day. also did notice that the compiles of the SPs goes up drastically at this point also. not sure if this is part of the cause or part of the effect.

CPU is normal. response from anything (even sp_who) is slow.

i do not understand the way procedure cache works completely so I thought I would ask for some direction.

Any ideas where to look or where to start???
Any thing I can do to catch this when it happens would be great.

thanks a head of time.

View 34 Replies View Related

SQL Identity Attribute Drops When Importing A Table

Dec 31, 2003

Hi all,

I am importing SQL 2000 tables from a developer install of SQL 2000. There are no destination tables to append just new tables being brought in.

I use the All Tasks > Import (DTS) service to do this. I have noticed that the Identity columns do not maintain the Identity attributes. I have to reset them after I import new tables. Is this normal? or is there a bug here?

I checked the MS KB with no success.
Anyone have some info on this?

Thanks.

View 8 Replies View Related

How To Prevent Database Drops In SQL Server 2000

Aug 28, 2007



Does anyone have a good strategy or technique for preventing database drops in SQL Server 2000? I know in 2005 DDL triggers rock, but in 2000 what can you do to audit who drops a database why keeping the same permissions intact.

Jason

View 3 Replies View Related

Initial Connection To SQL 2005 Drops After Several Commands

Mar 12, 2007

We are having some unusal problems with an upgrade to SQL Server 2005. I've come across two other people who have had this type of issue but I haven't found a resolution. We are running SQL Server 2000 in production and we have installed SQL Server 2005 on new hardware. We have been working for about a week to verify that all of out existing applications will work properly with SQL 2005.

We have several old VB applications that open a connection to the database and reuse it for days on end. We can test these apps against the production 2000 server and they run fine, when they connect to the 2005 server the connections work fine for 5 - 15 minutes then we get an error trying to execute a command and are told that the connection doesn't exist. (One interesting note, there is only one line of code in the whole program that connects to the database but we have seen 3+ connections from the app when we check in Management Studio for SQL 2005.)

Just to get though the tests we put in some error handling to reconnect if the connection is lost. When we do this the application ends up re-connecting only once, the new connection never has a failure even after hours of testing and we see a 300% performance improvement.

We can reliably cause this problem by stopping the sql server, starting it, waiting a few minutes, then running our app. After one run of the app the problem seems to go away for a while and we can't re-create it nearly as consistantly. Only occasionally will the initial connection fail and every time it does it will be after 5-10 minutes of activity.

Both machines are connected to the same switch and we have ruled out networking issues. We are only seeing this problem with VB6 apps, the VS2005 apps run great. We are looking for the cause of this connection behavior and a fix for it because we don't have the time to fix code in dozens of VB apps because the connections are not stable. Granted the practice of holding a connection open isn't good to begin with but with legacy code you have to work with what you're dealt.

Any thoughts as to why ADO connections from VB6 to SQL 2005 would not be stable?

View 4 Replies View Related

TempDb Drops User Account On Server Restart?

Sep 11, 2007

Please forgive my ignorance, I am by no means a SQL Expert, but have encountered a strange issue.

I have 6 SQL Servers, Primarily SQL 2005 (one older SQL 2000) all loaded on Windows Server 2003 SP1.

We use the servers for a proprietary database that we created which is the backend to a software package we sell.

The issue I have is: We have added a Security account to the servers, and in one case we have granted rights for this account to the TempDB system database. However, whenever we restart this server SQL drops this user account, thus severing connectivity for the app that is relying on that account.

I have set the account as DB-Owner etc, but nothing sems to keep it on re-start.

Any input would be greatly appreciated.

View 4 Replies View Related

Analysis :: Deploying Tabular Model Drops Records

Oct 22, 2015

I'm using SQL Server 2012 Bi edition.

I created a model in Visual Studio 2012 and when I process the model, most tables (the ones that should) process 77,546 records and that is reflected in VS.

However, when I deploy the model to the server it is only deploying 76,500 records for those tables.

Where those 1,046 records went to? Is there a setting that limits the records to be deployed? My base tables in the Data Mart have 77,546 records, just as in my Visual Studio model.

View 3 Replies View Related

SQL Server Agent SSIS Job: Set Values Drops Leading Quote

Jul 19, 2006

We are using BCP in a Process task. The value for the path to BCP's error log requires double quotes around it. We initially put this value in a configuration file, and that worked fine. Yesterday, we eliminated the config file and tried to use the "Set Values" tab of the SSIS SQL Server Agent job to pass in this value.

The package variable was assigned the value of the path -- all except for the leading double quote. The closing quotation mark was included in the value.

We tried adding a second double quote at the beginning of the value, but that caused the job to fail immediately.

For now we will work around this problem by putting the quotes around the value in a Script task inside the package.

Has anyone else encountered this problem?

Thanks,

Ron

View 2 Replies View Related

SQL Server 2008 :: Replicating Merge Range And File Drops For Partitioned Table?

Jul 28, 2015

I have few tables, which are replicated and partitioned. They also have archival process. I want to avoid having to run that same process on the subscriber.

Replication of partition switching is easy. However I am not sure how to replicate merge range and empty filegroup/file drops.

There the following article options:

Copy file group associations
Copy table partitioning schemes
Copy index partitioning schemes

I am not sure if these are enough to implement the replication of merge range and empty filegroup/file drops.

I could not find and option to copy partition functions.

View 0 Replies View Related

Pipeline Error-excel Source-data Reader Does Not Read In Meta Data

Apr 16, 2008

Hi all, i got this error:


[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".

and also this:

[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.


I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected.
It seems that somehow the 3 columns are not read in from the source file?
ans alslo fiscal year, fiscal week is not set up up properly in my data destination?
anyone faced such errors before?

Thanks

View 13 Replies View Related

XML Data Source .. Expression? Variable? Connection? Error: Unable To Read The XML Data.

Feb 23, 2008

RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.

I want my XML Data source to be an expression as i will be looping through a directory of xml files.

I don't see the expression property or the connection property??

I tried setting the XMLData property to @[User::filename], but that results in:

Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning.
Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data.
Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0.
Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows.
Task failed: Load XML Files
Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted.
Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).


Thanks for any help or information.

View 3 Replies View Related

Power Pivot :: Structural Data Model Changes In Data Source Leads To Errors

Oct 12, 2015

I've question about how to handle structural datamodel changes in a datasource of PowerPivot. Suppose I'm developing a starmodel in SQL Server and sometimes a datatype changes or a name of a field changes in a table. It seems to me that PowerPivot handle this not gracefully as Analysis MD does (mostly). I received an error because of a wrong fieldname or even no error when a dattype changes in PowerPivot. Is this common or do I something wrong here. Does this mean that every time the datamodel changes the PowerPivot should be recreated? Or am I missing the clue here?

View 6 Replies View Related

SQL 2012 :: SSRS Data Shared Data Source Connection Login

Jan 12, 2015

Was wondering if there was a best practice minimum permissions for creating a SQL login to use when setting up a new shared Data source for SSRS report manager?

Something along the lines of them being a data read for the DB and permissions to update tempdb?

Would have thought it not advisable to have the login be able to update the main db...

View 1 Replies View Related

Reporting Services :: Data Not Available For Selected Values To Be Set Based On Data Source

Jul 10, 2015

I have 4 Tablix and 2 of the Tablix get data from Server 1 and other 2 get the data from Server 2.I have set NoRowsMessage "=Data Not Available for the Selected Values"  for all the 4 Tablix.Now if data is not available from Server 1 then I must show "Data Not Available for the Selected Values" only once in the  outputbut now its appearing twice in the output because of the 2 tablix that had no rows.Similarly if data not available from Server 2 then it should show "Data Not Available for the Selected Values" only once in my output.If Data not avilable from all the Tablix then also i t should show only once as "Data Not Available for the Selected Values" in the report output.

View 3 Replies View Related

Truncation Of String Data With Data Reader Source Connecting To ODBC DSN

Mar 18, 2007

A data reader is using a connection manager to connect to an ODBC System DSN . A query in the SqlCommand property is provided. Data is being truncated in the only string column . The data type in data reader output-->external columns shows as Unicode string [DT_WSTR] Length 7.

The truncated output in a text file is the first 3 characters from left to right . Changing the column order has no effect.



A linked server was created in SQL Server Management Studio to test the ODBC System DSN using the following:

EXEC sp_addlinkedserver
@server = 'server_name',
@srvproduct = '',
@provider = 'MSDASQL',
@datasrc = 'odbc_dsn_name'

Data returned using "OPENQUERY" does not truncate the string column indicating that the ODBC Driver returns data as expected with sql 2005, but not with the Data Reader.

Any assistance would be appreciated.

Thanks,

View 3 Replies View Related

SQL 2012 :: SSIS Data Flow Items Tab Missing For Adding Data Source / Destination

Apr 3, 2014

I need to see inside a SSIS 2012 project a new SSIS installed component, but in the SSDT 2010 I cannot see the SSIS Data Flow Items tab for adding data source/data destination respect to the choose toolbox items pane.

View 4 Replies View Related

Problem Loading Data From FlatFile Source Data For Column Overflowed The Disk I/O Buffer

Sep 10, 2007



Hi i am trying to do a straight forward load from a Flatfile source , i have defined the columns according to the lenghts defined in the Data Dictionary Provided but when i am trying to run the Task i am encounterring this error

The column data for column "Column 20" overflowed the disk I/O buffer.

I tried to add another column 21 at the end and truncate or leave that column unmapped to destination but the same problem occurs for column 21 what should i do to over come this .

In case of Bad Data how to clean up the source.. Please help me with this








View 5 Replies View Related

Missing Data With SSAS Cube As A Report Data Source

May 9, 2006

I've got a report that is using a cube as a data source and I can't get the report to show all the data. Only data at the lowest level of the cube is displayed. The problem is that most of the data I'm concerned with is at higher levels. There's no problem with the MDX. I get the correct results when I run the query.

I'm using a table to show the results. I've also tried a matrix, but I get the same results. I'm using SSRS 2005 and SSAS 2000.

Anyone have experience with this? Am I missing something simple?

View 7 Replies View Related

Selecting A Single Data Source From Multiple Data Sources

Aug 4, 2006

Hi,

I am pretty new to SSIS. I am trying to create a package which can accept data in any of several formats. i.e. CSV, Excel, a SQL Server database/table and import the data into my destination database.

So far i've managed to get this working OK. However I am now TOTALLY stuck. I'm currently trying to just concentrate on the data sources being a CSV (using a Flat File Data Source) and/or an Excel Spreadsheet.

I can get the data in and to my destination using a UNION ALL component and mapping the data sources to it so long as both the CSV file and the Excel spreadsheet exist.

My problem is that I need my package to handle the possibility that only the CSV file might exist and there is no Excel spreadsheet. In which case i'd like the package to ignore the Excel datasource completely. Currently either of my data sources do not exist I get errors and the package terminates.

Is there any way in SSIS that I can check all my data sources to see which ones exist (i.e. are valid). If they exist I want to use them. If it doesn't exist i'd like to disgard it (without error - as long as there is a single datasource the package should run)

I've tried using the AcquireConnection method in a script task on each of my connections, hoping that it would error if the file/datasource did not exist. It doesn't though (in the case of an Excel datasource it just creates a empty excel file for me).

The only other option I can come up with are to have seperate packages depending on the type of data we want to import and then run a particular package depending on the format of the source data. This seems a bit long winded. I am pretty sure I must be able to do what I want to achieve but I can't work out how.

I'll be grateful to anyone who can send me any tips/hints/links on how I can achieve this.

Many thanks

Rob Gibson

View 5 Replies View Related

T-SQL (SS2K8) :: Load Data From Flat File Source Into OleDB Destination By Changing Data Types In SSIS

Apr 16, 2014

I have an source file and i have to load it into the data base by changing datatype of the columns in ssis

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved