Creating And Accessing Temp Tables In SSIS Package
Sep 26, 2007
Hi,
I want to create a local temporary table in execute sql task and and want to use the same in Data flow task as source table.
I follow the following steps to achieve this:
01. Created a new SSIS package
02. Create a connection string to "(local)/." server, "tempdb" database
03. Set the "RetainSameConnection" property value to "TRUE"
04. Set the "DelayValidation" to "TRUE", where ever I found this property
04. In Control Flow I added to items
a. Execute SQL Task
b. Data Flow Task
05. For "Execute SQL task" I set the connection to "tempdb"
06. I written the following query
Create table #transfer_CompaniesToProcess_tbl
(
companyID int not null
)
GO
07. In Data Flow task I added "OLE DB Source" and "OLE DB Destination"
08. In "OLE DB Source" I changed the "Data access mode:" to "SQL command"
09. In "SQL command text:" I entered "select * from #transfer_CompaniesToProcess_tbl"
10. When I clicked on the "OK" button; I ended with following error:
TITLE: Microsoft Visual Studio
------------------------------
Error at Data Flow Task [OLE DB Source [1]]: An OLE DB error has occurred. Error code: 0x80040E14.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Statement(s) could not be prepared.".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Invalid object name '#transfer_CompaniesToProcess_tbl'.".
------------------------------
ADDITIONAL INFORMATION:
Exception from HRESULT: 0xC0202009 (Microsoft.SqlServer.DTSPipelineWrap)
------------------------------
BUTTONS:
OK
------------------------------
I gone through the following article and it seems I missed some thing.
http://blogs.conchango.com/jamiethomson/archive/2006/11/19/SSIS_3A00_-Using-temporary-tables.aspx
Can any one have any idea where I am doing wrong?
Thanks
Sreekanth
View 9 Replies
ADVERTISEMENT
Nov 17, 2004
Frenz,
I need to create tables that are session specific. i.e When I login to a database i have to create a Table that can be accessed across all the procedures and triggers. When I log out the Tables should be droped. I understand that Local Temporary (#) tables can be created in annonymus block, but I need the tables to be created during login.
Your response is highly appreciated.
-Cheeku
View 2 Replies
View Related
May 17, 2004
Hi
I have developed an application in ASP/SQL server 7.
Ths system is single user. One of the tables is updated by usr actions( say Table A).
To make it multi user. I want to create table such as A_Username.
How can query be written for this. Also there are many stored procedures which will access this table. In these stored procedures i will send username as an input. Then the query in the stored procedure shd access the table as A_Username .
Such dynamic table name refrencing , how can it be done.. Is creating a string for the query and then executing it using sp_exec the only option?
pls suggest
View 6 Replies
View Related
Jul 18, 2002
Sql 2000
Hi All,
I am creating a dts package which copies data via transform data task from a sql sever connection to an excel spreadsheet connection.
The transform data task uses a query that goes like this.
create table #temp1(CName varchar(10))
Insert #temp1
select CName from tbl1
--tbl1 is in a user DB called SALES
create table #temp2(VName varchar(10))
Insert #temp2
select CName from #temp1
select * from #temp2
--I need the resultset of the last select statement into an excel spreadsheet.
The problem is I can't even create the dts package..it shows up with an error saying that #temp1 is an invalid object....I am guessing that since the default database for the transform data task has been selected as SALES it is trying to locate the the #temp1 or #temp2 table in that DB instead of looking in the TEMPDB....My question is..is there any workaround for this problem.
Sample code welcome!!!!
Any help appreciated !
TIA
Kinnu
View 1 Replies
View Related
May 2, 2008
Hi All,
I am in the process of moving from a 32-bit SQL Server 2005 Enterprise (9.0.3054) to a 64-bit SQL Server 2005 Enterprise (9.0.3054 with 4 CPUs and 8GB of memory on Win 2003 SP2) and the process has been very frustrating to say the least. I am having a problem with packages that I created on my 64-bit SQL Server. I am importing a few tables from the 32-SQL Server into the 64-bit SQL Server using the Task --> Import to create the package.
Sometimes when I am creating a package I get the following error in a message box:
SQL Server Import and Export Wizard
The SSIS Runtime object could not be created. Verify that DTS.dll is available and registered. The wizard cannot continue and it will terminate.
Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. (System.Windows.Forms)
Other times when I run a package that has run successfully before I get the following error:
Faulting application dtexecui.exe, version 9.0.3042.0, stamp 45cd726d, faulting module unknown, version 0.0.0.0, stamp 00000000, debug? 0, fault address 0x025d23f0.
Other times I get this error message:
.NET Runtime version 2.0.50727.1433 - Fatal Execution Engine Error (79FFEE24) (80131506)
And still other times
The package appears to hang when running. By this I mean that the Package Execution Progress shows progress up to a point then it just stops. (The package takes about 17 seconds to run normally) CPU usage is at 1% and the package cannot be stopped.
I have deleted and re-created the package several times and I have also re-installed the service pack on the SQL Server (9.0.3054) but that did not help.
Does anyone have any other suggestions to try?
Thanks.
View 4 Replies
View Related
Nov 27, 2006
Hi,
I have a database with serveral tables, for example 'customer', I want to update this table with a SSIS package. However, to ensure we don't have issues if the update fails then I've put in an intermediate stage
Using an Execute SQL Task I create temporary tables, for example 'customer_tmp'. Data is then imported into these tables. When all the data is imported successfully the original tables are dropped and the temporary tables are renamed, removing the '_tmp'
This works fine and I'm happy with it. However, if someone adds a column to one of the tables in SQL server it is lost on the next upload.
Similarly I have to hard code creating the indexes into the package as well.
Does anyone know how I could copy the original table definitions and create the temporary tables dynamically. So that any new columns would be picked up?
And indeed is it possible to copy the indexes from one table to another before the drop and rename trick?
Thanks in advance.
Iain
View 4 Replies
View Related
Mar 5, 2007
HI All,
can any body give steps to import data from one sqldb to another through ssis package, i was comfotable with dts but ssis is a lil bit confusing.....
thnx
regards
View 1 Replies
View Related
Feb 18, 2008
Can anyone give a newbie a general script for creating an SSIS page to import data nightly from an Access database to sql 2005. They are not on the same server. The Access database us 366,968 kb and grows everyday from input from web content. It has 5 tables and 100's of queries, which is how it is updated.
Please give me a starting point, do not understand the info in books on line.
Thank you
Dee
View 6 Replies
View Related
Mar 12, 2006
Following is what I would like to do, so I can keep updating my central SQL Server database with latest updates from the field. I like to use SSIS 2005 to create a package that could do this. Any help to get me started would be appreciated. I need some help soon, pls give me something to get started. APpreciate it. Thanks.
Open connection and read client location table on the local SQL Server database called PODO
For each location id in the table do the following:
Store locationid/clientid in a variable called CLLOC_ID
Construct file name with mdb extension and store in a variable MDB_FILE
Establish connection to the data import folder
Search for that MDB_FILE in the folder on the file system
If there is a file where match = true then do this:
1) Open the access database
2) Read and import the data from the customer experience table
3) Write that data to the SQL Server tables where location = CLLOC_ID
4) Exit process
IF there is no match, exit process
Keep looping until all the client ids/loc ids are read from the SQL Server client location table.
MA
View 4 Replies
View Related
Jan 4, 2007
Hi guys,
I was intended to write a program that will create a SSIS package which will import data from a CSV file to the SQL server 2005. But I did not find any good example for this into the internet. I found some example which exports data from SQL server 2005 to CSV files. And following those examples I have tried to write my own. But I am facing some problem with that. What I am doing here is creating two connection manager objects, one for Flat file and another for OLEDB. And create a data flow task that has two data flow component, one for reading source and another for writing to destination. While debugging I can see that after invoking the ReinitializedMetaData() for the flat file source data flow component, there is not output column found. Why it is not fetching the output columns from the CSV file? And after that when it invokes the ReinitializedMetaData() for the destination data flow component it simply throws exception.
Can any body help me to get around this problem? Even can anyone give me any link where I can find some useful article to accomplish this goal?
I am giving my code here too.
I will appreciate any kind of suggestion on this.
Code snippet:
public void CreatePackage()
{
string executeSqlTask = typeof(ExecuteSQLTask).AssemblyQualifiedName;
Package pkg = new Package();
pkg.PackageType = DTSPackageType.DTSDesigner90;
ConnectionManager oledbConnectionManager = CreateOLEDBConnection(pkg);
ConnectionManager flatfileConnectionManager =
CreateFileConnection(pkg);
// creating the SQL Task for table creation
Executable sqlTaskExecutable = pkg.Executables.Add(executeSqlTask);
ExecuteSQLTask execSqlTask = (sqlTaskExecutable as Microsoft.SqlServer.Dts.Runtime.TaskHost).InnerObject as ExecuteSQLTask;
execSqlTask.Connection = oledbConnectionManager.Name;
execSqlTask.SqlStatementSource =
"CREATE TABLE [MYDATABASE].[dbo].[MYTABLE] ([NAME] NVARCHAR(50),[AGE] NVARCHAR(50),[GENDER] NVARCHAR(50)) GO";
// creating the Data flow task
Executable dataFlowExecutable = pkg.Executables.Add("DTS.Pipeline.1");
TaskHost pipeLineTaskHost = (TaskHost)dataFlowExecutable;
MainPipe dataFlowTask = (MainPipe)pipeLineTaskHost.InnerObject;
// Put a precedence constraint between the tasks.
PrecedenceConstraint pcTasks = pkg.PrecedenceConstraints.Add(sqlTaskExecutable, dataFlowExecutable);
pcTasks.Value = DTSExecResult.Success;
pcTasks.EvalOp = DTSPrecedenceEvalOp.Constraint;
// Now adding the data flow components
IDTSComponentMetaData90 sourceDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
sourceDataFlowComponent.Name = "Source Data from Flat file";
// Here is the component class id for flat file source data
sourceDataFlowComponent.ComponentClassID = "{90C7770B-DE7C-435E-880E-E718C92C0573}";
CManagedComponentWrapper managedInstance = sourceDataFlowComponent.Instantiate();
managedInstance.ProvideComponentProperties();
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = flatfileConnectionManager.ID;
sourceDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(flatfileConnectionManager);
managedInstance.AcquireConnections(null);
managedInstance.ReinitializeMetaData();
managedInstance.ReleaseConnections();
// Get the destination's default input and virtual input.
IDTSOutput90 output = sourceDataFlowComponent.OutputCollection[0];
// Here I dont find any columns at all..why??
// Now adding the data flow components
IDTSComponentMetaData90 destinationDataFlowComponent = dataFlowTask.ComponentMetaDataCollection.New();
destinationDataFlowComponent.Name =
"Destination Oledb compoenent";
// Here is the component class id for Oledvb data
destinationDataFlowComponent.ComponentClassID = "{E2568105-9550-4F71-A638-B7FE42E66922}";
CManagedComponentWrapper managedOleInstance = destinationDataFlowComponent.Instantiate();
managedOleInstance.ProvideComponentProperties();
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManagerID = oledbConnectionManager.ID;
destinationDataFlowComponent.
RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.ToConnectionManager90(oledbConnectionManager);
// Set the custom properties.
managedOleInstance.SetComponentProperty("AccessMode", 2);
managedOleInstance.SetComponentProperty("OpenRowset", "[MYDATABASE].[dbo].[MYTABLE]");
managedOleInstance.AcquireConnections(null);
managedOleInstance.ReinitializeMetaData(); // Throws exception
managedOleInstance.ReleaseConnections();
// Create the path.
IDTSPath90 path = dataFlowTask.PathCollection.New(); path.AttachPathAndPropagateNotifications(sourceDataFlowComponent.OutputCollection[0],
destinationDataFlowComponent.InputCollection[0]);
// Get the destination's default input and virtual input.
IDTSInput90 input = destinationDataFlowComponent.InputCollection[0];
IDTSVirtualInput90 vInput = input.GetVirtualInput();
// Iterate through the virtual input column collection.
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)
{
managedOleInstance.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
DTSExecResult res = pkg.Execute();
}
public ConnectionManager CreateOLEDBConnection(Package p)
{
ConnectionManager ConMgr;
ConMgr = p.Connections.Add("OLEDB");
ConMgr.ConnectionString =
"Data Source=VSTS;Initial Catalog=MYDATABASE;Provider=SQLNCLI;Integrated Security=SSPI;Auto Translate=false;";
ConMgr.Name = "SSIS Connection Manager for Oledb";
ConMgr.Description = "OLE DB connection to the Test database.";
return ConMgr;
}
public ConnectionManager CreateFileConnection(Package p)
{
ConnectionManager connMgr;
connMgr = p.Connections.Add("FLATFILE");
connMgr.ConnectionString = @"D:MyCSVFile.csv";
connMgr.Name = "SSIS Connection Manager for Files";
connMgr.Description = "Flat File connection";
connMgr.Properties["Format"].SetValue(connMgr, "Delimited");
connMgr.Properties["HeaderRowDelimiter"].SetValue(connMgr, Environment.NewLine);
return connMgr;
}
And my CSV files is as follows
NAME, AGE, GENDER
Jon,52,MALE
Linda, 26, FEMALE
Thats all. Thanks.
View 4 Replies
View Related
Aug 6, 2007
Hi,
I'm trying to create a SSIS package. I have used OLEDBSource adapter to get the source table's data and transferring the data to an OLEDBDestination adapter. I tried but I'm facing the problem in mapping the metadata contents.
My code:
//////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
static void Main(string[] args)
{
// Create a new package
Package package = new Package();
package.Name = "OLE DB Transfer";
// Add a Data Flow task
TaskHost taskHost = package.Executables.Add("DTS.Pipeline") as TaskHost;
taskHost.Name = "Transfer Table";
IDTSPipeline90 pipeline = taskHost.InnerObject as MainPipe;
// Get the pipeline's component metadata collection
IDTSComponentMetaDataCollection90 componentMetadataCollection = pipeline.ComponentMetaDataCollection;
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 oledbSourceMetadata = componentMetadataCollection.New();
// Associate the component metadata object with the OLE DB Source Adapter
oledbSourceMetadata.ComponentClassID = "DTSAdapter.OLEDBSource";
// Instantiate the OLE DB Source adapter
IDTSDesigntimeComponent90 oledbSourceComponent = oledbSourceMetadata.Instantiate();
// Ask the component to set up its component metadata object
oledbSourceComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerSource = package.Connections.Add("OLEDB");
connectionManagerSource.Name = "OLEDBSource";
// Set the connection string
connectionManagerSource.ConnectionString = "provider=sqlncli;server=HSCHBSCGN25008;integrated security=sspi;database=Muthu_SSIS_Testing";
// Set the connection manager as the OLE DB Source adapter's runtime connection
IDTSRuntimeConnection90 runtimeConnectionSource = oledbSourceMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionSource.ConnectionManagerID = connectionManagerSource.ID;
// Tell the OLE DB Source adapter to use the SQL Command access mode.
oledbSourceComponent.SetComponentProperty("AccessMode", 2);
// Set up the SQL command
oledbSourceComponent.SetComponentProperty("SqlCommand", "select * from EmployeeTable");
// Set up the connection manager object
runtimeConnectionSource.ConnectionManager = DtsConvert.ToConnectionManager90(connectionManagerSource);
// Establish the database connection
oledbSourceComponent.AcquireConnections(null);
// Set up the column metadata
oledbSourceComponent.ReinitializeMetaData();
// Release the database connection
oledbSourceComponent.ReleaseConnections();
// Release the connection manager
runtimeConnectionSource.ReleaseConnectionManager();
// Add a new component metadata object to the data flow
IDTSComponentMetaData90 oledbDestinationMetadata = componentMetadataCollection.New();
//Associate the component metadata object with the OLE DB Destination Adapter
oledbDestinationMetadata.ComponentClassID = "DTSAdapter.OLEDBDestination";
// Instantiate the OLE DB Destination adapter
IDTSDesigntimeComponent90 oledbDestinationComponent = oledbDestinationMetadata.Instantiate();
// Ask the component to set up its component metadata object
oledbDestinationComponent.ProvideComponentProperties();
// Add an OLE DB connection manager
ConnectionManager connectionManagerDestination = package.Connections.Add("OLEDB");
connectionManagerDestination.Name = "OLEDBDestination";
// Set the connection string
connectionManagerDestination.ConnectionString = "provider=sqlncli;server=HSCHBSCGN25008;integrated security=sspi;database=Muthu_SSIS_Testing";
// Set the connection manager as the OLE DB Destination adapter's runtime connection
IDTSRuntimeConnection90 runtimeConnectionDestination = oledbDestinationMetadata.RuntimeConnectionCollection["OleDbConnection"];
runtimeConnectionDestination.ConnectionManagerID = connectionManagerDestination.ID;
// Tell the OLE DB Destination adapter to use the SQL Command access mode.
oledbDestinationComponent.SetComponentProperty("AccessMode", 2);
// Set up the SQL command
oledbDestinationComponent.SetComponentProperty("SqlCommand", "select from EmplTable");
// Set up the connection manager object
runtimeConnectionDestination.ConnectionManager = DtsConvert.ToConnectionManager90(connectionManagerDestination);
// Get the standard output of the OLE DB Source adapter
IDTSOutput90 oledbSourceOutput = oledbSourceMetadata.OutputCollection["OLE DB Source Output"];
// Get the input of the OLE DB Destination adapter
IDTSInput90 oledbDestinationInput = oledbDestinationMetadata.InputCollection["OLE DB Destination Input"];
// Create a new path object
IDTSPath90 path = pipeline.PathCollection.New();
// Connect the source and destination adapters
path.AttachPathAndPropagateNotifications(oledbSourceOutput, oledbDestinationInput);
IDTSInput90 input = oledbDestinationInput;
IDTSVirtualInput90 vInput = input.GetVirtualInput();
foreach (IDTSVirtualInputColumn90 vColumn in vInput.VirtualInputColumnCollection)
{
// Call the SetUsageType method of the destination
// to add each available virtual input column as an input column.
oledbDestinationComponent.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
foreach (IDTSInputColumn90 col in oledbDestinationInput.InputColumnCollection)
{
IDTSExternalMetadataColumn90 exCol = oledbDestinationInput.ExternalMetadataColumnCollection[col.Name];
oledbDestinationComponent.MapInputColumn(oledbDestinationInput.ID, col.ID, exCol.ID);
}
Console.WriteLine("done");
Console.ReadKey();
// Save the package
//Application application = new Application();
//application.SaveToXml(@"c:OLEDBTransfer.dtsx", package, null);
}
While debugging, I'm getting exception (ELEMENTNOTFOUND) at this line
IDTSExternalMetadataColumn90 exCol = oledbDestinationInput.ExternalMetadataColumnCollection[col.Name];
because metadata have no appropriate column to map with the destination component.
Any one help me to resolve this issue.
Regards,
kris
View 1 Replies
View Related
Jan 7, 2014
I have a ssis package in the DW environment. How can I run this package in the production environment by creating the job and run remotely.
View 7 Replies
View Related
May 21, 2007
I'm very new to ssis, so I hope I don't sound too ignorant. I was told there might be a way to create User interfaces for ssis projects. ie, I am trying to create an import package that will let a user import an excel file to a sql table. What I would like to have is some sort of web interface that lets the user select the source file and destination table.
The more I search for a way to do this the more it seems impossible without knowing a ton about coding. If anyoen can help I'd appreciate it.
View 4 Replies
View Related
Sep 11, 2007
I would like to create an event handler that would catch any errors that result from a sys.<table> not existing. The package is designed to run on both SQL Server 2000 and SQL Server 2005 and when I query sys.<tables> there is an error when the query is run on SQL Server 2000. I just need a good starting point...I would like something that when the server isn't 2005 it just skips the server and doesn't fail the package and doesn't get counted towards the max error count. Thanks for any help.
-Kyle
View 1 Replies
View Related
Sep 29, 2006
I need to run a make-table query against an Access database out of an SSIS package. I tried to do this with an OLE DB Command Task but it fails to create the table even though the task execution comes back successful. Any thoughts???
View 1 Replies
View Related
Apr 18, 2007
Trying to figure out the best method of reading in a number of flat files, all with different number of columns and data types and outputting them to a database.
Here's the problem: They are EBCDIC encoded and some of the columns are packed decimal. I've set up one package that takes the flat file, unpacks the decimal (Using UnpackDecimal component) and then sending the rest through a second component to go from EBCDIC -> ASCII.
What I need is a way to do this for every flat file based on the schema for that flat file. One current solution is to write a script/app to create the .dtsx XML file and then execute that for each flat file. It appears like this may be possible, but I haven't gotten far enough to know for sure. So my questions are this:
1) Is there an easier way to do this (ie somehow feed the schema to the package and use it to dynamically set up the column makers and determine which columns get fed to the unpack decimal component.
2) If there isn't a better way, will dynamically creating the .dtsx XML file based on the necessary input/output columns for each flat file work? If so, what is a good source of information on this (information about how the .dtsx XML file is set up, what needs to be changed/what doesn't, etc).
Thanks,
Travis
View 1 Replies
View Related
Jun 13, 2006
In the tutorial Creating a Basic Package Using a Wizard > Lesson 1: Creating the Basic Package >
says to use the following sql statement on the query page:
SELECT * FROM [Customers$] WHERE NumberCarsOwned > 0
When I paste the query in I get the message :
This SQL statement is not a query.
Does anyone have any suggestions? The input and output are set up correctly and I have the sample excel file Customers.xls.
I am new to all this, is there some setting I need to change for the tutorial to work or..?
FYI I have installed Sql 2005, Sp1.
View 4 Replies
View Related
Nov 17, 2004
Hi all,
Looking at BOL for temp tables help, I discover that a local temp table (I want to only have life within my stored proc) SHOULD be visible to all (child) stored procs called by the papa stored proc.
However, the following code works just peachy when I use a GLOBAL temp table (i.e., ##MyTempTbl) but fails when I use a local temp table (i.e., #MyTempTable). Through trial and error, and careful weeding efforts, I know that the error I get on the local version is coming from the xp_sendmail call. The error I get is: ODBC error 208 (42S02) Invalid object name '#MyTempTbl'.
Here is the code that works:SET NOCOUNT ON
CREATE TABLE ##MyTempTbl (SeqNo int identity, MyWords varchar(1000))
INSERT ##MyTempTbl values ('Put your long message here.')
INSERT ##MyTempTbl values ('Put your second long message here.')
INSERT ##MyTempTbl values ('put your really, really LONG message (yeah, every guy says his message is the longest...whatever!')
DECLARE @cmd varchar(256)
DECLARE @LargestEventSize int
DECLARE @Width int, @Msg varchar(128)
SELECT @LargestEventSize = Max(Len(MyWords))
FROM ##MyTempTbl
SET @cmd = 'SELECT Cast(MyWords AS varchar(' +
CONVERT(varchar(5), @LargestEventSize) +
')) FROM ##MyTempTbl order by SeqNo'
SET @Width = @LargestEventSize + 1
SET @Msg = 'Here is the junk you asked about' + CHAR(13) + '----------------------------'
EXECUTE Master.dbo.xp_sendmail
'YoMama@WhoKnows.com',
@query = @cmd,
@no_header= 'TRUE',
@width = @Width,
@dbuse = 'MyDB',
@subject='none of your darn business',
@message= @Msg
DROP TABLE ##MyTempTbl
The only thing I change to make it fail is the table name, change it from ##MyTempTbl to #MyTempTbl, and it dashes the email hopes of the stored procedure upon the jagged rocks of electronic despair.
Any insight anyone? Or is BOL just full of...well..."stuff"?
View 2 Replies
View Related
Apr 8, 2008
Hi All,
I have created fact tables and dimension tables in datawarehouse database, and i created a olap cube from those tables.
I want to run SSIS Package which populates these fact and dimension tables from datasources.
Thanks in advance,
Archana
View 5 Replies
View Related
Mar 12, 2008
Hi
I have a problem i receive the following error message when i try to add an new step into a SQL Server Agent job :
Failed to retrieve data for this request. (Microsoft.SqlServer.SmoEnum)
Additional information:
An exception occured while executing a Transact-SQL statement or batch.
(Microsoft.SqlServer.ConnectionInfo)
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction.(Microsoft SQL Server, Error: 3930)
This error pops up right after i change the type of the step to "SQL Server Intergration Services Package"
I have made the following configurations:
The user group (windows group) that the user belongs has the following roles in msdb :
db_dtsadmin
db_dtsltuser
db_dtsoperator
SQLAgentOperatorRole
SQLAgentReaderRole
SQLAgentUserRole
i have made a proxy to sql server agent which has the following subsystems :
"SQL Server Integration Services Provider" the proxy is tied to the same login which has those SQLagent and dts roles in msdb database.
Im using windows authentication and the user that logs into the sql server is in the same group that i have set all of the rights.
Ps. Clearly im missing some role or right somewhere because as soon as i give the group sysadmin role then all the users in that group can create SSIS steps in the agent.
Ps. Ps. I have been living under the impression that i dont have to give sysadmin rights to people that create ssis packages and schedule then with the agent.
View 1 Replies
View Related
Feb 21, 2008
Hi
I'm new in SSIS I'm trying to create SSIS that can copy tables,permission,index,constrant... to a sqlserver2000 from sqlserver2005. Can any one help stepbystep.
View 1 Replies
View Related
Apr 26, 2007
Hi guys,
I'm a newbie DBA and i'm trying to create a package that would extract data from MySQL and inserts them to a SQL 2005 Server. I'm quite new to this SSIS and would like to ask help from you to help me go through with this.
I hope you guys can help me with this.
Hoping to hear from you soon.
Thank you so much.
Kind regards,
Neil
View 7 Replies
View Related
May 25, 2015
If I have an XML without an XSD what is the best way to create and import data in SQL Server? I know I can use xsd.exe to create an XSD from my XML.
But if I want my structure to be somewhat different in SQL server how would I go about creating a reliable and repeatable import system for my data so i can easily manage the data updates?
View 3 Replies
View Related
Oct 22, 2007
I am transferring data from oracle and getting below error message.
I using 4 data flow tasks with in a single control flow and all the 4 tasks quueries same table but populates data in to different sql tables based on the where contidion
[OLE DB Source 1 [853]] Error: An OLE DB error has occurred. Error code: 0x80004005. An OLE DB record is available. Source: "Microsoft OLE DB Provider for Oracle" Hresult: 0x80004005 Description: "ORA-01652: unable to extend temp segment by 64 in tablespace TEMP ".
View 4 Replies
View Related
Nov 3, 2006
I need to create a fairly simple package. And almost because of the simplicity, I'm stumped.
I need to copy all non-system tables from server1.database1 to server2.database2. Additionally, four of the 30+ tables need to be renamed on the fly -- i.e. their name will reflect the year and month that the copy takes place.
I've tried using the Transfer SQL Server Object Task to simply copy the tables, but I get flaky results at best with it. Sometimes it tells me the source table doesn't exist, when I can clearly see it (and I've selected it from the list). And even though I have turned on the Include Indexes option, they don't always come through.
I'm wondering if I need to do a For Each loop looking at an ADO object?
Any suggestions?
Stephanie
View 5 Replies
View Related
Feb 15, 2011
I have a scenario, need to create SQL server Tables dynamically.
I Have multiple xml data file on a particular location, and want to load those XML data into sql server tables, but he metadata of each xml data files are not same.
Hence the approach is that,
1. Pick first file from that location
2. Create a table according to that xml data file metada
3. load data on newly created table.
4. Pickup the next xml data files.
5. loop through, till the XML data files are exists on that location.
View 4 Replies
View Related
Jul 17, 2007
Hi Guys,
Yet another question again on the issues with SSIS. I have a package now which is working fine.
The package consists of a control flow and i have 2 DF tasks which are unionall first and then saved into a sql server destination.
It's fine up to this point but i've just been notified that i would need to generate 2 files based on different values after i combined the data from 2 sql server DF tasks.
My question is how can i know the rows which are being saved on this sql server destination.
I have a primary key which is an autoincrement column.
Thank you
Gemma
View 45 Replies
View Related
Jul 20, 2005
I have an application that I am working on that uses some small temptables. I am considering moving them to Table Variables - Would thisbe a performance enhancement?Some background information: The system I am working on has numeroustables but for this exercise there are only three that really matter.Claim, Transaction and Parties.A Claim can have 0 or more transactions.A Claim can have 1 or more parties.A Transaction can have 1 or more parties.A party can have 1 or more claim.A party can have 1 or more transactions. Parties are really many tomany back to Claim and transaction tables.I have three stored procsinsertClaiminsertTransactioninsertPartiesFrom an xml point of view the data looks like this<claim><parties><info />insertClaim takes 3 sets of paramters - All the claim levelinformation (as individual parameters), All the parties on a claim (asone xml parameter), All the transactions on a claim(As one xmlparameter with Parties as part of the xml)insertClaim calls insertParties and passes in the parties xml -insertParties returns a recordset of the newly inserted records.insertClaim then uses that table to join the claim to the parties. Itthen calls insertTransaction and passes the transaction xml into thatsproc.insertTransaciton then inserts the transactions in the xml, and alsocalls insertParties, passing in the XML snippet
View 2 Replies
View Related
Jun 26, 2014
I have three databases and 40 tables within each database on my server, for each of the tables I want to know which SSIS package generates that table. Is there a script that can do this?
If the SSIS package does not create or populate a table on the server I then want to check if there is a stored procedure that populates this.
View 6 Replies
View Related
Oct 22, 2015
I need to export multiple tables from a database to multiple csv files (one for each table).
Rather than use SSIS and have multiple OLEDB sources and destinations (one for each table), is there a way to have a generic package that will export all the tables in the database ?
One way I can see is to use BCP in a loop - with the loop powered by a select statement that links to something like sys.tables etc, (or another table that i prepped with just the tables I want if I dont want them all).
i.e I would use a stored procedure that uses BCP (called via XPcmdShell) - so not via SSIS - although I could wrap up the whole thing in SSIS - but there is no realy need.
View 10 Replies
View Related
Jan 4, 2007
Hi All,
"Happy New Year!"
I need to append data to the existing Table1 from a .txt file stored like this: "http://xx0opt02.corpuk.net/ABC_REPORTS/DATA/Table1.csv"
(Columns are identical)
This .csv contains a rolling 7 days of stats. which means if it is added every Morning 6 of those Days will have been added before and must be Deleted.
I would like to schedule an automatic procedure to create a temp table in the server every day and a script removing duplicates.
I have a few questions:
*) Can I shedule from the Enterprise Console to read/create table from the above link to do this?
**) I heard I need an SQL agent. If so why and what is it?
***) Is it better to append data and then script removing Duplicates, or is it better to import into a temp table run comparison between the 2, Delete from Temp what is Common and then append whats left of the Temp to the Table1?
****) Please could someone paste a sample of CREATE TABLE from a http link like the one above?
Thanks for your help,
Gezza
View 1 Replies
View Related
Jan 27, 2008
I am getting frustrated with the accounting database table structure. I am really struggling to get the ‘par hours’. This is the amount of time that someone is assigned to work based on their workgroup in the workhour table.
Table 1 –Workhour table
Hours Resourceid workhourgroupcode dayid
NULL TJOHNSO TJOHNSO 1
8.0 TJOHNSO TJOHNSO 2
NULL TJOHNSO TJOHNSO 3
8.0 TJOHNSO TJOHNSO 4
NULL TJOHNSO TJOHNSO 5
8.0 TJOHNSO TJOHNSO 6
NULL TJOHNSO TJOHNSO 7
Table 2 - Time
Work DayHours
WorkGroup Resourceid DayID Non TimeDayWorked Hours worked
NULL SMB ABOOKWALTER 1 NULL 2007-12-26 00:00:00.000 7.50
NULL SMB ABOOKWALTER 1 NULL 2007-12-27 00:00:00.000 7.50
NULL SMB ABOOKWALTER 1 NULL 2007-12-28 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 2 1.00 2007-12-26 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 2 1.00 2007-12-27 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 2 1.00 2007-12-28 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 3 1.00 2007-12-26 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 3 1.00 2007-12-27 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 3 1.00 2007-12-28 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 4 1.00 2007-12-26 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 4 1.00 2007-12-27 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 4 1.00 2007-12-28 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 5 1.00 2007-12-26 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 5 1.00 2007-12-27 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 5 1.00 2007-12-28 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 6 1.00 2007-12-26 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 6 1.00 2007-12-27 00:00:00.000 7.50
8.30 SMB ABOOKWALTER 6 1.00 2007-12-28 00:00:00.000 7.50
NULL SMB ABOOKWALTER 7 NULL 2007-12-26 00:00:00.000 7.50
NULL SMB ABOOKWALTER 7 NULL 2007-12-27 00:00:00.000 7.50
NULL SMB ABOOKWALTER 7 NULL 2007-12-28 00:00:00.000 7.50
When I link these together by the resourceid dates worked, but since there is no date in the workhour table I cannot even do an outer join. The only think I could think of was creating a temp table (or a view) to return a line item for every day during the timeframe given. I created a dayID for they second table, I am not sure how to proceed. I need a line item to return for every day of the month so it will look at the workhours table to see how many hours someone is supposed to work.That will allow me to bring into Crystal, group and total.
8.30 SMB ABOOKW 6 1.00 2007-12-26 00:00:00.000 7.50
8.30 SMB ABOOKW 6 1.00 2007-12-27 00:00:00.000 7.50
8.30 SMB ABOOKW 6 1.00 2007-12-28 00:00:00.000 7.50
NULL SMB ABOOK 7 NULL 2007-12-29 00:00:00.000 0
NULL SMB ABOOKW 7 NULL 2007-12-30 00:00:00.000 7.50
NULL SMB ABOOKW 7 NULL 2007-12-31 00:00:00.000 0
View 2 Replies
View Related
Feb 22, 2008
I have 3 Checkbox list panels that query the DB for the items. Panel nº 2 and 3 need to know selection on panel nº 1. Panels have multiple item selection. Multiple users may use this at the same time and I wanted to have a full separation between the application and the DB. The ASP.net application always uses Stored Procedures to access the DB. Whats the best course of action? Using a permanent 'temp' table on the SQL server? Accomplish everything on the client side?
[Web application being built on ASP.net 3.5 (IIS7) connected to SQL Server 2005)
View 1 Replies
View Related