Should You Save Results Of Intermediate Data Flow Steps To Temporary Tables Or Raw Files?
Jun 2, 2006
Hi,
I'm just starting off in SSIS and have a question that I can't find an answer to...
I'm loading in a number of files (in separate Data Flows) and performing some transformations on them before merging them back together. What I'm not sure about is what I should be doing with the data at the end of each of my "Import Data From XXXX Flat File" Data Flows. Am I better off using OLE DB Destinations (or SQL Server Destinations) and saving this intermediate data to temporary tables, or am I better off using a Raw File Destinations and saving this intermediate data to files? Or is there, perhaps, a better option that I'm currently unaware of?
If the Raw File Destination is the way to go, then isn't there a maintenance issue with cleaning up all the files created? And will there not be a management issue to ensure that there is sufficient disc space available on the drive you are saving to?
I'm a bit confused and overwhelmed by SSIS at the moment, so any help would be much appreciated!
Thanks in advance,
Lawrie.
View 3 Replies
ADVERTISEMENT
Jun 15, 2006
Hi, I have one data flow with 10 sources and destinations in the flow. For the sources I'm using the datareader for an odbc and for the destination I'm using the ole db destination source. By default these run parallel when executed. Is there a way in the dataflow to run them step by step instead of creating 10 different data flow tasks in the control flow? Or is it better to have 10 different data flow tasks?
View 1 Replies
View Related
Jul 14, 2006
Hello.
I am using the "SSIS Log Provider for SQL Server" to log events to a table for "OnError" and "OnPostExecute" events of a package. This works as expected and provides a nice clean output on the execution steps of the package.
I am curious as to why I do not see any detail for any/all tasks that fall under the "Data Flow" section of the package though. For instance, on my "Control Flow" tab, I added a "Data Flow" task that simply loads a few tables from a target to destination server. However, there is nothing shown in the logging output. Just that a Data Flow task was initiated. And when I'm configuring this logging under "SSIS-->Logging" in the checkbox area on the left, you cannot "drill into" data flow steps.
Is there a reason why there is no detailed logging for Data Flow tasks? Would getting to that require me to create a custom log provider?
Thanks for the help.
Greg
View 1 Replies
View Related
Nov 16, 2014
In the T-SQL below, I retrieved data from two queries and I've tried to join them to create a report in SSRS 2008 R2. The SQL runs, but I can't create a report from it. (I also couldn't get this query to run in an Excel file that connects to my SQL Server data base. I've used other T-SQL queries in this Excel file and they run fine.) I think that's because I am creating temporary tables. How do I modify my SQL so that I can get the same result without creating temporary tables?
/*This T-SQL gets the services for the EPN download from WITS*/
-- Select services entered in the last 20 days along with the MPI number and program code.
SELECT DISTINCT dbo.group_session_client.note, dbo.group_session_client.error_note, dbo.group_session_client.group_session_id,
dbo.group_session_client.group_session_client_id, dbo.group_session.signed_note, dbo.group_session.unsigned_note
into #temp_group_sessions
FROM dbo.group_session_client, dbo.group_session
WHERE dbo.group_session_client.group_session_id = dbo.group_session.group_session_id
-- Select group notes
SELECT DISTINCT
dbo.client_ssrs.state_client_number, dbo.delivered_service_detail.program_name, dbo.delivered_service_detail.start_date,
dbo.delivered_service_detail.start_time,
dbo.delivered_service_detail.service_name, dbo.delivered_service_detail.cpt_code, dbo.delivered_service_detail.icd9_code_primary,
[code]....
-- Form an outer join selecting all services with any group notes attached to them.
select * from #temp_services
LEFT OUTER JOIN #temp_group_sessions
on #temp_services.group_session_client_id = #temp_group_sessions.group_session_client_id
;
-- Drop temporary tables
DROP TABLE #temp_group_sessions;
DROP TABLE #temp_services;
View 9 Replies
View Related
Mar 20, 2007
Good morning, all,
I am working on importing an Excel workbook, saved as multiple CSV flat files, that has both group level data and related detail row on the same sheet. I have been able to import the group data into a table. As part of the Data Flow task, I want to be able to save the key value for the group, which I will use when I insert the detail rows.
My Data Flow has the following components: The flat file with the data, which goes to a derived column transformation to strip out extraneous dashes, which leads to the OLEDB Destination component.
I want to save the value as a package level variable, so that I can reference it in another dataflow.
Is this possible, and if so, at what point do I save the value?
Thanks,
Kathryn
View 1 Replies
View Related
Jul 25, 2007
I have Data Flow task that contains 50 components.
My computer configuration: 1 GB RAM Microsoft Windows Server 2003
Periodicaly when i try to save package after making some changes Out of memory ... exceptions message box appears , and soon after this Not fatal error occurs ... message box shows . If i close solution and open it again all my 50 components disappears --instead I see clear list, and all my work losen.
Such "Not fatal errors" making hell out of job -- every time I need to change package i must add package to archive!!!
View 4 Replies
View Related
Feb 14, 2007
Hello
Kindly i need support in this issue, i create task flow import from flat file and store in database but i need to save all result for task into specific table
Like Record count transferred
Destination table name
Time ..........etc
thanks
View 3 Replies
View Related
Feb 9, 2007
Hi,
My scenario:
I have a master securities table which has 7 fields. As a part of the daily process I am uploading flat files into database tables. The flat files contains the master(static) security data as well as the analytics(transaction) data. I need to
1) separate the master (static) data from the flat files,
2) check whether that data is present in the master table, if not then insert that data into the master table
3) If data present then move that existing record to an history table and then update the main master table.
All the 7 fields need to be checked to uniquely identify a single record in the master table.
How can this be done? Whether we can us a combination of data flow items or write a sql procedure to do all this.
Thanks in advance for your help.
Regards,
$wapnil
View 4 Replies
View Related
Sep 24, 2007
Greetings,
I need some help determining the best way to accomplish my task. The workflow starts by generating a list of unique ID's from a local table. Then take that list of unique ID's and query an Oracle table for all matching records.
My thought was to first use an Execute SQL task with the following SQL:
select projectid from projectlist group by projectid
with Result Set configured as follows:
Result Name = projectid
Variable Name = varProjectIDList
Then in the Data Flow Task add a DataReader Source to pull the matching data. Here's where I'm getting hung up. I'd like to pass the result set from the Execute SQL task. I tried the following SQL but it doesn't work.
select * from masterlist where projectid = @[User::varProjectIDList]
I'm open to any suggestion on the best way to take my unique list and use it as input for a query against my Oracle DB.
Thank you for your ideas.
Rob
View 4 Replies
View Related
Apr 24, 2007
The SQL computed is complex enough that I can't see a way to make it a parameterized query. The obvious approach seems to be to compute the SQL in a CONTROL FLOW SCRIPT TASK and then use it to load a variable to set the VARIABLE SOURCE of a CONTROL FLOW EXECUTE SQL TASK.
I see that I can return a resultset to a variable.
But getting the rows of the results into a dataflow is not obvious. I have heard mentione that a Derived Column can do this. I can see using a dummy SCRIPT COMPONENT as DATA SOURCE with nothing in it to then drop into DERIVED COLUMN. But when setting up DERIVED COLUMN I don't see how to pull the columns out of the RESULTSET variable.
If it makes a difference I think the columns of the resultset will always be the same in this scenario.
Maybe this is totally the wrong approach? Any clues would be appreciated.
View 1 Replies
View Related
Jun 1, 2006
Hi,
Quick question on how SSIS handles queries from Data Source in a Data Flow. I noticed that when I run a particular query from Query Analyzer it takes forever. But, when I run the same query in SSIS data source in a data flow. The query results are immediate.
The query plan is already cached in SQL.
Is this just something which I am seeing incorrect or is there some bit of optimization in there in SSIS. As per my understanding SSIS does not optimize the source query.
Thanks,
Gaurav
View 3 Replies
View Related
Sep 7, 2007
Hi all,
I'm trying to find some way to implement a move file task in my Data Flow. I parse through a flat file, retrieve a value, then I'm using a lookup table to check if that value exists in a table... if the value doesnt exist, I need to move this file to a new directory and insert the value into a different table. Does anyone have any ideas about how to move the file while using the Data Flow pane?
Or is there some way to pass information from the look up table to the Control flow pane and use the Move File Task there? I'm trying to stay away from a Script task or a Script component unless it's unavoidable. I appreciate any ideas!
View 6 Replies
View Related
Aug 22, 2005
We're trying to read DBASE IV files as a source, but can't find any providers for that format. Will these be included in the final release? Is there another way? DBASE has always been supported, so it's kinda stranged.
View 19 Replies
View Related
Nov 2, 2006
I'm importing a large csv file two different ways - one with Bulk Import Task and the other way with the Data Flow Task (flat file source -> OLE DB destination).
With the Bulk Import Task I'm putting all the csv rows in one column. With the Data Flow Task I'm mapping each csv value to it's own column in the SQL table.
I used two different flat file sources and got the following:
Flat file 1: Bulk Import Task = 12,649,499 rows; Data Flow Task = 4,215,817 rows
Flat file 2: Bulk Import Task = 3,403,254 rows; Data Flow Task = 1,134,359 rows
Anyone have any guess as to why this is happening?
View 9 Replies
View Related
May 8, 2008
Hi All,
I'm developing a credit card application, and would like to store some sensitive data in a table temporarily, but be absolutely positive it either never goes into the log file in the first place, or is securely deleted from the log file when I delete the data. There's millions of other bits of data in the log, so truncating or otherwise disposing of the whole log is not an option.
In particular, I'm capturing the credit card mag stripe at the beginning of a transaction, and saving it while the work is done to complete the transaction. (ie, a pilot walks in, says "Top off the G4 with JetA, here's my card, have it ready when I get back.") We want to save the swipe info for maybe a couple of hours (it is encrypted during this time), use it for the authorization, then throw it away as per credit card industry rules.
Is there any way I can save the encrypted text string in a normal MSSQL field (Varchar(200)), have it stay around for a few hours as the user logs in and out of the database, then delete the record and be sure it's nowhere in the log?
TIA,
George Lehmann
Horizon Business Concepts, Inc.
Broken Arrow, OK
View 9 Replies
View Related
Feb 17, 2006
Hi,
I have a indexing problem. I have a sequence that needs to has a index number. I want to use a table data type and have a working sample BUT I cannot reseed the table when needed. How do I do this.
This works only for the first ExitCoilID then I need to RESEED.
Here is my code:
DECLARE
@EntryCoilCnt AS INT,
@ExitCoilID AS INT,
@SubtractedFromEntyCoilCnt AS INT
DECLARE
@ExitCoilID_NotProcessed TABLE
(ExitCoilID int)
INSERT INTO @ExitCoilID_NotProcessed
SELECT DISTINCT ExitCoilID
FROM
dbo.TrendEXIT
where
ExitCoilID is not null and
ExitCnt is null
order by
ExitCoilID
DECLARE
@ExitCoilID_Cnt_Index TABLE
(ExitCoilID int, ExitCnt int IDENTITY (1,1))
IF @@ROWCOUNT > 0
BEGIN
DECLARE ExitCoilID_cursor CURSOR FOR
SELECT ExitCoilID FROM @ExitCoilID_NotProcessed
ORDER BY ExitCoilID
OPEN ExitCoilID_cursor
FETCH NEXT FROM ExitCoilID_cursor
INTO @ExitCoilID
WHILE @@FETCH_STATUS = 0
BEGIN
INSERT INTO @ExitCoilID_Cnt_Index
SELECT ExitCoilID
FROM dbo.TrendEXIT
WHERE
ExitCoilID = @ExitCoilID
ORDER BY
EntryCoilID, Cnt
select * from @ExitCoilID_Cnt_Index
--truncate @ExitCoilID_Cnt_Index
--DBCC CHECKIDENT ('@ExitCoilID_Cnt_Index', RESEED, 1)
FETCH NEXT FROM ExitCoilID_cursor
INTO @ExitCoilID
END
CLOSE ExitCoilID_cursor
DEALLOCATE ExitCoilID_cursor
select * from @ExitCoilID_Cnt_Index
END --IF @@ROWCOUNT <> 0
View 8 Replies
View Related
Jul 14, 2003
Over the weekend, one of our out-of-house programmers ran an update to our three main tables. I know these are kind of broadstrokes, but basically he compared the data and updated certain fields when it met certain conditions (lots of rules basically). The three tables are one-to-one and contain a little over a million records. The comparison file contained around 400k records.
The scripts made it through 250k records from the comparison file before he had to stop it for the weekend.
When I came in to test the data yesterday - I was met with problems on my front end application - it would lock up on the write back to the database. I went into EP and experienced the same thing after making any changes to a record, it would just lock up. This only appears to be a problem on the 2 bigger tables of the 3. I currently have 12 gigs or so free on that box and I have already shrunk the log and data files.
I tried removing and re-adding the indexes, but I am freezing up everytime I try to either change or delete the Clustered Index on the Primary Key. I don't know why, but I thought maybe that was my issue.
I know this is pretty broad, but even if someone could give me ideas as to why SQL would lock up like that when trying to just save the data, it would be most helpful.
NOTE: There were NO structure changes in the update process and my restored data from Friday works perfect.
If you need more info, just ask. Thanks in advance for the help.
Don
elitecobra2000@yahoo.com
View 14 Replies
View Related
Mar 7, 2007
Microsoft could not have made SQL Server 2005 Reporting Services any harder given the wacky way you have to add web references in Visual Studio 2005 and then when you finally get the report services working as well as the C# program compiling and seeing the
Here are the steps for you poor souls like me:
When you create you C# program Maybe its the same thing for vb rightclick on the project
and click add Web Reference
To the right of the go button you must put in a path to:
http://<server>/reportserver/ReportExecution2005.asmx?wsdl
If you used the default ReportServer settings this should work You will need to make sure you have ReportServices working first but by entering this exact URL and press the go button allows you to acess the ReportExecutionServcie object ofcourse you need to properly rename it in the text box in the lower right labled: Web Reference Name: I renamed it to ReportExecution2005
Note us must add a using <myclassname>.ReportExecution2005; which allows access to the the components and allos compiling.
ReportExecutionService
object correctly and want to use them in C# program Microsoft makes you suffer further with problems getting the path in rs.LoadReport(ReportPath,HistoryID) function
I believe I have the correct path string from trial and error when I go to my report services site
It shows me which as a guess I thought might be derived from the webpage:
http://myserver.mydomain.com/Reports/Pages/Report.aspx?ItemPath=%2fReport+Project2%2fKSConcordanceErrorReport&SelectedTabId=PropertiesTab&SelectedSubTabId=GenericPropertiesTab
displaying:
SQL Server Reporting Services
Home > Report Project2 >
KSConcordanceErrorReport
/home/Report Project2/KSConcordanceErrorReport but that failed to work. Next I tried:
The RichTextBox stored:
The item '/home/Report Project2/KSConcordanceErrorReport' cannot be found. ---> The item '/home/Report Project2/KSConcordanceErrorReport' cannot be found.
/Report Project2/KSConcordanceErrorReport
??t was all that was returned when I tried this path so I believe but am (Not Sure) this is the reportpath to be used in rs.LoadReport function.
In the browser: I believe the catalog based reportpath should be
/Report Project2/KSConcordanceErrorReport
I set up a form with 3 fields:
TextBox: ReportPath so I could play with the path I provide
RichTextBox To store the output from rs.render.
RunButton1 to call up the render code see below:
Please note when I start the program in debug mode I get an error of:
Note the following message proceeds the execution:
The Project cannot be deployed because no target server
is specified. Provide a value for the TargetServerURL
property in the property pages for this project
Any help in getting output on this render function would be appreciated
Perhaps some of my work will help others struggling with this poorly documented service. Even the three poor books I had to work with did not help much. Does Microsoft really expect people to use this?
Here the relevant code. I will assume knowledge of Forms development I provide two pieces: 1 the calling function from the button presss event: and the method called:
richTextBox1.Text=EmbededReportx.Program.GetReportXML2005(
string.Format("http://myserver.mydomain.com/ReportServer/R
eportExecution2005.asmx"),ReportPathTextBox.Text);
public static string GetReportXML2005(string
ReportingServicesURL,string ReportPath)
{
ReportExecutionService rs = new
ReportExecutionService();
//windows authentication
rs.Credentials =
System.Net.CredentialCache.DefaultCredentials;
rs.Url = ReportingServicesURL;
byte[] result = null;
ParameterValue[] parameters = new
ParameterValue[1];
parameters[0] = new ParameterValue();
parameters[0].Name="user2report";
parameters[0].Value = "%";
string encoding, mimetype, extension;
Warning[] warnings = null;
string[] streamIDs = null;
try {
rs.LoadReport(ReportPath, null);
rs.SetExecutionParameters(parameters,
"en-us");
result = rs.Render("XML", null, out extension, out encoding,
out mimetype, out warnings, out streamIDs);
}
catch (SoapException e) {
return e.Message;
}
return System.Text.Encoding.ASCII.GetString(result);
}
Any help anyone could give me regarding this would be most appreciated:
View 1 Replies
View Related
Jun 13, 2006
Is it possible to skip all steps following the script task results (Step 1) in a For Each container. I am iterating thru all the files in a For Each container and parsing a few lines of the file and based on the result I want to force the For Each loop to get to the next file instead of executing the next steps. Is it possible to force the for each loop to get the to the next file if the test criteria in the very first step (Script Task) fails. Any inputs will be much appreciated.
THanks,
MShah
View 1 Replies
View Related
May 18, 2015
I have one SP in which I am opening transaction and on fail I am rolling back transaction.
Is there any way I can preserve data in temp table before failing record.
I means to say At record no 21 get failed I need to see 20 records in any temp tables.
View 11 Replies
View Related
Aug 29, 2007
Hello,
Is it possible to use existing data flow components (Merge Join, aggregation,...) in a custom data flow component?
Thanks,
Yoann
View 15 Replies
View Related
Apr 4, 2004
Hi all
I want to put the fetch results of a cursor to a temporary DB for manipulation, Im selecting all columns from the table in the cursor and the number of total columns is unknow.
Please guide me on how this could be done...
Thanks in advance
Regards
Benny
View 1 Replies
View Related
Dec 10, 2007
*Groan* Sorry. One more.
ERD above is what my database looks like (ignore it's in Access, database is in SQL 2005)
I have this code:
SELECT orderID, orderAmount = SUM(customerID)/customerID FROM orders
GROUP BY orderID
-- When Sum of the customer ID (and then) divided by the customer ID > 1
HAVING orderAmount > 1
which doesn't work because I never did find out how you make a column in the results to output your maths in.
orderAmount doesn't exist as a column in the database, but for this query it should show only those customers who have ordered more than once with the company.
Thanks again for any replies.
View 4 Replies
View Related
Oct 5, 2007
Firstly I consider myself quite an experienced SQL Server user, andamnow using SQL Server 2005 Express for the main backend of mysoftware.My problem is thus: The boss needs to run reports; I have designedthese reports as SQL procedures, to be executed through an ASPapplication. Basic, and even medium sized (10,000+ records) reportingrun at an acceptable speed, but for anything larger, IIS timeouts andquery timeouts often cause problems.I subsequently came up with the idea that I could reduce processingtimes by up to two-thirds by writing information from eachcalculationstage to a number of tables as the reporting procedure runs..ie. stage 1, write to table xxx1,stage 2 reads table xxx1 and writes to table xxx2,stage 3 reads table xxx2 and writes to table xxx3,etc, etc, etcprocedure read final table, and outputs information.This works wonderfully, EXCEPT that two people can't run the samereport at the same time, because as one procedure creates and writesto table xxx2, the other procedure tries to drop the table, or read atable that has already been dropped....Does anyone have any suggestions about how to get around thisproblem?I have thought about generating the table names dynamically using'sp_execute', but the statement I need to run is far too long(apparently there is a maximum length you can pass to it), and evenbreaking it down into sub-procedures is soooooooooooooooo timeconsuming and inefficient having to format statements as strings(replacing quotes and so on)How can I use multiple tables, or indeed process HUGE procedures,withdynamic table names, or temporary tables?All answers/suggestions/questions gratefully received.Thanks
View 2 Replies
View Related
Apr 27, 2006
I have created a package within SQL Server SSIS which includes an FTP Task, deployed it to our SQL Server (2005 SP1) msdb database and am running this job under SQL Agent on Windows Server 2003. Due to company security requirements this job has to be run under a service account within SQL Agent. The problem with this is that even though a directory is specified within the FTP Task to place any downloaded files into, the files are first written to the TIF (Temporary Internet Files) directory of "Default User" which is on the system drive. Based on corporate standards the system drive (C:) on our servers are only configured with enough space for the OS and other system files. All of the files being transferred are compressed, but some are still well over 1GB in size. The result is that many of our downloads are failing due to the system drive running out of space.
I have attempted to run IE by using "Run As" with the service account credentials, and have changed the location of the TIFs to a different drive, rebooted and verified the settings. When the SQL Agent job was run again, the files were still being written to the "Default User" directory on the system drive. I also created a new template account with the TIFs pointing to a non-system drive and used the User Profiles functionality of System Properties to copy the new template account to "Default User", but still the files are being written to the system drive.
My questions are:
is there a way to stop the FTP Task from using TIF (i.e. just directly write the file to the location specified)
is there a best practice around how to setup a service account and have it create a proper user profile that can be managed separate from "Default User"
short of specifying during the OS install, is there a way to move the "Default User" profile directory to a different drive
View 3 Replies
View Related
Feb 1, 2007
Why are some SSIS files, generated by the Import/Export Data wizard put into the local users temp folder? Why are these not compiled with the package when the solution is built?
Is there some setting I am missing?
This architecture is kind of silly, as the server always needs access to the temp folder on the local machine to run.
How can I get these temp files packaged with the rest of the package and deployed to the server so the server can run independent of the machine I develop the package on?
Thanks,
Jeff
View 8 Replies
View Related
May 1, 2006
Hi !
I am creating a website with a form that users can fill up the
information. This form is about the school's information of the users.
After fill up this form, the users will have to click the submit button
that will submit the form to be saved in a database.
I have created the database for that form. My question is how can I
save the result of this form to my database. Let's say I have 5
textboxes in that form, name, school's name, school's address, major,
and comments.
thanks a lot in advanced!
View 3 Replies
View Related
Aug 20, 1999
I want to write a small program (maybe just a stored procedure) that will save the results and column names of a simple query to CSV file. This will then be copied onto disk and distributed to user who'll use Excel or a similar package to open the CSV file.
Any suggestions as to the best way to achieve this? I was considering using a view and then BCP but I have problems then when trying to open the CSV file in excel (or other spreadsheet package). Doesn't like the datatypes and so on.
Is there a simple way to do this? And is it possible to save the results in such a way that Excel will choose the right datatypes for the columns. (not convert varchar's like '000122' to numbers.)
View 1 Replies
View Related
Apr 29, 2008
How do I save my query results into new table.... The ORIGINAL COLUMN Of course before parsing--- But the only data I want is in the three no name columns---(NO Column Name),(NO Column Name),(NO Column Name)I don’t want the original column saved back but I think it existing in the final query is blocking my Insert Into---
View 2 Replies
View Related
Nov 8, 2006
I need help in writing a query.
The query should get top 10 items and their values from current year and the values for the same items from previous year table.
I was able to write the code for 1st part that gets values from 1st table but I don't know how to get the values from 2nd table.
The 2 tables does not have any primary/foreign key relations. Both tables have same structure and same columns.
I am attaching some images below to give more information.
Image of results from my query.
Image of how the final output should look like.
The Store Procedure code is:
ALTER Procedure [dbo].[free_customsHS4](
@TblName1 varchar(20),
@TblType varchar(20),
@District varchar(6),
@Month varchar(3)
)
AS
Begin
SET NOCOUNT ON;
Declare @SQuery nvarchar(3000)
set @TblName1 = '[' + @TblName1 + ']'
set @TblType = '[' + @TblType + ']'
SELECT @SQuery = 'select top 10 a.commodity1 as HS4, b.descrip_1 as Description,
sum(a.all_val_mo) as [Amount],
(sum(a.all_val_mo)/(select Sum(a.all_val_mo) FROM ' + @TblName1 + 'a
where a.stat_month <=' + @Month + ' and a.district=' + @District +'))*100 as [% Share]
FROM ' + @TblName1 + ' a left outer join ' + @TblType + ' b on a.commodity1=b.commodity1
where a.stat_month <=' + @Month + ' and a.district=' + @District +'
Group by a.commodity1, b.descrip_1
order by [Amount] desc'
EXEC sp_executesql @SQuery
END
View 2 Replies
View Related
Apr 19, 2013
I found this topic from this link: Save MySQL query results into a text or CSV file | a Tech-Recipes Tutorial
I am try to create the text file from query results but it didn't work and got this error: "Incorrect syntax near the keyword 'INTO'.
SELECT sale, del
FROM order
INTO OUTFILE 'C:/tmp/orders.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '
'
View 10 Replies
View Related
Feb 7, 2007
Can anyone show me how to run a prediction query and save the results to a sql table without using the T-SQL OPENQUERY tip here http://www.sqlserverdatamining.com/DMCommunity/TipsNTricks/3914.aspx? I am looking for an example in vb.net that I can use in a SSIS script task.
Thanks
View 5 Replies
View Related
Sep 14, 2007
In sql mgt studio, I can save the query results to a file. How would I do this in an SSIS package?
View 2 Replies
View Related