Specifying Null Character For Loading Tables Using Bulk Loader

Jun 26, 2007

Hi All,

my dba folks are coming back with an answer to a question that sounds strange and I thought I would check on here.



Situation.

We are using an ETL tool I developed to move data around and building a DW on 2005. In some cases we are using the bulk loader to load and in some cases the tool itself.

One of the defaults of the tool is to truncate trailing blanks....and so fields that contain only blanks get truncated to a zero length character string.

When reloading the data with the tool it carries a null indicator next to the field value so the zero length character string is loaded as not null.

When loading with the bulk loader the dbas are telling me that the field is translated to a null. Note that they want some fields translated to null so they are using the keepnulls parameter.

On other databases (and built into the tool because this is so) the bulk loaders usually allow the specification of the load statement at column level and the specification of a 'null character sting' to be translated to null if this string is found. I put an example below.

I seem to recall that SQL Server 7 had some sort of bulk loader that allowed specification of columns at column level......for example offsets or whatever and the specification of fields to be interpreted as nulls. (Though that was a long time ago.)

I have searched through the manual and I don't see an option there any more to specify a character that will be interpreted to a null by the bulk loader.

Is it possible in 2005 to specify a character such that when the bulk loader sees it the field will be set to null?? And not just set fields to null which are not present in the load file?

(Just by the way, we are going to make the truncate trailing blanks optional and it's easy....it's just that I thought this kind of null if option was available in 2005 and I am keen to know if it is not there, either gone or never was...)



Thanks in Advance and Best Regards



Peter





Example of how Oracle does it...on page

http://www.csee.umbc.edu/help/oracle8/server.815/a67792/ch05.htm#5754
NULLIF Keyword
Use the NULLIF keyword after the datatype and optional delimiter specification, followed by a condition. The condition has the same format as that specified for a WHEN clause. The column's value is set to null if the condition is true. Otherwise, the value remains unchanged. NULLIF field_condition


The NULLIF clause may refer to the column that contains it, as in the following example: COLUMN1 POSITION(11:17) CHAR NULLIF (COLUMN1 = "unknown")

View 2 Replies


ADVERTISEMENT

Disabling Constraint For Bulk Loading

Dec 7, 2005

Greetings,I want to bulk load data into user defined SQL Servertables. For this i want to disable all the constraints on all the userdefined tables.I got solution in one of the thread and did the following:declare @tablename varchar(30)declare c1 cursor for select name from sysobjects where type = 'U'open c1fetch next from c1 into @tablenamewhile ( @@fetch_status <> -1 )beginexec ( 'alter table ' + @tablename + ' check constraint all ')fetch next from c1 into @tablenameenddeallocate c1goNow when i try to truncate one of the tables (say titles) it gives methe following error:Cannot truncate table 'titles' because it is being referenced by aFOREIGN KEY constraint.Can anyone show me the right path? I am working on ASE 12.5TIA

View 4 Replies View Related

Loading Bulk Data Frorm Textfile

Sep 17, 2004

i have bulk data that i dont want to have to enter manually. How can i achieve this for sql server. I want to be able to load from a text file (or any text format) with my data separated by delimeters. I know oracle has something that does this called sql loader.

View 6 Replies View Related

Loading Datetime Data Using Bcp/bulk Insert

Dec 11, 1998

Short version:
The best/fastest way to load large amounts of data from a comma delimited text file into an SQL Server table. Where the text file contains date fields in ccyy/mm/dd format and the SQL Server table defines those fields as datetime data types.

Details:
When I attempt to load files (using either bcp or BULK INSERT) containing datetime data the load process errors because the datetime fields in my text file are in ccyy/mm/dd format and the default format for SQL Server is mm/dd/yy. I have been unable to change the default format by using the SET DATEFORMAT statement (apparently the SET DATEFORMAT statement will not work for bcp because bcp runs outside of the SQL Server session???).
The only alternatives that I have come up with are: 1) Change the format of date fields in the text file from ccyy/mm/dd to mm/dd/ccyy. 2) Create a temporary table that defines the date fields as a char(n) datatype. Then load the data into the temp table. Then SET the DATEFORMAT to ccyy/mm/dd. Then copy the temp table into the permanent table (the permanent table using datetime data types).

Both of these alternatives would require additional processing time. Since this is a process that loads large amounts of data on a monthly (soon to be weekly) basis, speed is of the essence.

I would appreciate any suggestions.

Thanks!

View 6 Replies View Related

SQL 2012 :: Bulk Loading In Availability Groups

Aug 14, 2014

I have several 2012 availability groups running on a cluster. I have one database that is bulk loaded every 30 minutes. The DB is about 1 GB in size. To be on the availability group it has to be set to full recovery mode, but simple or even bulk would obviously be better. Is there a better way to handle the transaction log size other than to run a backup after each bulk load causing extra overhead? With mirrors you could use simple, but since those are going away . . .

View 2 Replies View Related

Error Bulk Loading CSV File With &" Quotes

Jun 16, 2008

Hi,

I have a csv file with 1.8 million records. Few of the text columns in each row has commas(,) in them and hence those columns are enclosed by " ".

An example record would look like:
123,abc,"abc, city, state",222,...

Now, the 3rd column should be read as: abc, city, state
But, it is reading ("abc) into 3rd column, and (city) into 4th column and (state") into 4th column resulting in data errors.

Is there a way to specify that fields are optionally enclosed by " as we do in Oracle?

Thanks,
Anil

View 2 Replies View Related

Loading Null Value From Excel

Mar 24, 2008

All, I am trying to load excel data into sqlserver through SSIS.
I have only one field in excel which has 6 rows as below.

The first row(<1) is loading into sqlserver as null value. Sqlserver destination table field is VARCHAR(10). Please help me on resolving this issue.

<1
1
2
3
4
5

View 1 Replies View Related

DRP Via Dumper -&> Loader

Jun 29, 2006

We want to implement a cold DRP strategy here, Is there any info where:

1. To transfer syslogins to DR Server (is BCP in/out sufficient).
2. SQL to extract DTS/Agent jobs from msdb.
3. I am aware of how to fix broken logins.

thanks all,
Paul

View 5 Replies View Related

Strange Character In Col Instead Of NULL

Jun 16, 2007

I have a database column like this ColName(CHAR(1), null)I want to set the column to Null if no value is passed to the database.When I set my parameter to  " " OR Nothing the column is not Null as I would expect. Instead a square character is there instead. The type of square you get when word or excel cannot resolve a char it puts a square in instead. This 'square' then doesn't appear as Null.Anyone know what is causing this??? 

View 4 Replies View Related

Convert Character To NULL

Jun 7, 2007



I am loading a flat file to a table but I also need to scrub the data a bit before the data hits the table. The main update required is converting a dot (.) character to a null value. The source file is using this character to indicate a blank. I know I can use the Dervived Column Transformation, but I have quite a few columns which will take a while to manually configure. Is there another transformation option that anyone can point me to?



Thanks

View 10 Replies View Related

Can't Access / Bootstrap Loader

Jan 26, 2013

I'm running Windows 7 64-bit (yes I'm sure) and when trying to install any version of SQL Server Express 2012 I get the following Error messages:

1) The following error has occurred:SQL Server Setup has encountered an error when running a Windows Installer file.Windows Installer error message: Error opening installation log file. Verify that the specified log file location exists and that you can write to it. Windows Installer file: C:UsersStudent AppDataLocal TempSQL Server 2012Setup1033_ENU_LPx64setupsqlsupport_msiSqlSupport.msi Windows Installer log file: C:Program FilesMicrosoft SQL Server110Setup BootstrapLog20130125_224835SqlSupport_Cpu64_1_ComponentUpdate.log Click 'Retry' to retry the failed action, or click 'Cancel' to cancel this action and continue setup. URL... LinkID=20476&ProdName= Microsoft%20SQL%20Server& EvtSrc=setup.rll&EvtID=50000& ProdVer= 11.0.21 00.60&EvtType=0xDC80C325

2) SQL Server Setup failure.

SQL Server Setup has encountered the following error: Could not find a part of the path 'C:Program FilesMicrosoft SQL Server110Setup Bootstrap Log 20130125_131045'..

I have all .NET framework 3.5 and 4.0 installed, Visual Studio 2012, Visio 2012... everything else is running fine, but I can't get this to work!

View 7 Replies View Related

Null - Bulk Insert Task

Jun 5, 2007

Dear All,

For ex:
In SQL Server 2005 when we create table with 3 columns which has allow nulls option is disabled. Then we try to insert record with blank to any one of the column SQL server throws error, coz column name is NOT Null without inserting data it won't saves the record.
When we try to import a csv data file has 3 columns & has null data. If we use BULK INSERT TASK option, it easily inserts the data, without throwing error.

My question is how can we restrict null data without inserting it into the table using BULK INSERT TASK using SQL Server 2005.

Please suggest me to solve this issue.

Thanks in advance
Karna

View 5 Replies View Related

Loading Of Tables

Mar 19, 2001

We are trying to load flat text files with upwards of 7 million records into a table on SQL. The table has a clustered index on 3 fields. We are sometimes able to complete smaller tables (500,000-750,000 records) and build the indexes prior to importing the data, however when we try the larger tables an error occurs :

Error at Destination for row number 6785496. Errors encountered so far in this task: 1

Location: somerge.c:1573
Expression: mrP->mrStatus!=MERGERUN::NONE
SPID: 11
Process ID: 173

None of the recods end up importing. The row number it gives is always the total number of records that was in the text file I was trying to import. I tried to import the text files first and then build the clustered indexes but a table with only 300,000 records ran for nearly 4 days without completing before we killed it.

Please help me on where to look......:(

Thanks,
Cheri

View 3 Replies View Related

Loading Data From XML To Tables

Dec 1, 2007

Hi,

I have got an xml file with size more than 2 GB. I have to load this file into tables. With 32 bit platform, I am unable to load this file using SSIS. Ram is 8 GB, but it is still bombing out. As I know it uses XML DOM Parser and tries to shred the file in memory and because of memory limition, it fails. Although I have already written code in C# using XmlTextReading object(implemetation of SAX Parser) to load data in tables, but I want to keep this loading process within the limits of DBAs.

I am stuck. Can someone guide me through the situation?

Thanks for your help!

Navnish

View 8 Replies View Related

Loading Data Into Columns 75 - N In Tables?

Jun 23, 2004

I'm having a very irritating time trying to migrate data from a COBOL system to SQL Server.

One of the A/R Master files has approx. 200 columns.

I can export this file any number of ways that will (sometimes) load partially into my database, but always when the load succeeds, columns 75 through N simply contain NULL, even though there is data in the file. When the load fails in DTS, the error is always missing column delimiter. Using BULK INSERT the error is always something like data too long at column 75.

Putting all this together, I have deduced that something isn't working if I try
to load a staging table with more than 74 columns of data. This seems like a way-too-low threshold for a robust database, especially since SQL Server is supposed to be able to handle up to 1,024 columns per table.

Has anyone ever encountered this problem?

Thanks in advance for any help

View 2 Replies View Related

Loading Tables With Foriegn Key Contraints

Aug 31, 2006

Hi,

I am having trouble loading tables (within the same data flow) that have a foriegn key relationship defined between them. For instance:

Table A is a parent (one side of the relationship) to Table B (many side of the relationship).

I am trying to load Table A first within the data flow and then Table B after, but I get the following error:

[OCMD EntityRole Insert [2666]] Error: An OLE DB error has occurred. Error code: 0x80040E2F. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The statement has been terminated.". An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E2F Description: "The INSERT statement conflicted with the FOREIGN KEY constraint "FK_EntityRole_Entity". The conflict occurred in database "ODS", table "dbo.Entity", column 'EntityGuid'.".

I am currently using OLE DB commands to perform the inserts, I load table A and move on to then load Table B, I can see the records in Table A before trying to load Table B but for some reason Table B load still fails.

I was thinking maybe this has something to do with the transaction setting or Isolation level but have played with this to no avail (currently everything is the default - supported/serializable). Also, I am thinking maybe because the OLE DB commands are creating two seperate connections (they are using the same connection manager) the second one is unable to see the transactions from the other (first) connection (Table A)?

Is there a way around this without dropping (disabling) forigen keys before the load and adding them back in after? Would like to avoid this?

I would also like to avoid reading the data source multiple times. Everything I need is in the one source so I would like to populate multiple tables from the one source data stream instead of reading the same data 2,3 or 4 times etc.

Seems to me there must be a simple explanation/solution for this but I'm stuck at this point?

P.S.

I was intially using OLE DB destinations (because they are much faster) and was having the same issue, which made sense because the OLE DB destinations do not let you pass the data stream on so I had to multi cast to the destinations so they were loading at the same time. I would rather use the OLE DB destinations so if you have any ideas around how I could do this using those components that would be appreciated too!

Thanks!

View 3 Replies View Related

Help Needed On Loading Different Files To Different Tables

Sep 16, 2007

I am trying to do here is to load different flat files to different tables:
For example, if the file name starts with "enrollment", then it goes to table "enrollment" table;
if the file name starts with "student", then it goes to "student" table.

For now, I created a foreach loop container for the each different files. So it ended up using several foreach loop containers. I am wondering if there is a way just to use one foreach loop containters to process the loading.

Anyone shed some light on this?? Thank you very much for your help!

View 1 Replies View Related

Loading Data In SQL Server To Related Tables (how)

Jul 14, 2004

Hi I have a question how to load data to tables linked by Foreign Keys in MSDE/SQL server. Example:
If I have 2 tables linked (by Foreign Key):
One table:

ITEM idITEM NAMEITEM CATEGORY (FK)
1cheese 2

And another:

Category IDCATEGORY NAME
1 household
2 food
3 general

How do I enter the load of data
Do I have to enter it as
1cheese2
or is there some way of entering it as
1cheesefood

TDS wizard does not allow me to transfer to views/querries what I thought would be a normal way as I would enter data to view(relevant to Access's form) and it would update related tables . When I wrote sql to do it it said I can not update my view table as too many tables would be affected(I had lookup tables empty then though)
I am doing it by number using TDS wizard to transfer it directly to the main table but there must be a better way

View 1 Replies View Related

Loading Tables With An Identity Field Using DTS Packages

Aug 2, 2001

Is there a way ,(if so what is the syntax?), to set up a DTS package
that loads a table that has an identity column. I am trying to load the data from another table, (leaving the identity field unmapped), and de-selecting the "enable identity insert" from the advanced tab of the Data Transformation Properties window. I keep getting errors due to the table not allowing null values. I tried using the set_identity command, but this still did not work.
Any help would be appreciated.
TB

View 1 Replies View Related

SQL 2012 :: Loading XML Files Into Tables By Using SSIS?

Apr 6, 2015

Currently we are trying to load the xml files into sqlserver tables by using ssis 2012,We are getting xml files as a column in source table ,so we have to push these xml files into destination tables.

I'm following the below way to perform this activity

[URL]

But We have standard XSD structure for all the xml files ,and if xml file matches the XSD structure then only we have to load ,else it should skip to next xml file.

View 1 Replies View Related

LOADING TEMP TABLES AND MULTIPLE SEARCHES

May 6, 2008

Hi Guys,

I have to work with a poorly designed table :(, that has columns

IDINT,
thisIDvarchar(50) null,
parentIDvarchar(50) null,
Titlevarchar(255) null,
Description varchar(8000) null,
ProductTypevarchar(255) null,

The reason it is poorly designed is the table is used to hold questions and answers, all with a 1:1 relationship. Instead of having ID, ProductType, Question, Answer they have unfortunately adopted the approach of the above i.e:

id 1
thisID 3
parentid nuLL
DESCRIPTION: this is a question

id 20
thisID 3_1
parentID 3
DESCRIPTION: this is the answer to the question above

So I am writing a sproc that does this using a temp table. I got this far:

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Author:Spencer
-- =============================================
ALTER PROCEDURE [dbo].[GetFAQs]
-- Add the parameters for the stored procedure here
@ProductType varchar(255)
AS
BEGIN
SET NOCOUNT ON;
-- Insert statements for procedure here
CREATE TABLE TEMP
(
IDINT,
thisIDvarchar(50) null,
parentIDvarchar(50) null,
Titlevarchar(255) null,
DescriptionQvarchar(8000) null,
DescriptionAvarchar(8000) null,
ProductTypevarchar(255) null,
)

SELECT
ID,
thisID,
parentID,
Title,
DescriptionQ,
DescriptionA,
ProductType
FROM
A2Z
WHERE
ProductType = @ProductType AND parentID IS NULL
END
GO

This gets all my questions for that product type.

What I need to do is load the questions into my temp table and then run through the a2z table again gaining the answers to the questions (the parentid holds the question ID). The answers then will also get loaded into the temp table.

Any bright sparks out there that can help me?

Cheers

View 9 Replies View Related

Loading Multiple Excel Sheets In To Different Tables In A DB

Nov 21, 2007



Hi All,

I have problem in loading multiple excel sheets data in to according to that excelsheets tables in a DB.
All the excel sheets are in a folder,from that folder i have to acces all excel sheets.
For this i am unsing script task and one dataflow task.
But the error is coming in script task i am not able to put the path in the script..

Is this the correct way to do like this? Or any other way?

Can u please tell me the solution for this..Thanks in advance who are responding to this mail...

Maruthi..

View 9 Replies View Related

Error Loading Tables Of Different File Groups.

Apr 6, 2006

Hi All,

I am facing a peculier problem. Problem definition goes like this,

I have one staging DB in which all the tables resides in Primary file and one production DB in which tables resides in 2 secondary files.

Now when iam trying to load the data from the table A in staging which is on primary file to the table A1 in production DB which in secondary file, all the data are going to error log instead of table A1.

Can you please tell me, where am i going wrong.

Regards,

Chetan

View 2 Replies View Related

Loading Data Into Dimensions &&amp; Fact Tables

Jul 10, 2007

Hello Everybody,



Can anybody please let me know the procedure for loading data into Dimensions & Facts? And what is the sequence of loading?



Thanks in Advance

Rajesh

View 1 Replies View Related

Loading Records In One Table To Multiple Tables

Aug 18, 2007

Data_Staging:
Unique_id
Gender
Ethnicity
Race
MCP_key
Admission_Dt
Discharge_Date
Enrollment_key
Reason
Disability
Income
Employment


I need to load the data from this table to three different tables all have foreign key relationship

Registration Table:
Registration_key ( Indetity) -PK
Unique_id
Gender
Ethnicity
Race

Episode:
Episode_Key(Identity)- PK
Registration_key (FK)
MCP_key
Admission_Dt
Discharge_Date

Assessment Table:
Assessment_Key(Identity) €“ PK
Registraion_Key(FK)
Episode_Key(FK)
Enrollment_key
Reason
Disability
Income
Employment

View 1 Replies View Related

Query Designer Very Slow Loading Tables

Aug 17, 2007

when i use the query designer and add tables it takes 15-30 mins to load a table....so if i need 6 tables in my query thats like 6 times 30 minute load times...i'm using oracle as my data source

when i switch the source to oledb it adds tables within seconds the only thing is i need to use named parameters

any ideas?

View 14 Replies View Related

Loading Images Into SQL Server Express 2005 Tables

Jul 19, 2006

Im new to SQL server express 2005 and im having issues loading images into my tables. can anyone show me how to achieve this?

Thanks

Matt

View 4 Replies View Related

Guidance Needed: Loading Hierarchical XML Into Relational Tables

Aug 8, 2007

I've got a lot of XML like this (simplified):




Code Snippet





... 8 MORE


... 9 MORE TIMES






I need to get this into three existing SQL Server 2005 tables, each with identity columns for their primary keys:




Code Snippet
CREATE TABLE ELEMENT1 (

[ID] INT IDENTITY
)

CREATE TABLE ELEMENT2 (

[ID] INT IDENTITY,
[ELEMENT1_ID] INT
)

CREATE TABLE ELEMENT3 (

[ID] INT IDENTITY,
[ELEMENT2_ID] INT
)





With primary and foreign keys as you'd expect, and, of course, many more columns!

How would I get this into tables through SSIS, preferably in a high-performance manner (there may be several gigabytes of XML to load).

The issue, of course, is that in order to insert an ELEMENT2 row, I need the ID from the coresponding ELEMENT1, etc.

Any ideas or pointers to articles would be welcome.

View 14 Replies View Related

SSIS - Data Loading Job -- Update Col B With Col A If Col B Is NULL In The Data File?

May 10, 2007

How do u achieve this -- While SSIS Data Load Execution itself?

Update Col B with Col A value if Col B is NULL in the Data File?

View 1 Replies View Related

Problem Loading Data From Foxpro Tables Under Multiple Folders

Oct 17, 2007



Hi,

I am trying load data from multiple Foxpro tables which are under a folder. I can have multiple folders with 17 foxpro tables. I was able to do it in DTS using ActiveX script. Here is the ACtiveX script.

'**********************************************************************
' Visual Basic ActiveX Script
'************************************************************************
Option Explicit
Dim conObj,DSNGosfbill,comObj,objRs,HostServer
Dim sFolder,sFileFolder, Details,subFolderoccur,sFileFolderDBF,sFileFolderFPT,CheckFile,dFiles,Fil
Dim fso, folderObj,subFolderList,dFolderObj
Dim objPackage,oStep,objPackage_1,oStep_1,ConnObj_001,ConnObj_004,ConnObj_031,ConnObj_032,ConnObj_033
Dim ConnObj_Hclaimb, ConnObj_HProv, ConnObj_Hids, ConnObj_HCodes, ConnObj_HSpan, ConnObj_002, ConnObj_HCHGB
Set conObj = CreateObject("ADODB.Connection")
HostServer =DTSGlobalVariables("gvServer").Value

'DSNGosfbill ="DRIVER={SQL Server};SERVER=" & HostServer & ";DATABASE=GOSFBILL"
DSNGosfbill ="DRIVER={SQL Server};SERVER=" & HostServer & ";UID=syntelhum;PWD=syntel123;DATABASE=GOSFBILL"
conObj.open DSNGosfbill

sFolder =DTSGlobalVariables("gvSSIUnzipPath").Value
CheckFile =DTSGlobalVariables("gvSSIBatchPath").Value

set comObj=CreateObject ("adodb.command")
set comObj.ActiveConnection =conObj
Function Main()
Dim Dir_Name,DirFlag
Dir_Name = ""
DirFlag = "N"
Set fso = CreateObject("Scripting.FileSystemObject")

If fso.FileExists(checkFile) Then
Else
Details = "***** Success.Lst file is missing in Batch folder. BATCH job may not be successfull or there are no folders in UNZIP directory to process. Check the batch run.*****"
Call Write_Log
Main = DTSTaskExecResult_Failure
Exit Function
End If
Set folderObj = fso.GetFolder(sFolder)
Set subFolderList = folderObj.SubFolders
For Each subFolderOccur in subFolderList
DirFlag = "Y"
Dir_Name = subFolderOccur.Name
Call Process_Dir(1,subFolderOccur.Name)
Next
If DirFlag = "N" Then
Details = "***** No directories to process in SSI UNZIP folder*****"
Call Write_Log
End If
If DirFlag = "Y" Then
Call Process_Dir(2,Dir_Name)
If objRs.Eof Then
Details = "***** No directories to process in SSI UNZIP folder*****"
Call Write_Log
End If
While not objRs.EOF
set sFileFolder = fso.GetFolder(sFolder & objRs("zip_file_name"))
Details = "***** Start-Time " & sFileFolder & " " & Date & " " & Time & "*****"
Call Write_Log
Call Update_Process_Flag("L",objRs("zip_file_name"))
'*******Execute the package for each directory****************'
'********* Call the Package**************'
Set objPackage = CreateObject("DTS.Package")
Set objPackage_1 = CreateObject("DTS.Package")



'objPackage.LoadFromSQLServer HostServer, , ,256, , , , "GOSFBILL_SSI_STAGING_LOAD"


objPackage.LoadFromSQLServer HostServer,"syntelhum","syntel123", , , , , "GOSFBILL_SSI_STAGING_LOAD"


'objPackage_1.LoadFromSQLServer HostServer, , ,256, , , , "GOSFBILL_SSI_Update_FileSource"

objPackage_1.LoadFromSQLServer HostServer,"syntelhum","syntel123", , , , , "GOSFBILL_SSI_Update_FileSource"

Set ConnObj_001 = objPackage.Connections("SSIPATH001")
ConnObj_001.DataSource = sFileFolder

Set ConnObj_002 = objPackage.Connections("SSIPATH002")
ConnObj_002.DataSource = sFileFolder

Set ConnObj_004 = objPackage.Connections("SSIPATH004")
ConnObj_004.DataSource = sFileFolder
Set ConnObj_031 = objPackage.Connections("SSIPATH031")
ConnObj_031.DataSource = sFileFolder
Set ConnObj_032 = objPackage.Connections("SSIPATH032")
ConnObj_032.DataSource = sFileFolder
Set ConnObj_033 = objPackage.Connections("SSIPATH033")
ConnObj_033.DataSource = sFileFolder

Set ConnObj_Hclaimb = objPackage.Connections("SSIPATHCLAIMB")
ConnObj_Hclaimb.DataSource = sFileFolder

Set ConnObj_HProv = objPackage.Connections("SSIPATHPROV")
ConnObj_HProv.DataSource = sFileFolder
Set ConnObj_Hids = objPackage.Connections("SSIPATHHIDS")
ConnObj_Hids.DataSource = sFileFolder
Set ConnObj_HCodes = objPackage.Connections("SSIPATHCODES")
ConnObj_HCodes.DataSource = sFileFolder
Set ConnObj_HSpan = objPackage.Connections("SSIPATHSPAN")
ConnObj_HSpan.DataSource = sFileFolder

Set ConnObj_HCHGB = objPackage.Connections("SSIPATHCHGB")
ConnObj_HCHGB.DataSource = sFileFolder

objPackage.Execute
For Each oStep In objPackage.Steps
If oStep.ExecutionResult = DTSStepExecResult_Failure Then
Details = "***** GOSFBILL_SSI_Staging_Load failed. " & Date & " " & Time & "*****"
Call Write_Log
Main = DTSTaskExecResult_Failure
Exit Function
End If
Next

Call Update_Process_Flag("X",objRs("zip_file_name"))

objPackage_1.GlobalVariables("gFileSource").Value = objRs("zip_file_name")


objPackage_1.Execute

For Each oStep_1 In objPackage_1.Steps
If oStep_1.ExecutionResult = DTSStepExecResult_Failure Then

Details = "***** GOSFBILL_SSI_Update_FileSource failed. " & Date & " " & Time & "*****"
Call Write_Log
Main = DTSTaskExecResult_Failure
Exit Function
End If
Next

'********************************************'
Details = "***** End-Time " & sFileFolder & " " & Date & " " & Time & "*****"
Call Write_Log
objPackage.Uninitialize
objPackage_1.Uninitialize
Set objPackage = Nothing
Set objPackage_1 = Nothing
sFileFolder = ""
sFileFolderDBF = ""
sFileFolderFPT = ""
objRs.MoveNext
Wend
objRs.Close
End If
Call Close_Conn
Main = DTSTaskExecResult_Success
End Function
Sub Process_Dir (Para_cntl,Dir_Name)
comObj.CommandText ="dbo.Usp_Process_Dir"
comObj.commandtype = 4
comobj.parameters.Refresh
comobj.parameters("@Para_Cntl")= para_cntl
comobj.parameters("@Dir_Nm")= Dir_Name
comobj.parameters("@File_Type")= "SSI"
If (Para_Cntl = 1)Then
comObj.Execute()
Else If Para_Cntl = 2 Then
Set objRs = comObj.Execute()
End If
End If

End Sub
Sub Update_Process_Flag(P_Flag,Dir_Name)
comObj.CommandText ="dbo.Usp_Process_Flag"
comObj.commandtype = 4
comObj.parameters.Refresh
comObj.parameters("@Process_Flag")= P_Flag
comobj.parameters("@Dir_Nm")= Dir_Name
comObj.Execute()
End Sub
Sub Write_Log
comObj.CommandText ="dbo.usp_etl_write_log"
comObj.commandtype = 4
comobj.parameters.Refresh
comobj.parameters("@Text")= Details
Comobj.parameters("@NDC_SSI_IND")= "SSI"
Comobj.parameters("@Process_Stage")= "Staging"
comObj.Execute()
End Sub

Sub Close_Conn
Set comObj = Nothing
Set objRs = Nothing
conObj.Close
Set conObj = Nothing
Set fso = Nothing
Set folderObj = Nothing
Set subFolderList = Nothing
End Sub

When I migrated this code to SSIS, its not working. How can I achive this functionality in SSIS. Any one pls help me.

Thanks in advance
Gijo

View 1 Replies View Related

Integration Services :: Loading Multiple Flat Files Into Different Tables Using SSIS?

Oct 25, 2015

I have been tasked to do the following using SSIS.

We received two csv files each week and we would like to load these files to two different sql server tables using SSIS.

These files should be archived into a folder after each load.  

How can I achieve this?

View 6 Replies View Related

Loading Data From Same Table (oracle) In To Diff SQL Tables With In Same Package(same Query)

Nov 14, 2007



Hi,

I have Table A . we already have 80 columns . we have to add 65 more columns.

we are populating this table from oracle .and we need to populate those 65 columns again from the same table.

Is it a better idea to add those new 65 columns to the same table or new table.

If we go for the same table then loading time will be double, If I go for new table and If i am able to run both the packages which loads table data from same oracle server to difffrent sql tables then we should be good. But if we run in to temp space issues on oracle server . Then i have to load the two tables separately which consumes the same time as earlier one.

I was thinking if there is a way in SSIS where I can pull data from same oracle table in to two diff sql tables at same time?

View 1 Replies View Related

Bulk Insertion In Child Tables???

Oct 23, 2007

Hi guys iam working on Sales Force Automation Application in which i have a senario where i create team (base table) and team users (child table) one team can have many users and so user can be in one or more team.
My question is what will be the best practise (performance wise) to add let suppose 100 users in a team in one go for example user can create of any number users present in the user list or How to insert bulk of records in child table .>>
iam using sql server 2000 +Asp.net 2005 (c#)
 

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved