Longarse Txt Source Files

Oct 24, 2006

I thought i share this lil thought/tip with ya'all:

I have .txt import files which have huge amounts of columns (this time about 150 - 200). And i need to import 3 of those files. I do have the definitions for columns - but to fill out 200 column definitions in SSIS editor takes u a while. (And suggest types is quite useless due to 1000 sample row limitation - not to mention that if you identify Boolean fields using Y,N - i throws errors on import - so you are most of the times better off defining your own columns).

Fortunately for me - majority of the fields is similar (over 100 fields) - so I can copy and paste the flat file connection and make changes to copied file connection

I guess my wish for future version is that flat file manager could create column definitions from pre-defined format document which would look like:





1 Acres
float
8


2 AgentList
varchar
20


3 Area
int
4


4 Fee
int
4


5 FeePaid
varchar
10

View 2 Replies


ADVERTISEMENT

Using XML Source In SSIS Vs Script Source To Parse Xml Files.

Sep 14, 2007



Hi

I am having a huge xml file with nested section.
i also have a xsd file for that xml.
i have a destination table where the data from the xml should be loaded into.
i am using the xml source transformation. But o get all the data i need to use multiple merje joins to get the data in a single row which i can insert into the destination.i was not quiet convinced with using so many joins.

so i tried using the script source transformation where i am using xml objects to get the node and dynamically construction the data row. and the output is then inserted into the destination.
on comparing the two approach the one using the script source is working much faster than the xml source transformation.

i wanted to know is there any limitaion using the script source to parse through xml files.
also i would like to know any other better way of getting the data from xml source without using the joins.


Hari

View 7 Replies View Related

Source Files In A Zip File

Aug 26, 2007

I am designing a SSIS package where my source is Flat Files from a zip file, I am not sure how to work with flat file which are inside zip file...

View 5 Replies View Related

Using SSIS Package To Delete Source Files

Jan 31, 2013

is it possible to delete a source file(*.txt) when a SSIS package is done with it?

View 3 Replies View Related

SSIS - Dealing With Missing Source Files

Jul 9, 2007



I want to skip running the SSIS data flow task when the source file is missing. We have a scheduler that copies the source file to the staging area. This SSIS package runs as SQL server job. So when a SSIS package fails due to missing file the remaining steps in the SQL scheduler won't execute. I want to handle the missing source file condition grace fully. Please advise.



Thanks in advance.

View 1 Replies View Related

Data Flow Source For Dbf (DBASE IV) Files

Aug 22, 2005

We're trying to read DBASE IV files as a source, but can't find any providers for that format. Will these be included in the final release? Is there another way? DBASE has always been supported, so it's kinda stranged.

View 19 Replies View Related

XML Source File Change Issue (same XSD Different XML Files)

May 26, 2008

hi all,
i'm facing a small problem with xml file source.
let me explain the scenario!

we have 4 source xml files with same format, each file has around 25k rows of records.

XML Source
Derived Columns
Data Conversion
Conditional Split
OLEDB Destination

its executing fine.
once we change xml file source (second xml file with same xsd),
all other components are showing errors. the error is that

Error 21 Validation error. Staging: DTS.Pipeline: input column "LateRsnCd" (17917) has lineage ID 19380 that was not previously used in the Data Flow task. OnTimeOrderEntry.dtsx 0 0

lineage id has changed it seems, when we are giving new file as xml source. of course xsd is same, but its not taking with names.
to resolve this, if we open that errored task, select all, map using column name--> apply. its getting mapped with the columns as it is in the input.
still the conditional split will show the error.
if we open the conditional flow and give ok, the error has gone, i think some meta dat change has occured in the opening task and giving ok.

what could be the possible probelm and how to fix it?
one more doubt, how can we include xml source in the configuration file? since it doesn't have connection manager, i'm struggling with it for dynamic file selection for xml.
i dont want hard code the source file path (as it reside in the server) in the poperties of the xml source task.
can i have some suggestions please?

View 4 Replies View Related

How To Handle Double Quotes In The Source CSV Files

Apr 27, 2007



I am running into an issue with the SSIS when I try to load a CSV file that contains double quotes wrapped around a field (CSV files have double quotes when field contains a comma; example: "Streams, Inc")



Has anyone worked around this issue?

View 1 Replies View Related

Data Source Passwords In Config Files

Aug 31, 2005

Anybody find a sound approach to being able to store DataSource connection information in a configuration and be able to access a Data Source that requires a password for login?

View 6 Replies View Related

Move The Source Excel Files To Archieve Folder

Apr 15, 2008

Using the below script task I am checking for the excel file existence and upon file existence using the data flow task will load the excel data into sql table. After the data is loaded from one file or however number of excel files present, I want to move those into a archieve folder with datetimestamp to the filenames,please let me know how I can move those files with datetimestamp to the filenames, any help is greatly appreciated. Thanks!!

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.IO
Public Class ScriptMain
Public Sub Main()
If File.Exists(ReadVariable("FileNameVariable").ToString()) Then
Dts.TaskResult = Dts.Results.Success
Else
Dts.TaskResult = Dts.Results.Failure
End If
End Sub

'From Daniel Read's Blog - http://www.developerdotstar.com/community/node/512/
Private Function ReadVariable(ByVal varName As String) As Object
Dim result As Object
Try
Dim vars As Variables
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
result = vars(varName).Value
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
Return result
End Function
End Class

View 8 Replies View Related

Integration Services :: Load Different Source Files Into Tables

Oct 12, 2015

I have one small requirement.. I want to load the different types of files(.txt, .csv, .tsv, .xlsx).

Using one forearch loop container how can I load the files to database and I shouldn't use the script task to split the filenames. Is there any other way to load all the files using forearch loop container, exesql task..

View 2 Replies View Related

Move The Source Excel Files To Archieve Folder

Apr 15, 2008

Using the below script task I am checking for the excel file existence and upon file existence
using the data flow task will load the excel data into sql table. After the data is loaded from one file or
however number of excel files present, I want to move those excel files into a archieve folder with date&timestamp to the filenames, please let me know how I can move those files with datetimestamp to the filename,
any help is greatly appreciated. Thanks!!
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports System.IO
Public Class ScriptMain
Public Sub Main()
If File.Exists(ReadVariable("FileNameVariable").ToString()) Then
Dts.TaskResult = Dts.Results.Success
Else
Dts.TaskResult = Dts.Results.Failure
End If
End Sub
'From Daniel Read's Blog - http://www.developerdotstar.com/community/node/512/
Private Function ReadVariable(ByVal varName As String) As Object
Dim result As Object
Try
Dim vars As Variables
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
result = vars(varName).Value
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
Return result
End Function
End Class

View 6 Replies View Related

How To View AS400 Library Files From OLEDB Source (SSIS)

Jan 3, 2007

HI,

I'm trying to get data from AS400. using OLEDB source as my connection. i'm using IBM OLEDB provider for iSeries. and working on standard edition of SQL Server 2005.

While using OLEDB source task when i set my access mode to 'table or view' and try to see list of available libraryname.tablenames, i do not get and tables

where as when i use Data access mode as 'SQL Command' i can get data (can only see preview of data) from AS400 but not able to insert that into my destination table. At run time task Fails with the error mentioned below.

I have configured Data links tab inside the OLEDB connection manager also, but when tried to set a default library it gives me error. : "Error code :CWBZZ5042" - ( catalog is invalid ) but it does exist.

Is there some settings that needs to be done from AS400 side or SQL Server side to view the available libray and its tables ?

Can some one help me on the same.

thanks in advance

Shah

Error Message received when executed with SQL command:

Error: 0xC0202009 at Data Flow Task, OLE DB Source [1]: An OLE DB error has occurred. Error code: 0x80040E00.

Error: 0xC0047038 at Data Flow Task, DTS.Pipeline: The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC0202009. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038.

Error: 0xC0047039 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.

Error: 0xC0047021 at Data Flow Task, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0xC0047039.

Task failed: Data Flow Task

View 4 Replies View Related

T-SQL (SS2K8) :: Extracting Source / Dest Column Details From SSIS Package Files?

Jun 30, 2015

We have several hundred very simple ETL SSIS 2K8 package files (*.dtsx).

I'd like to be able to interrogate them to determine source and destination fields.

There's no great need to map source to dest or to extract data types.

So far, the most promising candidate is to load them using OPENROWSET into an XML field in a SS table.No problem there, but querying using OPENXML has me stumped.

The package files will change a couple of times per year, so the process needs to be repeatable with minimal manual intervention.

View 3 Replies View Related

Flat File Source Error Output Conversion Error With UNICODE Files

May 14, 2008

i have a weird situation here, i tried to load a unicode file with a flat file source component, one of file lines has data like any other line but also contains the character "ÿ" which i can't see or find it and replace it with empty string, the source component parses the line correctly but if there is a data type error in this line, the error output for that line gives me this character "ÿ" instead of the original line.


simply, the error output of flat file source component fail to get the original line when the line contains hidden "ÿ".

i hope you can help me with issue.

Thanks in advance.

View 5 Replies View Related

DB Engine :: Copy Files From Source Server On Destination Server

Sep 25, 2015

I want to schedule a job which pulls files from a non SQL server (Sybase) which later needs to have a step 2 kicking the ssis package. Problem is that, on the source a batch file will run every 4 hours and outputs total of 10 text files. (takes 5 minutes complete). Now, on destination i want to pull these files via SQL job but while scheduling;

1. I don't see any option saying like 4 hours 10 minutes or so
2. If its out there, then i believe this might be a problem as this time would be an increment one e.g next run would be 4 hours 20 minutes in that case.

How should i achieve pulling these files up because we have an SSIS package on destination that needs those text files to be used as soon as they arrive on SQL server(destination)

View 2 Replies View Related

Integration Services :: Metadata Change In Source Disturbs All Configuration Of Source

Jun 18, 2015

I have a got a package with source as sql table which has got 50 columns. We are using only 10 columns out of this. Recently one column name has changed and thus throws error invalid mapping. When I open the source to do the changes noticed that all the colums are prselected now and also the datatypes got changed to default ( I had changed the datatypes as per my requirement while i developed). So now I had to select required columns from source and redo the datatype changes in advanced editor.Is there any option which doesnt disturb this settings and we just need to correct the mapping alone. 

View 4 Replies View Related

XML TASK XSL Variable: Exception 'Property New Source Has No Source Xml Text'

Dec 20, 2005

Hi,
I want to transform an xml flow to an html flow. For this, I create an XmlDataDocument, I add on it, all that I want, and I store it in a file (in a dataflow task).
After that, in an xml task, I set the input properties like that:
Operation type: xslt
SourceType: File connection
Source: ras.xml
 
All works fine. But now, I want to use a variable to store my xml data. So, instead of storing my data in ras.xml, I store it in a variable like that:
 
Dim v As Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSVariables90
Me.VariableDispenser.LockOneForWrite("RAS", v)
v("RAS").Value = doc.OuterXml
v.Unlock()
 
and in the xml task, I set configuration like that:
 
Operation type: xslt
SourceType: Variable
Source: user::RAS
 
But when I execute, I got this following exception:
 
Error: 0xC002F304 at Format HTML Mail, XML Task: An error occurred with the following error message: "Root element is missing.".
Error: 0xC002928F at Format HTML Mail, XML Task: Property "New Source" has no source Xml text; Xml Text is either invalid, null or empty string.
 
Furthermore, when I want to debug it, in replacing the xml task, by a script task, I can see that in my RAS variable, I get the xml data which is well formed.
 
<?xml version="1.0" encoding="UTF-8"?>
<root> €¦
</root>
 
(When I store it in a file, I can see it in IE)
 
Have you got an idea ?
 
 
 
 

View 3 Replies View Related

Loading Data Using Ole Db Source With Input Source Being A View

Dec 13, 2007

I was trying to load data using SSIS, Data Flow Task, OLE DB Source, source was a view to a OLE DB Destination (SQL Server). This view returns 420,591 rows from Query Analyzer in 21 seconds. Row length is 925. When I try to executed the Data Flow Task from SSIS, I had to stop the process after 30 minutes, because only 2,000 rows had been retrieved. I modified the view to retun top 440, 000 and reran. This time all 420, 591 rows were retrieved and written in 22 seconds. Next, I tried to use a TOP 100 Percent. Again, only 2,000 rows were return after 30 minutes. TempDB is on a separate SAN Raid group with 200 gig free, Databases on a separate drive with 200 gig free. Server has 13 gig of memory and no other processes were executing.

The only way I could populate the table was by using an Execute SQL Task and hard code an Insert into table selecting data from the view (35 seconds) from SSIS.

Have anyone else experience this or a similar issue? Anyone have a solutionexplanation?

View 13 Replies View Related

SQL Server Admin 2014 :: Separate Data Files / Log Files / TempDB / Backups

Jan 9, 2015

I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.

View 2 Replies View Related

For Loop - Iterate From Older Files To Newer Files Based On File's Timestamp

Mar 13, 2008

In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.

Any Suggestions?

View 3 Replies View Related

Script Task: How To Compare Files On FTP With Existing Files In Local Folder Before Transfer!

Apr 24, 2008

In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this:
1. If no file(s) found, stop executing and send email saying no file(s) found;
2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.

I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.

Thanks for the help in advance and i apologize for the long lines of code!

example for step 1:
----------------------------------------------------------------------------------------------------------

' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports Microsoft.VisualBasic.FileIO.FileSystem
Imports System.IO.FileSystemInfo

Public Class ScriptMain

' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.

Public Sub Main()

Dim cDataFileName As String
Dim cFileType As String
Dim cFileFlgVar As String
WriteVariable("SCFileFlg", False)
WriteVariable("OOFileFlg", False)
WriteVariable("INFileFlg", False)
WriteVariable("IAFileFlg", False)
WriteVariable("RCFileFlg", False)
cDataFileName = ReadVariable("DataFileName").ToString
cFileType = Left(Right(cDataFileName, 4), 2)
cFileFlgVar = cFileType.ToUpper + "FileFlg"
WriteVariable(cFileFlgVar, True)
Dts.TaskResult = Dts.Results.Success
End Sub
Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object)
Try
Dim vars As Variables
Dts.VariableDispenser.LockForWrite(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
vars(varName).Value = varValue
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
End Sub
Private Function ReadVariable(ByVal varName As String) As Object
Dim result As Object
Try
Dim vars As Variables
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
result = vars(varName).Value
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
Return result
End Function
End Class

example for step 2:
-------------------------------------------------------------------------------------------------------

' Microsoft SQL Server Integration Services Script Task

' Write scripts using Microsoft Visual Basic

' The ScriptMain class is the entry point of the Script Task.

Imports System

Imports System.Data

Imports System.Math

Imports Microsoft.SqlServer.Dts.Runtime

Public Class ScriptMain

' The execution engine calls this method when the task executes.

' To access the object model, use the Dts object. Connections, variables, events,

' and logging features are available as static members of the Dts class.

' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.

'

' To open Code and Text Editor Help, press F1.

' To open Object Browser, press Ctrl+Alt+J.

Public Sub Main()

Try

'Create the connection to the ftp server

Dim cm As ConnectionManager = Dts.Connections.Add("FTP")

'Set the properties like username & password

cm.Properties("ServerName").SetValue(cm, "ftp.name.com")

cm.Properties("ServerUserName").SetValue(cm, "username")

cm.Properties("ServerPassword").SetValue(cm, "password")

cm.Properties("ServerPort").SetValue(cm, "21")

cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout

cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb

cm.Properties("Retries").SetValue(cm, "1")

'create the FTP object that sends the files and pass it the connection created above.

Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))

'Connects to the ftp server

ftp.Connect()

'ftp.SetWorkingDirectory("..")

ftp.SetWorkingDirectory("directoryname")

Dim folderNames() As String

Dim fileNames() As String

ftp.GetListing(folderNames, fileNames)

Dim maxname As String = ""

For Each filename As String In fileNames

' whatever operation you need to do to find the correct file...

Next

Dim files(0) As String

files(0) = maxname

ftp.ReceiveFiles(files, "C: emp", True, True)

' Close the ftp connection

ftp.Close()





'Set the filename you retreive for use in data flow

Dts.Variables.Item("FILENAME").Value = maxname

Catch ex As Exception

Dts.TaskResult = Dts.Results.Failure

End Try

Dts.TaskResult = Dts.Results.Success

End Sub

End Class

View 16 Replies View Related

SQL 2012 :: FOR FILES Command To Delete Old Backup Files On Remote Server?

Feb 24, 2015

I have the need to delete old backup files via TSQL job. Found this solution online:

PushD "
emoteservershareDIFF" &&(
forfiles -m *DIFF*.sqb -d -1 -c "cmd /c del /q @path"
) & PopD

It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?

View 6 Replies View Related

An Error Has Occurred During Report Processing. A Data Source Instance Has Not Been Supplied For The Data Source DetailDS_get_o

Mar 13, 2008

hi ,

i am trying for a drill through report (rdlc)

ihave written the following code in drill through event of reportviewer, whenever i click on the first report iam getting the error like

An error has occurred during report processing.


A data source instance has no
t been supplied for the data source "DetailDS_get_orderdetail".







the code is



using System;

using System.Data;

using System.Data.SqlClient;

using System.Configuration;

using System.Collections;

using System.Web;

using System.Web.Security;

using System.Web.UI;

using System.Web.UI.WebControls;

using System.Web.UI.WebControls.WebParts;

using System.Web.UI.HtmlControls;

//using Microsoft.ApplicationBlocks.Data;

using Microsoft.Reporting.WebForms;

using DAC;

public partial class _Default : System.Web.UI.Page

{

protected void Page_Load(object sender, EventArgs e)

{

ReportViewer1.Visible = false;

}

protected void Button1_Click(object sender, EventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

DataSet ds = new DataSet();

ds = obj.get_order();

ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DataSet1_get_order", ds.Tables[0]);



ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = "C:/Documents and Settings/km63096/My Documents/Visual Studio 2005/WebSites/drillthrurep/Report.rdlc";

ReportViewer1.LocalReport.Refresh();

ReportViewer1.Visible = true;

}

protected void ReportViewer1_Drillthrough(object sender, DrillthroughEventArgs e)

{

DAC.clsReportsWoman obj = new clsReportsWoman();

ReportParameterInfoCollection DrillThroughValues =

e.Report.GetParameters();



foreach (ReportParameterInfo d in DrillThroughValues)

{

Label1.Text = d.Values[0].ToString().Trim();

}

LocalReport localreport = (LocalReport)e.Report;

string order_id = Label1.Text;

DataSet ds = new DataSet();

ds = obj.get_orderdetail(order_id);



ReportViewer1.LocalReport.DataSources.Clear();

ReportDataSource reds = new ReportDataSource("DetailDS_get_orderdetail", ds.Tables[0]);

ReportViewer1.LocalReport.DataSources.Add(reds);

ReportViewer1.LocalReport.ReportPath = Server.MapPath(@"Reportlevel1.rdlc");

ReportViewer1.LocalReport.Refresh();





}



}

the code in method get_orderdetail(order_id) is

public DataSet get_orderdetail(string order_id)
{
SqlCommand cmd = new SqlCommand();
DataSet ds = new DataSet();
cmd.Parameters.Add("@order_id", SqlDbType.VarChar, 50);
cmd.Parameters["@order_id"].Value = order_id;
ds = SQLHelper.ExecuteAdapter(cmd, CommandType.StoredProcedure, "dbo.get_orderdetail");
return (ds);
}pls help me.

View 1 Replies View Related

OLE DB Source To Flat File Destintation Using Fixed Width Columns - Determining Source Column Width

Feb 13, 2007

Hi,

I am trying to create a program that transfers tables to flat files.
At this point in time, I have suceeded in created one that creates delimited files.

However, I am now trying to create fixed-width files as you can do with the SSIS designer, but programatically.

Is there a way to programatically determine the width of a column from the source table? I can not seem to find any kind of function or member that stores this information or allows me to retrieve it.

I know what I need to change in order to set a width for a column, but I just don't know how to find the width without just asking the user to provide one.

View 5 Replies View Related

HELP!!! Cannot Import SSIS Package Files From .dtsx Files

Oct 8, 2007





Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current
SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)

I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!

View 20 Replies View Related

Is It A Good To Replace SQL Script Files With XML Files?

Jul 23, 2005

I am thinking about replacing the INSERT data scriptfiles that I have with XML files. This way I can open the XMLfile using an XML Editor and see the values in a GRID andmake changes easier.Do you see any problem with this approach?I managed to put together some code that is exportinga SQL table with its data to an XML file and also a codethat reads the XML file's data and inserts it into a table.Now I am researching on XSD, td:datatype, DTD...(I am new to XML) in order to figure out how I canuse a single xml file that will hold both the sql serverfields, the datatypes and their values.If you have links to some sample code that has anythingto do with the datatype export and import I am workingon, can you please share them with me?Most importantly what do you think about the idea of usingXML files vs sql scripts?Thank you

View 4 Replies View Related

Integration Services :: Converting RTF Files To PDF Files?

Jun 25, 2015

I have a scenario where I need to convert RDF files to PDF files? may I know is this achievable in SSIS - writing C# code?

View 6 Replies View Related

Transaction Log Files/Virtual Log Files

Oct 1, 2004

I am wanting to reduce the amount of Virtual Log Files I have. In reading through the Online Book Documentation, I realize that I have forgotten to move the Transaction Log Files to a different drive. Now that the server is in production, I wanted to get some input about the best way of making this change.

Can I just change the directory the log files are being written to in the DB properties without having any adverse problems occurring?

View 2 Replies View Related

Empty Log Files And Data Files

Sep 17, 2004

Hi!
I'm using replication with two database on SQL 2000,when begin, the log files size is 50mb and the data files size is 150mb. But now the log files size is 2Gb and the data files size is 4Gb. I would like to decrease the log files and the data files ??? How do i do this???
(I using Truncate and shrink doesn't change )
Thanks!!!

View 2 Replies View Related

Process Many Image Files And Wav Files

Dec 11, 2005

I need to write codes to access many image files and wave files and display them on the webpage. Both of these files are stored on the server (as many experts in this forum adviced) and their locations are in the SQL server database. About the wav files, I think I will have to convert them to MP3 first for fast delivery on internet. Since I am a newbie, any help would be appreciated.
Thanks,

View 14 Replies View Related

Bcp Unicode Files Using Format Files

Jul 20, 2005

I have a problem with bcp and format files.We changed our databases from varchar to nvarchar to support unicode. Noproblems so fare with that. It is working fine.But now I need a format file for the customer table and and it is notworking. It is working fine with the old DB with varchar, but withnvarchar I'm not able to copy the data. The biggest problem is, that Igot no error message. BCP starts copying to table and finished withouterror message.This is my table:CREATE TABLE [dbo].[Customer] ([ID] [int] NOT NULL ,[CreationTime] [datetime] NULL ,[ModificationTime] [datetime] NULL ,[DiscoveryTime] [datetime] NULL ,[Name_] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AI NULL ,[Class] [int] NULL ,[Subclass] [int] NULL ,[Capabilities] [int] NULL ,[SnapshotID] [int] NOT NULL ,[CompanyName] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AI NOTNULL ,[TargetRCCountry] [nvarchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AINOT NULL ,[LocationID] [int] NULL ,[MirrorID] [binary] (16) NULL ,[DeleteFlag] [bit] NULL ,[AdminStatus] [bit] NULL) ON [PRIMARY]GOand this is the format file:8.0131 SQLINT 1 12 "#~@~#" 1 ID ""2 SQLDATETIME 1 24 "#~@~#" 2 CreationTime ""3 SQLDATETIME 1 24 "#~@~#" 3 ModificationTime ""4 SQLDATETIME 1 24 "#~@~#" 4 DiscoveryTime ""5 SQLNCHAR 2 510 "#~@~#" 5 Name_SQL_Latin1_General_CP1_CI_AS6 SQLINT 1 12 "#~@~#" 6 Class ""7 SQLINT 1 12 "#~@~#" 7 Subclass ""8 SQLINT 1 12 "#~@~#" 8 Capabilities ""9 SQLINT 1 12 "#~@~#" 9 SnapshotID ""10 SQLNCHAR 2 510 "#~@~#" 10 CompanyNameSQL_Latin1_General_CP1_CI_AS11 SQLNCHAR 2 510 "#~@~#" 11 TargetRCCountrySQL_Latin1_General_CP1_CI_AS12 SQLINT 1 12 "#~@~#" 12 LocationID ""13 SQLBINARY 1 33 "#~@~#

"13 MirrorID """#~@~#" is the field terminator. We have a lot of text files with allkind of charachers in it. So we think this is a set that will neveroccur in our files.Thanks for your help!*** Sent via Developersdex http://www.developersdex.com ***Don't just participate in USENET...get rewarded for it!

View 2 Replies View Related

MDS Descriptor Files Associated With Databases Files

Jun 14, 2007

Hi All,



Are there any MDS Descriptor Files associated with Databases Files? Are these associated with transaction logs? As per my information there are .mdf , .ndf and .ldf files.Could anyone please guide me ASAP,as there is an urgent requirement from one of the clients?



Thanks in Advance

Gaurav Matkar

View 5 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved