Compare Fields-two Flat Files-load Data

Jul 13, 2007

Hi All,



I am totally new to SSIS and im in the learing phase. I have a requirement as below,

I have two flat files (mainframe files), the structure i have given below,

File1:



070113

12345johnk

23456james



1st row is header record which has got date in YYMMDD format and remaining rows have emp no and emp name



File2:

070113

070113

070113

070113



contains 4 records which are dates.



The requirement is to compare the header date in file1 with the 4 dates in file2, if they are equal then it should load all the records in file1 except the header into a table and if they donot match then it should log an err msg. Please could someone provide a lead on this.



The files have same record length and fixed field delimited.



Thanks in advance

raj

View 1 Replies


ADVERTISEMENT

Integration Services :: Load Data From Flat Files Having Variable Number Of Columns

Jun 23, 2015

I want to load flat files into a single table. But the flat files can have variable number of columns upto a maximum of 10 columns. The table in my database has 10 columns in it. So in case if I load a flat file having 6 columns then rest of the columns in the table will have nulls. I don't want to use script task for this  as I am not good in writing C#code.

View 5 Replies View Related

How To Load Dynamic Multiple Flat Files

Apr 23, 2008



Hi Evry one,

I Have Multiple Flat Files in Source Folder(They have Naming Conventions With Todays Date ex: Flatfile_20082204_1,Flatfile_20082204_2,Flatfile_20082204_3 ),
I need to Extract Each and Evry file by Dynamically, and Transform the Flat File then load that Flat file into the Destination Folder with Standard Prefix and Todays Date with a Sequence No ex:Flatfile_20082304_A,Flatfile_20082304_B, Flatfile_20082304_C


Please HELP Me
Thanks In Advance.

View 20 Replies View Related

Integration Services :: Load Flat Files From S3 Into 2008 R2?

May 12, 2014

We have a few customers dropping files in Amazon S3. how to load this data into SQL Server 2008 R2 database using SSIS? We are 2008 R2 BIDS environment.

View 5 Replies View Related

Integration Services :: Load Multiple Flat Files Into Destination Dynamically?

May 29, 2015

how do you load the multiple flat files  to into destination dynamically?

View 9 Replies View Related

How To Load SSAS Cube Partitions With Dynamic Name Directly From Flat Files

Feb 1, 2008

HI,

As the source data of the cube is from MySQL, and the source data volume is more than 80M row per month, I have to build multiple partition in the cube, each partition contain only one month data, even though, the time to load data directly from MySQL is still too long, and because the mysql .Net provider and is not mature enough, the connection often break while loading, so I have to try to load from flat file which was exported from MySQL.
The Partition Processing Destination seems support this way, but, even it shows the partition name in component editor, it actually process the partition with partition ID, and there's not any way to change destination partition name for this component.
Unless I have to change the SSIS package every month, looks like it is impossible to make a smart ETL program that can dynamic create new partition with the ID and name as YYYYMM every month and load data directly from the flat csv file exported from big MySQL table.


Does anyone know how to load data from flat text, and also support dynamic destination partition name?

And I also find a bug in Partition Processing Destination, even 2005 SP2, if there's more than one cube in a SSAS database, and if two partition name in different cube is same, in component Editor, you can not set mapping to the second one with the same name, even you point to the second one, and click ok, the next time you open the editor, you will find it high light at the first one. And even it shows name in editor, it actually process with partition ID instead of partition name, this make it is not possible by change the partition name which need to process to a constant name, say currentMonth to force the component process it.

Thanks.

Jun

View 4 Replies View Related

Processing CSV Flat Files With Records Of Number Of Fields

Mar 10, 2008

Hello I have some flat files that contain CSV records with different number of fields but the first 4 fields of each record type are the same of each re. eg there would be an entry of one record that has eight fields and another that has 6 fields. Which of the items in the toolbox can i use to filter the records based on the entry in the first 4 fields so i can process the filtered records.

Thank you
Kenalex

View 5 Replies View Related

T-SQL (SS2K8) :: Load Data From Flat File Source Into OleDB Destination By Changing Data Types In SSIS

Apr 16, 2014

I have an source file and i have to load it into the data base by changing datatype of the columns in ssis

View 1 Replies View Related

Integration Services :: Data Load From Flat File To Database

Sep 22, 2015

I am having difficulties loading data from a flat file to a SQL Database. I am able to load some data but the rest gets kicked out for the following reasons:

1 – The field is varchar 50 and I would like to convert it to a date field
2 – The field contain periods (.) (Only 1 period in each row)
3 – The field contain blanks (NULLS)

How do I create a derived column that will bypass blanks (Nulls) and remove periods (.) in each row then convert column to a date field in SSIS? Looking for steps to create a derived date column using SSIS (derived task); convert it to a date column (09-19-2015); use functions to redirect the nulls and possibly remove the period (.)?

[b][u]Sample Data[/u][/b]
Column 3 (Varchar 50)       Need to convert to date; remove periods, and bypass nulls(blanks)                 
Blank
.
Blank
.
Blank
Blank
.
01-19-2015
01-19-2015
Blank
.
Blanlk
.
Blank
01-19-2015
.
Blank
.

View 4 Replies View Related

Using SSIS Package To Dynamically Load Data From Database Into Three Separate Flat File

Jul 24, 2015

I have three tables in data base:

customer
product
sales

And i want to use SSIS package dynamically load data from database into three separate flat file, each table into each file.

I know i got to use for each loop task ADO.Net schema row set enumerator, with OLEDB connection manager, select table name or view name variable from access mode list, but the problem comes, as table name is dynamic then flat file connection is also dynamic, i am using visual studio 2013...

View 5 Replies View Related

Integration Services :: Working On SSIS To Load Data From Flat File To Server

Jun 17, 2015

I'm working on SSIS to load the data from flat file to sql server, I'm getting date in below format, but in sql server I have given data type datetime. how to convert below format to 16-01-15 12.05.19.1234 AM.

View 4 Replies View Related

SSIS Data Flow Task Fails To Load All Text From Flat File

Jan 2, 2007

Hi Guys,

I
have a flat file which is loaded into the database on a daily basis.
The file contains rows of strings which I load into a table,
specifically to a column of length 8000.

The string has a length of 690, but the format is like 'xxxxxx xx xx..'
and so on, where 'xxxx' represents data. So there are spaces, etc present in the middle.

Previously
I used SQL 2000 DTS to load the files in, and it was just a Column
Transformation with the Col001 from the text file loading straight to
my table column. After the load, if I select len(col) it gives me 750
for all rows.

Once I started to migrate this to SSIS, I
allocated the Control Flow Task and specified the flat file source and
the oledb destination, and gave the output column a type of String and
output column width of 8000. But when I run the data flow task it
copies only 181 or 231 characters out of the 750 required.
I feel it stops where it finds the SPACES and skips the rest.

I
specified row delimiters or CR, and LF. I checked the file under
UltraEdit and there were no special characters in the file that would
cause the problem.

Any suggestions how I can get it to load the full data?

Thanks

View 26 Replies View Related

Best Way To Load Data From Text Files

Jan 22, 2006

Hi,
I have problem I'm hoping someone can give me some pointers with.

I need to load data from several text files into one table. The format of the files are simple - each line is comma separated, with double quotes around each element e.g.

"parameter 1","value 1","parameter 2","value 2"....
"parameter 12","value 12","parameter 13","value 13"...

However, the files themselves will have different numbers of columns e.g file 1 may have 8 columns, file 2 may have 16 columns.

I'm going to load the data into a table that has at least as many columns as the longest file. The table columns are all varchar, and are named simply as [Col001] [Col002] [Col003] etc...

The first two columns of this table must be left empty during the load (I use these later on), so the data entry will start at [Col003].

My question is what is the best way to do this? I thought perhaps using a BULK INSERT in a stored procedure might do the trick, but I haven't used it before and haven't got very far. I gather another approach might be to use bcp utility. Someone has also suggested a DTS package, but the filenames will be suffixed with current date/time stamp, so i don't think that will work.

My preferred appraoch would be the BULK INSERT..but i'm open to any pointers.

Many Thanks
Greg

View 2 Replies View Related

SQL Server 2012 :: Unable To Load Data From Flat File To Table Using Bulk Insert Statement

Apr 3, 2015

I am unable to load data from flat file to sql table using bulk insert sql statement

My code:-

DECLARE @filePath VARCHAR(200)
DECLARE @sql VARCHAR(8000)
Declare @filename varchar(100)
set @filename='CCNVZ_150401054418'
SET @filePath = 'I:IncomingFiles'+@FileName+'.txt'

[Code] .....

View 1 Replies View Related

Store Procedure To Load Data From Flat File To Staging Table Dynamically - Column Metadata

Apr 9, 2015

I am having one store procedure which use to load data from flat file to staging table dynamically.

Everything is working fine.staging_temp table have single column. All the data stored in that single column. below is the sample row.

AB¯ALBERTA ¯93¯AI
AI¯ALBERTA INDIRECT ¯94¯AI
AL¯ALABAMA ¯30¯

After the staging_temp data gets inserted into main table.my probelm is to handle such a file where number of columns are more than the actual table.

If you see the sample rows there are 4 column separated by "¯".but actual I am having only 3 columns in my main table.so how can I get only first 3 column from the satging_temp table.

Output should be like below.

AB¯ALBERTA ¯93
AI¯ALBERTA INDIRECT ¯94
AL¯ALABAMA ¯30

How to achieve above scenario...

View 1 Replies View Related

How To Move The Bad Data Into Another Directory While Looping Through A Set Of FLAT FILES ?

Sep 1, 2007

Currently looping through the set of flat files like CHK0604, CHK0611, CHK0618, and CHK0625 from the source folder C:SOURCE



OBJECTIVE within the flat file if any records/rows cause error i have to move the bad data into separate folder C:ERROR



STEPS TAKEN

1) In FOREACH LOOP component i specified the variable User:: sourceFilePath for my source file CHK0604 etc. location C:SOURCE. The loop walkthrough each file in C:SOURCE and if no error then moves the flat file into another folder C:ARCHIVED. This task is perfectly working.



2) Within the dataflow I am diverting the the bad rows from "conditional component" into "Flat File Destination" Component.



3) "Flat File Destination" Connection manager i set the expressions as @[User:: sourceFilePath] +"_Error.TXT".



ISSUE

Because of point (3) the error file is created in the SOURCE flat file location C:SOURCE.



QUESTION



1) My error file name should be CHK0604_Error, CHK0611_Error, CHK0618_Error, CHK0625_Error created in another folder C:ERROR.



2) How to move the bad data into another directory while looping through a set of FLAT FILES ?

3) If i have to create another variable like @[User:: ErrorFilePath] where to create ? How to use the source file title as the title of error file.?

Thanks for the help

View 4 Replies View Related

Problem In Maintaining Data Precision During Import From Flat Files.

Nov 19, 2007



In my scenario I have about a dozen of flat files (Text files),that I have to import in SQL Server 2005.
I am using Flat File connection manager to carry out tha task.Flat files contains data generated from oracle.
When I import data from these text files into SQL ,the main problem lies in converting number(p,s) data type column of Oracle(In text file) to numeric(p,s) data type of sql server 2005.

Number(p,s) data type looses all it's digits after decimal to zero during import process.
For example

1.2434234390 (from text file,number(p,s) type of oracle) converts to 12.0000000000 (numeric(p,s)) of sql server 2005.

Is this this any workaround to this problem.I urgently need help.



View 7 Replies View Related

Doing A Data Import Using DTS Wizard In SQL Server 2005 - Being Efficient With 5 Flat Files

Apr 13, 2006

Hi,

I'm a new user of SQL Server 2005. I have the full version installed. I also have SQL Server Business Integration Dev Studio installed. My OS is Windows XP.

I'm importing a series of 5 flat files into a database on one of the SQL Servers we have. My goal is to get 5 different tables (though perhaps I should do one and add an extra field to distinguish each import) into the database for further analysis.

I tried doing an import via DTS Wizard. There are no column names in the flat file so I defined them during the import process (all 58 of them). When I got to the end, I had an option to save the import process as a SSIS (SQL Server Integration Service) Package on:

SQL SERVER (I don't have permission for this)

or

FILE SYSTEM (did this one)

I saved the Package locally in hopes of being able to go back in, change the source file and destination table of the package and quickly get the other 4 flat files imported.

My problems are:

1) I couldn't find how to run the *.DTSX Package file to run in SQL Server Studio (basically reuse the Package with minor changes and saving me having to redefine the same 58 columns on each flat file import)

2) Tried but didn't understand how to run it in SQL Server Bus Intel Dev Studio (i.e. understanding the mapping and getting the data types right so it wouldn't error out)

3) Don't know how to make the necessary changes so that the Package handles the next source file and puts in a new destination table (do I need to do 5 CREATE TABLES so this Package has a place to run to?)

4) Does the Package need to be part of a Project to run (I haven't found how to take an existing Package and make it part of a Project/Solution)?

5) Is there a good book or online resource for just getting the basics of using SQL Server 2005 and SQL Server Business Intelligence Development Studio?

I'm really at a loss after spending a day fruitlessly on it scouring the help files, forums and experimenting around.

Hope somebody can point me in the right direction.

Regards,

Patrick Briggs,
Pasadena, CA


View 7 Replies View Related

Doing A Data Import Using DTS Wizard In SQL Server 2005 - Being Efficient With 5 Flat Files

Apr 18, 2006

I just spent some time working out how to do a seemingly simple task. I€™m sharing the steps I took to do this in hopes it saves other SQL Server 2005 users (especially newbies like myself) time.

My original question posed on several SQL newsgroups was based on this goal:


I'm importing a series of 5 flat files (all with same file layout) into a database on one of the SQL Servers we have using SQL Server 2005 (SQL Server Management Studio) . My goal is to get 5 different tables. I want to do this without having to redo all the layout criteria 4 additional times.

Below are the steps I followed to get a solution (all done in Microsoft SQL Server Management Studio):

Create the Package (data import)

1) Use the SQL Server Import Export Wizard (equivalent to SQL Server 2000 Data Transfer Wizard) to import your first flat file. At the CHOOSE DATA SOURCE window browse for your file.
2) Under the Advanced tab, you can set your Column attributes (€œoutput column width€? or €œdata type€? to name a few). I highlighted all the columns and selected €œstring [DT_STR]€? for data type. To avoid truncation errors, I selected 255 for output column width. You can name the columns whose data you are most concerned with (I did import all the available fields).
3) After choosing a server destination you will have a €œSELECT SOURCE TABLES AND VIEWS€? window pop up. Under the €œMapping€? column you can choose to tweak your mapping further editing in SQL (see Edit SQL button). I didn€™t.
4) The €œSAVE AND EXECUTE PACKAGE€? will pop up. The €œExecute Immediately€? box should be checked and you should check the €œSave SSIS Package€? (SQL Server Integration Services). When you do, select €œFile System€? for where to save this import-file-package to.
5) Click OKAY for the Package Protection Level and the €œSAVE SSIS PACKAGE€? window will appear. Browse for a path on your local computer to save to.

Modify Package (data import) for Next Use

6) In SQL Server Management Studio, browse for the Package and open it.

Preparation for SQL Task €“ box

7) You should see a screen that shows two boxes (€œPreparation for SQL Task€?) and (€œData Flow Task€?).
8) Right click on the former and select €œEdit€?.
9) On the €œSQL Statement€? row, click into the right column and select the €œ€¦€? box
10) Change the destination table (the table you will create with this package) to a meaningful name and click OK.
11) Click OK for the €œSQL Task Editor€?

Data Flow Task - box

12) Right click on the €œData Flow Task€? box and select €œEdit€?.
13) Three boxes will appear €œSourceConnectionFlatFile€?, €œData Conversion 1€?, and €œDestination - <whatever table name your original data import went to>€?. Below them is a section that displays €œConnection Managers€?

SourceConnectionFlatFile - editing

14) The first thing you will want to do is change the import source to a new flat file. You do this by going below the boxes under the €œConnection Managers€? window and right clicking on €œSourceConnectionFlatFile€? and then selecting €œEdit€?
15) Browse for the new €œFile Name€? and select it.
16) A €œMicrosoft SQL Server Management Studio€? window will pop up asking you if you want to €œkeep or reset the existing metadata€?. The metadata is just your column definitions and choosing €œYES€? to keep this makes sense if you are doing data imports on files with the same file layout.
17) Still in the €œFlat File Connection Manager Editor€? window, change the €œConnection Manager Name€? to something meaningful (I add <_> at the end and then the name of the table the flat file is going to) and click OK.

SourceConnectionFlatFile €“ box (editing)

18) Right click on the €œSourceConnectionFlatFile€? box and select €œEdit€?.
19) Your newly named €œFlat File Connection Manager€? should appear in select box.
20) Click OK, right click again on the €œSourceConnectionFlatFile€? box and select €œShow Advanced Editor€?.
21) Under the €œConnections Manager€? tab, your newly named €œFlat File Connection€? should appear (the prior step is necessary for the advanced editor to recognize your change).
22) Under the €œComponent Properties€? tab, on the €œName€? row, click into the right column and rename to something meaningful (notice the €œIdentification String€? row description changes too once you click out of the €œName€? row)
23) Under the €œColumn Mappings€? tab, just confirm you are mapping your flat file fields (€œAvailable External Columns€?) to a destination table€™s fields (€œAvailable Output Columns€?).
24) Under the €œInput and Output Properties€? tab you can check in €œFlat File Source Output€? to make modifications to either your €œExternal Columns€? or your €œOutput Columns€? €“ you shouldn€™t need to for a simple import.
((NOTE: any changes you make here would likely need to be consistent with the column properties found under the €œConnection Manager Window€? for the €œSourceConnectionFlatFile€? as well as the €œData Conversion 1€? box under the €œData Flow Tasks€? window, so exercise caution
25) NOTE: This process has worked for me by making my source columns all €œstring [DT_STR]€? data type and the output columns all €œUnicode String [DT_WSTR]€? data type.

Data Conversion 1 €“ box (editing)

26) There is nothing you need to do here. By right clicking on the €œData Conversion 1€? box and selecting €œEdit€?, you can see and change the data type of the output columns (the ones in the table your importing the flat file to). There are probably more edits one can do but they€™re beyond what I€™ve learned.

Destination - <whatever table name your original data import went to> €“ box (editing)

27) Right click on the €œDestination - <whatever table name your original data import went to>€? box and select €œShow Advanced Editor€?.
28) Select the €œComponent Properties€? tab.
29) Select the right column at the €œName€? row and change the name to something meaningful (ie. related to the source file name or the table name you€™re importing to).
30) Select the right column at the €œIdentification String€? row and it will update to this change.
31) Select the right column at the €œOpenRowSet€? and change it to the name of the table you are importing your flat file to (this should be consistent with table name under step 10).
32) Click OK
33) Select FILE and select €œSave As€¦€? and then give your package a new name that€™s meaningful (this will be helpful if you have to rerun the import of the flat file later).

Run (execute) the Revised Package (data import)

34) Go back to SQL Server Management Studio and open the Object Explorer
35) Connect to an €œIntegration Services€? component. This should essentially be a local instance (not sure where it is on the local computer or in SQL Server Management Studio on the local computer).
36) In €œObject Explorer€? go down to your €œIntegration Services€? object and expand it.
37) Expand €œStored Packages€?
38) Right click on €œFile System€? and select €œImport Package€? and an €œIMPORT PACKAGE€? window will appear
39) For €œPackage Location€? choose €œFile System€? and then browse for the €œPackage Path€?
40) Click into the €œPackage Name€? and it defaults to your Package€™s file name.
41) Click OK and the Package is imported.
42) Right click on the newly imported Package and select €œRun Package€?
43) An €œExecute Package Utility€? window appears
44) Select €œExecute€? and the package runs.

View 1 Replies View Related

Script Task: How To Compare Files On FTP With Existing Files In Local Folder Before Transfer!

Apr 24, 2008

In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this:
1. If no file(s) found, stop executing and send email saying no file(s) found;
2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.

I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.

Thanks for the help in advance and i apologize for the long lines of code!

example for step 1:
----------------------------------------------------------------------------------------------------------

' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Imports Microsoft.VisualBasic.FileIO.FileSystem
Imports System.IO.FileSystemInfo

Public Class ScriptMain

' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.

Public Sub Main()

Dim cDataFileName As String
Dim cFileType As String
Dim cFileFlgVar As String
WriteVariable("SCFileFlg", False)
WriteVariable("OOFileFlg", False)
WriteVariable("INFileFlg", False)
WriteVariable("IAFileFlg", False)
WriteVariable("RCFileFlg", False)
cDataFileName = ReadVariable("DataFileName").ToString
cFileType = Left(Right(cDataFileName, 4), 2)
cFileFlgVar = cFileType.ToUpper + "FileFlg"
WriteVariable(cFileFlgVar, True)
Dts.TaskResult = Dts.Results.Success
End Sub
Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object)
Try
Dim vars As Variables
Dts.VariableDispenser.LockForWrite(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
vars(varName).Value = varValue
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
End Sub
Private Function ReadVariable(ByVal varName As String) As Object
Dim result As Object
Try
Dim vars As Variables
Dts.VariableDispenser.LockForRead(varName)
Dts.VariableDispenser.GetVariables(vars)
Try
result = vars(varName).Value
Catch ex As Exception
Throw ex
Finally
vars.Unlock()
End Try
Catch ex As Exception
Throw ex
End Try
Return result
End Function
End Class

example for step 2:
-------------------------------------------------------------------------------------------------------

' Microsoft SQL Server Integration Services Script Task

' Write scripts using Microsoft Visual Basic

' The ScriptMain class is the entry point of the Script Task.

Imports System

Imports System.Data

Imports System.Math

Imports Microsoft.SqlServer.Dts.Runtime

Public Class ScriptMain

' The execution engine calls this method when the task executes.

' To access the object model, use the Dts object. Connections, variables, events,

' and logging features are available as static members of the Dts class.

' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.

'

' To open Code and Text Editor Help, press F1.

' To open Object Browser, press Ctrl+Alt+J.

Public Sub Main()

Try

'Create the connection to the ftp server

Dim cm As ConnectionManager = Dts.Connections.Add("FTP")

'Set the properties like username & password

cm.Properties("ServerName").SetValue(cm, "ftp.name.com")

cm.Properties("ServerUserName").SetValue(cm, "username")

cm.Properties("ServerPassword").SetValue(cm, "password")

cm.Properties("ServerPort").SetValue(cm, "21")

cm.Properties("Timeout").SetValue(cm, "0") 'The 0 setting will make it not timeout

cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb

cm.Properties("Retries").SetValue(cm, "1")

'create the FTP object that sends the files and pass it the connection created above.

Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing))

'Connects to the ftp server

ftp.Connect()

'ftp.SetWorkingDirectory("..")

ftp.SetWorkingDirectory("directoryname")

Dim folderNames() As String

Dim fileNames() As String

ftp.GetListing(folderNames, fileNames)

Dim maxname As String = ""

For Each filename As String In fileNames

' whatever operation you need to do to find the correct file...

Next

Dim files(0) As String

files(0) = maxname

ftp.ReceiveFiles(files, "C: emp", True, True)

' Close the ftp connection

ftp.Close()





'Set the filename you retreive for use in data flow

Dts.Variables.Item("FILENAME").Value = maxname

Catch ex As Exception

Dts.TaskResult = Dts.Results.Failure

End Try

Dts.TaskResult = Dts.Results.Success

End Sub

End Class

View 16 Replies View Related

SQL- How To Compare Max Date Across Fields

Jan 4, 2006

Hello, I am wondering how to select the max date from a database view containing multiple date fields for a particular record. I am rusty at SQL and can't seem to get started. Here's an example of the data:

PRCL_IDPROJ_STRT_DTORD_DT ...
1242/3/20063/5/2006
5206/3/20068/2/2006
6412/31/2005
1872/14/20063/16/2006

I need to be able to compare the max date for each PRCL_ID record ("where prcl_id="). In the example above, prcl_id 124 has a max date of 3/5/2006, and prcl_id 520 has a max date of 8/2/2006, etc. The results of the SQL query should return the PRCL_ID and the max date.

Example query results:

PRCL_ID ORD_DT
124 3/5/2006

Please let me know if you can help or if I can provide more info.

Thanks,
Alexkav

View 6 Replies View Related

Compare Fields In 2 Tables

Jan 30, 2008

For example, I have 2 tables that have very similar fields

Table 1 Table 2
ShipRpt ShipRpt
invoice invoice
total sold total sold

Is it possible to compare each of these fields and return any record that has non matching data? I would join the tables on the shipping report or invoice and only compare the total sold field. If the invoices match and the total sold is different in each table, can I just return the 1 record?

View 2 Replies View Related

To Compare Fields Between Two Table

Jul 23, 2005

Hello,I would like to create a stored procedure that would compare the fieldsof two tables and their types. If they are different the user iswarned.How can I do that ?thx

View 3 Replies View Related

Compare Fields From Two Datasets

Jan 9, 2007

Hello,
How can I solve this problem.
For example, I have two databases. db1 and db2.
I want to compare two float fields from tables from db1 and db2 and the present the difference between these fields in a column in reporting services.

ex. loop1
value from db1.table_db1
4

value from db2.table_db2
6

diff
2

ex. loop2
value from db1.table_db1
2

value from db2.table_db2
2

diff
0

..continues...



Should I create 2 datasets and 2 datasources? Must I write custom code for this?

Regards






View 2 Replies View Related

Extract And Load In Flat File

Nov 6, 2007



Hi,

I have a dataflow task which has 3 oledb source objects connected to each data conversion object and these are connected to a union all and finally to a flat file destination.

The purpose of this one is to extract data and pump them to the flat file.

If i run this in production during the time users are doing transactional processes (typical, add, edit delete), will it have an impact?

cherriesh

View 1 Replies View Related

Cannot Compare NVarChar(Max) Fields Over 4000 Chars

Oct 24, 2006

I had a post a week or so ago with this issue, I found the cause, but cannot figure out how to fix it... The problem is the comparison on the NVarChar(Max) fields when one of them exceeds 4000 chars. If I comment out the following lines of the stored proc, everything works. I tested without using COALESCE and it still does not work. COALESCE(Comments, '') = COALESCE(@o_Comments, '') ANDCOALESCE(SpecialNotes, '') = COALESCE(@o_SpecialNotes, '') ANDCOALESCE(IAppComments, '') = COALESCE(@o_IAppComments, '') ANDCOALESCE(MgmtNotes, '') = COALESCE(@o_MgmtNotes, '') AND  set ANSI_NULLS ON
set QUOTED_IDENTIFIER ON
go

ALTER PROCEDURE [dbo].[pe_updateAppraisal]
-- Add the parameters for the stored procedure here
@OrderId INT,
@FileNumber NVarChar(25),
@OrderDate DateTime,
@ClientID NVarChar(25),
@ClientFileNumber NVarChar(25),
@PropertyTypeID INT,
@EstimatedValue money,
@PurchaseValue money,
@LoanOfficer NVarChar(50),
@ReportID INT,
@ReportFee money,
@FeeBillInd bit,
@FeeCollectInd bit,
@CollectAmt money,
@Borrower NVarChar(50),
@StreetAddrA NVarChar(50),
@StreetAddrB NVarChar(50),
@City NVarChar(50),
@CountyID INT,
@StateID INT,
@Zip NVarChar(50),
@ContactName NVarChar(50),
@PhoneA NVarChar(50),
@PhoneB NVarChar(50),
@PhoneC NVarChar(50),
@ApptDate DateTime,
@ApptTime NVarChar(25),
@AppraiserID INT,
@InspectionDate DateTime,
@DateMailed DateTime,
@TrackingInfo NVarChar(50),
@ReviewedBy INT,
@PreNotesID INT,
@PostNotesID INT,
@StatusID INT,
@Comments NVarChar(MAX),
@SpecialNotes NVarChar(MAX),
@MgmtName NVarChar(50),
@MgmtContactName NVarChar(50),
@MgmtAddress NVarChar(50),
@MgmtPhone NVarChar(50),
@MgmtFax NVarChar(50),
@MgmtFee money,
@CheckNumber NVarChar(25),
@MgmtNotes NVarChar(MAX),
@SentAppraiser DateTime,
@InfoReceived DateTime,
@CheckReceived DateTime,
@CheckMailed DateTime,
@INumFamilies NVarChar(10),
@IStyle NVarChar(15),
@IUnit NVarChar(15),
@IConstruction NVarChar(15),
@IBasement NVarChar(10),
@IBFinished NVarChar(10),
@IGarage NVarChar(10),
@INumCars NVarChar(2),
@IGarageType NVarChar(10),
@IContactHas NVarChar(10),
@IAvailable NVarChar(10),
@IInformedAmt NVarChar(5),
@IRealtorContract NVarChar(10),
@IContractContact NVarChar(50),
@IPermitCO NVarChar(10),
@ICORenewal NVarChar(10),
@IRenewalInt NVarChar(15),
@IAppComments NVarChar(MAX),
@IKitchen1 NVarChar(5),
@IKitchen2 NVarChar(5),
@IKitchen3 NVarChar(5),
@IKitchen4 NVarChar(5),
@IKitchenB NVarChar(5),
@IBedroom1 NVarChar(5),
@IBedroom2 NVarChar(5),
@IBedroom3 NVarChar(5),
@IBedroom4 NVarChar(5),
@IBedroomB NVarChar(5),
@IBathroom1 NVarChar(5),
@IBathroom2 NVarChar(5),
@IBathroom3 NVarChar(5),
@IBathroom4 NVarChar(5),
@IBathroomB NVarChar(5),
@AppraiserPerc NVarChar(6),
@AppraiserFee money,

@o_OrderId INT,
@o_FileNumber NVarChar(25),
@o_OrderDate DateTime,
@o_ClientID NVarChar(25),
@o_ClientFileNumber NVarChar(25),
@o_PropertyTypeID INT,
@o_EstimatedValue money,
@o_PurchaseValue money,
@o_LoanOfficer NVarChar(50),
@o_ReportID INT,
@o_ReportFee money,
@o_FeeBillInd bit,
@o_FeeCollectInd bit,
@o_CollectAmt money,
@o_Borrower NVarChar(50),
@o_StreetAddrA NVarChar(50),
@o_StreetAddrB NVarChar(50),
@o_City NVarChar(50),
@o_CountyID INT,
@o_StateID INT,
@o_Zip NVarChar(50),
@o_ContactName NVarChar(50),
@o_PhoneA NVarChar(50),
@o_PhoneB NVarChar(50),
@o_PhoneC NVarChar(50),
@o_ApptDate DateTime,
@o_ApptTime NVarChar(25),
@o_AppraiserID INT,
@o_InspectionDate DateTime,
@o_DateMailed DateTime,
@o_TrackingInfo NVarChar(50),
@o_ReviewedBy INT,
@o_PreNotesID INT,
@o_PostNotesID INT,
@o_StatusID INT,
@o_Comments NVarChar(MAX),
@o_SpecialNotes NVarChar(MAX),
@o_MgmtName NVarChar(50),
@o_MgmtContactName NVarChar(50),
@o_MgmtAddress NVarChar(50),
@o_MgmtPhone NVarChar(50),
@o_MgmtFax NVarChar(50),
@o_MgmtFee money,
@o_CheckNumber NVarChar(25),
@o_MgmtNotes NVarChar(MAX),
@o_SentAppraiser DateTime,
@o_InfoReceived DateTime,
@o_CheckReceived DateTime,
@o_CheckMailed DateTime,
@o_INumFamilies NVarChar(10),
@o_IStyle NVarChar(15),
@o_IUnit NVarChar(15),
@o_IConstruction NVarChar(15),
@o_IBasement NVarChar(10),
@o_IBFinished NVarChar(10),
@o_IGarage NVarChar(10),
@o_INumCars NVarChar(2),
@o_IGarageType NVarChar(10),
@o_IContactHas NVarChar(10),
@o_IAvailable NVarChar(10),
@o_IInformedAmt NVarChar(5),
@o_IRealtorContract NVarChar(10),
@o_IContractContact NVarChar(50),
@o_IPermitCO NVarChar(10),
@o_ICORenewal NVarChar(10),
@o_IRenewalInt NVarChar(15),
@o_IAppComments NVarChar(MAX),
@o_IKitchen1 NVarChar(5),
@o_IKitchen2 NVarChar(5),
@o_IKitchen3 NVarChar(5),
@o_IKitchen4 NVarChar(5),
@o_IKitchenB NVarChar(5),
@o_IBedroom1 NVarChar(5),
@o_IBedroom2 NVarChar(5),
@o_IBedroom3 NVarChar(5),
@o_IBedroom4 NVarChar(5),
@o_IBedroomB NVarChar(5),
@o_IBathroom1 NVarChar(5),
@o_IBathroom2 NVarChar(5),
@o_IBathroom3 NVarChar(5),
@o_IBathroom4 NVarChar(5),
@o_IBathroomB NVarChar(5),
@o_AppraiserPerc NVarChar(6),
@o_AppraiserFee money
AS
BEGIN

UPDATE Orders
SET FileNumber = @FileNumber,
OrderDate = @OrderDate, ClientID = @ClientID,
ClientFileNumber = @ClientFileNumber, PropertyTypeID = @PropertyTypeID,
EstimatedValue = @EstimatedValue, PurchaseValue = @PurchaseValue,
LoanOfficer = @LoanOfficer, ReportFee = @ReportFee,
FeeBillInd = @FeeBillInd, FeeCollectInd = @FeeCollectInd,
CollectAmt = @CollectAmt, Borrower = @Borrower,
StreetAddrA = @StreetAddrA, StreetAddrB = @StreetAddrB,
City = @City, CountyID = @CountyID, StateID = @StateID, Zip = @Zip,
ContactName = @ContactName, PhoneA = @PhoneA, PhoneB = @PhoneB,
PhoneC = @PhoneC, ApptDate = @ApptDate, ReportID = @ReportID,
ApptTime = @ApptTime, AppraiserID = @AppraiserID,
InspectionDate = @InspectionDate, DateMailed = @DateMailed,
TrackingInfo = @TrackingInfo, ReviewedBy = @ReviewedBy,
StatusID = @StatusID, Comments = @Comments,
SpecialNotes = @SpecialNotes, CheckNumber = @CheckNumber,
MgmtName = @MgmtName, MgmtContactName = @MgmtContactName,
MgmtAddress = @MgmtAddress, MgmtPhone = @MgmtPhone,
MgmtFax = @MgmtFax, MgmtFee = @MgmtFee, MgmtNotes = @MgmtNotes,
CheckMailed = @CheckMailed, CheckReceived = @CheckReceived,
InfoReceived = @InfoReceived, SentAppraiser = @SentAppraiser,
PreNotesID = @PreNotesID, PostNotesID = @PostNotesID,
INumFamilies = @INumFamilies,
IStyle = @IStyle, IUnit = @IUnit, IConstruction = @IConstruction,
IBasement = @IBasement, IBFinished = @IBFinished,
IGarage = @IGarage, INumCars = @INumCars,
IGarageType = @IGarageType, IContactHas = @IContactHas,
IAvailable = @IAvailable, IInformedAmt = @IInformedAmt,
IRealtorContract = @IRealtorContract, IContractContact = @IContractContact,
IPermitCO = @IPermitCO, ICORenewal = @ICORenewal,
IRenewalInt = @IRenewalInt, IAppComments = @IAppComments,
IBedroomB = @IBedroomB, IBedroom1 = @IBedroom1, IBedroom2 = @IBedroom2,
IBedroom3 = @IBedroom3, IBedroom4 = @IBedroom4, IKitchenB = @IKitchenB,
IKitchen1 = @IKitchen1, IKitchen2 = @IKitchen2, IKitchen3 = @IKitchen3,
IKitchen4 = @IKitchen4, IBathroomB = @IBathroomB, IBathroom1 = @IBathroom1,
IBathroom2 = @IBathroom2, IBathroom3 = @IBathroom4, IBathroom4 = @IBathroom4,
AppraiserPerc = @AppraiserPerc, AppraiserFee = @AppraiserFee
WHERE OrderID = @o_OrderId AND
COALESCE(FileNumber, '') = COALESCE(@o_FileNumber, '') AND
COALESCE(OrderDate, 01/01/1900) = COALESCE(@o_OrderDate, 01/01/1900) AND
COALESCE(ClientID, 0) = COALESCE(@o_ClientID, 0) AND
COALESCE(ClientFileNumber, '') = COALESCE(@o_ClientFileNumber, '') AND
COALESCE(PropertyTypeID, 0) = COALESCE(@o_PropertyTypeID, 0) AND
COALESCE(EstimatedValue, 0) = COALESCE(@o_EstimatedValue, 0) AND
COALESCE(PurchaseValue, 0) = COALESCE(@o_PurchaseValue, 0) AND
COALESCE(LoanOfficer, '') = COALESCE(@o_LoanOfficer, '') AND
COALESCE(ReportID, 0) = COALESCE(@o_ReportID, 0) AND
COALESCE(ReportFee, 0) = COALESCE(@o_ReportFee, 0) AND
COALESCE(FeeBillInd, 0) = COALESCE(@o_FeeBillInd, 0) AND
COALESCE(FeeCollectInd, 0) = COALESCE(@o_FeeCollectInd, 0) AND
COALESCE(CollectAmt, 0) = COALESCE(@o_CollectAmt, 0) AND
COALESCE(Borrower, '') = COALESCE(@o_Borrower, '') AND
COALESCE(StreetAddrA, '') = COALESCE(@o_StreetAddrA, '') AND
COALESCE(StreetAddrB, '') = COALESCE(@o_StreetAddrB, '') AND
COALESCE(City, '') = COALESCE(@o_City, '') AND
COALESCE(CountyID, 0) = COALESCE(@o_CountyID, 0) AND
COALESCE(StateID, 0) = COALESCE(@o_StateID, 0) AND
COALESCE(Zip, '') = COALESCE(@o_Zip, '') AND
COALESCE(ContactName, '') = COALESCE(@o_ContactName, '') AND
COALESCE(PhoneA, '') = COALESCE(@o_PhoneA, '') AND
COALESCE(PhoneB, '') = COALESCE(@o_PhoneB, '') AND
COALESCE(PhoneC, '') = COALESCE(@o_PhoneC, '') AND
COALESCE(ApptDate, 01/01/1900) = COALESCE(@o_ApptDate, 01/01/1900) AND
COALESCE(ApptTime, '') = COALESCE(@o_ApptTime, '') AND
COALESCE(AppraiserID, 0) = COALESCE(@o_AppraiserID, 0) AND
COALESCE(InspectionDate, 01/01/1900) = COALESCE(@o_InspectionDate, 01/01/1900) AND
COALESCE(DateMailed, 01/01/1900) = COALESCE(@o_DateMailed, 01/01/1900) AND
COALESCE(TrackingInfo, '') = COALESCE(@o_TrackingInfo, '') AND
COALESCE(ReviewedBy, 0) = COALESCE(@o_ReviewedBy, 0) AND
COALESCE(PreNotesID , 0) = COALESCE(@o_PreNotesID, 0) AND
COALESCE(PostNotesID, 0) = COALESCE(@o_PostNotesID, 0) AND
COALESCE(StatusID, 0) = COALESCE(@o_StatusID, 0) AND
/*COALESCE(Comments, '') = COALESCE(@o_Comments, '') AND
COALESCE(SpecialNotes, '') = COALESCE(@o_SpecialNotes, '') AND*/
COALESCE(CheckNumber, '') = COALESCE(@o_CheckNumber, '') AND
COALESCE(MgmtName, '') = COALESCE(@o_MgmtName, '') AND
COALESCE(MgmtContactName, '') = COALESCE(@o_MgmtContactName, '') AND
COALESCE(MgmtAddress, '') = COALESCE(@o_MgmtAddress, '') AND
COALESCE(MgmtPhone, '') = COALESCE(@o_MgmtPhone, '') AND
COALESCE(MgmtFax, '') = COALESCE(@o_MgmtFax, '') AND
COALESCE(MgmtFee, '') = COALESCE(@o_MgmtFee, '') AND
/*COALESCE(MgmtNotes, '') = COALESCE(@o_MgmtNotes, '') AND*/
COALESCE(SentAppraiser, 01/01/1900) = COALESCE(@o_SentAppraiser, 01/01/1900) AND
COALESCE(InfoReceived, 01/01/1900) = COALESCE(@o_InfoReceived, 01/01/1900) AND
COALESCE(CheckReceived, 01/01/1900) = COALESCE(@o_CheckReceived, 01/01/1900) AND
COALESCE(CheckMailed, 01/01/1900) = COALESCE(@o_CheckMailed, 01/01/1900) AND
COALESCE(INumFamilies, '') = COALESCE(@o_INumFamilies, '') AND
COALESCE(IStyle, '') = COALESCE(@o_IStyle, '') AND
COALESCE(IUnit, '') = COALESCE(@o_IUnit, '') AND
COALESCE(IConstruction, '') = COALESCE(@o_IConstruction, '') AND
COALESCE(IBasement, '') = COALESCE(@o_IBasement, '') AND
COALESCE(IBFinished, '') = COALESCE(@o_IBFinished, '') AND
COALESCE(IGarage, '') = COALESCE(@o_IGarage, '') AND
COALESCE(INumCars, '') = COALESCE(@o_INumCars, '') AND
COALESCE(IGarageType, '') = COALESCE(@o_IGarageType, '') AND
COALESCE(IContactHas, '') = COALESCE(@o_IContactHas, '') AND
COALESCE(IAvailable, '') = COALESCE(@o_IAvailable, '') AND
COALESCE(IInformedAmt, '') = COALESCE(@o_IInformedAmt, '') AND
COALESCE(IRealtorContract, '') = COALESCE(@o_IRealtorContract, '') AND
COALESCE(IContractContact, '') = COALESCE(@o_IContractContact, '') AND
COALESCE(IPermitCO, '') = COALESCE(@o_IPermitCO, '') AND
COALESCE(ICORenewal, '') = COALESCE(@o_ICORenewal, '') AND
COALESCE(IRenewalInt, '') = COALESCE(@o_IRenewalInt, '') AND
/*COALESCE(IAppComments, '') = COALESCE(@o_IAppComments, '') AND*/
COALESCE(IKitchen1, '') = COALESCE(@o_IKitchen1, '') AND
COALESCE(IKitchen2, '') = COALESCE(@o_IKitchen2, '') AND
COALESCE(IKitchen3, '') = COALESCE(@o_IKitchen3, '') AND
COALESCE(IKitchen4, '') = COALESCE(@o_IKitchen4, '') AND
COALESCE(IKitchenB, '') = COALESCE(@o_IKitchenB, '') AND
COALESCE(IBedroom1, '') = COALESCE(@o_IBedroom1, '') AND
COALESCE(IBedroom2, '') = COALESCE(@o_IBedroom2, '') AND
COALESCE(IBedroom3, '') = COALESCE(@o_IBedroom3, '') AND
COALESCE(IBedroom4, '') = COALESCE(@o_IBedroom4, '') AND
COALESCE(IBedroomB, '') = COALESCE(@o_IBedroomB, '') AND
COALESCE(IBathroom1, '') = COALESCE(@o_IBathroom1, '') AND
COALESCE(IBathroom2, '') = COALESCE(@o_IBathroom2, '') AND
COALESCE(IBathroom3, '') = COALESCE(@o_IBathroom3, '') AND
COALESCE(IBathroom4, '') = COALESCE(@o_IBathroom4, '') AND
COALESCE(IBathroomB, '') = COALESCE(@o_IBathroomB, '') AND
COALESCE(AppraiserPerc, 0) = COALESCE(@o_AppraiserPerc, 0) AND
COALESCE(AppraiserFee, 0) = COALESCE(@o_AppraiserFee, 0)
END 

View 13 Replies View Related

Cursor To Compare/report On The Same Fields In 2 Tables

Sep 20, 2006

I have the following cursor that I am comparing 2 tables, theproduction table and a copy of the production table, I want results ofall address's that are like the address1 field...the problem is...myresults are giving me every field and if there is more than one, it isputting it in a grid....I only want to see results if they are 1 for the same address fieldthis is what I have so far....declare @address1 char(61),@city char(61)declare address_cursor CURSOR FORSELECT address1,city FROM test.dbo.testaddOPEN address_cursorfetch next from address_cursor into @address1,@citywhile @@fetch_status = 0BEGINselect * from testadd where @address1 like '%' + address1 + '%' and@city = cityFetch next from address_cursor into @address1,@cityPrintENDCLOSE address_cursorDEallocate address_cursor

View 4 Replies View Related

Can't Load Into Numeric Fields

Feb 1, 2005

Hi,

This may seem like a simple question to be asking but I’m not the most experienced working with DTS loads and can't understand (don't know) why my load is failing.

I am trying to load in a text file comma separated into a table I have created.

It consists of only 5 entries

01 - varchar (20)
02 - smalldatetime
03 - smalldatetime
04 - numeric Scale (1)
05 - numeric Scale (1)

Here are the first two lines from my file,

ABCDEFGHIJKLMNOPQRS1,02/02/2005,05/02/2005,0,1
ABCDEFGHIJKLMNOPQRS2,01/02/2005/06/02/2005,1,1

As far as I can se it should be working but it gives me the error:

The number of failing rows exceeds the maximum specified.
TransformCopy 'DTSTransformation_4'conversion error: General conversion failure on column pair (source column 'Col004(DBTYPE_STR), destination column 'Result1' (DBTYPE_NUMERIC)).

Help.

Can't figure out what’s going on so any ideas would be useful.

Cheers.

View 1 Replies View Related

Rank Query - Compare 2 Total Income Fields?

Apr 24, 2015

I have a query that ranks. Once I get the ranked fields is there a way to compare the 2 total_income fields?

Here's the query:

select * from
(
select cardholderid, appcnum, total_income ,Rank() over (PARTITION BY A.APPCNUM
ORDER BY A.EFFSTARTDATE DESC) as Rank
from
TBL_EPIC_BILLSTATUS A
) tmp
where Rank in (2,3) and CARDHOLDERID = '704355'
--------------and rank ((2),total_income) <> rank (3),total_income))

Looking to do something like this to see if the income is different

View 17 Replies View Related

DB Engine :: Compare Two Tables And Log Difference In New Table With Fields?

Apr 21, 2015

I want to compare two tables and log the difference in new table with the fields as (old value,new value, column name). The column name should be the changes value column.

View 10 Replies View Related

How To Load A Flat Text File Into MSDE / SQL Server

Mar 27, 2001

Hi!

How do I load a comma-delimited text file into MSDE? How does it work for SQL Server 7?

Thanks,
Helmut

View 1 Replies View Related

Not Able To Load The Application In Case Web Farm Garden When We Load Data Through Background Thread.

Dec 14, 2007

Hi,

Here I will describe my problem.
1. We are loading large amount of data from database on background thread which is starting on Application_start event in global.aspx.cs file.The data is later cached for subsquent request to improve the performance.
2. Now when we put the application on web farm garden, it is not able to load the application.
3. We are sending the request the servers through Router kind of application.
4 This application is working fine on single server enviornment.

Please help us.

Ajay Kumar Dwivedi

View 1 Replies View Related

Outlook 2003 Filter SQL Tab To Compare The Values Of Two Fields In A Given Contact.

Sep 28, 2007

Hello and Thank You in advance:


I would like to Use the SQL Tab in the Filter Selection of the View-Modify Current View Option in Outlook 2003.
I am specifically trying to create a statement that only displays those Contacts that have a Modified Date that is different than the Created date. In other words I want to view all the records that have been modified at any time after the date of their creation.

I have been unable to create a SQL statement that compares the Created and Modified fields in the Contacts. Would you please explain and show what the correct syntax is in order to compare the values of two fields within a given Contact?


These are all examples of what I have tried and that have not worked. I cannot find documentation for this anywhere only small articles here and there that are not enough for me to develop a complete answer from:

(("urnchemas:calendar:created" >= '1/1/2001 12:00 AM' AND
"urnchemas:calendar:created" <= '12/31/2001 12:00 AM')
AND ("urnchemas:calendar:created" <> "urnchemas:calendar:lastmodified" ))
("urnchemas:calendar:created" <> "urnchemas:calendar:lastmodified" )

(("urnchemas:calendar:created" >= '1/1/2001 12:00 AM' AND
"urnchemas:calendar:created" <= '12/31/2001 12:00 AM') AND
("DAV:getlastmodified" >= '1/1/2001 12:00 AM' AND
"DAV:getlastmodified" <= '12/31/2007 12:00 AM'))

("urnchemas:calendar:created" <> "DAV:getlastmodified")
("DAV:getlastmodified" <> "urnchemas:calendar:created")
("DAV:getlastmodified" <> 'urnchemas:calendar:created')
("urnchemas:calendar:created" <> 'DAV:getlastmodified')

View 2 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved