the ssis package loops through a folder using a foreach loop container and imports the data inside each .csv file into the database.
Now I would like to add the functionality to first check the modified date of the .csv file. If it is NOT today's date then the package should fail or go to error.
Hi, the ssis package loops through a folder using a foreach loop container and imports the data inside each .csv file into the database. Now I would like to add the functionality to first check the modified date of the .csv file. If it is NOT today's date then the package should fail or go to error. Any thoughts how to do this please? Thanks
I was just thinking about a situation which I have been tasked with in that I have a package that is scheduled to run once a day which will import data from a flat file. The data does not have any distinguishing time that can be used to see whether or not it is beyond a last imported time.
I was wondering, is there a way to access the date modified of the file? If so I could simply append this time to a table which keeps track of all of the files and the modified dates which I have ever imported for this task.
I am looking to create an SSIS package to import text files into an SQL table. I'd like to import the 'date modified' attribute from the text file into one of the columns as well as the data within the text file.
I'm thinking an ActiveX task and variable currently, just wondered if anyone had any other thoughts on how this could be achieved.
basically what we need is a query that will allow me to provide a directory and a variable for number of days for instance 1 day old and I want to be able to able to delete all files older than that date and of course I want to be able exclude files of a particular type where I would give it a wild card statement for example say i wanted to save all csv files i would have the wild card say <> '%.csv'
I have a database that is actively being used and updated. When I look at the last modified date of the mdf file, it shows as a couple of weeks ago date and the transaction log file .ldf file shows a couple of days ago. wonder if the database is constantly updated and changed, should it show the currently date instead. Also, the attribte of mdf file show A as archive. what does it mean? Thanks
I am taking a "complete backup" of my production db every day using Backupagent of Arcserver 2000. But the Modified Date of .mdf and .ldf files show an older date. Is it normal? Thanks Wilson
Does anyone know how to get a table's 'last modified date' in SQL 7 ? Sysobjects contains the 'create date', but I can't find a 'last modified date' anywhere......
I'm an Oracle DBA and just getting used to MS Sqlserver. I noticed that the windows explorer "date modified" field for my database files ( .MDF files ) doesn't change much even though there is activity going on. Sometimes it doesn't change for a week.
Is this the expected behavior? Could it be that no data is changing in my database? ( I find that hard to believe)
Hi, I am using SQL Server 2000 and have the following questions:
1. How do I know the last updated (data) date using system objects or any other method? 2. How do I know the last modified date of a table using system objects or any other method? 3. How do I know when a table is last accessed
If your tables contain created and modified/updated dates what is the best practice for these?
1. Should you use UTC dates? 2. Do you use a default for the creation date (I assume yes)? 3. Should you create a trigger to handle the last update date? Or do you update the column directly in your stored procedures that modify data?
Also, as an aside if you store the user who created/updated the record do you store a foreign key reference to the user table or do you store the username as a varchar? Of course I know you'd normally store the fk, but I wasn't sure if the "logging" nature of the column suggests storing a string value.
ID AppName DepCode DepName Group ModifiedDate YearlyAmount 1 Nestle NS Foods Products 01/12/14 451 1 Nestle NS Foods Products 01/17/14 495 2 Oracle OR Software Info 01/24/14 279 2 Oracle OR Soft & IT Info 01/26/14 310 2 Oracle ORL Software Info 01/25/14 219 2 Oracle ORL Soft IT 01/28/14 600
MonthlyAmount Funded AppCategory Research 37.5623 Yes NE NA 41.2365 No N NA 23.2568 Yes OR InProgress 25.8333 Yes ORL NA 18.2189 Yes SOF Approved 50.0000 No IT RejectedExpected Output:
ID AppName DepCode DepName Group ModifiedDate YearlyAmount 1 Nestle NS Foods Products 01/17/14 946 2 Oracle OR Soft & IT Info 01/26/14 589 2 Oracle ORL Soft IT 01/28/14 819
MonthlyAmount Funded AppCategory Research 78.7988 No N NA 49.0901 Yes ORL NA 68.2189 No IT Rejected
I want to pick the recent modified date for DepCode and sum Yearly and Monthly Amount. I have tried this query and not able to get the output. This is the single table.
select B1.[ID], B1.[AppName], B1.[DepCode], B1.[DepName], B1.[Group], B2.ModifiedDate, B2.YearlyAmount, B2.MonthlyAmount, B1.[FuBded], B1.[AppCategory], B1.[Research] FROM Business B1 INNER JOIN (select [ID], MAX(ModifiedDate) as ModifiedDate, SUM(YearlyAmount) as YearlyAmount, SUM(MonthlyAmount) as MonthlyAmount from Business Group by ID) B2 ON B1.ID = B2.ID AND B1.ModifiedDate = B2.ModifiedDate
Hello, I am using SQL Server 2005 and ASP.NET 2.0. We have a very simple content management system where we have to keep track of date last modified for each row in all of our content tables. I know there's a "timestamp" datatype that is used for replication scenarios, but is there anything similar that I can use to set up a date_modified column for each of my content tables that will automatically update with GETDATE() whenever anything in a given row is updated? Or do I have to create a date_modified column of smalldatetime datatype and write a trigger on update for EVERY single table of content that I have in the database? It seems there should be an easier way to do this than to write 20 triggers for my 20 content tables. Thanks!
Any one please tell me is there any possible way to identify the table modified date.
I have checked the table created date from sysobjects or by right click properties. my requirement is to identify the exact date of table modification and column creation,alter dates. Is there any such provision in sql server 2000 or 2005 , My application is in sql server 2000.
I need to confirm this because some database structure modification has affected my application and causing dataloss i need to check with the date of structural change of table and lost data date can any one help
Kazim writes "generally when we find the new software on our customer and when they want to change the software they said "we must see and use the old datas on the new program". but at here the problem is starting for us because of i want to know on any example on the old software which data is stored on which table on sql server or it is possible for this example we can say spy. do you know anything for about this type of the question and how can we check which tables and columnS are affected by the software? thnx."
Hi, Basically the above is a very common requirement, please comment on my solution which I've arrived at by searching through the web; -
In summary I have used 3 SSIS components these are "Flat File Source", "Derived Column" and "SQL Server Destination".
1) File Connections Manager Editor 1.1) Within File Connections Manager Editor; - Name the data type e.g. "INTERCHANGE_NET_APP_DATE_SRC" and assign a type to the data type e.g. string[DT_STR]
1.2) Click on the Preview button to ensure the expected text is assigned to the expected data type.
2.4) Select "database timestamp [DT_DBTIMESTAMP] " as Data Type.
2.5) Within the Mappings tab of the SQL Destination Editor have; - Input Column as INTERCHANGE_NET_APP_DATE and Destination Column as INTERCHANGE_NET_APP_DATE.
Please comment on the above, I will then pass on my suggestion to Microsoft.
I asked this question below, but the answer was that the conversion will take place automatically, but I can't get that to happen. I have a flat file with an 8 position field that I identify as string (and I also tried date) that is yyyymmdd and it needs to go into the database field that is datetime format. IS there something I am doing wrong with the definition of it, or do I need to add some kind of conversion, and if so, what and how would that be done. I'm a dts Sql2000 expert, but the SSIS thing is driving me crazy. I have a ton of dts' to convert and the migration tool doesn't work because there are a lot of active X scripts in them. thanks for your help. Boston Rose
I have a ssis package where I need to have excel destination. In the Excel file, I need to have few rows with some text and then populate data below the text. One the text is like this:
Data as of: 08/25/2015
if the report ran today, then Data as of will have Yesterday. So, if the user opens that excel file after a week, then user should see same Data as of: 08/25/2015. not today()-day(1).
I was planing to handle on excel side with today()-day(1). but it only works the day it was run. Then the excel file is open after few days later, then it might as Data as of: 08/30/2015 which is not true. It should still stay Data as of:
08/25/2015 on what ever date the excel file is open. The SSIS package runs only once.
How do I handle this so that whenever user open the file, they will see Data as of: 08/25/2015. This is not a column in excel. It is like a description of data in excel.
I need to set up create a package so that I could check the date of the files posted in a folder, e.g. H:source. If there is no file created later than one day exists, then continue to check again one hour later. If files do exists, then copy then to c:dest and then upzip the files. Once this is done, sent an notification email to user@mydomain.com.
Is there any way for me to return the row that was last entered? If so, how do I go about accomplishing this?
For example,
Column1-Plant NameColumn2-Creation DateColumn3-Comments Bayport 12305/24/01 2:51 AMAirflow became unstable Bayport 12305/24/01 4:00 AM Bayport 12305/24/01 5:36 AMNo events Bayport 12305/24/01 1:00 PM Bayport 12305/26/01 2:45 PMNo events Bayport 12305/26/01 3:12 PMStarted liquifier 25% LIN Bayport 405/24/01 2:51 AMSwung liquifier 0% to LIN Bayport 405/26/01 5:45 AM Bayport 405/26/01 5:15 PMLiquifuer @ 25% LIN Coatesville05/24/01 9:32 AM Coatesville05/26/01 4:25 PMNo events
If I were to query against 5/26, I would want my result to return as: Bayport 123 5/26/01 3:12 PM (because this was the last row entered on the 26th) Started liquifier 25% LIN. Bayport 4 5/26/01 5:15 PM (because this was the last row entered on the 26th) Liquifier @ 35% LIN. Coatesville 5/26/01 4:25 PM No events.
Here's an updated version of bigtables.sql that also displays the ratio of index size to data size and the percentage of unused space per table. I've found the index to data ratio particularly helpful for finding and fixing over-indexing.
-- Create a cursor to loop through the user tables declare c_tables cursor for selectid fromsysobjects wherextype = 'U'
open c_tables
fetch next from c_tables into @id
while @@fetch_status = 0 begin
/* Code from sp_spaceused */ insert into #spt_space (objid, reserved) select objid = @id, sum(reserved) from sysindexes where indid in (0, 1, 255) and id = @id
select @pages = sum(dpages) from sysindexes where indid < 2 and id = @id select @pages = @pages + isnull(sum(used), 0) from sysindexes where indid = 255 and id = @id update #spt_space set data = @pages where objid = @id
/* index: sum(used) where indid in (0, 1, 255) - data */ update #spt_space set indexp = (select sum(used) from sysindexes where indid in (0, 1, 255) and id = @id) - data where objid = @id
/* unused: sum(reserved) - sum(used) where indid in (0, 1, 255) */ update #spt_space set unused = reserved - (select sum(used) from sysindexes where indid in (0, 1, 255) and id = @id) where objid = @id
update #spt_space set rows = i.rows from sysindexes i where i.indid < 2 and i.id = @id and objid = @id