How Can You Tell If A File Has Finished Downloadin
May 3, 2006
Currently we have a process where 4 files are ftp'd down to our server. We have DTS package which tests for the existence of these files and once these exist, the DTS package loads the data from these files into a SQL database. Sometimes, however, though the files exist they haven't finished being downloaded so that when the DTS package tries to process them the package is saying the files are empty (though they're not).
I have this on my page Dim backUpDB2 As SqlClient.SqlCommand backUpDB2 = New SqlClient.SqlCommand backUpDB2.CommandType = CommandType.StoredProcedure backUpDB2.CommandText = "msdb.dbo.SP_RESUMENFAC" backUpDB2.Connection = SqlConnection1 backUpDB2.ExecuteNonQuery()
I am trying to create a SQL Job which will report on another job which hasn't finished within its normal completion time. I schedule the new job late enough after the first job that the first should have finished by then.
I would like to have just queried msdb..sysjobhistory.run_status. However, this seems to only report on job STEP status - and after the step is finished! It is always showing run status = 1 (Complete).
Does anyone know the base meta-data table and column I could query? Enterprise Manager shows the current status as running and I want to know where it gets that.
If it is still running, I will raise an error to notify our support group, etc.
PS: My job steps include a combination of DTSRun commands and TSQL commands. I don't think the flavour of commands should matter...
I'm a MS Access developer who needs to help someone migrate/convert from a 'finished' SQL application to Access. How do I gain access to the SQL backend so I can examine/export the raw data? For example, in Access, I would hold down the Shift key while opening the program and it would give me editing rights to the database.
Please be very specific because I have zero experience with SQL.
My ssis package errors out because one of the database connection failed. I successfully logged error but also indicated that package finished successfully. My confusion is if a sheduling software schedules this package, what would be return code sent by dtexe... . would it be success or failure? In this scnerio i want it to return failure so that appropriate team can be contacted.
I have a SSIS package which process 12699 files in a folder. After about 20 minutes it looks like the loading is finished (the record count of database table doesn't change any more) and I believe it's finished, but if I check the Progress tab, it's still showing the file name being processed and still moving. So my first question is: is it still loading files or it's just that the progress message is behind? Then I click on stop debugging, after a while it stopped but then it's frozen, nothing responds no matter what I click. I haven't saved the package yet so I don't want to close it out. Should I just wait? And what is the problem?
Thanks a lot for your help in advance!!
p.s. I'm exploring SSIS and will convert our current DTS packages to SSIS so you'll see me post more and more questions down the road.
I have a job I want to run everyday but before this job starts and I want to check and see if another job has completed before I start this job. i would like to do this in the job steps in SSMS. step 1 is job 'xxxxxxx' running if no go to step 2 if yes exit
Hi All, I have a solution which is synchronised with visualsourcesafe. Now there are some reports present in the solution, I am able to view the preview of the reports but when I am going to view the dataset definition It is not able to retain its defintion and its becoming blank but this happens to only datasets which were developed from cube and its able to retain dataset which are developed from Database.
I want to generate a new snapshot using stored procedures. I want to wait for the snapshot files to be created and then execute a stored procedure. What's the best way to determine that the snapshot has completed successfully? I thought of doing something like:
however I can't put the results of that proc into a temp table because I get this error: Msg 8164, Level 16, State 1, Procedure sp_get_composite_job_info, Line 72 An INSERT EXEC statement cannot be nested.
I'm not really sure if my question should fall on Service Broker or T-SQL, but I hope someone helps me with this... After activating the stored procedure assigned to the queue, is there any way for me to find out if the stored procedure is already finished executing?
I have successfully sent messages to my queue but I have no way to know if all the processing is already done.
I am working with a stored procedure that needs to roll up a week number column once a week - columns are numbered 1-10, 1 being this week, 2 being last week and so forth
Once a week the 10th column is deleted, the 9th becomes 10, the 8th becomes the 9th and so forth and the 1st is calculated the week numbers are getting all screwed up - and we think it's because one statement starts before the one before it completes the statements go like this:
delete theTable where week_num=10; update theTable set weeknum=10 where weeknum=9; update theTable set weeknum=9 where weeknum=8; and so forth
is that the reason? is there any way not to start one statement until the one before it finishes?
In the control flow, I have more than one sequence containers, and I have a script component that I want to be executed only when of 2 last sequence finished with sucess... these 2 sequences does not have any relation with each other...
We are running Windows Server 2003 SP 1 and trying to upgrade SQL 2000 SP 4 to SQL 2005 using the command line.
The process finishes in under ten minutes. Summary.txt file we have this information:
Log File : C:Program FilesMicrosoft SQL Server90Setup BootstrapLOGFilesSQLSetup_<ServerName>_SQL.log Last Action : ValidateUpgrade Error String : The installer has encountered an unexpected error. The error code is 2259. Database: Table(s) Update failed Error Number : 2259
In the log file named SQLSetup_ServerName_Core.log I found the following:
Error: Action "LaunchLocalBootstrapAction" threw an exception during execution. Error information reported during run: "C:Program FilesMicrosoft SQL Server90Setup Bootstrapsetup.exe" finished and returned: 1627 Aborting queue processing as nested installer has completed Message pump returning: 1627
After receiving this info, I can navigate to the setup.bat for the SQL 2005 upgrade and complete the upgrade without error. We are planning on 500 of these, so manual updates is a very ugly concept.
I'd appreciate any and all ideas on where to go from here.
We've recently upgraded to SQL Server 2014, and are now using SSIS integrated with Visual Studio. We have a SSIS project which contains about 20 packages which are nested in Sequence Containers and executed concurrently. These packages have been set up as project references.
The problem is that when I press the start button to run the packages, they all light up green reporting completion before the data has finished loading into the SQL database. If I press the stop button without waiting a sufficient length of time, then not all of the data gets loaded. i.e. a certain number of rows will be missing from some of the SQL tables.
If I click through to the individual package items and check the data flow progress while running, some of the data flows appear to hang at a certain number of rows without ever reaching completion. The number of rows indicated in the data flow is incorrect - i.e. it will count up to ~150,000 and stay there indefinitely in the running state, when in actual fact there are ~500,000 rows to load.
To clarify, the main package will show all items green and display the "Finished: Success" message in the log window, however when I drill through to certain packages in the set, they'll be stuck in the yellow running state, with no way of knowing whether they've actually completed or not.
My current workaround is to just wait a certain length of time before pressing the stop button. This bug doesn't seem to inhibit rows being loaded - it just incorrectly identifies the point when the load finishes, causing people to terminate the load prematurely.
This issue only occurs if I run the project from the main package container. If I execute the child packages individually, they correctly report the number of rows being loaded and light up green once complete.
I have a BOM table with all finished item receipes and semi items recipes. create a query where semi item materials are also listed in finished item recipe.
Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url. My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?
Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc.
I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).
Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.
I have a customer they are running raid 5 on a windows 2000 server one of the drives went bad. The customer replaced the drive and raid rebuilt the drive, every thing seamed to be fine but there is one database file that cannot be attached to SQL. The file is 15G so I know there is information the error states that the file is not a Primary file. Any clue on how to fix this?
Greetings, I have just arrived back into the country (NZ) and back into ASP.NET. I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share. It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page. Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer. Cheers ~ J
I am testing some maintenance tasks sql commands such as index rebuild, index reorg, update statistics and db integrity check on a SQL Server 2014 Database. This is a new non-production vendor database (DB Size 500 GBs, Log Size 25 GBs) which eventually will be created in production. Currently, it is in full recovery model and without log backups. The database has a whole lot of indexes. I am just trying to rebuild and reorganize all the indexes (that need it), in addition to trying to get an idea of how long these maintenance task will take and the space needed in the log file to complete these tasks/commands. I would like to execute these tasks manually (the first time) to gather the duration and space required information. Eventually, I would probably schedule a weekly job to perform this maintenance.
I ran the index rebuild task on the database and noticed that the log file grew by over 50 GBs. I killed the process and truncated and shrunk the log file back down.
1. Does the index rebuild, index reorg, update statistics and db integrity check commands all use the log file?
2. Does Indexs Reorg have less impact on log file then Index Rebuild?
3. Should a truncate log and shrink log file be performed after these maintenance commands?
4. Should a full database backup be performed after these maintenance commands? Or before the maintenance commands?
I have read and understand that shrinking is not good for the database (could lead to more fragmentation and more data file growth when data is added) and I know about rebuilding indexes when fragmentation is GT 30% and reorganizing indexes when fragmentation is GT 5% and LE 30%.
Since this is a non-production database maybe I should set the recovery model to simple, run the maintenance commands and leave the database in simple recovery model unless the vendor needs it in full recovery model for some unknown reason.
5. With the simple recovery model the log file should be reused in a circular manner and not grow during these maintenance tasks. Is this correct?
I need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?
I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.
For a database, we have 4 data files in a particular file group and the file sizes are almost 70 GB each.
Do I come across any performance issues if I create/pre-allocate an additional data file in the same file group so that the existing files don't grow too much?
I have a package in which there are only one Data flow Task and it has only three components. 1) Source , which is a SQL db 2) destination and 3) OLE DB Destination flat file Error output file. I want the error file to be created ONLY if there is any error while dumping the data into destination DB. But , the issue is, the error flat file is being created inspite of No error while dumping the data from Source to Destination.
I'm copying files to a folder with the naming convention as follows in the source folder:
CM_ABC_MY_TEST.txt
In the destination folder, this filename needs to appear as:
CM_XYZ_MY_TEST.txt
In my File System Task, I'm pretty sure I'm going to need an expression with a replace, substring, etc. But am having a hard time nailing down the exact syntax.
Need to know how I can get the dynamic filename created in the FlatFile destination for insert into a package audit table?
Scenario: Have created a package that successfully outputs Dynamiclly named flat files { Format: C:Test’Comms_File_’ + ‘User::FileNumber’+’_’+Date +’.txt’
E.g.: Comms_File_1_20150724.txt, Comms_File_2_20150724.txt etc} using Foreach Loop Container :
* Enumerator Set to: “Foreach ADO Enumerator” with the ADO object source variable selected to identify how many total loop iterations there are i.e. Let’s say 4 thus 4 files to be created
*Variable Mappings : added the User::FileNumber – indicates which file number current loop iteration is i.e. 1,2,3,4
For the DataFlow task have a OLDBSource and a FlatFile Destination where Flat File ConnectionString is set up as:
In my script task I have the following code. The task I'm trying to accomplish is: If the filename on FTP can be found in the local archive folder of e: drive then show message "FileAlreadyThere" (I will ultimatley change it to do nothing); if the filename on FTP cannot be found in the local archive folder of e: drive then transfer the file to the local package folder on d: drive.
While the script task is executing I was watching it closely, but the problem i saw is that: If some files on FTP are already in local archive folder and some are not, then it the files which are already in the archive folder are dumped to the package folder; then after that the files which are not in the archive folder are then dumped to the package folder. But I only want the new files on FTP to be transferred to the package folder for further processing.
Then after this is finished, I saw all the files in the package folder are refreshed one after another, after the first round of refresh the second round starts, after the second round finishes it then stopped. I saw it refreshes itself because the 'Date Modified' of the file changes. And I saw the script task turned green.
I don't see how the code below produced this result. Something is wrong in the logic of the loop? Anyone has any idea why it's behaving the way it is now? And how to change the code to accomplish what I want? Thanks a lot!!
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Imports System Imports System.IO Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() Dim cm As ConnectionManager = Dts.Connections.Add("FTP") cm.Properties("ServerName").SetValue(cm, "ftp2.name.com") cm.Properties("ServerUserName").SetValue(cm, "username") cm.Properties("ServerPassword").SetValue(cm, "password") cm.Properties("ServerPort").SetValue(cm, "21") cm.Properties("Timeout").SetValue(cm, "0") cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb cm.Properties("Retries").SetValue(cm, "1") Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing)) ftp.Connect() ftp.SetWorkingDirectory("/directory") Dim fileNames() As String Dim folderNames() As String ftp.GetListing(folderNames, fileNames) If fileNames Is Nothing Then MsgBox("NoFileOnFTP") Else Dim fileName As String For Each fileName In fileNames If File.Exists("c: emp" + fileName) Then MsgBox("FileAlreadyThere") Else ftp.ReceiveFiles(fileNames, "c: emp", True, True) End If Next End If
Similar to a previous post (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=244646&SiteID=1), I am trying to import data into a SQL Table.
I am trying to program a small application that will import product data obtained through suppliers via CD-ROM. One supplier in particular uses Fixed width colums, and data looks like this:
Example of Data
0124015Apple Crate 32.12
0124016Bananna Box 12.56
0124017Mango Carton 15.98
0124018Seedless Watermelon 42.98 My Table would then have: ProductID as int Name as text Cost as money
How would I go about extracting the data with an XML Format file? I am stumbling over how to tell it where to start picking up data for a specific column. Is there any way that I could trim the Name column (i.e.: "Mango Carton " --> "Mango Carton")?
I don't know if it makes any difference, but I've been calling SQL from my code by doing this:
Code in C# Form
SqlConnection SqlConnection = new SqlConnection(global::SQLClients.Properties.Settings.Default.ClientPhonebookConnectionString); SqlCommand cmd = new SqlCommand();
SqlConnection.Open(); cmd.ExecuteNonQuery(); SqlConnection.Close(); RefreshData(); I am running Visual Studio C# Express 2005 and SQL Server Express 2005.
I need find out the number of columns in flat file before i process that particular file.I have file name in @filename variable and file path is @filepath variable.But do not not that how i will check the column name in before i will process that file.
@filePath = C:DatabaseSourceFilesCAHCVSSourceFiles And i am using for each loop container to read the file one by one and put the file name in @filename variable.and my file name like
Now what i have to do is i need to make sure that ID,Name,City,County,Phone is there in flat file.if it is not there then i have to send mail to client saying that file is not valid.I need to also calculate the size of flat file.
The TEMPDB transaction log file keeps growing.The database server is new and the transaction log was presized to 1 GB on installation. After installing a number of databases, the log file grew over a day to 38GB. Issuing a manual checkpoint was the only way to free some space to allow it to be shrunk back to a usable size. The usage of the file is still going up.
I am struggling to find what process is causing the log to be used so heavily. Looking at the log reuse wait desc for tempdb returns "Nothing" and tempdb itself isn't being used very much or growing in size.
In a server we had File Growth,And then We had to Add New Hard Drive And New File On It.And Now We have New server with a Huge Hard Drive.But all files remaind.Can I Reduce This files to One data file or not ?