I have a package that reads a table that has a list of files that I need to load into a table that arrive every night. These files range from 50mb to 1.5gb, The entire process to load and transform is taking about 50mins, which is about a 300% increase from our current production environment. However I am looking at ways to improve performance by loading all of the data into the staging table at the same time.
Is this possible, and do you guys think it would improve performance significantly?
I am trying to load a text file into a temporary table in MS-SQL using DTS.
The file is coming from a unix machine and contains comma-delimited entries.
the DTS script runs every minute and the transformation maps values in the text file into corresponding columns in the temp table.
The problem is that the sequence of the entries in the text file does not match the sequence of the values imported.
I have several instances of this script (all identical) but only one file seems to be out of whack. Does anyone know what could be causing this? This is for a real-time trading system so the sequence has to be perfect.
In a dataflow there is a Flat File Source that loads a file with 500000 rows but when I execute the package it only loads 249999, I have verified the file and every looks fine, it is a csv and ms-excel recognized it perfectly.
I'm wondering if there is any technical limitation regarding to the amount of rows per file?
and what else can I verify for discover the problem?
I am at turning point with MS and SQLv6.5. I really need to make a decision for a large scale growth of a project. I was hoping people could respond for their experiences with SQL 7, with some rough load numbers ( concurrent connections, size DB, transactions/sec or day). Also is anyone using this product with MTS.
Production numbers only please. So many people have worked with the Beta and are doing testing, but it doesn't really count until it is in production.
Is it requirement that the OLE DB provider should have implemented IRowetChange interface so that it can be used to configure OLE DB destination?
Is there any way to configure a destnation for bulk and faster load? Normal OLE DB destination via IRowsetChange does load one row at a time ( InsertRow() ).
I recently got a new computer and reinstalled Visual Studio 2005, SQL, and all the goodies. When I build a SSIS project with multiple packages and run it, VS loads all of the packages in the project into the IDE before execution. None of the packages are related and I only want the current package to be loaded. I don't remember it working this way before. I've looked through the package, project, solution, and VS options and can't find anything that might control this behavior.
Hello,I have a formview and when I load it I would like to check a variable "Cover" from the database, to see wether or not it is empty. But how on earth do I get the variables from the sqldatasource in my function "FormView1_load(...)" ??
I have a database that's 2.5GB but only has about 17MB of actual data. I've setup a standby server that I load my dumps into. The load takes about 10 miuntes. The dump takes about a minute and a half (which also seems slow to me for that small amount of data). I don't expect that it should take that long to load 8800 pages into a database. The standby server is the same hardware as the production server (sinlge 500MHz Xeon, 2GB RAM, RAID 5). The server has only a single RAID 5 array to store all the OS, and all the SQL data however, I still don't thinkit should take thta long to load. Let me know what you think.
Hi guys,I've created a web application using ASP together with SQL Server asour db source, running through IIS 6 on a Winows Server 2003 platform.This application retrieves a list of customer codes from our db, sorecords returned could be as many as 2000+ for any single transaction.The application runs fine for users from the same state. However, ourinterstate colleagues have notice that it takes more than 3-4mins forthe page to load, while it only takes me < 2secs to load.Our intranet server is located in the same state as I, so anyone fromwithin this state has no problems loading the page. All other statesare finding it unbearable.I've done some debugging, and it appears to be a server factor.I saved the page with the longest list to a local drive and opened itlocally in IE and it loads quickly.Does anyone have any suggestions as to how to speed this application upfor our interstate users?Any ideas would appreciated.Thanks,Shawn
When I attempt to load a database from dump format across a network (100mb Ethernet) It takes forever. (15 hours for 16GB!) can anyone help me find a starting point to troubleshoot this?
Thanks!
-Chris
P.S. File Copies of the same size move at a rapid fashion, and I cannot find any bottlenecks in the network.
I have a table thats about 3 gigs, using this table and a few others Im making another table. The problem is when making the new table my transaction log inflates so much that Im running out of disk space. What I can I do to prevent this or to keep the transaction log size under control?
We have some tables that are bulk-loaded every day and they do not have RI to the other tables in the database.
To ease pressure on the logs, I had the idea of spinning them off to another database on the same AG in simple or bulk-load recovery model and using synonyms to point to them so the code base would not need changing.
I know an earlier bug in 2005 existed that basically made the query analyzer ignore indexes if a table was accessed via a synonym.
I inherited an SSIS package that is rather simple. It grabs data from a SQL Query and then loads it into a SQL table. The first step of this process TRUNCATES the destination table and then reloads for the current year. This table has over a million rows and the DB SOURCE that we are pulling from is not in our domain, so one can imagine how long this takes.
This process is working fine, (given the 45 minutes it takes to repopulate data in the DESTINATION table), but what I really need is a way to load only the rows that are NEW and UPDATED. I would also need functionality to DELETE the rows that have been removed (sounds like a MERGE, right?).I tried using MERGE and MERGE JOIN transformations but these transformations seem to be different from the T-SQL MERGE statement. MERGE seems like a slow UNION and MERGE JOIN only seems to work with SELECTS.
Dim cnn As ADODB.Connection Dim rst As ADODB.Recordset
Private Sub Form_Load()
Set cnn = New ADODB.Connection cnn.ConnectionString = "driver={SQL Server};" & "server=SCHS-SQL;uid=sa;pwd=sa;database=Library" cnn.Open
Call loadrst
End Sub
Public Sub loadrst() Set rst = New ADODB.Recordset Dim sql1 As String sql1 = "select * from Books order by srno" rst.Open sql1, cnn, adOpenDynamic, adLockOptimistic, adCmdText
If rst.EOF = True Then MsgBox ("No records are present") Command1.Enabled = False Else Call display Command1.Enabled = True End If
End Sub
This is the code i use basically to connect my vb6 application to sql server 2005. I had started out lately trying to use sql server instead of access. So far none of the program have given any problems as the databases has a max one of 120 records. But the one which this code connects to has about 5200 records. I had imported the tables from access into sql server. The size of the database was around 17.67mb so i shrank it and it became 4mb. But still it takes roughly 2 minutes for the user to see the records in the grid. Could you tell me what to do?
I have scenario where I have process that loades data into SQL server 2012 database by doing some manipulation on data like sorting , aggregation, etc. Once this process is completed it's not free up the Tempdb space. If I restart the database, then it does.
is there any way (apart from shirking) to release space for Tempdb, like writing some post SQL queries to delete/ truncate the data and logs from temp db?
We have a massive database with an almost massive amount of traffic to and from it.
I've been requested to implement a sliding window partitioning with 2 partitions an active and passive 1,I managed to test this on a very small testbed last month.
I currently moved 97k table on to the partition function leaving me another 26 k to go
I'm using the following stored procedure to implement the sliding window
CREATE PROCEDURE [dbo].[ManageFactSlidingWindow](@pFunction nvarchar(max),@pSchema nvarchar(max),@FG nvarchar(max),@moveDays int) /***************************************************************************** PROCEDURE NAME: [ManageFactSlidingWindow] AUTHOR: Arshad Ali CREATED: 02/24/2013 DESCRIPTION: This stored procedure manages sliding window for the partitioned table
VERSION HISTORY: DATE EMAIL Company DESCRIPTION
[Code] .....
When I try to move the partition even a single day I get loads of locks.
Hello. I have a 32-bit SQL 2005 (SP1) server that is my current Production server. I also have a 64-bit SQL 2005 (SP1) server that will become my Production server. I have several SSIS packages that load/refresh data on a nightly basis to a few of my databases from DB2 (MVS). I have the packages setup on the current Production server (32-bit) and all is working well through the Microsoft OLE DB provider for DB2. However, on the 64-bit server, I am experiencing some issues with the SSIS packages failing due to large loads. Loads that are loading tables with 500K, or less, data seem to run without issue (through SQL Agent Jobs). But, larger table loads are failing.
I do have a linked server set up on the 64-bit server to the 32-bit server, for other processes. And because of this I have lightweight pooling turned off on the 64-bit server (because of distributed querying). Lightweight pooling is turned on on the 32-bit server. Could this be what is causing some of my issue? Since I don't have the lightweight pooling option turned on (on the 64-bit server), am I not getting the proper amount of through-put for my 8 dual core CPU server?
I'm having trouble getting off the ground with the Web application walkthrough "Walkthrough: Creating a Web Application Using Visual C# or Visual Basic" in VS.NET Pro 2002 [Academic] documentation. After a bit of fishing around, and consulting the MS Knowledge Base, I got the pubs database installed. I also got the connection to work well enough that the dataset would fill in the IDE.
The problem is that when I try to run the web form, either from the IDE debug menu, or by accessing the .aspx file on localhost using Firefox, I get the error: SELECT permission denied on object 'titles', database 'pubs', owner 'dbo'. showing in the browser. My understanding is that this page is running as ASPNET, and I did already carry out the recommended commands to enable access: C:>osql -E -S MY-MACHINE-NAMEVSDOTNET -Q "sp_grantlogin 'MY-MACHINE-NAMEASPNET'" C:>osql -E -S MY-MACHINE-NAMEVSDOTNET -d Pubs -Q "sp_grantdbaccess 'MY-MACHINE-NAMEASPNET'" both of which commands returned successfully. Any suggestions as to what else I should do to get the necessary permissions to actually display the data in my browser? Does the IIS user account need permission also?
Thanks for any insight into this vexing problem. I must say that along the way, I have had some fun exploring the osql comand-line tool. Using the -E switch, I have been able to run select and upgrade queries, but this is all pretty much fishing in the dark. I would like to get back to actually working with the walthroughs in the Visual Studio documentation.
We're experiencing the following error regularly: "I/O error 1450(Insufficient system resources exist to complete the requested service.)"
We are running SQL2000 with 2GB memory, Windows 2000 AS, 4 CPUs. We are using a SAN storage connected by 2 Fibre cards. The databases range from 10GB to 400GB in size for decision support applications.
Although it seems clear that the disk subsystem is causing this error, our hosting party is blaming the application layer for this behaviour.
From the SQL server log: I/O error 1450(Insufficient system resources exist to complete the requested service.) detected during read at offset 0x00000350860000 in file 'O:DATASA_Data.MDF'.. Error: 823, Severity: 24, State: 2 I/O error 1450(Insufficient system resources exist to complete the requested service.) detected during read at offset 0x00000350862000 in file 'O:DATASA_Data.MDF'.. Error: 823, Severity: 24, State: 2
From the event log: [..] dmio: Harddisk37 read error at block 23192007: status 0xc000009a dmio: Harddisk35 read error at block 23192135: status 0xc000009a dmio: Harddisk36 read error at block 23192127: status 0xc000009a dmio: Harddisk36 read error at block 23192263: status 0xc000009a [etc] dmio: Disk Harddisk31 block 23193791 (mountpoint O:): Uncorrectable read error [..]
Currently have a single hard coded file path to the SSRS config file which parses the file and provides the reporting services web service url. My question is how would i run this same query against 100s of servers that may or may not share the same file path as the one hard coded ?
Is there a way to query the registry to find the location of the config file of any server ? which could be on D, E, F, H, etc.
I know I can string together the address followed by "reports" and named instance if needed, but some instances may not have used the default virtual directory name (Reports).
Am I going about this the hard way ? Is there a location where the web service url exists in a table ? I could not locate anything in the Reporting service database. Basically need to inventory all of my reporting services url's.
I have a customer they are running raid 5 on a windows 2000 server one of the drives went bad. The customer replaced the drive and raid rebuilt the drive, every thing seamed to be fine but there is one database file that cannot be attached to SQL. The file is 15G so I know there is information the error states that the file is not a Primary file. Any clue on how to fix this?
Greetings, I have just arrived back into the country (NZ) and back into ASP.NET. I am having trouble with the following:An attempt to attach an auto-named database for file (file location).../Database.mdf failed. A database with the same name exists, or specified file cannot be opened, or it is located on UNC share. It has only begun since i decided i wanted to use IIS, I realise VWD comes with its own localhost, but since it is only temporary, i wanted a permanent shortcut on my desktop to link to my intranet page. Anyone have any ideas why i am getting the above error? have searched many places on the internet and not getting any closer. Cheers ~ J
I am testing some maintenance tasks sql commands such as index rebuild, index reorg, update statistics and db integrity check on a SQL Server 2014 Database. This is a new non-production vendor database (DB Size 500 GBs, Log Size 25 GBs) which eventually will be created in production. Currently, it is in full recovery model and without log backups. The database has a whole lot of indexes. I am just trying to rebuild and reorganize all the indexes (that need it), in addition to trying to get an idea of how long these maintenance task will take and the space needed in the log file to complete these tasks/commands. I would like to execute these tasks manually (the first time) to gather the duration and space required information. Eventually, I would probably schedule a weekly job to perform this maintenance.
I ran the index rebuild task on the database and noticed that the log file grew by over 50 GBs. I killed the process and truncated and shrunk the log file back down.
1. Does the index rebuild, index reorg, update statistics and db integrity check commands all use the log file?
2. Does Indexs Reorg have less impact on log file then Index Rebuild?
3. Should a truncate log and shrink log file be performed after these maintenance commands?
4. Should a full database backup be performed after these maintenance commands? Or before the maintenance commands?
I have read and understand that shrinking is not good for the database (could lead to more fragmentation and more data file growth when data is added) and I know about rebuilding indexes when fragmentation is GT 30% and reorganizing indexes when fragmentation is GT 5% and LE 30%.
Since this is a non-production database maybe I should set the recovery model to simple, run the maintenance commands and leave the database in simple recovery model unless the vendor needs it in full recovery model for some unknown reason.
5. With the simple recovery model the log file should be reused in a circular manner and not grow during these maintenance tasks. Is this correct?
I need to write a process to get file size in kb and record count in a file. I was planning on writing a c# console app that takes the file path and name as a param however should i use a CLR?
I cant put a script in the ssis when it's bringing the file down because it has been deemed that we only use ssis for file consumption.
For a database, we have 4 data files in a particular file group and the file sizes are almost 70 GB each.
Do I come across any performance issues if I create/pre-allocate an additional data file in the same file group so that the existing files don't grow too much?
I have a package in which there are only one Data flow Task and it has only three components. 1) Source , which is a SQL db 2) destination and 3) OLE DB Destination flat file Error output file. I want the error file to be created ONLY if there is any error while dumping the data into destination DB. But , the issue is, the error flat file is being created inspite of No error while dumping the data from Source to Destination.
I'm copying files to a folder with the naming convention as follows in the source folder:
CM_ABC_MY_TEST.txt
In the destination folder, this filename needs to appear as:
CM_XYZ_MY_TEST.txt
In my File System Task, I'm pretty sure I'm going to need an expression with a replace, substring, etc. But am having a hard time nailing down the exact syntax.
Need to know how I can get the dynamic filename created in the FlatFile destination for insert into a package audit table?
Scenario: Have created a package that successfully outputs Dynamiclly named flat files { Format: C:Test’Comms_File_’ + ‘User::FileNumber’+’_’+Date +’.txt’
E.g.: Comms_File_1_20150724.txt, Comms_File_2_20150724.txt etc} using Foreach Loop Container :
* Enumerator Set to: “Foreach ADO Enumerator” with the ADO object source variable selected to identify how many total loop iterations there are i.e. Let’s say 4 thus 4 files to be created
*Variable Mappings : added the User::FileNumber – indicates which file number current loop iteration is i.e. 1,2,3,4
For the DataFlow task have a OLDBSource and a FlatFile Destination where Flat File ConnectionString is set up as:
In my script task I have the following code. The task I'm trying to accomplish is: If the filename on FTP can be found in the local archive folder of e: drive then show message "FileAlreadyThere" (I will ultimatley change it to do nothing); if the filename on FTP cannot be found in the local archive folder of e: drive then transfer the file to the local package folder on d: drive.
While the script task is executing I was watching it closely, but the problem i saw is that: If some files on FTP are already in local archive folder and some are not, then it the files which are already in the archive folder are dumped to the package folder; then after that the files which are not in the archive folder are then dumped to the package folder. But I only want the new files on FTP to be transferred to the package folder for further processing.
Then after this is finished, I saw all the files in the package folder are refreshed one after another, after the first round of refresh the second round starts, after the second round finishes it then stopped. I saw it refreshes itself because the 'Date Modified' of the file changes. And I saw the script task turned green.
I don't see how the code below produced this result. Something is wrong in the logic of the loop? Anyone has any idea why it's behaving the way it is now? And how to change the code to accomplish what I want? Thanks a lot!!
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- Imports System Imports System.IO Imports System.Data Imports System.Math Imports Microsoft.SqlServer.Dts.Runtime Public Class ScriptMain Public Sub Main() Dim cm As ConnectionManager = Dts.Connections.Add("FTP") cm.Properties("ServerName").SetValue(cm, "ftp2.name.com") cm.Properties("ServerUserName").SetValue(cm, "username") cm.Properties("ServerPassword").SetValue(cm, "password") cm.Properties("ServerPort").SetValue(cm, "21") cm.Properties("Timeout").SetValue(cm, "0") cm.Properties("ChunkSize").SetValue(cm, "1000") '1000 kb cm.Properties("Retries").SetValue(cm, "1") Dim ftp As FtpClientConnection = New FtpClientConnection(cm.AcquireConnection(Nothing)) ftp.Connect() ftp.SetWorkingDirectory("/directory") Dim fileNames() As String Dim folderNames() As String ftp.GetListing(folderNames, fileNames) If fileNames Is Nothing Then MsgBox("NoFileOnFTP") Else Dim fileName As String For Each fileName In fileNames If File.Exists("c: emp" + fileName) Then MsgBox("FileAlreadyThere") Else ftp.ReceiveFiles(fileNames, "c: emp", True, True) End If Next End If