I am trying to read a 36 byte files that contains compressed data. I create my Flat File data source and SSIS reads it fine UNTIL it hits a x00 in the file. Then it stops reading and I can't get any data after it. There is data after the x00. Here the entire hex string: C7 C7 CF 6A 00 00 05 02 3D 03 21 01 E0 02 00 00 00 00 00 00 00 00 3D 3C 1E FD 02 C8 00 00 00 AE 41 E3 28 7C
To test, I changed the two x00 in bytes 5 and 6 to x01 and SSIS read until the next x00.
already tried this in other SQL forums, but maybe i have some luck here.
I need mainly to restore database backups from customers. They arrive in all kind of formats (zip, rar, gz). I'd like to be able to restore those directly from the compressed file, because i'm talking up to 7GB rar files which take a while to uncompress in a separate step.
I'm working for 6 years in R&D environments, but mostly on Linux/Oracle where this is an easy task using pipes, but i haven't found a sinlge web page, post or even script to do this with MSSQL. The VDI is not really what i'm looking for, so aren't backup software like SQLBackup, Litespeed etc. because i can't force the customer to use those.
Anybody any idea or even the same problem maybe with solution?
Hi, I am trying to lean ASP.NET. So I was trying to install SQL Server too. But, I could not install. The aleart box popped up:
"SQL Server set up cannot install files to the compressed or encrypted folder: C:Program FilesMicrosoft SQL Server"
Before this, I also got a warning that the hardware of my computer does not meet the hardware requirement.
I would like to know how I can install SQL Server into my computer. Or, shouldn't I do that? To learn ASP.NET I want to have a database ready. If I can not use SQL Server, I may try Access or something smaller...
When your snapshot is set to be delivered via ftp and compressed in a cab file. If you add a new article to your publication and re-run the snapshot the agent will be unable to pull the snapshot down as it for some reason doesnt think its compressed. It is failing to find the scripts it needs inside the cab file despite the cab file existing in the correct location.
Here is the error.
2007-07-19 09:57:29.855 Snapshot files will be downloaded via ftp 2007-07-19 09:57:29.886 Connecting to ftp site 'SQL3' 2007-07-19 09:57:29.933 The schema script 'empActive_127.sch' could not be propagated to the subscriber. 2007-07-19 09:57:29.933 Category:NULL Source: Merge Replication Provider Number: -2147201001 Message: The schema script 'empActive_127.sch' could not be propagated to the subscriber. 2007-07-19 09:57:29.933 Category:AGENT Source: SQL2SQL2005 Number: 20033 Message: The process could not retrieve file 'SQL3_CCUSA_ATLAS_SYSTEM TABLES/20070719055712/empActive_127.sch' from the FTP site 'SQL3'. 2007-07-19 09:57:29.949 CategoryS Source: Number: 12003 Message: 200 Type set to I. 200 PORT command successful. 550 SQL3_CCUSA_ATLAS_SYSTEM TABLES/20070719055712/empActive_127.sch: The system cannot find the file specified. 550 SQL3_CCUSA_ATLAS_SYSTEM TABLES/20070719055712/empActive_127.sch: The system cannot find
I think we dont have option to read Transaction file in SQLserver Other than using Logexplorer. IS this Logexplorer working file to audit the sql server. We are planning to buy Logexplorer. Is it good product to buy.
I NEED TO READ A TEXT FILE INTO A SQL SERVER 6.5 TABLE. THE FILE HAS VARIABLE LENGTH FIELDS AND THE FIELDS ARE SEPARATED BY PLUS SIGNS ("+"). ANY IDEAS ? THANKS FOR YOUR TIME.
I have created a C2 Audit .trc file in sql server 2000 windows platform. i would like to be able to read it. I have tried the following but i get a message that the file does not exist or is not a recognizable trace file format or there was an error opening the file.
SELECT * FROM ::fn_trace_gettable('D:Program FilesMicrosoft SQL ServerMSSQLDataaudittrace_20070207073931.trc', default) GO
I know the file and the path are valid. what else could be the problem??
I have 2 tables EmployeeA(Eng) and EmployeeB(Spanish) kept in seperate mdb's. I want to add the records into Sql Server 2005 table called StateEmployee.
Procedure:
1. Loop through 2 folders..one containing table EmployeeA in mdb and other containing tbl EmployeeB in diff mdb's. 2. Pick a file from EmployeeA and EmployeeB, both at the same time. 3. Count the total no of rows in both files. If equal proceed. 4. Compare the 'employeeid' of one row of employeeA to the employeeid of EmployeeB. 5. If employee id matches, load both the rows in Sql server else file it to the error table. 6. Loop through all rows simultaneously till end of row. 7. Go to next mdb.
How do i go about this step by step. I am fairly new to SSIS. I asked my other friends too but they have complex answers which i couldnt follow. Hope someone gives an 'easy to understand' solution with sample.
I have a challenge i am trying to overcome, hopefully soneone would have come across this issue before..
I am creating a DTS package that will be scheduled to run at a certain time everyday. A source folder exists that get a set of new files everyday.The DTS Package will then read each file and copy the data into a load table in my database the challenge is this:
I am trying to load files from a source folder into my load table, Within each file, the entires are in a specific format using pipes to seperate the data that goes into which column e.g
example of a file entry:
column1 | column2 | column3
data1 | data2 | data3
data1 | data2 | data3
data1 | data2 | data3
And now i am using DTS to specify the file format and map the cloumns as apprporiate to my table...all this is well and good, but my problem is each file has a different name as well as being timestamped, now how do i use DTS to specify the source folder, open each file sequentially and read (or more appropriate, copy the entries into my table, inserting new data from each file into my load table as well as overwriting old data in the load table from the files in the folder ?) is there a way in specifying your source folder in DTS rather than specifying the file in the Menu options (in the transformation data task properties )given, and or do i need to write a script for this(reading the file?)
can someone please give me a solution and how to approach this?
Hi€¦ During my web search looking for a solution I ran across SQL CE 3.5 articles. My questions about SQL CE 3.5 are: 1) Can SQL CE 3.5 handle a 4 €“ 6 GB file - Read - Parse (SQL) 2) Can SQL CE 3.5 act as a standalone client that a user can view a large (4-6 GB) text file? - Will I need a .NET (small) client to read the large (4-6 GB) text file? More info: The text file will reside on the machine where the SQL CE 3.5 is installed. There is no pull to get the data.
We have a package that is using a ForEach loop container to access files on a network drive. For some reason I am getting a message that the ForEach enumerator is empty and did not find any files that matched the pattern. For the pattern I left the default *.* for testing purposes. I have specified the file folder as \remoteserverfilesharesubfolder and also as \remoteserverc$filesharesubfolder and have gotten the same message. However when I map a network drive and set the file folder to the network drive it finds the files. Is this a permissions issue?
After I finish processing the file I want to move it to a new directory. Once this is deployed in production, the package will not be running under a domain account and probably won't have access to the network folder. Is there any way to specifiy in the connection manager itself that it should use a specific account to access the folder?
Has anyone had any trouble moving a package using a OLE DB Connection Manager reading DBASE IV files? While developing I never had a problem, the confiugration string described int his thread (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=76237&SiteID=1) worked just fine. Since I have enabled Configurations, my package will always fail when trying to read the dbf file. I've gone through just about every setting in the config file that I can think of.
Information: 0x4004300A at MyDataFlow, DTS.Pipeline: Validation phase is beginning.
Error: 0xC0202009 at MyPackage, Connection manager "MyDBASEIVConnManager": An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft OLE DB Service Components" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
Error: 0xC020801C at MyDataFlow, OLEDBFileSource [15]: The AcquireConnection method call to the connection manager "MyDBASEIVConnManager" failed with error code 0xC0202009.
Error: 0xC0047017 at MyDataFlow, DTS.Pipeline: component "OLEDBFileSource" (15) failed validation and returned error code 0xC020801C.
Error: 0xC004700C at MyDataFlow, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at MyDataFlow: There were errors during task validation.
My requirement is I have to read 2 sets of files from a folder. For example, I have to read all files starting with either 'a' or 'b' only. In 'Foreach Loop', if I say 'a*,b*', it is not working. Instead of comma (,), I tried colon, semi-colon and pipeline characters also. It is not working. So I am using 2 loops now. But I would like to know is there any way to do it using a single loop?
Hello, I have a table of compressed data and am looking for an efficient way to expand the data for reporting purposes. The table is used to store the number of hours a given contractor works and is stored in the following fashion:
cntHours 58 20 58 20
The first row represents the number of sequential days where an employee worked the same # of hours. Once the # of hours changes, a new record is created. In this simple example, the first row shows an employee working Monday-Friday (5) for a total of 8 hours each day. The second row represents the weekend (2 days) where the employee worked 0 hours.
What I need to do is explode this out to show 1 record per day. Ideally I'd like to write a function to do this as I would be linking to another table which has the start and end date for the contractor and would allow me to apply individual dates to each record based on the contractor start date through to the end date.
I am using SQL2005 merge replication and have pull subscribers on a low bandwidth link
I am compressing the snapshot into an alternate folder. Files are not put into the default folder
When I start a synchronization, I would expect the cab file to be copied to the subscriber and then the files to be extracted locally at the subscriber in order to apply the snapshot
However, what appears to be happening is that the files are being extracted from the cab file on the publisher (in a UNC specified directory) and then copied in their uncompressed form to the subscriber - resulting in an extremely slow snapshot application.
Any ideas what I am doing wrong? I have read about the options for using FTP to transfer snapshot files, but I am not clear whether I have to use FTP in order to transmit a compressed snapshot. I don't want to use FTP unless I need to.
I just completed a copy-Only compressed backup of a DB (with a FULL Recovery Model ) on SQL Server 2012 and the resulting backup (the bak file) is 1/100th the size of the data & log file. Is the compression in SQL Server 2012 just that good or did something else happen that I did not catch? Below is the T-SQL to re-create the backup. The size of the data file is 750MB and the log file is 75GB and is %95 used according to the SQLPERF command.
Does the compression in SQL Server 2012 simply that good
BACKUP DATABASE [MYBIGOLEDB] TO DISK = N'Z:Microsoft SQL ServerMSSQL11.MSSQLSERVERMSSQLBackupMYBIGOLEDB_20150611.bak' WITH COPY_ONLY, NOFORMAT, INIT, NAME = N'MYBIGOLEDB-Full Database Backup', SKIP, NOREWIND, NOUNLOAD, COMPRESSION, STATS = 10 GO
I have a server in our central location which is a compressed snapshot publisher. I have 2 push subscribers in remote locations on very slow WAN links. I would like the snapshot cabinet file to be uncompressed at the subscribers location rather than the publisher location. Is this possible with push subscribers? I want to manage the pushing of data to the remote subscribers from the publisher location.
I understand the default with push subscriptions is to uncompress the cabinet file at the publisher location.
I am trying to use bcp to output data to a compressed (zipped) folder. The bcp command is called from a step in scheduled job in SQL 2005 (T-SQL) similar to:
.... where Cdata is a compressed (zipped) folder. The scheduled job seems to work without errors, but afterwards there is nothing in the compressed folder. If Cdata is a regular folder everything works fine.
My source data is present in XML File which is stored in CLOB column Of Oracle. CLOB column is compressed.I need to Migrate data by Uncompressing XML to SQL 2012 .
Do I need to define XML column in SQL Server 2012 for storing Uncompressed CLOB values ?
How to uncompress the clob and extract the required data from XML using SSIS .
I have created a new report.what i do in application is i compress the image and save it in database.now i need to retrieve the compressed image and display in the report. I have used the following code to decompress the binary data save in the image.
I dont know after that what should i do. How to show the picture in SSRS Report. I need to show picture in many reports.one of my doubt is how to call this function in SSRS Report. The function accepts input as byte but in database the column in varbinary.
Should I convert the input type of function to varbinary instead of byte array?
Public Function Decompress(ByVal arr As Byte()) As Byte() Dim notCompressed As Boolean notCompressed = False Dim MS As MemoryStream MS = New MemoryStream()
I got full backup on daily schedule its taking more space on Drive because each file has more than 25GB.I am using SLQ server 2008R2 so I'm looking to take the backup with compression instead of uncompressed Backup. What are the impacts of compressed backup. Is there any problems with compressed backup while restoring the backup file.
I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.
In the For Loop, How to Iterate from Older flat files to Newer flat files based on File's Timestamp. If there are some older files in that folder, it should be processed first and then continue with the newer one.
In the first step of my SSIS package I need to get files from FTP and dump it/them in a local directory, but it's more than that, the logic is like this: 1. If no file(s) found, stop executing and send email saying no file(s) found; 2. If file(s) found, then compare it/them with existing files in our archive folder; if file(s) already exist in archive folder, stop executing and send email saying file(s) already existed, if file(s) not in archive folder yet, then transfer it/them to the local directory for processing.
I know i have to use a script task to do this and i did some research and found examples for each of the above 2 steps and not both combined, so that's why I need some help here to get the logic incorporated right.
Thanks for the help in advance and i apologize for the long lines of code!
example for step 1: ----------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task ' Write scripts using Microsoft Visual Basic ' The ScriptMain class is the entry point of the Script Task.
' The execution engine calls this method when the task executes. ' To access the object model, use the Dts object. Connections, variables, events, ' and logging features are available as static members of the Dts class. ' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure. ' ' To open Code and Text Editor Help, press F1. ' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Dim cDataFileName As String Dim cFileType As String Dim cFileFlgVar As String WriteVariable("SCFileFlg", False) WriteVariable("OOFileFlg", False) WriteVariable("INFileFlg", False) WriteVariable("IAFileFlg", False) WriteVariable("RCFileFlg", False) cDataFileName = ReadVariable("DataFileName").ToString cFileType = Left(Right(cDataFileName, 4), 2) cFileFlgVar = cFileType.ToUpper + "FileFlg" WriteVariable(cFileFlgVar, True) Dts.TaskResult = Dts.Results.Success End Sub Private Sub WriteVariable(ByVal varName As String, ByVal varValue As Object) Try Dim vars As Variables Dts.VariableDispenser.LockForWrite(varName) Dts.VariableDispenser.GetVariables(vars) Try vars(varName).Value = varValue Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try End Sub Private Function ReadVariable(ByVal varName As String) As Object Dim result As Object Try Dim vars As Variables Dts.VariableDispenser.LockForRead(varName) Dts.VariableDispenser.GetVariables(vars) Try result = vars(varName).Value Catch ex As Exception Throw ex Finally vars.Unlock() End Try Catch ex As Exception Throw ex End Try Return result End Function End Class
example for step 2: -------------------------------------------------------------------------------------------------------
' Microsoft SQL Server Integration Services Script Task
' Write scripts using Microsoft Visual Basic
' The ScriptMain class is the entry point of the Script Task.
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Runtime
Public Class ScriptMain
' The execution engine calls this method when the task executes.
' To access the object model, use the Dts object. Connections, variables, events,
' and logging features are available as static members of the Dts class.
' Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.
'
' To open Code and Text Editor Help, press F1.
' To open Object Browser, press Ctrl+Alt+J.
Public Sub Main()
Try
'Create the connection to the ftp server
Dim cm As ConnectionManager = Dts.Connections.Add("FTP")
It works remotely if I run it via command prompt. But when I add this to a TSQL job on my remote SQL instance, it runs without deleting anything. What I'm missing?
Brief overview...Running SQL Server 2003 Server Enterprise 64 bit - All Service Packs and patches current SQL Server 2005 Enterprise Edition 64 bit Build Microsoft SQL Server 2005 - 9.00.3054.00 (X64) Mar 23 2007 18:41:50 Copyright (c) 1988-2005 Microsoft Corporation Enterprise Edition (64-bit) on Windows NT 5.2 (Build 3790: Service Pack 2)
I cannot import any SSIS packages nor crete any new folders under stored packages. I hve googled the news groups and looked at BOL to no avail. HELP!!!!
I am thinking about replacing the INSERT data scriptfiles that I have with XML files. This way I can open the XMLfile using an XML Editor and see the values in a GRID andmake changes easier.Do you see any problem with this approach?I managed to put together some code that is exportinga SQL table with its data to an XML file and also a codethat reads the XML file's data and inserts it into a table.Now I am researching on XSD, td:datatype, DTD...(I am new to XML) in order to figure out how I canuse a single xml file that will hold both the sql serverfields, the datatypes and their values.If you have links to some sample code that has anythingto do with the datatype export and import I am workingon, can you please share them with me?Most importantly what do you think about the idea of usingXML files vs sql scripts?Thank you