How Can I Process Multiple Binary Files Into My Sql Database?
Aug 1, 2006
i have a table with rows of file names and paths. what i'm trying to do is process each file and store it in my sql database. i want to store the files as binary files (they are word and excel and pdf files) anyone know a way to do this? it would especially be useful if i could do this with a console application so i can schedule it
View 2 Replies
ADVERTISEMENT
Feb 7, 2002
In DTS designer I need to solve the following problem:
Problem:
- Import x-number of flat files into SQL Server one at a time.
Issues:
- I can't call the package recursively to process
the files because I don't want to eat up computer resources.
(there is a possiblity that there could be hundreds of files
that need to be processed and I wouldn't want to have hundreds
of package instances running).
- The files names cannot be read into a list then processed
because while the list is being processed more files could
be "dumped" into the folder by the main frame. And, all files
must be processed.
I thought about scheduling a job at the end of the package
to run the package again, however, I haven't figured out
how to do that and am still looking. DTS is new to me.
Thanks,
Martin
View 3 Replies
View Related
Sep 26, 2005
I work for a company that makes heat transfers for the imprinted apparel market. We're developing a database of merchandise images for all of our non-design inventory. Using Access we're going to be inserting thumbnails of psd (photoshop) files. We're wondering if there is any way to import multiple psd's into the sql server database into matching records like matching a column named "filename" and the actual filename of the file without having to upload each file individually. We want to be able to dump the files from the database of the matching records, also. This way, once our catalog designer has found which designs they need to put into the new catalog, it will dump the psd's for us. The same for our staffer who does color separations.
Any suggestions out there? If you need me to post further of what we're trying here, I will. This is for the bossman.
View 4 Replies
View Related
Sep 13, 2000
Usually, our in house ERP software has 1 database and 1 database file. After an upgrade from MS SQL 6.5 to MS SQL 7.0 I have a database who's properties show that it is made up of multiple datafiles. What is the easiest and safest method to return this database to only have 1 datafile?
View 11 Replies
View Related
May 16, 2007
Hi there
It is obvious that putting multiple database files on different physical disk is better for performance, but what about splitting the data on different files on the same disk?
I have got a database of about 20GB and only a single data file. will I benefit from splitting this file to multiple files on the same disk?
View 10 Replies
View Related
Nov 2, 2007
Hello, I have a table that stores binary files. When I serve them up to the user, I call the following page (serveDocument.aspx?DocumentID=xxx) which holds only the following code and pass the document ID: Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Dim DocumentID As Integer = Convert.ToInt32(Request.QueryString("DocumentID")) 'Connect to the database and bring back the image contents & MIME type for the specified picture
Using myConnection As New SqlConnection(ConfigurationManager.ConnectionStrings("kelly_lmConnectionString1").ConnectionString) Const SQL As String = "SELECT [Content_Type], [Document_Data] FROM [lm_Service_Detail_Documents] WHERE [Document_id] = @Document_id"
Dim myCommand As New SqlCommand(SQL, myConnection) myCommand.Parameters.AddWithValue("@Document_id", DocumentID) myConnection.Open() Dim myReader As SqlDataReader = myCommand.ExecuteReader If myReader.Read Then
Response.ContentType = myReader("Content_Type").ToString() Response.BinaryWrite(myReader("Document_Data")) End If
myReader.Close()
myConnection.Close()
End Using End Sub It works perfectly. But heres the thing: I want users to be able to right click on the link and chose 'Save Target As'. In IE 7, this works fine. In FireFox it saves the document as 'serveDocument.aspx'. Is there any way to inject the filename that I want to save the document as? Thank you.
View 5 Replies
View Related
Apr 16, 2006
Is there any way to store a binary or image file into a database record / save a binary or image record back into a file using Transact SQL? If yes, would the "image" data type be most suitable?
(As a background, I am working on a process that deals with incomign emails, and I am using SQL Mail for that purpose. The above is needed to deal with attachments that may come with emails, and which are saved in a physical file on the HDD).
Thanks a lot in advance...
View 3 Replies
View Related
Sep 12, 2007
I have inherited some responsibilities for which I'm not really qualified, so I'll push on through and maybe not totally fall down.
Assume 10 50GB databases, each in a single MDF file. All these MDF files reside on the C drive (the only drive on the system), running SQL 2005 in a 32-bit Windows 2003 or later, 8GB RAM.
The C drive is 6 physical disks in RAID 5, say about 1.0 TB or so. We have 4 dual-core processors on the box.
We have limited simultaneous users, initally about 8 users doing very heavy write on all tables in any one database. Later, we have about 15 users connecting via Web interface, and doing very heavy read and light writing. Each of the 10 or so database has this lifecycle: Heavy write for about 2 weeks (load data) then heavy read for about 1 month (research and search data), then nothing ever again (db is taken offline).
Of course, this is not enough information to go on, but let's just go on it anyway.
My TempDB, Log (simple recovery), Index etc is all on the same RAID 5 drive (C).
I have two basic questions I'd love to hear feedback on:
1. Is there any real advantage to creating 8 Data files for my database (one per processor core)?
2. Given that the hardware people here REALLY don't want to change anything, what should I fight for first:
a. Separate drive for LOG files?
b. Separate drive for TempDB?
c. Something else
Thanks in advance.
View 1 Replies
View Related
Jul 23, 2005
Hi folks,Is it possible to store Binary Files in MS SQL 2000 ??? Say I have a100K PDF or a 150K word document. Is it possible to store this in afield in MS SQL and pull it out somehow? We're using ColdFusion on theserver on Apache.Thanks,Ringo
View 3 Replies
View Related
Nov 29, 2006
Hi There,
I am new to VC# and SQL Express. I am currently trying to find a way to store user selected files into SQL Express. I am using a column with the varbinary(MAX) data type. My current thought is to give the user a open file dialog box, and let them select the file they want to upload. On closing this dialog box, I convert the file selected into a file stream and insert into SQL Express.
Is this method a good one? If anyone else can give me pointers or hints to better methods, please do!
Thanks,
Ke
View 4 Replies
View Related
Dec 11, 2007
Hey everyone,
I'm deploying a desktop application with Sql CE 3.5. I have a collection of files that I would like to be save as a binary format in SQL CE. These files range from 1KB to 5MB. I know I cannot use varbinary(max) , instead I am limited to varbinary(8000), but this obvisously comes short of 5MB. There are articles stating to use the image datatype to get around this. Is this just bad practice, should I keep the files on the file system or take advantage of this work around? Do you have any suggestions?
Thanks in advance,
-- Chris
View 10 Replies
View Related
Dec 11, 2005
I need to write codes to access many image files and wave files and display them on the webpage. Both of these files are stored on the server (as many experts in this forum adviced) and their locations are in the SQL server database. About the wav files, I think I will have to convert them to MP3 first for fast delivery on internet. Since I am a newbie, any help would be appreciated.
Thanks,
View 14 Replies
View Related
Feb 16, 2004
i need to store a bmp or zipped file in a field of a ms sql db. i read the file using vb6 o.net, and in my mind i think sto store it in binary mode. the files could be more of 12 mega. whitch kind of field a could use?
View 1 Replies
View Related
Sep 27, 2006
First of all i do not know whether this is the right form to ask the question Let me describe the scenario iam using Iam generating xml files at a particular place and sending them to a server xml1|--------------------->dataset1------------------------------>adapter1.update(dataset1)xml2|----------------------->dataset2----------------------------->adapter2.update(dataset2)xml3|----------------------->dataset3------------------------------>adapter3.update(dataset3) all the three updates should happen in only one transaction if any one of the update fails then the transaction should rollbackcan anyone tell me a way to do iti am desperately in search of any ways to do it can anybody help please
View 2 Replies
View Related
Nov 19, 2006
Hi,I am new to SQL express and try to solve the 4GB size limitation.Is there a possibility to create a new database file every time I getto the limit?How can I do that with C#? how can I create new database file everytime it gets full?Can I be connected to two database files at the same time (the full dband the new db)?thanks in advance,oren
View 1 Replies
View Related
Aug 29, 2007
I have several databases that have grown to 300 GB and would like to distribute the data into multiple files across multiple drives. Can I create a new database that is spread across the new drives and use a full backup to restore or am I stuck with unloading the data table by table?
View 3 Replies
View Related
Jul 11, 2006
Hi
When images are uploaded and stored directly into a sql database as binary data (eg in the club starter kit) how can those images be accessed and displayed.
When I open the images table in VWD and select display data, the cells holding the image data hold a <binary data> tag. What I want to be able to do is get at that data, or actually get at the image so that it is displayed. My reason is this, at the moment the only way to access the images in the sql database after they have been uploaded is to log into the website and view them as an administrator of the site. It would be much simpler if I could access the database directly and view the contents of the images table.
Any ideas?
Thanks
View 2 Replies
View Related
Apr 29, 2015
We run std 2008 r2, I need to recreate flat files from their varbinary(max) equivalents in our db. I have a mix of excel, pdf, word etc to recreate. Will ssis be a good tool for doing this?  I'm wondering what transform(s) would be involved.Â
Perhaps I need to cast to varchar 1st and then land the data but if I recall correctly there is a maximum record length in ssis destination flat file rows. And I'm thinking I would have to map the varbinary (or cast equiv) to a row in the destination once for each file created.  Â
View 4 Replies
View Related
Feb 9, 2015
I need to import multiple csv files and load into table and everytime new database has to be created .
I was able to create new databases using stored proc
How do i do a bulk insert for all the files at once to insert into tables .
i want to load all the files at once .
View 6 Replies
View Related
Mar 17, 2014
I am testing out a blank database created over two physical files on two separate disks with one table called data which has one column called values nvarchar(max).
I filled the table up with a whole load of data and ran a select * against it. If I run Permon at the same time I can see that the read load has been spread over multiple disks as each of these disks is getting read from in parallel. If I create the same database on a single file and run the same select * again it takes much longer, proving that the read load has been distributed across multiple disks.
Now moving onto writes, this is where the confusion lies. I understand that SQL server fills files evenly until they need growing, after which it will then fill files individually until they are full in a round robin fashion unless you have trace 1117 turned on. What I don't understand is why the writes aren't distributed out whilst it is filling these file groups.
I ran an continual insert into my table with go 1000000 to monitor how the files are being filled up. I monitored where SQL server was physically placing the files as they were being inserted by running the following query:
;WITH CTE AS
(SELECT
sys.fn_PhysLocFormatter (%%physloc%%) col1,
RIGHT(LEFT(sys.fn_PhysLocFormatter (%%physloc%%),2),1) AS [Physical RID],
DATAID
[Code] ....
I could see that it would a thousand or so records into file 1, then a thousand or so into file 2, then a thousand or so into file 1 etc etc. In another words it would hit one disk, then another disk, then back to disk one to fill the file evenly. Is there any way to make SQL Server distribute the writes out in parallel so that both disks are writing in tandem?
By the looks of it, multiple disks only scale reads, as with writes only one disk is ever written to at once which is annoying. Any way to harness the write power of multiple disks?
View 6 Replies
View Related
Feb 18, 2007
I'm using EncryptByKey to encrypt data in my SS2005 database. Since our server is really slow to access from home to work on, I used the Database Publishing Wizard and installed the db to work on at home. Then I created the certificate and symmetric key in my home db.
When I pull info using the DecryptByKey on our database at work on Windows 2003 Server, no problem, the data is decrypted. However, the same data does not decrypt at home on my Windows XP computer. I'm using TripleDes on both machines for the symmetric key (AES won't work on XP).
--To create my cert and key:
USE My_DB;
CREATE CERTIFICATE MyCert
ENCRYPTION BY PASSWORD = 'some password'
WITH SUBJECT = My Data',
START_DATE = '01/01/2007',
EXPIRY_DATE = '01/01/2099';
GO
CREATE SYMMETRIC KEY MyKey WITH ALGORITHM = TRIPLE_DES
ENCRYPTION BY CERTIFICATE MyCert;
GO
To encrypt:
OPEN SYMMETRIC KEY MyKey
DECRYPTION BY CERTIFICATE MyCert
WITH PASSWORD = 'same password as above';
Insert my record, use scope_identity to return primary key into @CustomerID.
INSERT INTO [Customers] (EncryptByKey(Key_GUID('MyKey'), @DataToEncrypt, 1, CONVERT( varbinary, @CustomerID)))
CLOSE SYMMETRIC KEY MyKey
To decrypt:
SELECT CONVERT(varchar(3925), DecryptByKey(EncryptedField, 1, CONVERT( varbinary, @CustomerID))) as PlainTextData
FROM Customers
WHERE (CustomerID= @CustomerID)
Everything works fine when I run the decrypt query on the database on our work server. But I'm not getting decrypted data at home. Is the symmetric key or certificate machine specific? If so, that will cause a huge problem when we deploy to a production server.
Thanks in advance for your help!
View 5 Replies
View Related
Mar 15, 2007
Can multiple instances of SQL 2005 Express attach to the same database files on a network share? I have seen this done before with MSDE where the database files are stored on the server, but instead of having a SQL server running on the network and then connecting to it, only the database files exist on the network share and the users connect through MSDE running on the local machine. Is this possible with SQL2005Express? I do not have the ability to share an SQL instance from one workstation to another nor do I have the ability to install an instance on the corporate server. Is it as simple as creating the database and storing the files on the share then attaching the database to the SQL Instance on each workstation?
View 3 Replies
View Related
Feb 27, 2007
I would like to have a SSIS package which loops through each xml file (.xml files) in a folder on the network. And then for each file pull out the data and insert into a sql server table.
Please kindly guide me through this i.e. What task(s) are required, etc.
Thanks
View 1 Replies
View Related
Apr 24, 2008
Hi.
There is a "text" file generated by mainframe and it has to be uploaded to SQL Server. I've reproduced the situation with smaller sample. Let the file look like following:
A17 123.17 first row
BB29 493.19 second
ZZ3 18947.1 third row is longer
And in hex format:
00: 41 31 37 20 20 20 20 20 ”‚ 31 32 33 2E 31 37 20 20 A17 123.17
10: 66 69 72 73 74 20 72 6F ”‚ 77 0D 0A 42 42 32 39 20 first row™ª—™BB29
20: 20 00 20 34 39 33 2E 31 ”‚ 39 20 20 73 65 63 6F 6E 493.19 secon
30: 64 0D 0A 5A 5A 33 20 20 ”‚ 20 20 20 31 38 39 34 37 d™ª—™ZZ3 18947
40: 2E 31 20 74 68 69 72 64 ”‚ 20 72 6F 77 00 69 73 20 .1 third row is
50: 6C 6F 6E 67 65 72 ”‚ longer
I wrote "text" in quotes because sctrictly it is not pure text file - non-text binary zeros (0x00) happen sometimes instead of spaces (0x20).
The table is:
CREATE TABLE eng (
src varchar (512)
)
When i upload this file into SQL2000 using DTS or Import wizard, the table contains:
select src, substring(src,9,8), len(src) from eng
< src ><substr> <len>
A17 123.17 first row 123.17 25
BB29 493.19 22
ZZ3 18947.1 third row 18947.1 35
As one can see, everything was imported, including binary zeros. And though SELECT * in SSMS truncates strings upon reaching 0x00's, still all information is stored in tables - SUBSTRINGs show that.
When i upload this file into SQL2005 using SSIS or Import wizard the result is following:
< src ><substr> <len>
A17 123.17 first row 123.17 25
BB29 4
ZZ3 18947.1 third row 18947.1 25
This time table is half-empty - all characters behind binary zeros in respective rows are lost.
I stumbled upon this problem while migrating my DTSes to SSIS packages. Do you think there is some workaround, or i need to turn on some checkbox or smth else could help? Please...
View 8 Replies
View Related
Oct 21, 2005
Here's the senario:
View 8 Replies
View Related
Oct 5, 2005
I'm having a problem getting the for loop container to process all excel files in a folder. I set the collection folder to where my .xls files are, and i set a variable in the for loop container to the FileName. I then changed my source connection and added expressions for
View 12 Replies
View Related
Jun 30, 2006
What I'm trying to achieve is a SSIS package that will pick up 1 or
more excel files, process the data in them via the conditional
splitter, pushing the good data into a table, and all other rows into
an error table.
I'm having some issues using the ForEach container to process multiple
excel spreadsheets into tables. The excel import into the tables is
more or less working (imports data for good cases, but uses a null if
the Excel Source when it gets an unexpected value - but that's a
seperate problem).
I found something related to this when searching, but it related to
CTPs (June and September) and trying to reuse the connection strings
they built up there (using my own variable names, naturally) causes a
'Property Value failure':
--------------------------------------------------------------------------------
The connection string format is not valid. It must consist of one or
more components of the form X=Y, separated by semicolons. This error
occurs when a connection string with zero components is set on database
connection manager.
--------------------------------------------------------------------------------
I attemtpted to use this:
"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" +
@[User::RankingFileFullPath] + ";Extended Properties="Excel
8.0;HDR=YES";"
The excel importer works fine as a stand-alone component. Trying to use
the process defined in 'Profession SQL Server Integration Services'
pp140, I tried to use an expression to assign the variable value to the
connection string. I get a validation error:
--------------------------------------------------------------------------------
Error at Import TPNB Ranking Excel spreadsheets [Excel Source [1]]: The
AcquireConnection method call to the connection manager "Excel
Connection Manager" failed with error code 0xC0202009.
Error at Import TPNB Ranking Excel spreadsheets [DTS.Pipeline]:
component "Excel Source" (1) failed validation and returned error code
0xC020801C.
Error at Import TPNB Ranking Excel spreadsheets [DTS.Pipeline]: One or more component failed validation.
Error at Import TPNB Ranking Excel spreadsheets: There were errors during task validation.
Error at Excel Importer [Connection manager "Excel Connection Manager"]: An OLE DB error has occurred. Error code: 0x80040E4D.
--------------------------------------------------------------------------------
Any advice?
....
.... in addition ....
I attempted to change the package - I set the Data Flow validation to
Delay Validation, and changed the expression to change from:
ConnectionString @[User::RankingFileFullPath]
to
ExcelFilePath @[User::RankingFileFullPath]
This allowed the package to start debugging, and gave more information in the failure:
--------------------------------------------------------------------------------------------
SSIS package "Excel Importer.dtsx" starting.
SSIS breakpoint 'Break when the container receives the OnPreExecute event' at executable 'Excel Importer' has been hit
SSIS breakpoint 'Break when the container receives the OnPreExecute event' at executable 'Foreach Loop Container' has been hit
SSIS breakpoint 'Break when the container receives the OnQueryCancel event' at executable 'Excel Importer' has been hit
Information: 0x4004300A at Import TPNB Ranking Excel spreadsheets, DTS.Pipeline: Validation phase is beginning.
Warning: 0x802092A7 at Import TPNB Ranking Excel spreadsheets,
ProductSalesRank Table [278]: Truncation may occur due to inserting
data from data flow column "Rank" with a length of 1000 to database
column "SalesRank" with a length of 50.
Error: 0xC0202009 at Excel Importer, Connection manager "Excel
Connection Manager": An OLE DB error has occurred. Error code:
0x80004005.
An OLE DB record is available. Source: "Microsoft JET Database
Engine" Hresult: 0x80004005 Description: "Unrecognized
database format 'D:TestingTestRanking.xls'.".
Error: 0xC020801C at Import TPNB Ranking Excel spreadsheets, Excel
Source [1]: The AcquireConnection method call to the connection manager
"Excel Connection Manager" failed with error code 0xC0202009.
Error: 0xC0047017 at Import TPNB Ranking Excel spreadsheets,
DTS.Pipeline: component "Excel Source" (1) failed validation and
returned error code 0xC020801C.
Error: 0xC004700C at Import TPNB Ranking Excel spreadsheets, DTS.Pipeline: One or more component failed validation.
Error: 0xC0024107 at Import TPNB Ranking Excel spreadsheets: There were errors during task validation.
SSIS breakpoint 'Break when the container receives the OnQueryCancel event' at executable 'Excel Importer' has been hit
Warning: 0x80019002 at Foreach Loop Container: The Execution method
succeeded, but the number of errors raised (5) reached the maximum
allowed (1); resulting in failure. This occurs when the number of
errors reaches the number specified in MaximumErrorCount. Change the
MaximumErrorCount or fix the errors.
SSIS breakpoint 'Break when the container receives the OnQueryCancel event' at executable 'Excel Importer' has been hit
SSIS breakpoint 'Break when the container receives the OnQueryCancel event' at executable 'Excel Importer' has been hit
SSIS breakpoint 'Break when the container receives the OnWarning event' at executable 'Excel Importer' has been hit
Warning: 0x80019002 at Excel Importer: The Execution method succeeded,
but the number of errors raised (5) reached the maximum allowed (1);
resulting in failure. This occurs when the number of errors reaches the
number specified in MaximumErrorCount. Change the MaximumErrorCount or
fix the errors.
SSIS breakpoint 'Break when the container receives the OnPostExecute event' at executable 'Excel Importer' has been hit
SSIS package "Excel Importer.dtsx" finished: Failure.
The program '[2460] Excel Importer.dtsx: DTS' has exited with code 0
(0x0).--------------------------------------------------------------------------------------------
View 10 Replies
View Related
Jun 16, 2015
I have a requirement where in i have around 15 different flat files , filenames are fixed but folder path can be changed(i think i should use a variable for folder path). These 15 files data should go to their respective tables in the database.
Whether I need to create separate data flow task for each file or separate package? In addition to these, example : while importing product data into product table, if product ID already exists, we need to ignore it and upload only the new records.
View 4 Replies
View Related
Jun 27, 2006
I have a couple of hundred flat files to import into database tables using SSIS.
The files can be divided into groups by the format they use. I understand that I could import each group of files that have a common format at the same time using a Foreach Loop Container.
However, the example for the Foreach Loop Container has multiple files all being imported into the same database table. In my case, each file needs to be imported into a different database table.
Is it possible to import each set of files with the same format into different tables in a simple loop? I can't see a way to make a Data Flow Destination item accept its table name dynamically, which seems to prevent me doing this.
I suppose I could make a different Data Flow Destination item for each file, in the Data Flow. Would that be a reasonable solution, or is there a simpler solution, or should I just resign myself to making a separate Data Flow for every single file?
View 9 Replies
View Related
Jul 6, 2015
while i am trying to unzip files using execute process task ,getting below error
[Execute Process Task] Error: In Executing "C:Program Files7-Zip7z.exe" "a -tzip D:excel.zip D:unzipfileexcel.xls" at "", The process exit code was "1" while the expected was "0".
Warning: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. Â The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
i want to know more about unzip and zip files and folders using execute process task.
zip folder: Â C:Program Files7-Zip7z.exe
SQL version:Â SQL server 2008 R2
do not having win rar so please instruct using 7z.its quite interest to work but i don't know to get desired result.
View 6 Replies
View Related
Aug 14, 2012
I am trying to restore multiple .bak backup SQL database files onto a new server. However, I have found that it will not allow me to restore multiple databases at once. Is there a way to do this so that I do not have to manually upload one at a time? I tried adding all the .bak files at once to the backup device window but it only did the first one listed. It would be so much easier to restore them all at once so that I do not have to continue this manual process. I am restoring them via device.
View 13 Replies
View Related
Feb 15, 2008
I need to be able to bulk insert a bunch of tables from their corresponding flat file. I have created an XML file (see below) which has the file name/table name pair at each node. I then created a ForEachLoop task and used the Node enumeration type and the following OuterXpathString: ReferenceFiles/File. At this point I get lost. How do I pass the 2 inside node values (file name and table name) to variables which I can then use as expressions for the bulk insert task inside the Foreach?
Here is XML file:
Code Snippet
<ReferenceFiles>
<File>
<FileName>Ref_Categories.txt</FileName>
<TableName>Ref_Categories</TableName>
</File>
<File>
<FileName>Ref_Configs.txt</FileName>
<TableName>Ref_Configs</TableName>
</File>
</ReferenceFiles>
Thanks.
View 1 Replies
View Related
Nov 29, 2007
I used the data export wizard to export a single table to a single flat file (multiple wasn't allowed). I saved the package as a *.dtsx file which I'm attempting to edit to add the additional tables.
Creating additional sources is fairly easy copy of the first source and change to the table name.
I've tried copying the destination connection and changing to a new text file, but can't get past having to add each column manually to the new destination.
How can I duplicate the mapping that must be taking place in the wizard in the *.dtsx editing environment?
This seems like a simple / common task, but I've been unable to find a solution.
Thanks, Richard
View 1 Replies
View Related