I have been trying to write data from a database to XML files and
although i have been successful in doing that, the file(s) that are made
are not of the form that I would want.
I am using:
SQLDataSet.WriteXml(strFileName, XmlWriteMode.WriteSchema);
to generate the files where strFileName is the name of the file. The
files when viewed with XML Editor are seen as lines of data instead of
the regularly generated XML where each row is one below the other. When
viewed with WordPad/Notepad i see elements />< betweeen the data
and somehow I think this has to do something with the problem.
Can someone help me with this?? Any help or suggestion would be greatly
appreciated.
Thanks.
*** Sent via Developersdex http://www.developersdex.com ***
Hi, I'm relatively new to SQL and ASP.NET and I'm after some assistance to what I hope is an easily achievable goal. I need to upload some files (.pdf, .doc, .jpg - whatever) to a directory on my webserver. Once this has been done, I'd like to write some data to an SQL Database Table that will contain information about these files in a datagrid (or something similar) along with a link to download them. E.g I upload car.jpg which is located at /images/secure/car.jpg and is a photo of car. When a user logs in, they will be shown a datagrid which pulls information from said SQL database and in this datagrid will be something like: Filename Description Link car.jpg A pic of a car Download
If anyone is able to help me out with regards to linking to a file from an SQL dB it would be greatly appreciated, and if you require any further information just let me know :) Ben.
Hi guys,I have a stored procedure that I will subsequently post below thismessage. What the stored procedure does is, it reads some specifictables in a database and created XML off it.The problem comes in when the data is bigger than a few hundred/thousand characters. The data is truncated and the XML file is notfully made.Now i have been reading some posts by some people and tried using TEXT/NTEXT/ IMAGE type to write the files but they give me errors which,again, I am adding to this email.STORED PROC:****************************CREATE PROCEDURE usrp_sp_makexmlfile@str varchar(50),@templatefile varchar(255),@ReturnValue as image OUTASSET NOCOUNT ONDECLARE @strVar as varchar(8000)/*DECLARE @ReturnValue as varchar(255)*/Set @strVar = 'Select * from ' + @str + ' FOR XML AUTO'Set @templatefile = 'c: emplate.tpl'EXEC (@strVar)SELECT @ReturnValueGOTHIS IS WHAT I GET WHEN I USE IMAGE/NTEXT/TEXT TYPE:************************************************** *******An unhandled exception of type 'System.InvalidOperationException'occurred in system.data.dllAdditional information: Parameter 2: '@ReturnValue' of type: Byte[],the property Size has an invalid size: 0Also, if i try to add an integer value to the image/text/ntext like8000 or something less than that, it tells me that the result has to bediscarded because the current process has had some error.Can someone give me a suggestion on how to resolve this or an examplethat would illustrate the solution? Thank you all in advance.Pi.
I need to write text files of the size more than 1GB! First thought of doing this with BCP but then fetching data from stored proc was taking too long (more than one hour) and so BCP did not seem to be the right option as we have less flexiblity in terms - handling errors, knowing the status as to what point the writing has been complete etc. Most importantly getting such a huge amount of data did not seem to be a one time job! So I am now thinking about writing a CLR procedure which fetches smaller amount of data at a time and writing to file. Something like - open connection, get data, close connection, write to file and continue in cycles of this till whole file is written. Does this makes sense considering the amount of data to be fetched?
Any suggestion/comment on this will be much helpful!
Hi;Before I start let me say I know this is a silly way to go aboutthings, but it is one of those "my company made me do it" things.My company is in the process of ( finishing ) migrating from foxpro tosqlserver.They have a foxpro program that does a lot of various updates. Iwould like to write a tsql script to replace it.The trick is that the tsql script would have to have some proceduralprogramming artifacts that I don't think it has.I/my company would like the script to generate output messages thatthe user can see while the script is running. Everytime I have ran ascript in query analyzer I usually don't see the output ("print"statemetns, etc ) until all of the work isdone. Is there a way to run thing so I can see the output statementsas the work is being done?Can print statements work from within a tsql loop?The other thing is that I/my company wants the tsql script to writeoutput messages to a log file while it is operating ( for errorchecking and other logging).Is this possible with tsql?Thanks in advance for the infoSteve
I am having a scenario where all the stored procedures are stored in a folder (one sql file per sproc). Stored procedure does not have 'IF Exists .... DROP Procedure' in the script so before creating them we have to drop all sproc manually.
Can anyone help me writing a script / SSIS process to loop through each file in folder and write "IF EXISTS ... DROP PROCEDURE" with the procedure name in it ??
I can create a package that loop through each file in FOR each Loop task but dont know how to write in file using .net
We have a package that is using a ForEach loop container to access files on a network drive. For some reason I am getting a message that the ForEach enumerator is empty and did not find any files that matched the pattern. For the pattern I left the default *.* for testing purposes. I have specified the file folder as \remoteserverfilesharesubfolder and also as \remoteserverc$filesharesubfolder and have gotten the same message. However when I map a network drive and set the file folder to the network drive it finds the files. Is this a permissions issue?
After I finish processing the file I want to move it to a new directory. Once this is deployed in production, the package will not be running under a domain account and probably won't have access to the network folder. Is there any way to specifiy in the connection manager itself that it should use a specific account to access the folder?
Is there a quick way to extract a full dump of 50 tables to 50 corresponding text files?
i.e.
table_a has to be extracted to table_a.txt
table_b has to be extracted to table_b.txt
table_c has to be extracted to table_c.txt etc.
I don't want to have to add each one separately by hand in the DTSX package designer. I can't see any way to do it in a loop (because you have to do the field mapping). I can't seem to get the DTS Wizard to help - it only seems to be able to handle one table-to-text extract at any one time. And I've tried editing the DTXS file directly (in XML) but it looks like it's going to be rather complex, even if I only do it to define the connection managers. Feel free to suggest any better way to do this, though the specification has already been agreed, so I'm unlikely to be able to change it. Thanks
I want to write a batch file that will do just that. The problem is bcp or bulkcopy never works with my code, it doesn't recognize it. Maybe I'm going the wrong way to do this, but I could use some help. Ideally I'll make the batch to do the importing via bulkcopy or something like it of a text file into a table and use windows scheduler to automate it. Be as specific as you can please, I'm very new to sql server.
I am getting the following error running a data flow that splits the input data into multiple streams and writes the results of each stream to the same destination table:
"This operation conflicts with another pending operation on this transaction. The operation failed."
The flow starts with a single source table with one row per student and multiple scores for that student. It does a few lookups and then splits the stream (using Multicast) in several layers, ultimately generating 25 destinations (one for each score to be recorded), all going to the same table (like a fact table). This all is running under a transaction at the package level, which is distributed to a separate machine.
Apparently, I cannot have all of these streams inserting data into the same table at one time. I don't understand why not. In an OLTP system, many transactions are inserting records into the same table at once. Why can't I do that within the same transaction?
I suppose I can use a UnionAll to join them back together before writing to a single destination, but that seems like an unnecessary waste and clutters the flow. Can anyone offer a different solution or a reason why this fails in the first place?
I am trying to use a recordable CD as the storage medium for a comma delimited flat file destination in an SSIS package. I am running the developer edition of SQL Server 2005 on Windows XP SP2.
I discovered that in the file connection manager for a flat file, the path to the file can point to a directory on the CD-R. When I set the file name for the connection manager and point it to a file on the CD, the path looks something like this;
C:Documents and SettingsjuserLocal SettingsApplication DataMicrosoftCD BurningArchiveMonthlyMCTxnDetail_Archive.txt
When I run the package, the packge executes without a problem, but no 'burning' occurs. When I check the directory specified in 'file name', there is a header in the directory that states, 'Files Ready to Be Written to the CD'. The file I want to write to is listed there.
Is there some executable I can kick off in another task to complete the process of burning the file to the CD?
I proposed on a new server that we separate Data Files, Log Files, tempDB, Backups, etc. onto separate LUNS on a SAN with High Speed Solid State Drives.I was told that with the new technology with solid state SAN's that it would decrease performance and that it did not work the same way as it did when you had RAID 5's etc.I thought that if things were cared out correctly by a SAN Administrator they would know how to configure for optimal performance.
Hello, historically it has been frowned upon to store textual data in sqlserver. In v6.5 and before, 2k pages were allocated regardless of actual data size. Is this still the case in v7.0? I have a situation where the intranet application wants to store documents/images/etc. Currently, we need to have security granted to each user who connects to the web server to allow them to write files. However, if we write the data via sqlserver, then the application only needs permission. Is this the recommended method? What other options can I use? tia, Lee
I am working on a project where data is stored remotely in a Postgres database. I need to download some of the postgres data and store it in the SQL Server. The data in PG is in UTF-8. I use another application to write the data to the PG database. To talk to the PG database I am using the npgsql data provider (http://gborg.postgresql.org/project/npgsql/projdisplay.php). The data I am trying to download is arabic.
Everything seems to work fine except when I get the data from the PG DB and write it to SQL Server. I've done lots of debugging and can see that the data is correctly in arabic write until I do the update on the local dataset which saves it in SQL server. For some strange reason it makes the data into jibberish (just question marks).
I am using SQL Server express 2005. If anyone can help me with this I'd be extremely grateful as this has become a big problem and I've tried to find a solution without any success.
I am using OLE DB Destination to write data to a SQL server database. However, nothing is written to the database though there is no error reported. See the following output:
SSIS package "Tbl_Dim_Dates.dtsx" starting.
Information: 0x4004300A at Tbl_Dim_Dates, DTS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Tbl_Dim_Dates, DTS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Tbl_Dim_Dates, DTS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Tbl_Dim_Dates, DTS.Pipeline: Pre-Execute phase is beginning.
Information: 0x4004300C at Tbl_Dim_Dates, DTS.Pipeline: Execute phase is beginning.
Information: 0x402090DF at Tbl_Dim_Dates, OLE DB Destination [2396]: The final commit for the data insertion has started.
Information: 0x402090E0 at Tbl_Dim_Dates, OLE DB Destination [2396]: The final commit for the data insertion has ended.
Information: 0x40043008 at Tbl_Dim_Dates, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x40043009 at Tbl_Dim_Dates, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Tbl_Dim_Dates, DTS.Pipeline: "component "Date extract to file" (924)" wrote 3652 rows.
Information: 0x4004300B at Tbl_Dim_Dates, DTS.Pipeline: "component "Raw File Destination" (2518)" wrote 3652 rows.
Information: 0x4004300B at Tbl_Dim_Dates, DTS.Pipeline: "component "OLE DB Destination" (2396)" wrote 3652 rows.
I'm trying to do this: ------------------------------------- 1) Have SQL server return some data from this simple stored procedure. It returns XML: SELECT * FROM eAccessTypes FOR XML AUTO ------------------------------------- 2) In my C# I capture the return value in an XmlReader like so: XmlReader xmlr = mySqlCommand.ExecuteXmlReader(); After this i want to write the xmlr data to a file. Any ideas
Hi Guys In my project I need to write the data from an xml file to Database through a stored precedure. On line a lot of help is available for writing data from Database to xml but I didnt see much info about writing xml to database. If any one could give me some code I will really appreciate it.(I am a newbee). This application is developed using asp.net 1.1 and sql server 2005. Thanks for your help in advance.
I'm trying to read a byte array of an image datatype from sql server, and then to put this in another field in the database. I get a byte array, but somehow the image doesn't get into the db well with the sql parameters. Does anyone have an idea how to tackle this problem?
DataBase i am using is Sql Server6.5. In a trigger i had written code to transfer updated records from one table to other table.These updated records needs to be written into a text file. I had used xp_cmdshell but it is taking time.Is there a way to write data to flat file.
Hi, I have a data structure called 'Quote' which contains a number of different variables and controls ranging from text boxes, check boxes and radio buttons, i need to be able to read and write this from a database.
First I think a description of my overall project is needed:
Project Description I have been given a brief that basically says: i have to create a programmed solution in VB to solve a problem. This problem can be anything we like, and I personally have chosen to create a program that manages quotes for building Log Cabins (this is very contrived and far from anything someone would do in the real world).
My solution will allow a generic user to create a quote (using a form with controls such as text boxes, check boxes, radio buttons) , and then save this to file. These users may then wish to load/edit this quote at a later date, from another form.
Whilst completing this project, i'll only have up to about 5 records (quotes) within the system, so i dont need the ability to store hundreds of records. And each record will be relatively short, with only about 10-15 data items within the data structure.
Also the Admin (or business owner in this case) need to be able to view all saved quotes in a presentable format, and edit them if needs be, from within this same program.
This solution does not need to be absolutely perfect and 100% efficiently coded, or have all the bells and whistles a real-world program would have. This is for an A level computing project by the way.
So basically, i need to be able to read from the database (to populate a Data Grid (i imagine this is best way?)) and so Admin can access any quote and edit it (editing is not vital, but viewing/printing is. Maybe i should stop at just viewing any quote?). Also i need generic users to be able to fill in the Edit Quote form and then save this data into the database.
And is a data structure really required for me to use a database?
I've never used databases in VB before (but have used them elsewhere, mainly Access) and so am completely new to this. Any help will be much appreciated. Thanks
I am having a problem with an ASP program that inserts data into a table on SQL Server 2000.
No error msg is returned upon submission and the confirmation msg that displays after the commit command is sent to the server displays, but when we go to the DB, the data sent isn't there. This is an occassional occurance and usually the data is there, just some times, it isn't. Other forms function just fine, using the *exact* same file to perform the submit function (all the forms "include" the same submit page). The only difference we can find is a trigger on the table having problems which executes upon update, capturing the information about who updated the record when. From what we can see, this is the only programmatic difference. The other thought tickling our minds was the possiblity of a simultaneous submission, since all the users submit with the same db user name via the form, if user 1's data gets written but not yet commited, user 2's data is submitted, then the commit transaction is submitted by user 1 as the program steps run in sequence, would the commit by user one cause eiither of the records inserted but not commited to be lost? If so, why wouldn't that be causing problems on other forms ...
I was using for years an application that had an MSDE database.
Recently. I've upgraded my computer to Windows Vista, so I had to upgrade MSDE to SQL Express.
I've managed to attach the old MSDE database of the application to SQL Express, and I am able to retrieve my data from within the application.
My problem is that I can not write or update data from within the application anymore. However, If I open a table in SQL Management Studio Express I can write/ update data with no problem.
The company that build the application I was using doesnt exist any more so I cant take any support from them.
The only thing I know for a fact is that the application is connecting to the database using the sa user.
Can someone show me the command string required to write data out to 2 or 3 tables at once? How about how to write 2 entries at once? Write now my solution writes 1 record to a table and then I write somewhere around 40-200 records to a second child . Write now I'm writing like this:Open --> Add 1 record in table #1 --> closeOpen --> Add 1 record in table #2 --> closeOpen --> Add 1 record in table #2 --> closeOpen --> Add 1 record in table #2 --> closeOpen --> Add 1 record in table #2 --> closeand I'm just wondering if there is a better approach, such as writing all the data at once. Something like:Open --> Add 1 record in table #1 --> Add 5 records in table #2 --> close
I have a project were I will have it so that users can sign in and change information on an SQL server. The catch is that this site will be from a different domain name and from a different hosting company then where the SQL database is located. Sorry if this is a dumb question but how can I utilize asp.net to change and view an SQL database that is located else-where. For example: a user logs into www.something.com and he/she can view and edit SQL tables from www.somethingelse.com's database. Thanks in advance.
First of all let me say that ASP.NET a new programming environment for me so please forgive my ignorance. Can someone please tell me how to write data to a SQL table that is a Binary data type? I have a stored procedure on the SQL server that I am calling to insert data into a table. I build a parameter list and set the values. It worked just fine before I added a binary field to the SQL table. My problem is that I don't know how to set the Binay data type to pass it to the stored procedure. Here is part of the code: GetCMD = Myconnection.CreateCommand GetCMD.CommandType = CommandType.StoredProcedure GetCMD.CommandText = "SCHEMANAME.InsertLineItem" GetCMD.Parameters.Add("HEADER_ID", SqlDbType.VarChar, 150) GetCMD.Parameters("HEADER_ID").Value = "some value" GetCMD.Parameters.Add("@OPTIONS", SqlDbType.Binary) GetCMD.Parameters("@OPTIONS").Value = HOW DO I SET THIS VALUE???? rowsaffected = GetCMD.ExecuteNonQuery() I assume serialization but have not figured out how. Anyone's help is greatly appreciated!!
I am inserting data into a spread sheet from user interface(power builder). But at the same time some one can open that excel spread sheet to read the data. Then the process was going to fail(it won't able to write the data in to the spread sheet). How to avoid this situation? I really appreciate if anyone can shed some light.
I currently have the problem that I have to write some data into a SUN Directory Server 5.2 LDAP directory. Does anyone know how I can do this. I already found some articles in this forum that provide solutions how to access the active directory - but how can I access a none Microsoft LDAP Server?
Is there any way to use the OLE DB Destination or do I have to implement my own LDAP adapter in VB.net?
i am trying to write a store procedure which inserts data from flat file to table but i want to align the rows for data transformation, like which column should be transferred to which column on the existing table. can anyone help me with this..i know how to do it thru DTS or SSIS but just want it in script...
Hey everyone I've got this question that has me stuck for the last few days but its an important part of my website.....What I am trying to do is basicly have a user be able to upload a file, have that uploaded file plus some other info automaticly display on other parts of my site, and have a different user eventually be able to download that file....I have thought about allowing the file upload as a BLOB but still cannot find a proper way to execute this using VB, plus I have heard that this way of doing it is not reccommeneded cause databases were not designed to store large files like this, lots of articles recommened having the file upload to a Folder on your server then get the binary data for the file that can be placed in a database to refrence that particular file.....Well this also proves to be a lot harder then said here is what I got so far (written in C#) protected void UploadBtn_Click(object sender, EventArgs e) { if (FileUpLoad1.HasFile) {
FileUpLoad1.SaveAs(@"C:Documents and SettingsAdamMy DocumentsVisual Studio 2005ProjectsWebsiteFiles" + FileUpLoad1.FileName); Label1.Text = "File Uploaded: " + FileUpLoad1.FileName; } else { Label1.Text = "No File Uploaded."; } } and here is the asp part of the code that goes with it<asp:FileUpLoad id="FileUpLoad1" runat="server" />
<asp:Label ID="Label1" runat="server" Text=""></asp:Label> Now from what I know is I need to get the binary of the file which I have read you can do with the Page.Request.Files statement but again not sure how I would impliment this. Does anyone have any suggestions on which way I should take when dealing with this should I try and just use the BLOB method or use the binary refrence method? and if so how would I impliment this, heck even some good tutorials on the subject would be great... Thanks.....Adam
let's say I have a table named "myTable" in a SQL database:
UniqueID FirstName FamilyName
1 Elizabeth Moore
2 Chris Lee
2 Robert McDonald's
I want to create a SQL query should contain a parameter for example: SELECT * FROM myTable WHERE UniqueID = @TextBox OR FirstName = @TextBox OR FamilyName = @TextBox,
and when I type in my TextBox the ' * ' character, it should retrieve the whole table...
hope anybody understood, will be happy to explain more.
I have a package that contains three database tables (Header, detail and trailer record) each table is connected via a OLE DB source in SSIS. Each table varies in the amount of colums it holds and niether of the tables have the same field names. I need to transfer all data, from each table, in order, to a flat file destination.
I am using a foreach loop, with the data from an ado recordset, which contains the table name that I wish to write data to an OLEDB data dest. The table names are retrieved from an execute sql task in the an object var. Within the foreach loop, for each table name, I then use a datareader to an ado.net source to pull data from that table, via an expression construct into a variable - i.e. "select * from " + @[User::table_name]. This works fine for the first table, in which mappings are setup using the SSIS design environment. The data is retrieved. I then use a variable and set the data access mode for the oledb destination to "Table name or view name variable". This also saves data fine for the first table in the loop in the oledb dest. When the next table name is retrieved from the ado provider in the foreach loop, the datareader fails, as it still thinks the metadata mappings are from the first table, which was used for the mapping in the design environment. I.E. FIN_CLASS is a column from the first table in the loop.
Error: 0xC0202005 at Data Flow Task, DataReader Source [7181]: Column "FIN_CLASS" cannot be found at the datasource.
I have set the following properties, that I thought (in my feeble mind), are supposed to avoid that behavior. For the datareader, I set ValidateExternalMetadata to false, and for the data flow task (container for the datareader), I set DelayValidation to true. These settings, according to the doc, are supposed to evaluate metadata for the datareader source at runtime (not design time), so that the column metadata is dynamic, and so that the subsequent oledb destination can use the "data access mode" for the oledb destination of "Table name or view name variable".
If I cannot get this to work, I have 2 options: Use OPENQUERY via dynamic t-sql statements, OR create 30 separate flows in SSIS - one for each table - not looking forward to that one.