I was using for years an application that had an MSDE database.
Recently. I've upgraded my computer to Windows Vista, so I had to upgrade MSDE to SQL Express.
I've managed to attach the old MSDE database of the application to SQL Express, and I am able to retrieve my data from within the application.
My problem is that I can not write or update data from within the application anymore. However, If I open a table in SQL Management Studio Express I can write/ update data with no problem.
The company that build the application I was using doesnt exist any more so I cant take any support from them.
The only thing I know for a fact is that the application is connecting to the database using the sa user.
I am getting the following error running a data flow that splits the input data into multiple streams and writes the results of each stream to the same destination table:
"This operation conflicts with another pending operation on this transaction. The operation failed."
The flow starts with a single source table with one row per student and multiple scores for that student. It does a few lookups and then splits the stream (using Multicast) in several layers, ultimately generating 25 destinations (one for each score to be recorded), all going to the same table (like a fact table). This all is running under a transaction at the package level, which is distributed to a separate machine.
Apparently, I cannot have all of these streams inserting data into the same table at one time. I don't understand why not. In an OLTP system, many transactions are inserting records into the same table at once. Why can't I do that within the same transaction?
I suppose I can use a UnionAll to join them back together before writing to a single destination, but that seems like an unnecessary waste and clutters the flow. Can anyone offer a different solution or a reason why this fails in the first place?
We are upgrading our servers from 6.5 to 7.0 SQL servers. I used the same configurations on both 6.5 server and 7.0 server like ANSI and the others. But SQL7.0 server turns out to have different data on some tables. Here is the example on one table. Why?
WT_PAY_PERIOD_DATE WT_RDC_HOURS WT_RDC_EXPENSE WT_MS_SUB_HOURS WT_MS_APPR_IND WT_NEW_EMPL_FLAG ------------------------------------- -------------------------- --------------------------------- -------------------------------- ----------------------------- -------------------------------- 2002-04-20 00:00:00.000 23.50 135.2050 23.50 Y X
WT_ASSOC_OTH_EXPENSE WT_ASSOC_TOT_EXP WT_MS_TOT_EXP WT_MS_MILE_LIM ------------------------------------------- ----------------------------------- -------------------------- --------------------------- 3.7600 135.2050 135.2050 373 ************************************************** ************************************************** ************************************************** ********************** I have checked the datatypes that are transferred from SQL6.5. They are the same types as SQL6.5. Plus the configurations are the same on both 6.5 and 7.0. The question is that why the values are different? What the way is to fix it?
I am trying to use a recordable CD as the storage medium for a comma delimited flat file destination in an SSIS package. I am running the developer edition of SQL Server 2005 on Windows XP SP2.
I discovered that in the file connection manager for a flat file, the path to the file can point to a directory on the CD-R. When I set the file name for the connection manager and point it to a file on the CD, the path looks something like this;
C:Documents and SettingsjuserLocal SettingsApplication DataMicrosoftCD BurningArchiveMonthlyMCTxnDetail_Archive.txt
When I run the package, the packge executes without a problem, but no 'burning' occurs. When I check the directory specified in 'file name', there is a header in the directory that states, 'Files Ready to Be Written to the CD'. The file I want to write to is listed there.
Is there some executable I can kick off in another task to complete the process of burning the file to the CD?
Hello, historically it has been frowned upon to store textual data in sqlserver. In v6.5 and before, 2k pages were allocated regardless of actual data size. Is this still the case in v7.0? I have a situation where the intranet application wants to store documents/images/etc. Currently, we need to have security granted to each user who connects to the web server to allow them to write files. However, if we write the data via sqlserver, then the application only needs permission. Is this the recommended method? What other options can I use? tia, Lee
Hi guys,I have been trying to write data from a database to XML files andalthough i have been successful in doing that, the file(s) that are madeare not of the form that I would want.I am using:SQLDataSet.WriteXml(strFileName, XmlWriteMode.WriteSchema);to generate the files where strFileName is the name of the file. Thefiles when viewed with XML Editor are seen as lines of data instead ofthe regularly generated XML where each row is one below the other. Whenviewed with WordPad/Notepad i see elements />< betweeen the dataand somehow I think this has to do something with the problem.Can someone help me with this?? Any help or suggestion would be greatlyappreciated.Thanks.*** Sent via Developersdex http://www.developersdex.com ***
I am working on a project where data is stored remotely in a Postgres database. I need to download some of the postgres data and store it in the SQL Server. The data in PG is in UTF-8. I use another application to write the data to the PG database. To talk to the PG database I am using the npgsql data provider (http://gborg.postgresql.org/project/npgsql/projdisplay.php). The data I am trying to download is arabic.
Everything seems to work fine except when I get the data from the PG DB and write it to SQL Server. I've done lots of debugging and can see that the data is correctly in arabic write until I do the update on the local dataset which saves it in SQL server. For some strange reason it makes the data into jibberish (just question marks).
I am using SQL Server express 2005. If anyone can help me with this I'd be extremely grateful as this has become a big problem and I've tried to find a solution without any success.
I am using OLE DB Destination to write data to a SQL server database. However, nothing is written to the database though there is no error reported. See the following output:
SSIS package "Tbl_Dim_Dates.dtsx" starting.
Information: 0x4004300A at Tbl_Dim_Dates, DTS.Pipeline: Validation phase is beginning.
Information: 0x4004300A at Tbl_Dim_Dates, DTS.Pipeline: Validation phase is beginning.
Information: 0x40043006 at Tbl_Dim_Dates, DTS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Tbl_Dim_Dates, DTS.Pipeline: Pre-Execute phase is beginning.
Information: 0x4004300C at Tbl_Dim_Dates, DTS.Pipeline: Execute phase is beginning.
Information: 0x402090DF at Tbl_Dim_Dates, OLE DB Destination [2396]: The final commit for the data insertion has started.
Information: 0x402090E0 at Tbl_Dim_Dates, OLE DB Destination [2396]: The final commit for the data insertion has ended.
Information: 0x40043008 at Tbl_Dim_Dates, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x40043009 at Tbl_Dim_Dates, DTS.Pipeline: Cleanup phase is beginning.
Information: 0x4004300B at Tbl_Dim_Dates, DTS.Pipeline: "component "Date extract to file" (924)" wrote 3652 rows.
Information: 0x4004300B at Tbl_Dim_Dates, DTS.Pipeline: "component "Raw File Destination" (2518)" wrote 3652 rows.
Information: 0x4004300B at Tbl_Dim_Dates, DTS.Pipeline: "component "OLE DB Destination" (2396)" wrote 3652 rows.
I'm trying to do this: ------------------------------------- 1) Have SQL server return some data from this simple stored procedure. It returns XML: SELECT * FROM eAccessTypes FOR XML AUTO ------------------------------------- 2) In my C# I capture the return value in an XmlReader like so: XmlReader xmlr = mySqlCommand.ExecuteXmlReader(); After this i want to write the xmlr data to a file. Any ideas
Hi Guys In my project I need to write the data from an xml file to Database through a stored precedure. On line a lot of help is available for writing data from Database to xml but I didnt see much info about writing xml to database. If any one could give me some code I will really appreciate it.(I am a newbee). This application is developed using asp.net 1.1 and sql server 2005. Thanks for your help in advance.
I'm trying to read a byte array of an image datatype from sql server, and then to put this in another field in the database. I get a byte array, but somehow the image doesn't get into the db well with the sql parameters. Does anyone have an idea how to tackle this problem?
DataBase i am using is Sql Server6.5. In a trigger i had written code to transfer updated records from one table to other table.These updated records needs to be written into a text file. I had used xp_cmdshell but it is taking time.Is there a way to write data to flat file.
Hi, I have a data structure called 'Quote' which contains a number of different variables and controls ranging from text boxes, check boxes and radio buttons, i need to be able to read and write this from a database.
First I think a description of my overall project is needed:
Project Description I have been given a brief that basically says: i have to create a programmed solution in VB to solve a problem. This problem can be anything we like, and I personally have chosen to create a program that manages quotes for building Log Cabins (this is very contrived and far from anything someone would do in the real world).
My solution will allow a generic user to create a quote (using a form with controls such as text boxes, check boxes, radio buttons) , and then save this to file. These users may then wish to load/edit this quote at a later date, from another form.
Whilst completing this project, i'll only have up to about 5 records (quotes) within the system, so i dont need the ability to store hundreds of records. And each record will be relatively short, with only about 10-15 data items within the data structure.
Also the Admin (or business owner in this case) need to be able to view all saved quotes in a presentable format, and edit them if needs be, from within this same program.
This solution does not need to be absolutely perfect and 100% efficiently coded, or have all the bells and whistles a real-world program would have. This is for an A level computing project by the way.
So basically, i need to be able to read from the database (to populate a Data Grid (i imagine this is best way?)) and so Admin can access any quote and edit it (editing is not vital, but viewing/printing is. Maybe i should stop at just viewing any quote?). Also i need generic users to be able to fill in the Edit Quote form and then save this data into the database.
And is a data structure really required for me to use a database?
I've never used databases in VB before (but have used them elsewhere, mainly Access) and so am completely new to this. Any help will be much appreciated. Thanks
I am having a problem with an ASP program that inserts data into a table on SQL Server 2000.
No error msg is returned upon submission and the confirmation msg that displays after the commit command is sent to the server displays, but when we go to the DB, the data sent isn't there. This is an occassional occurance and usually the data is there, just some times, it isn't. Other forms function just fine, using the *exact* same file to perform the submit function (all the forms "include" the same submit page). The only difference we can find is a trigger on the table having problems which executes upon update, capturing the information about who updated the record when. From what we can see, this is the only programmatic difference. The other thought tickling our minds was the possiblity of a simultaneous submission, since all the users submit with the same db user name via the form, if user 1's data gets written but not yet commited, user 2's data is submitted, then the commit transaction is submitted by user 1 as the program steps run in sequence, would the commit by user one cause eiither of the records inserted but not commited to be lost? If so, why wouldn't that be causing problems on other forms ...
I just installed SQL 2005 on a new box. I want to move a database from a SQL 2000 server to the new server. Can I detach the database, copy it and attach it in the new server without having problems?
I'm concerned that if the datafile is in SQL2000 format, when I connect it to SQL2005 server, will it still be in old format or will it upgrade?....or is this something you don't worry about....and why?
I wanted to upgrade pre-release version of SQL Express to the latest version, and wanted to make sure that none of the database information would be lost. The thing that concerns me is that it states to uninstall previous versions of SQL Server 2005 Express before installing the latest version. I threw the database tables into a backup directory, and I think this should be fine, but I want to be sure. Let me know. Thanks!
Can someone show me the command string required to write data out to 2 or 3 tables at once? How about how to write 2 entries at once? Write now my solution writes 1 record to a table and then I write somewhere around 40-200 records to a second child . Write now I'm writing like this:Open --> Add 1 record in table #1 --> closeOpen --> Add 1 record in table #2 --> closeOpen --> Add 1 record in table #2 --> closeOpen --> Add 1 record in table #2 --> closeOpen --> Add 1 record in table #2 --> closeand I'm just wondering if there is a better approach, such as writing all the data at once. Something like:Open --> Add 1 record in table #1 --> Add 5 records in table #2 --> close
I have a project were I will have it so that users can sign in and change information on an SQL server. The catch is that this site will be from a different domain name and from a different hosting company then where the SQL database is located. Sorry if this is a dumb question but how can I utilize asp.net to change and view an SQL database that is located else-where. For example: a user logs into www.something.com and he/she can view and edit SQL tables from www.somethingelse.com's database. Thanks in advance.
First of all let me say that ASP.NET a new programming environment for me so please forgive my ignorance. Can someone please tell me how to write data to a SQL table that is a Binary data type? I have a stored procedure on the SQL server that I am calling to insert data into a table. I build a parameter list and set the values. It worked just fine before I added a binary field to the SQL table. My problem is that I don't know how to set the Binay data type to pass it to the stored procedure. Here is part of the code: GetCMD = Myconnection.CreateCommand GetCMD.CommandType = CommandType.StoredProcedure GetCMD.CommandText = "SCHEMANAME.InsertLineItem" GetCMD.Parameters.Add("HEADER_ID", SqlDbType.VarChar, 150) GetCMD.Parameters("HEADER_ID").Value = "some value" GetCMD.Parameters.Add("@OPTIONS", SqlDbType.Binary) GetCMD.Parameters("@OPTIONS").Value = HOW DO I SET THIS VALUE???? rowsaffected = GetCMD.ExecuteNonQuery() I assume serialization but have not figured out how. Anyone's help is greatly appreciated!!
I am inserting data into a spread sheet from user interface(power builder). But at the same time some one can open that excel spread sheet to read the data. Then the process was going to fail(it won't able to write the data in to the spread sheet). How to avoid this situation? I really appreciate if anyone can shed some light.
I currently have the problem that I have to write some data into a SUN Directory Server 5.2 LDAP directory. Does anyone know how I can do this. I already found some articles in this forum that provide solutions how to access the active directory - but how can I access a none Microsoft LDAP Server?
Is there any way to use the OLE DB Destination or do I have to implement my own LDAP adapter in VB.net?
i am trying to write a store procedure which inserts data from flat file to table but i want to align the rows for data transformation, like which column should be transferred to which column on the existing table. can anyone help me with this..i know how to do it thru DTS or SSIS but just want it in script...
Hey everyone I've got this question that has me stuck for the last few days but its an important part of my website.....What I am trying to do is basicly have a user be able to upload a file, have that uploaded file plus some other info automaticly display on other parts of my site, and have a different user eventually be able to download that file....I have thought about allowing the file upload as a BLOB but still cannot find a proper way to execute this using VB, plus I have heard that this way of doing it is not reccommeneded cause databases were not designed to store large files like this, lots of articles recommened having the file upload to a Folder on your server then get the binary data for the file that can be placed in a database to refrence that particular file.....Well this also proves to be a lot harder then said here is what I got so far (written in C#) protected void UploadBtn_Click(object sender, EventArgs e) { if (FileUpLoad1.HasFile) {
FileUpLoad1.SaveAs(@"C:Documents and SettingsAdamMy DocumentsVisual Studio 2005ProjectsWebsiteFiles" + FileUpLoad1.FileName); Label1.Text = "File Uploaded: " + FileUpLoad1.FileName; } else { Label1.Text = "No File Uploaded."; } } and here is the asp part of the code that goes with it<asp:FileUpLoad id="FileUpLoad1" runat="server" />
<asp:Label ID="Label1" runat="server" Text=""></asp:Label> Now from what I know is I need to get the binary of the file which I have read you can do with the Page.Request.Files statement but again not sure how I would impliment this. Does anyone have any suggestions on which way I should take when dealing with this should I try and just use the BLOB method or use the binary refrence method? and if so how would I impliment this, heck even some good tutorials on the subject would be great... Thanks.....Adam
let's say I have a table named "myTable" in a SQL database:
UniqueID FirstName FamilyName
1 Elizabeth Moore
2 Chris Lee
2 Robert McDonald's
I want to create a SQL query should contain a parameter for example: SELECT * FROM myTable WHERE UniqueID = @TextBox OR FirstName = @TextBox OR FamilyName = @TextBox,
and when I type in my TextBox the ' * ' character, it should retrieve the whole table...
hope anybody understood, will be happy to explain more.
I have a package that contains three database tables (Header, detail and trailer record) each table is connected via a OLE DB source in SSIS. Each table varies in the amount of colums it holds and niether of the tables have the same field names. I need to transfer all data, from each table, in order, to a flat file destination.
I am using a foreach loop, with the data from an ado recordset, which contains the table name that I wish to write data to an OLEDB data dest. The table names are retrieved from an execute sql task in the an object var. Within the foreach loop, for each table name, I then use a datareader to an ado.net source to pull data from that table, via an expression construct into a variable - i.e. "select * from " + @[User::table_name]. This works fine for the first table, in which mappings are setup using the SSIS design environment. The data is retrieved. I then use a variable and set the data access mode for the oledb destination to "Table name or view name variable". This also saves data fine for the first table in the loop in the oledb dest. When the next table name is retrieved from the ado provider in the foreach loop, the datareader fails, as it still thinks the metadata mappings are from the first table, which was used for the mapping in the design environment. I.E. FIN_CLASS is a column from the first table in the loop.
Error: 0xC0202005 at Data Flow Task, DataReader Source [7181]: Column "FIN_CLASS" cannot be found at the datasource.
I have set the following properties, that I thought (in my feeble mind), are supposed to avoid that behavior. For the datareader, I set ValidateExternalMetadata to false, and for the data flow task (container for the datareader), I set DelayValidation to true. These settings, according to the doc, are supposed to evaluate metadata for the datareader source at runtime (not design time), so that the column metadata is dynamic, and so that the subsequent oledb destination can use the "data access mode" for the oledb destination of "Table name or view name variable".
If I cannot get this to work, I have 2 options: Use OPENQUERY via dynamic t-sql statements, OR create 30 separate flows in SSIS - one for each table - not looking forward to that one.
I have a program where I have to do a weekly data upload of approx 1,500 records. I've written the sql, but need to know how I can add static header text to the SQL statement.the text I need in the header is Reportname(Static string) + #rows (Rowcount of the sql, int) + department (String static) I'm planning on writing it and saving it as a package, then schedule it to run every Friday at 5:00. But I have been reading a little on Reporting Services and wonder if people think it' might be the way to go? Is it a whole new area of security issues? Or is it worth installing and learning? I will have more and more reports I need to write that will increase in complexity, and want to know if I want to keep writing querys by hand, or use reporting services. Thanks in advance! Dan
We're going to be upgrading our Server from 6.5 to 7.0. In addition we're getting a new SQL Server Box.
Which of the following is better: a) Install SQL 6.5 on the new box, restore the DB from old box to the new box, then install 7.0 on the new box and use the upgrade wizard to upsize the DB. (We'll have room to leave the 6.5 devices on the box after upgrading.) or b) Install SQL 7.0 on the new box and upsize the db across the network (100megabit network cards).
Thanks,
Kevin M. Tupper EZLinks Golf, Inc. http://www.ezlinksgolf.com
I'm following a white paper and have run chkupg65 and am stuck at resolving keyword conflicts. I see many TechNet references about fixing them...but none about how. A little help? Below are the conflicts: Keyword conflicts Column name: SyncModification.ACTION[SQL-92 keyword] Column name: Jobs.EXPIREDATE[SQL-92 keyword] Column name: SqlStoreTable.SIZE[SQL-92 keyword] Column name: JobDetails.TIME[SQL-92 keyword] Column name: SchedulerSafeDelete.TIME[SQL-92 keyword] Column name: SMSEvent.TIME[SQL-92 keyword] Column name: SyncModification.TIME[SQL-92 keyword] Column name: vJobDetails.TIME[SQL-92 keyword] Column name: vSMSEvent.TIME[SQL-92 keyword] Column name: Sites.TIMESTAMP[SQL-92 keyword] Column name: QueryExpressions.VALUE[SQL-92 keyword]
Does anyone know of a good book or resource on what's the best way to upgrade 6.5 to 2k? Or generally how would you do that considering you need to do on more then 50 servers?
So I'm using Visual Studio's one click deployment to install/update software for users. SQL Express 2005 is the default, and I need to get all current and new users to update to SQL Express 2005 SP2. Ontop of that, I need to change the sa password for all current users (I should be able to manually set it for the new guys).
The problem I'm having is that instead of just updating the current SQL Express engine installed, it creates a new instance of the SQL Express engine when installing SQL Express SP2.
Is there anyway to just update SQL Express to SP2? Is there an easy way to change the sa password when doing so?