Hello, I am trying to pull run this sql statement but it's bombing out at the comm.ExecuteNonQuery();. ..
Could someone help me figure this out.. Ex.System.Int32 bum = System.Convert.ToInt32(Request.QueryString["dum"]); SqlConnection conn = new SqlConnection("Data Source=**********************");SqlCommand comm = new SqlCommand(); comm.Connection = conn;SqlDataAdapter myadapter = new SqlDataAdapter(comm);DataSet myset = new DataSet(); conn.Open(); comm.CommandText = "Select * from Order_Forms where (Order_Num = " + Session["dum1"] + " ) ";
I'm new at this so I apologize in advance for my ignorance.
I'm creating a website that collects dates in a calendar control (from Peter Blum). When the page containing the control loads it populates the calendar with dates from the database (that have previously been selected by the user). The user then can delete existing dates and/or add new dates.
I create a dataset when the page loads and use it to populate the calendar. When the user finishes adding and deleting dates in the calendar control I delete the original dates from the dataset and then write the new dates to the dataset. I then give execute the data adapter update command to load the contents of the dataset back into the database. This command involves using parameterized queries. For example the Insert command is:
Dim cmdInsert As SqlCommand = New SqlCommand("INSERT INTO Requests VALUES(@fkPlayerIDNumber, @RequestDate, @PostDate, @fkGroupID)", conn)
ok. the problem: some tables are empty, so i can't be sure why some are updating at the DB and some arent. I have manually picked thru every line of the xml that i'm reading into the dataset here, and it is fine, data is all valid and everything. the tables i'm most worried about are bulletins and surveys, but they all have to be imported from my upload, the code steps thru just fine and I've been thru it a million times. Did I miss something in my dataadapter configuration?. 'daBulletin ' Me.daBulletin.ContinueUpdateOnError = True Me.daBulletin.DeleteCommand = Me.SqlDeleteCommand17 Me.daBulletin.InsertCommand = Me.SqlInsertCommand17 Me.daBulletin.SelectCommand = Me.SqlSelectCommand25 Me.daBulletin.TableMappings.AddRange(New System.Data.Common.DataTableMapping() {New System.Data.Common.DataTableMapping("Table", "tblBulletin", New System.Data.Common.DataColumnMapping() {New System.Data.Common.DataColumnMapping("BulletinID", "BulletinID"), New System.Data.Common.DataColumnMapping("ContractID", "ContractID"), New System.Data.Common.DataColumnMapping("Msg_Type", "Msg_Type"), New System.Data.Common.DataColumnMapping("DatePosted", "DatePosted"), New System.Data.Common.DataColumnMapping("Subject", "Subject"), New System.Data.Common.DataColumnMapping("B_Body", "B_Body"), New System.Data.Common.DataColumnMapping("I_Read_It", "I_Read_It"), New System.Data.Common.DataColumnMapping("DateRead", "DateRead")})}) Me.daBulletin.UpdateCommand = Me.SqlUpdateCommand16 here is my merge function: Private Function Merge(ByVal sFilename As String, ByVal User As String) As String Dim connMerge As New SqlConnection(ConnectionString) Dim dsNew As New dsBeetleTracks Dim dsExisting As New dsBeetleTracks Dim strResult As String SetConnections(connMerge) Dim idc As New System.Security.Principal.GenericIdentity(User) Dim currentUser As BeetleUser = bu
dsNew.ReadXml(sFilename)
If currentUser.IsInRole("Admin") Or currentUser.IsInRole("QA") Then If dsNew.tblBulletin.Count > 0 Then daBulletin.Fill(dsExisting.tblBulletin) dsExisting.Merge(dsNew.tblBulletin) strResult += daHelipads.Update(dsExisting.tblBulletin).ToString + " Bulletins updated<br>" End If End If
If dsNew.tblHours.Count > 0 And (currentUser.IsInRole("Survey") Or currentUser.IsInRole("Admin")) Then daHours.Fill(dsExisting.tblHours) dsExisting.Merge(dsNew.tblHours)
strResult += daHours.Update(dsExisting.tblHours).ToString + " hours updated<br>" End If
If dsNew.tblHeliPads.Count > 0 Then daHelipads.Fill(dsExisting.tblHeliPads) dsExisting.Merge(dsNew.tblHeliPads) strResult += daHelipads.Update(dsExisting.tblHeliPads).ToString & " helipads updated " End If
If dsNew.tblExpenses.Count > 0 Then daExpenses.Fill(dsExisting.tblExpenses) dsExisting.Merge(dsNew.tblExpenses)
strResult += daExpenses.Update(dsExisting.tblExpenses).ToString + " expenses updated<br>" End If
If dsNew.tblPersons.Count > 0 And (currentUser.IsInRole("Survey") Or currentUser.IsInRole("FB") Or currentUser.IsInRole("Heli-burn")) Then daPersons.Fill(dsExisting.tblPersons) dsExisting.Merge(dsNew.tblPersons) strResult += daPersons.Update(dsExisting.tblPersons).ToString + " persons updated<br>" End If
If currentUser.IsInRole("Field") Then daSurveys.SelectCommand.CommandText = "exec Surveys_Field_Select" daSurveys.InsertCommand.CommandText = "exec Surveys_Field_Insert" daSurveys.UpdateCommand.CommandText = "exec Surveys_Field_Update" End If
If dsNew.tblSurveys.Count > 0 And (currentUser.IsInRole("Survey") Or currentUser.IsInRole("Field")) Then ' Or CurrentUser.IsInRole("Admin")) Then
strResult += daSurveys.Update(dsExisting.tblSurveys).ToString + " surveys updated<br>" End If
If dsNew.tblSurveyChecks.Count > 0 And (currentUser.IsInRole("QA") Or currentUser.IsInRole("Admin")) Then daSurveyChecks.Fill(dsExisting.tblSurveyChecks) dsExisting.Merge(dsNew.tblSurveyChecks)
strResult += daSurveyChecks.Update(dsExisting.tblSurveyChecks).ToString + " survey checks updated<br>" End If
If dsNew.tblTreatments.Count > 0 And (currentUser.IsInRole("FB") Or currentUser.IsInRole("Heli-burn")) Then ' Or CurrentUser.IsInRole("Admin")) Then daTreatments.Fill(dsExisting.tblTreatments) dsExisting.Merge(dsNew.tblTreatments)
strResult += daTreatments.Update(dsExisting.tblTreatments).ToString + " treatments updated<br>" End If
If dsNew.tblInternalQC.Count > 0 And (currentUser.IsInRole("FB") Or currentUser.IsInRole("Heli-burn") Or currentUser.IsInRole("Survey")) Then ' Or CurrentUser.IsInRole("Admin")) Then daInternalQC.Fill(dsExisting.tblInternalQC) dsExisting.Merge(dsNew.tblInternalQC)
strResult += daInternalQC.Update(dsExisting.tblInternalQC).ToString + " internalqc updated<br>" End If
If dsNew.tblTreatmentChecks.Count > 0 And (currentUser.IsInRole("QA") Or currentUser.IsInRole("Admin")) Then Try daTreatmentChecks.Fill(dsExisting.tblTreatmentChecks) dsExisting.Merge(dsNew.tblTreatmentChecks) strResult += daTreatmentChecks.Update(dsExisting.tblTreatmentChecks).ToString + " treatment checks updated<br>" Catch dbex As DBConcurrencyException strResult += vbCrLf & dbex.Message For x As Integer = 0 To dbex.Row.Table.Columns.Count - 1 strResult += vbCrLf & dbex.Row.GetColumnError(x) Next End Try End If
If dsNew.tblHeliPiles.Count > 0 And (currentUser.IsInRole("Heli-burn")) Then ' Or CurrentUser.IsInRole("Planner")CurrentUser.IsInRole("QA") Or CurrentUser.IsInRole("Admin") Or daHeliPiles.Fill(dsExisting.tblHeliPiles) dsExisting.Merge(dsNew.tblHeliPiles)
strResult += daHeliPiles.Update(dsExisting.tblHeliPiles).ToString + " piles updated<br>" End If
If dsNew.tblHeliCycles.Count > 0 And (currentUser.IsInRole("Heli-burn")) Then ' CurrentUser.IsInRole("Planner") Or Or CurrentUser.IsInRole("Admin")) Then daHeliCycles.Fill(dsExisting.tblHeliCycles) dsExisting.Merge(dsNew.tblHeliCycles)
strResult += daHeliCycles.Update(dsExisting.tblHeliCycles).ToString + " cycles updated<br>" End If
If dsNew.tblHeliTurns.Count > 0 And (currentUser.IsInRole("Heli-burn")) Then 'CurrentUser.IsInRole("Admin") Or CurrentUser.IsInRole("Planner") Or daHeliTurns.Fill(dsExisting.tblHeliTurns) dsExisting.Merge(dsNew.tblHeliTurns)
strResult += daHeliTurns.Update(dsExisting.tblHeliTurns).ToString + " turns updated<br>" End If
If dsExisting.HasChanges Then dsExisting.Merge(dsNew) End If dsExisting.AcceptChanges()
'duh. 'If dsNew.HasChanges Then ' dsNew.AcceptChanges() 'End If If dsExisting.HasErrors Then Dim bolError As Boolean Dim tempDataTable As DataTable bolError = True
strResult += "<br>"
For Each tempDataTable In dsExisting.Tables If (tempDataTable.HasErrors) Then strResult += PrintRowErrs(tempDataTable) End If Next End If
dsNew.Dispose() dsExisting.Dispose() connMerge.Close() 'edebugging will only track strresult Dim fsError As New FileStream(Server.MapPath("./incoming/error.txt"), FileMode.Create, FileAccess.Write) Dim swError As New StreamWriter(fsError) swError.WriteLine("--==ERROR LOG==--") swError.WriteLine(Now.Date.ToShortDateString) swError.WriteLine("-----------------") swError.WriteLine(strResult) swError.Close() fsError.Close()
I'm trying to insert records into "holding" table and write back identity column value (Entry_Key) to the original table. So my setup is I have two tables; tblEWPBulk and tbleFormsUploadEWP. Users will enter records into tblEWPBulk and use BatchID to group records, once batch entry has been completed (usually less than 30 records) user will click on UploadAll button and insert records (not all fields) into tbleFormsUploadEWP. One record in tblEWPBulk can be sent multiple times to the holding table but tblEWPBulk will need to have latest Entry_Key captured. Records are sent from holding table to DB2 z/VSE using SQL stored procedure and based on certain logic records are marked uploaded or certain error capture... that part works fine.
So for example I want to sendÂ
BatchID, AccountNumber, Period, ReceiveDate, AccountType, ReturnType, NetProfitOrLoss, TaxCredit FROM tblEWPBulk to the holding table and write back Entry_Key (identity column) back to the record in tblEWPBulk (field called UploadEntryKey). As I said one record could be sent to the holding table multiple times until uploaded or deleted and UploadEntryKey always needs to be updated so that when results are processed response from the DB2 can be inserted into table and presented to the user.
No foreign key relationship exists since records in the holding table get sent to the archive table and table is truncated and entry_key starting value reset back to 2000... just some DB2 restrictions.Â
I have a cube with a partition configures in write back.
Users in Excel need to see the totals of the line. Unfornately they have the bad idea to write in this cell sometimes and not in the leaf cells.
As there is some MDX code behind in the weight expression field, we got some weird values : one is negative and others ARe 10 times the initial value in the total. So it's very dangerous.
How can we block the writing in this totals cells ?
Hi, I have an application where I'm filling a dataset with values from a table. This table has no primary key. Then I iterate through each row of the dataset and I compute the value of one of the columns and then update that value in the dataset row. The problem I'm having is that when the database gets updated by the SqlDataAdapter.Update() method, the same value shows up under that column for all rows. I think my Update Command is not correct since I'm not specifying a where clause and hence it is using just the value lastly computed in the dataset to update the entire database. But I do not know how to specify a where clause for an update statement when I'm actually updating every row in the dataset. Basically I do not have an update parameter since all rows are meant to be updated. Any suggestions? SqlCommand snUpdate = conn.CreateCommand(); snUpdate.CommandType = CommandType.Text; snUpdate.CommandText = "Update TestTable set shipdate = @shipdate"; snUpdate.Parameters.Add("@shipdate", SqlDbType.Char, 10, "shipdate"); string jdate =""; for (int i = 0; i < ds.Tables[0].Rows.Count - 1; i++) { jdate = ds.Tables[0].Rows[i]["shipdate"].ToString(); ds.Tables[0].Rows[i]["shipdate"] = convertToNormalDate(jdate); } da.Update(ds, "Table1"); conn.Close();
Hello,I'm trying to create a simple back up in the SQL Maintenance Plan that willmake a single back up copy of all database every night at 10 pm. I'd likethe previous nights file to be overwritten, so there will be only a singleback up file for each database (tape back up runs every night, so each daysback up will be saved on tape).Every night the maintenance plan makes a back up of all the databases to anew file with a datetime stamp, meaning the previous nights file stillexists. Even when I check "Remove files older than 22 hours" the previousnights file still exists. Is there any way to create a back up file withoutthe date time stamp so it overwrites the previous nights file?Thanks!Rick
New to Database Mirroring and I have a question about the Principal database server. I have a Database Mirroring setup configured for High-safety with automatic fail over mode using a witness.
When a fail over occurs because of a lost of communication between the principal and mirror, the mirror server takes on the roll of Principal. When communication is returned to the Principal server, at some point does the database that was the previous Principal database automatically go back to being the Principal server?
I need to run two reports each of A5 Size to run back to page and print on single A4 paper means in 1st half Sale bill will be printed and in second half Gate Pass Will Be Printed both report will be on same page and size and shape should be maintained. How to do it.
Hello,I am hoping you can help me with the following problem; I need to process the following steps every couple of hours in order to keep our Sql 2000 database a small as possible (the transaction log is 5x bigger than the db).1.back-up the entire database2.truncate the log3.shrink the log4.back-up once again.As you may have determined, I am relatively new to managing a sql server database and while I have found multiple articles online about the topics I need to accomplish, I cannot find any actual examples that explain where I input the coded used to accomplish the above-mentioned steps. I do understand the theory behind the steps I just do not know how to accomplish them!If you know of a well-documented tutorial, please point me in the right direction.Regards.
I have a report with multiple datasets, the first of which pulls in data based on user entered parameters (sales date range and property use codes). Dataset1 pulls property id's and other sales data from a table (2014_COST) based on the user's parameters. I have set up another table (AUDITS) that I would like to use in dataset6. This table has 3 columns (Property ID's, Sales Price and Sales Date). I would like for dataset6 to pull the Property ID's that are NOT contained in the results from dataset1. In other words, I'd like the results of dataset6 to show me the property id's that are contained in the AUDITS table but which are not being pulled into dataset1. Both tables are in the same database.
I have a small number of rows in a dataset, Table 1. There is a CLOB on a large dataset, Table 2. They join on a PK. I would like to retrieve this CLOB and add it to the data flow for Table1. In short I want to emulate the following:
Table 1: Small table without CLOB, 10 rows. Table 2: Large table with CLOB, 10,000,000 rows
select CLOB from table2 where pk = (select pk from table1)
I want this to return the CLOBs for the small number of rows in Table 1. The PK is indexed obviously so it should be a fast look up.
Table 1 and Table 2 live on different Oracle databases. How do I perform this operation efficiently in SSIS? It seems the Lookup and Merge Join wont do this.
I have a report with multiple datasets, the first of which pulls in data based on user entered parameters (sales date range and property use codes). Dataset1 pulls property id's and other sales data from a table (2014_COST) based on the user's parameters.
I have set up another table (AUDITS) that I would like to use in dataset6. This table has 3 columns (Property ID's, Sales Price and Sales Date). I would like for dataset6 to pull the Property ID's that are NOT contained in the results from dataset1. In other words, I'd like the results of dataset6 to show me the property id's that are contained in the AUDITS table but which are not being pulled into dataset1. Both tables are in the same database.
I found out the data I need for my SQL Report is already defined in a dynamic dataset on another web service. Is there a way to use web services to call another web service to get the dataset I need to generate a report? Examples would help if you have any, thanks for looking
Hi, I have a stored procedure attached below. It returns 2 rows in the SQL Management studio when I execute MyStorProc 0,28. But in my program which uses ADOHelper, it returns a dataset with tables.count=0. if I comment out the line --If @Status = 0 then it returns the rows. Obviously it does not stop in if @Status=0 even if I pass @status=0. What am I doing wrong? Any help is appreciated.
ALTER PROCEDURE [dbo].[MyStorProc]
(
@Status smallint,
@RowCount int = NULL,
@FacilityId numeric(10,0) = NULL,
@QueueID numeric (10,0)= NULL,
@VendorId numeric(10, 0) = NULL
)
AS
SET NOCOUNT ON
SET CONCAT_NULL_YIELDS_NULL OFF
If @Status = 0
BEGIN
SELECT ...... END If @Status = 1 BEGIN SELECT...... END
Does anybody know of a way to rollback SQL Server 2005 databases back to SQL Server 2000? Is there a way of doing it without resorting to Copy Database Wizard? I love to find a way of attaching a SS 2005 database to a SS 2000 instance without any issues.
I recently upgraded to SS 2005 and I am very unhappy with the SS 2005 and I want to rollback to SS 2000, which was a lot more stable. I am having several major issues that are affecting my whole company's day-to-day operations and the managers are not happy. Some of the issues include night time batch running very sluggish for no apparent reason. This is a biggest problem because it only occurs once or so a week and causes a disturbance with the daily activities when the night time processing isn€™t completed on time. The rest of the time, the batch processing runs great, even a little better then on SS 2000. I don't believe it is a matter of my application needing to be retuned because if that was the case, then why isn't it running sluggish every night? Also, it's never the same day that the sluggish behavior occurs. If it was occurring on the same night, then I would have something to investigate within our application, but it doesn't. Another issue that I am having involves a night time job that restores a copy of the production database to the Data Warehouse server to be used for updating the data warehouse. Again, most of the time it runs great (~2 1/2 hours), but once or twice a week, it goes stupid and takes 6 1/2 hours for no apparent reason. Again, it is not happening the same day either, which could give me something to invesigate. On SS 2000, this same job ran flawlessly. Never I did I run into situation that the database restoration took that long to run. Even another issue involves a SQL Server Agent Job that was put into suspended state. What's a suspended state and how can I get it out of suspended state? I can find no information about suspended state in BOL. I did a Google and nothing came up. If this suspended state was put in for security reasons, great, but then tell me how I can remove the suspended state. I am also not happy with the fact that I can't get accurate information about the queries that are actively running at that particular moment. In SS 2000, when I noticed high CPU usage on the server, I would run the sp_who2 active stored proc and it would show me all the active thread and how much CPU it was consuming. I would then find the running threads with the highest CPU numbers and investigate the query and see if we could improve it. Now in SS 2005, I get in the same situation and run the sp_who2 stored proc, and there is no smoking gun. All of the active threads are showing very little CPU usage, which I am very suspect of. What the heck happen to sp_who2? I looked at some of the other ways of looking at running processes (i.e... sys.sysprocesses) and they don't appear to be giving the information that I need.
I am very unhappy and I just want to roll back to SS 2000 and wait a couple of years before I upgrade to SS 2005.
i have two datasets.one dataset have old data from some other database.second dataset have original data from sql server 2005 database.both database have same field having id as a primary key.i want to transfer all the data from first dataset to new dataset retaining the previous data but if old dataset have the same id(primary key) as in the new one then that row will not transfer. but if the id(primary key) have changed values then the fields updated with that data.how can i do that.
D2 is a list of data. each row in D2 has a classid. D2 may or may not have all the classids in D1. all classids in D2 must be in D1.
I want to show fields in D2 and group the data with classids in D1 and show every group as a seperate table. If no data in D2 is available for a classid, It shows a empty table.
=CountDistinct(IIF(Fields!Released_DT.Value = Fields!Date2.Value, Fields!Name.Value, Nothing)) Released_DT = a date - 09/03/2015 or 09/02/2015 Date2 = returns another date value in this case 09/03/2015
What I'm trying to do is: count distinct number of people (Fields!Name.Value) if the Relased_DT = Date2.My IIF statement is returning a zero value.
Hi every body... I have a probleme I have a web Services which contains a method getValue(IDEq (int), idIndicator(int), startTime(dateTime), endTime(dateTime)) I need to call this method. But my problem is how pass parameter ? I see the tab Param but it isn't work as I wait,... maybe I do a mistake...
I want that statTime and endTime are select by the user via a calendar for example... now idIndicator and idEq was result of an other dataSet from a xml datasource...
But I don't how integrate dynamically... I try to enter a parameter via the param tab, and create and expression : =First(Fields!idEq.Value, "EquipmentDataSet") but when i execute the query, the promter display <NULL>... So I don't know how to do and if it is possible ! I hope someone can help me ! Thank you !
anyone try to convert a 7.0 database back to 6.5? Is there a way to move the data back or does the 6.5 to 7.0 upgrade change field identifiers or anything else prohibiting the move?
Please help. We need back up database on remote SQL Server backup device, because first one doesn't have space for backup file. Give mi hint what to do.