I have a chart I am producing in .net and I need to values. The problem is the values (sums) i need are on 2 different servers. Is there any where to combine the query using two databases? Does anyone have any suggestions?
Hi, I am very new to the .net framework, but I have a lot of experince in php/mysql applications. So this is this is proboly going to seem like a nwebie question. I am trying to pull a single item out of the database programaticlly, take the value of that item and set it to a varible so I can work with it in other areas. Here is basically what I have this is on the vb page in a sub I have the varible MonthlyPrice Defined earlier in the pageDim connString As String = _ConfigurationManager.ConnectionStrings("ConnectioString").ConnectionString'Create a SqlConnection instanceUsing myConnection As New SqlConnection(connString)'Specify the SQL queryConst sql As String = "SELECT Price FROM Plans Where PlainID='" & PlanId & "'"'Create a SqlCommand instanceDim myCommand As New SqlCommand(sql, myConnection)'Get back a DataSetDim myDataSet As New DataSet'Create a SqlDataAdapter instanceDim ReadData As New SqlDataReader(myCommand) 'Bind the DataSet to the GridViewMonthlyPrice = ReadData("Price")'Close the connectionmyConnection.Close()End Using I just want to get the price of the plan where planID is equal to the value of my varible planID. I want to set that equal to MonthlyPrice which is a decimal so I can add it into some calculations and return a value. I don't really want to bind the data to anything but everytime I search google I can't find anything excpet for binding data to grideview and things like that. I have tried several diffrent approaches that I have found out there but have gotten errors or and things I am not understanding. I originally was trying to access a datasource I already have defined that is on the aspx page.<asp:SqlDataSource ID="SQLSelectPlan" runat="server" ConnectionString="<%$ ConnectionStrings:ConnectionString %>" SelectCommand="SELECT * FROM [Plans] WHERE ([PlanID] = @PlanID2)"> <SelectParameters> <asp:FormParameter FormField="grSelectPlan" Name="PlanID2" Type="Int32" /> </SelectParameters> </asp:SqlDataSource> Is there a way for me to grab this datasource which works and is tied to a detailed view, take the value in the price collom and stick it in the MonthlyPrice varible on my vb page? That is the approach that seems the most logical to me. If someone has better way I would really like to know.
Here is the setup. Text Box: User enters in customer transaction number Button: User clicks button to display information about the customer
Now the database has a lot of unique customer numbers. What I am trying to do is take what the user enters so it can search the database and pull out that customers information. I am having a hard time getting that information from the textbox. Any suggestions! Here is what I have so far.
Private Sub btnViewFlow_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles btnViewFlow.Click
Try
Me.SqlConnection1 = New System.Data.SqlClient.SqlConnection
Me.SqlDataAdapter1 = New System.Data.SqlClient.SqlDataAdapter
Me.SqlSelectCommand1 = New System.Data.SqlClient.SqlCommand
Me.SqlDataAdapter1.TableMappings.AddRange(New System.Data.Common.DataTableMapping() {New System.Data.Common.DataTableMapping("Table", "DTR_Document_Summary", New System.Data.Common.DataColumnMapping() {New System.Data.Common.DataColumnMapping("DocumentId", "DocumentId"), New System.Data.Common.DataColumnMapping("PartnerId", "PartnerId"), New System.Data.Common.DataColumnMapping("PartnerName", "PartnerName"), New System.Data.Common.DataColumnMapping("Direction", "Direction"), New System.Data.Common.DataColumnMapping("TranSet", "TranSet")})})
I want to include the name of the user whom so ever has changed the record. I want to insert the name of the user in the column. How to grab the name of the user from the action...?
My DB saves it's data into a table at the end of each day like:
'e4_event_20140530' where the last bit changes according to the date. So 30th May 2014 in this case.
What I am trying to do is query the last 24 hours. I know i can grab from 2 tables and do a 'between' with times but it means having to change table name and times in the query every time i run it. I'd just like to run it and for it to just fetch the last 24 hours at any point in time.
the notes table contain the note_id and notetext, so example of a row would be: 1 this is the note for the first row 2 this is the note for computer parts
etc..
the po table has item info, description, po_id, and note_id, heres an example:
po_id item desc note_id 1 215-33 computer parts 2
the problem is, is there a way where say i insert the note row into the notes table first and then get the note_id of that, so that i can then insert the row into the po table which includes the note_id from the notes table?
would i have to use two separate inserts or can i use one? i was going to just insert the notes row first, and then use a query using the LIKE() function on the note text to get the note_id, but some notes may be the same as others, so the note_id wouldn't be the correct one :( and i think just getting the last row of the notes table would be bad too incase someone happens to insert a row into that table at the same time
Is there a handy-dandy way for me to be able, from a stored procedure, to capture the notification email address for a user associated with a job?
I need to send an email out from a stored procedure, and I want it to be set up to send to the same user that is set up in the job that calls the stored procedure.
the same notification stored proc will be used in multiple jobs, but the jobs are already set up with a notification email to be sent to the defined users "on completion" (error or not). I want a separate email to be sent, but I want to use xp_Sendmail because I have to perform a select who's output I want to appear in the body of the email (and unless I am missing something, I have no control over the body of the standard "job completion" email).
I would prefer to NOT have to "hard code" the user email address in the stored proc that calls xp_sendmail, OR to put it in some user table in the database, when the user I want to send it to will always be the same one defined in at the JOB level in Enterprise Manager.
Any thoughts? Thanks in advance for your instantaneous and informative answer-packed responses! ;)
I have MSDE running on my system and I want to get a list of databasesfor that server programmatically. I'm using vb.net and I want to view,add, delete and modify databases on a server from within a class.I'm aware I can use the "Server Explorer" feature in Visual Studio, butthat isn't what I'm looking for. Any references or suggestions on whereto start are appreciated.TIARalf
The Status code column in the source table will have a single 3 DIGIT alphanumeric code, and sometimes a longer string. I need a code that will automatically grab the 3 digit code before the dash.
I have run into an issue that seems very simple but I am new to SSIS and DBs in general and therefore cannot solve it. I would like to take data from a SQL Server table, lookup a key from a dimension table and update the same SQL Server table with this data. Is there anyway to do an update using SSIS?
I have a sub that passes values from my form to my stored procedure. The stored procedure passes back an @@IDENTITY but I'm not sure how to grab that in my asp page and then pass that to my next called procedure from my aspx page. Here's where I'm stuck: Public Sub InsertOrder() Conn.Open() cmd = New SqlCommand("Add_NewOrder", Conn) cmd.CommandType = CommandType.StoredProcedure ' pass customer info to stored proc cmd.Parameters.Add("@FirstName", txtFName.Text) cmd.Parameters.Add("@LastName", txtLName.Text) cmd.Parameters.Add("@AddressLine1", txtStreet.Text) cmd.Parameters.Add("@CityID", dropdown_city.SelectedValue) cmd.Parameters.Add("@Zip", intZip.Text) cmd.Parameters.Add("@EmailPrefix", txtEmailPre.Text) cmd.Parameters.Add("@EmailSuffix", txtEmailSuf.Text) cmd.Parameters.Add("@PhoneAreaCode", txtPhoneArea.Text) cmd.Parameters.Add("@PhonePrefix", txtPhonePre.Text) cmd.Parameters.Add("@PhoneSuffix", txtPhoneSuf.Text) ' pass order info to stored proc cmd.Parameters.Add("@NumberOfPeopleID", dropdown_people.SelectedValue) cmd.Parameters.Add("@BeanOptionID", dropdown_beans.SelectedValue) cmd.Parameters.Add("@TortillaOptionID", dropdown_tortilla.SelectedValue) 'Session.Add("FirstName", txtFName.Text) cmd.ExecuteNonQuery() cmd = New SqlCommand("Add_EntreeItems", Conn) cmd.CommandType = CommandType.StoredProcedure cmd.Parameters.Add("@CateringOrderID", get identity from previous stored proc) <------------------------- Dim li As ListItem Dim p As SqlParameter = cmd.Parameters.Add("@EntreeID", Data.SqlDbType.VarChar) For Each li In chbxl_entrees.Items If li.Selected Then p.Value = li.Value cmd.ExecuteNonQuery() End If Next Conn.Close()I want to somehow grab the @CateringOrderID that was created as an end product of my first called stored procedure (Add_NewOrder) and pass that to my second stored procedure (Add_EntreeItems)
I am wondering if it is possible to use SSIS to sample data set to training set and test set directly to my data mining models without saving them somewhere as occupying too much space? Really need guidance for that.
I have used both data readers and data adapters(with datasets) in the projects that I have worked on. I am trying to get some clarification on when I should be using which one. I think I am doing this correctly but I want to be sure I am developing good habits.
As the name might suggest, it seems like a datareader is for only reading data. I have read that the data adapter and dataset are for a disconnected architecture. Or, that they can be used for this type of set up. I have been using the data adapter and datasets when writing to a database and the datareader when reading from a database.
Is this how these should be used? Is the data reader the best choice for reading data? Am I doing this the optimal way from a performance stand point?
......................................................thanks in advance
We already integrated different client data to MDS with MS Excel plugin, now we want to push back updated or new added record to source database. is it possible do using MDS? Do we have any background sync process to which automatically sync data to and from subscriber and MDS?
When I enter over 4000 chars in any ntext field in my SQL Server 2005 database (directly in the database and through the application) I get an error saying that the data could not be updated because string or binary data would be truncated.Has anyone ever seen this? I cannot figure out what is causing it, ntext should be able to hold a lot more data that this...
I have a requirement to implement CDC for 50+ tables to implement incremental data changes warehouse/reporting rather than exporting the whole table data. The largest table is having more than half a billion records.
The warehouse use a daily copy of OLTP db (daily DB refresh). How can I accomplish this. Is there a downside in implementing CDC just for the sake of taking incremental changes on the tables?
Is there any performance impact if we enable CDC on OLTP db?
Can we make use of the CDC tables on the environment we do daily db refresh so that the queries don't hit OLTP database?
What is the best way to implement CDC to take incremental changes for reporting.
Hi,This is driving me nuts, I have a table that stores notes regarding anoperation in an IMAGE data type field in MS SQL Server 2000.I can read and write no problem using Access using the StrConv function andI can Update the field correctly in T-SQL using:DECLARE @ptrval varbinary(16)SELECT @ptrval = TEXTPTR(BITS_data)FROM mytable_BINARY WHERE ID = 'RB215'WRITETEXT OPERATION_BINARY.BITS @ptrval 'My notes for this operation'However, I just can not seem to be able to convert back to text theinformation once it is stored using T-SQL.My selects keep returning bin data.How to do this! Thanks for your help.SD
I'm using Script Component to load data into Oracle DB due to the poor performance issue. Now, I found it will missing some data during the transmission. Please see the screenshot below:Â
[DTS.Pipeline] Error: "component "Excel Source" (1)" failed validation and returned validation status "VS_NEEDSNEWMETADATA".
and also this:
[Excel Source [1]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "Fiscal Week" needs to be updated in the external metadata column collection. The column "Fiscal Year" needs to be updated in the external metadata column collection. The column "1st level" needs to be added to the external metadata column collection. The column "2nd level" needs to be added to the external metadata column collection. The column "3rd level" needs to be added to the external metadata column collection. The "external metadata column "1st Level" (16745)" needs to be removed from the external metadata column collection. The "external metadata column "3rd Level" (16609)" needs to be removed from the external metadata column collection. The "external metadata column "2nd Level" (16272)" needs to be removed from the external metadata column collection.
I tried going data flow->excel connection->advanced editor for excel source-> input and output properties and tried to refresh the columns affected. It seems that somehow the 3 columns are not read in from the source file? ans alslo fiscal year, fiscal week is not set up up properly in my data destination? anyone faced such errors before?
When I execute the below stored procedure I get the error that "Arithmetic overflow error converting expression to data type int".
USE [FileSharing] GO /****** Object: StoredProcedure [dbo].[xlaAFSsp_reports] Script Date: 24.07.2015 17:04:10 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO
[Code] .....
Msg 8115, Level 16, State 2, Procedure xlaAFSsp_reports, Line 25 Arithmetic overflow error converting expression to data type int. The statement has been terminated. (1 row(s) affected)
is there a step by step paper to get there? here is what i need to consider. I Iwill have many customers that will need their own set of records and access pages "branded for their company" each customer will have many clients. I am hosting this application on a windows 2003 server with SQL 2005 server enterprise.
I am using windows authentication, I have created a username in windows, then i added the windows user in SQL management studio in security, granted "DB Read" and "DB write" and again under the database security tab. still from the web authentication fails. i must be nissing a step or two?
I expect to set up a username for each database as i setup new customers.
RE: XML Data source .. Expression? Variable? Connection? Error: unable to read the XML data.
I want my XML Data source to be an expression as i will be looping through a directory of xml files.
I don't see the expression property or the connection property??
I tried setting the XMLData property to @[User::filename], but that results in:
Information: 0x40043006 at Load XML Files, DTS.Pipeline: Prepare for Execute phase is beginning. Error: 0xC02090D0 at Load XML Files, XML Source [108]: The component "XML Source" (108) was unable to read the XML data. Error: 0xC0047019 at Load XML Files, DTS.Pipeline: component "XML Source" (108) failed the prepare phase and returned error code 0xC02090D0. Information: 0x4004300B at Load XML Files, DTS.Pipeline: "component "OLE DB Destination" (341)" wrote 0 rows. Task failed: Load XML Files Information: 0xC002F30E at Bad, File System Task: File or directory "d:jcpxmlLoadjcp2.xml.bad" was deleted. Warning: 0x80019002 at Package: The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. SSIS package "Package.dtsx" finished: Failure. The program '[3312] Package.dtsx: DTS' has exited with code 0 (0x0).
I setup this package to import data from a Sharepoint list to a SQL Server data table. The primary key of my SQL table is mapped to the Title column of my Sharepoint list. There is a possibility that duplicate values will be entered in the Title field of the Sharepoint list. So when importing data into my table via SSIS, my package always error-out when there it comes across duplicate values. how you others have managed data integrity when importing from a Sharepoint list with the Title column being mapped to the primary key of a table.
"pRecordSet" is an ADO recordset. The database column "MyColumn" is of type "decimal(19,10)".
The most important question for me is, if the regional settings of the database server or the regional settings of the client PC are considered during the conversion from the string to the decimal value. For example in standard French regional settings the "." would not be recognized as decimal separator.
I am also wondering if the language of the database instance, in which this data is saved, is considered during this conversion or any other settings of this database instance.
So my general question is: Does anybody know exactly what rules apply during the above mentioned conversion?
I've question about how to handle structural datamodel changes in a datasource of PowerPivot. Suppose I'm developing a starmodel in SQL Server and sometimes a datatype changes or a name of a field changes in a table. It seems to me that PowerPivot handle this not gracefully as Analysis MD does (mostly). I received an error because of a wrong fieldname or even no error when a dattype changes in PowerPivot. Is this common or do I something wrong here. Does this mean that every time the datamodel changes the PowerPivot should be recreated? Or am I missing the clue here?