Non-cached Lookup Method

May 28, 2008

Phil, great links, really helpful and appreciated.

I just need to verify one thing on the lookup method:
--One of the lookup methods people were discussing is non-cached lookup -- which seem to be evaluated to be the fastest. Is the non-cached the default of LookUp transformation? and when I wanted the lookup method to be cached, I need to go into the Advance tab and set it to however %, right? thanks.

View 3 Replies


ADVERTISEMENT

Keeping A Cached Lookup In Memory

Dec 10, 2007

Is it possible to keep a Cached Lookup in memory when executing multiple Data Flows? Executing DFT€™s in parallel will cache and use the same LOOKUP statement. But what if I€™m executing the DFT sequentially, can I keep the LOOKUP from the first DFT in memory for the second DFT? For example, in my case, I€™m caching a lookup against the Customer dimension for invoices. The second DFT then processes credits and again does a lookup against the Customer dimension. I want to use the cached Customer records from the first DFT.

View 1 Replies View Related

Full Cached Lookup With Parameters

Jul 8, 2006

Parameterized queries are only allowed on partial or none cache style lookup transforms, not 'full' ones. Is there some "trick" to parameterizing a full cache lookup, or should the join simply be done at the source, obviating the need for a full cache lookup at all (other suggestion certainly welcome)

More particularly, I'd like to use the lookup transform in a surrogate key pipeline. However, the dimension is large (900 million rows), so its would be useful to restrict the lookup transform's cache by a join to the source.

For example:

Source query is: select a,b,c from t where z=@filter (20,000 rows)

Lookup transform query: select surrogate_key,business_key from dimension (900 M rows, not tenable)



Ideal Lookup transform query:

select distinct surrogate_key

,business_key

from dimension d inner join

t on d.business_key = t.c

where t.z = @filter

View 7 Replies View Related

Lookup Tables Or Another Method?

Jul 11, 2007

I'm creating an application that will allow users to contribute "content". The content can be tagged, saved as "my content", etc... very web 2.0'ish. The site will rely very heavily on SQL Server 2005. By default, there is a Content table that simply stores the content with an identity column PK. There is also a Tag table and User table.
What is the most effective schema for speed and reliable scalability? Eventually there could be hundreds to thousands of people contributing and tagging content.
Idea 1: Lookup TablesSimply make a new table that holds the TagID and UserID. The table will be very important as it will be queried very regularly to show the users which tags they have stored.Pros: This will effectively allow me to store & query what tags the user has selected. It's simple to setup and the application won't have to work much with the data returned from the query.Cons: What happens when there are thousands of users tagging content? At what point does it become very inefficient to query a table that has a huge number of rows?
Idea 2: Comma Delimited ListSimply has a field in the user table that has a comma delimited list of TagID's the user has selectedPros: Keeps table size low, fairly easy to implement.Cons: Application has to perform more work. It has to separate the TagID's by comma, and requery the database to get each tags data, based on TagID.
Those are basically the only two methods I've really got experience using. The reason for this post is to see if there are methods that I'm not aware of that are better suited for what I'm trying to do.
Any assistance is greatly appreciated, thank you in advance!
 

View 5 Replies View Related

Send Request To Stored Procedure From A Method And Receive The Resposne Back To The Method

May 10, 2007

Hi,I am trying to write a method which needs to call a stored procedure and then needs to get the response of the stored procedure back to the variable i declared in the method. private string GetFromCode(string strWebVersionFromCode, string strWebVersionString)    {      //call stored procedure  } strWebVersionFromCode = GetFromCode(strFromCode, "web_version"); // is the var which will store the response.how should I do this?Please assist.  

View 3 Replies View Related

Update Method Is Not Finding A Nongeneric Method!!! Please Help

Jan 29, 2008

Hi,
 I just have a Dataset with my tables and thats it
 I have a grid view with several datas on it
no problem to get the data or insert but as soon as I try to delete or update some records the local machine through the same error
Unable to find nongeneric method...
I've try to create an Update query into my table adapters but still not working with this one
Also, try to remove the original_{0} and got the same error...
 Please help if anyone has a solution
 
Thanks

View 7 Replies View Related

Performance Expectations For Fuzzy Lookup Against 25mill Row Lookup Table

Oct 31, 2007

We did some "at scale" fuzzy lookup tests today and were rather disappointed with the performance. I'm wanting to know your experience so I can set my performance expectations appropriately.

We were doing a fuzzy lookup against a lookup table with 25 million rows. Each row has 11 columns used in the fuzzy lookup, each between 10-100 chars. We set CopyReferenceTable=0 and MatchIndexOptions=GenerateAndPersistNewIndex and WarmCaches=true. It took about 60 minutes to build that index table, during which, dtexec got up to 4.5GB memory usage. (Is there a way to tell what % of the index table got cached in memory? Memory kept rising as each "Finished building X% of fuzzy index" progress event scrolled by all the way up to 100% progress when it peaked at 4.5GB.) The MaxMemoryUsage setting we left blank so it would use as much as possible on this 64-bit box with 16GB of memory (but only about 4GB was available for SSIS).

After it got done building the index table, it started flowing data through the pipeline. We saw the first buffer of ~9,000 rows get passed from the source to the fuzzy lookup transform. Six hours later it had not finished doing the fuzzy lookup on that first buffer!!! Running profiler showed us it was firing off lots of singelton SQL queries doing lookups as expected. So it was making progress, just very, very slowly.

We had set MinSimilarity=0.45 and Exhaustive=False. Those seemed to be reasonable settings for smaller datasets.

Does that performance seem inline with expectations? Any thoughts to improve performance?

View 4 Replies View Related

Fuzzy Lookup Error When Adding Additional Lookup Columns

Sep 26, 2007

I'm working with an existing package that uses the fuzzy lookup transform. The package is currently working; however, I need to add some columns to the lookup columns from the reference table that is being used.

It seems that I am hitting a memory threshold of some sort, as when I add 3 or 4 columns, the package works, but when I add 5 columns, the fuzzy lookup transform fails pre-execute:

Pre-Execute
Taking a snapshot of the reference table
Taking a snapshot of the reference table
Building Fuzzy Match Index
component "Fuzzy Lookup Existing Member" (8351) failed the pre-execute phase and returned error code 0x8007007A.

These errors occur regardless of what columns I am attempting to add to the lookup list.

I have tried setting the MaxMemoryUsage custom property of the transform to 0, and to explicit values that should be much more than enough to hold the fuzzy match index (the reference table is only about 3000 rows, and the entire table is stored in less than 2MB of disk space.

Any ideas on what else could be causing this?

View 4 Replies View Related

Reporting Services :: SSRS Lookup - Can Use More Than One Field When Doing Lookup

Sep 23, 2015

Say I want to lookup a value in another dataset, but there is a grouping that requires you to know what the values for each level is in order to get to the correct detail record.   Can you still use the lookup function with more than one field to compare against? So for example

Department
\___SalesPerson
     \___Measure

I want to be able to add a new row at the Measure level, but lookup each field from another dataset.  In order to do that I will need the Department AND SalesPerson values to do the lookup, but I dont think the Lookup function will let us do that will.

View 2 Replies View Related

Cached Datasets?

Aug 8, 2007

Hello all,
I have a report with a table and a chart. It uses dataset1 as the data source.
All works fine.
I create a new dataset called dataset2.
The queries are exactly the same. The only differences between the 2 datasets is the database server and the fact that one of the columns is a smallint (in dataset2) and an int(in Dataset1)
I change the datasetName property of both the table and the chart to use dataset2.
When I run the report I get a conversion error stating that there was an overflow of int2 while using dataset1. I have verified the report is not using dataset1 anywhere. If I delete dataset1 and run the report the error goes away. If I add it back, I get the error again. Why is the report looking at dataset1 if it is not referenced at all in the report? Does SQL RS cache the datasets and verify each when it compiles?

regards,
Bill

View 9 Replies View Related

View Result Cached

Aug 1, 2007

I am using SQL server 2005. I have a VIEW that joins several tables. One of the table's column can be added dynamically by the user from a GUI interface. However, after a column is added, it does not show up in the VIEW immediately. It will take a while (I haven't figured out exactly how long) before the extra column shows up as the execution result of the VIEW.
 So it seems like SQL server is caching that VIEW's schema. Is there anyway I can make this view always comes back with the latest schema?
Thanks a lot!
Penn

View 1 Replies View Related

Cached Search Of Top Terms

Mar 11, 2008

Hi, I have a search and I want to create a hyperlinked list of the top 5 search terms below it, what's the most efficient way to go about this?

View 20 Replies View Related

Cached Query Result

Sep 20, 2007

I want to check the performance of m query and i just want to remove cached query results. Is there any suggestion how can i do this.
I just want to check after each modificatin how much improvement in performance

View 1 Replies View Related

Snapshot And Cached Instances

Oct 23, 2007

Hi,

I'm trying to understand the cases where it's more interesting to use snapshot and when it's more interesting to use cached instances.

If I have 100 users trying to reach a report, is it better to use snaphsot or cache instance ? In both case, the 100 users will have the same report result. And what about the performance, are they similar ?

Thanks for your time and response,

See you,


Have a nice day!

regards,
sandy

View 1 Replies View Related

Snapshots Vs Cached Instances

Jun 19, 2007

Hello,

I would like to know what is the difference between a snapshot and a cached instance in SSRS?

Which one has the best performances and which one is the best for multiple users and reports containing parameters (the parameters are then passed in the where clause of the sql code; ex: WHERE IN(@param1))?

Thanks for your answers.
Zoz

View 4 Replies View Related

Is It Possible To Lookup Value Based On Two Tables Using Lookup Task

Jun 27, 2007

Hi All,

Actually this is in regard to SCD Type 2 Dimension, Scenario is like that I am moving Fact table from some old source and I have dimensionA description value in fact which I want to replace with appropriate id from Dimension Table and that Dimension table is SCD Type 2 based on StartDate and EndDate and Fact Table doesn't contains direct date value rather there is timeId in Fact so to update the value in Fact table I have to Join Time Dimension table and other Dimension Table to replace fact Description with proper Id.



Lets assume DimensionA Structure

id

Description

StartDate

EndDate



Fact Table

id

measure1

measure2

TimeId

Description



Time Dimension

TimeId

Date

Day

Hour ...

View 1 Replies View Related

Row Yielded No Match During Lookup When Using 2 Columns In Lookup

Jul 24, 2007

I am doing a lookup that requires mapping 2 columns in the column mapping section. When I do this, I get the error "Row yielded no match during lookup" . The SQL that I captured in SQL profiler does find the record when I run it in Management Studio. I have already tried trimming everything to no avail.

Why is this happening?



I tried enabling memory restrictions but then I my package hangs and I get a SQLDUMPER_ERRORLOG.log file with the following logged:



07/24/07 13:35:48, ERROR , SQLDUMPER_UNKNOWN_APP.EXE, AdjustTokenPrivileges () failed (00000514)
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Input parameters: 4 supplied
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ProcessID = 5952
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ThreadId = 0
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Flags = 0x0
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, MiniDumpFlags = 0x0
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, SqlInfoPtr = 0x0100C5D0
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, DumpDir = <NULL>
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ExceptionRecordPtr = 0x00000000
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ContextPtr = 0x00000000
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ExtraFile = <NULL>
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, InstanceName = <NULL>
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, ServiceName = <NULL>
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Callback type 11 not used
07/24/07 13:35:48, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Callback type 15 not used
07/24/07 13:35:49, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, Callback type 7 not used
07/24/07 13:35:49, ACTION, SQLDUMPER_UNKNOWN_APP.EXE, MiniDump completed: C:Program FilesMicrosoft SQL Server90SharedErrorDumpsSQLDmpr0033.mdmp
07/24/07 13:35:49, ACTION, DtsDebugHost.exe, Watson Invoke: No


Why am I getting this error with "Enable Memory Restriction"?

View 12 Replies View Related

Too Many Stored Procedures Or Cached Plans?

Aug 16, 2005

I was wondering if anyone had an concrete information about if there is a problem with having too many stored procedures or plans in the cache? Obviously there is an impact on memory but if we can ignore that for the time being, does SQL perform just as well with 100 query plans as it does with 10's millions of plans?

View 9 Replies View Related

Generating A Cached Copy Of The Report

May 27, 2005

Is there a way to programatically determine if a report should be generated from the cache or run against real-time data?

View 7 Replies View Related

Linked Servers And Cached Execution Path

May 1, 2006

I have a procedure that generates dynamic sql and then executes via the execute(strSQL) syntax. BOL states that if I use sp_executesql with hard-typed parameters passed in variables, the query optimizer will 'probably' match the sql statement with the cached execution path, thus avoiding recompilation and speeding up the results for heavily run procedures.

Can anyone tell me if this is also true if the sql references an object on a linked sql server 2000 database? Technically, the sql is exactly the same, but I'm unsure if there is some exception due to the way linked objects are processed.

Thanks!

View 1 Replies View Related

SQL 2012 :: Way To Invalidate Cached Query Plans?

Apr 29, 2014

Any way to invalidate cached query plans? I would rather target a specific query instead of invalidating all of them. Also any sql server setting that will cause cached query plans to invalidate even though only one character in the queries has changed?

exec sp_executesql N'select
cast(5 as int) as DisplaySequence,
mt.Description + '' '' + ct.Description as Source,
c.FirstName + '' '' + c.LastName as Name,
cus.CustomerNumber Code,
c.companyname as "Company Name",
a.Address1,
a.Address2,

[code]....

In this query we have seen (on some databases) simply changing ‘@CustomerId int',@CustomerId=1065’ too ‘@customerId int',@customerId=1065’ fixed the a speed problem….just changed the case on the Customer bind parameter. On other servers this has no effect.the server is using an old cached query plan, but don’t know for sure.

View 9 Replies View Related

Images Not Displaying From Snapshot Or Cached Report

Apr 11, 2008

Hello,
This is my first post, and I'm hoping you all can help.

Using Reporting Services 2005, I have several reports that use embedded images.
All images render fine when the report execution is set to:
1) Always run this report with the most recent data
1a) Do not cache temporary copies of this report

However, when I change the execution to either a Cache or a Snapshot, the images and some charts render as red "X" placeholders. This is sometimes remedied when the user clicks the page refresh button, but not always.

Of course, I could just have all the concurrent users use the uncached report that hits the OLAP server, but that would be highly inefficient, and just plain slow.

Thanks for any help on this subject.

-michael

View 2 Replies View Related

Clear Cached Reports In Visual Studio

Jun 12, 2007

I am frequently going back and forth between the making changes to my reports and previewing those changes, all from within visual studio without publishing the reports each time.



The problem is that frequently changes that I make are not reflected in the preview window unless I either close out of visual studio completely or I wait some length of time, (I'm not sure how long exactly, but 1/2 an hour seems to always do the trick.)



Is there a way to clear any cache and force visual studio to completely reprocess a report?



At least when it is formatting changes I can identify whether the change has stuck, but when I'm fixing bugs in the code, I can't tell if I didn't fix it or if the change just hasn't taken effect.

View 1 Replies View Related

Storing Daily Cached Query Results In SQL Server

Mar 7, 2005

I'm working on a reporting tool that could bring back hundreds of thousands of results back at once. I need some way to run the actual query only once a day, and then the reporting tool would just pull back this cached results. To be short, I need to figure out how to do this using a minimum amount of resources. Would a DataView work with something like this? How would I have it update only once a day? I appreciate any advice!

View 2 Replies View Related

SQL 2012 :: Entity Framework Returning Cached Data

Mar 23, 2014

I have a datagridview bound to a table that is part of an Entity Framework model. A user can edit data in the datagridview and save the changes back to SQL. But, there is a stored procedure that can also change the data, in SQL, not in the datagridview. When I try to "refresh" the datagridview the linq query always returned the older cached data. Here's the code that I have tried using to force EF to pull retrieve new data:

// now refresh the maintenance datagridview data source
using (var context = new spdwEntities())
{
var maintData =
from o in spdwContext.MR_EquipmentCheck
where o.ProdDate == editDate
orderby o.Caster, o.Strand
select o;
mnt_DGV.DataSource = maintData;
}

When I debug, I can see that the SQL table has the updated data in it, but when this snippet of code runs, maintData has the old data in it.

View 0 Replies View Related

SQL Server 2012 :: Invalidate Cached Query Plans?

Apr 30, 2014

way to invalidate cached query plans? I would rather target a specific query instead of invalidating all of them.

Also do you know of any sql server setting that will cause cached query plans to invalidate even though only one character in the queries has changed?

exec sp_executesql N'select
cast(5 as int) as DisplaySequence,
mt.Description + '' '' + ct.Description as Source,

[Code].....

In this query we have seen (on some databases) simply changing ‘@CustomerId int',@CustomerId=1065’ too ‘@customerId int',@customerId=1065’ fixed the a speed problem….just changed the case on the Customer bind parameter. On other servers this has no effect. I’m thinking the server is using an old cached query plan, but don’t know for sure.

View 3 Replies View Related

Auto Stats Adversly Affect Cached Plans?

May 31, 2007

With SQL2005 SP2, we are seeing that when auto stats run on one or more indexes of a large table (1.5M rows), then immediately the stored proc using that table starts acting as if the query plan is no longer any good. This causes a drastic slowdown in response time and a corresponding increase of table reads to get the data. E.g, the next execution of the procedure after the auto stats kick in goes from 355 reads to 755000 reads (as depicted by Profiler). Generally, there are about 25 people using the DB at any one time. They connect through a mid-tier VB component.



I tried adding WITH RECOMPILE to the stored proc in question, but that caused almost all executions to run at the higher number. I thought that the WITH RECOMPILE hint would create a new query plan for each execution of the procedure and that plan would the the latest and greatest. Perhaps it did, but most users got stuck with the higher number of reads anyway. After taking the hint out, everyone went back to getting the 335 number and quick response times.



What we are wrestling with is that when those auto stats hit, it really messes up everyone until we manually recompile the procedure. Daily we delete all records in the table that are over 45 days old, so the table stays pretty much the same size. We also set the recompile flag to cause a new plan to be generated that will reflect the smaller amount of data. Should we also run a stats update before recompiling the procedure? Profiler has been very helpful in capturing what is going on, so I think I have a good handle on that. However, I don't understand why WITH RECOMPILE produced a messed up plan for everyone. The compile itself seems to take only 1 ms when done from the query screen.

View 11 Replies View Related

Row Yielded No Match During Lookup While There Is No Row Going Through The Lookup

Sep 29, 2006

Hi all,

I don't understand what's happening here.

I have a Conditional Split with 3 outputs. On the first output I have a lookup, when I execute the package I have 56 rows going through the Conditional Split, all rows are then going to the 2nd and 3rd output but the lookup on the first output generates an error "Row yielded no match during lookup".

I don't understand why the lookup is generating an error while there is no row going through it.

Any idea ?

Sébastien.

View 6 Replies View Related

Reporting Services :: SSRS 2008 R2 BIS Deploy Does Not Always Clear Cached Reports

Jul 19, 2011

Is there a known problem with BIS Deploy not clearing cache.  I repeatly (but not always) have an issue where I will do a BIS Deploy of an RDL but when I run it, the old report runs.  I can't clear this out until I explicitly log into the Report Manager and explicitly delete the RDL and re-deploy from the BIS.

Is there a workaround, fix, process change, configuration change I can do to keep this from happening?

View 7 Replies View Related

How To Pick Nearby Text Of Lookup Terms With Help Of Term Extraction/Term Lookup

Oct 4, 2007

I am designing a ssis package,This is intends to mine text data(Data extracted from websites).
Term lookup/Term extraction has been used as tools for mining.
I have lookup terms defined with me for reference table,but the main problem lie in extracting the nearby text/number/charcters to these lookup terms during mining.
For example :
I found noun "Email" 200 (frequency score) times in my text,Now I want to extract nearby email address(this is also true for PhoneNumber,Address attributes also).so how can I achieve this with SSIS.
If u have some idea/suggestion to carry out this challenge with or without Term Extraction/Term Lookup,plz do write here.

View 1 Replies View Related

SQL 2012 :: Select Statements And Ended Up Seeing Multiple Cached Instances Of Same Stored Procedure

Nov 24, 2014

I ran the below 2 select statements and ended up seeing multiple cached instances of the same stored procedure. The majority have only one cached instance but more than a handful have multiple cached instances. When there are multiple cached instances of the same sproc, which one will sql server reuse when the sproc is called?

SELECT o.name, o.object_id,
ps.last_execution_time ,
ps.last_elapsed_time * 0.000001 as last_elapsed_timeINSeconds,
ps.min_elapsed_time * 0.000001 as min_elapsed_timeINSeconds,
ps.max_elapsed_time * 0.000001 as max_elapsed_timeINSeconds

[code]...

View 4 Replies View Related

Cached Reports REQUIRE That SQL Server Used SQL Server Authentication (or Mixed)?

May 16, 2007

I am trying to setup cached reports so that one of my larger reports doesn't have to be re-run every time someone wants to view it (the data source only updates every 24 hours).



Anyway I made a new data source, set the report to use that, and in that data source I said to use "SQLexampleUserName" as the stored credentials.



Now when I go to run the report it says: Login failed for user 'username'. The user is not associated with a trusted SQL Server connection.



This is referenced in the MSKB here:

http://support.microsoft.com/default.aspx/kb/555332



Which makes sense, but now my question is: If I want to used a cached report do I HAVE to allow SQL Server Credentials?



I was using Windows Authentication only up to this point.

View 7 Replies View Related

User-defined Stored Procedures InsertCustomerin Northwind Database That Is Cached In Database Explorer:No New Values Inserted?

Jan 14, 2008

Hi all,

I put "Northwind" Database in the Database Explorer of my VB 2005 Express and I have created the following stored procedure in the Database Exploror:

--User-defined stored procedure 'InsertCustomer'--

ALTER PROCEDURE dbo.InsertCustomer

(

@CustomerID nchar(5),

@CompanyName nvarchar(40),

@ContactName nvarchar(30),

@ContactTitle nvarchar(30),

@Address nvarchar(60),

@City nvarchar(15),

@Region nvarchar(15),

@PostalCode nvarchar(10),

@Country nvarchar(15),

@Phone nvarchar(24),

@Fax nvarchar(24)

)

AS

INSERT INTO Customers

(

CustomerID,

CompanyName,

ContactName,

ContactTitle,

Address,

City,

Region,

PostalCode,

Country,

Phone,

Fax

)

VALUES

(

@CustomerID,

@CompanyName,

@ContactName,

@ContactTitle,

@Address,

@City,

@Region,

@PostalCode,

@Country,

@Phone,

@Fax

)
=================================================
In my VB 2005 Express, I created a project "KimmelCallNWspWithAdoNet" that had the following code:
--Form_Kimmel.vb--
Imports System.Data

Imports System.Data.SqlClient

Imports System.Data.SqlTypes

Public Class Form_Kimmel


Public Sub InsertCustomer()

Dim connectionString As String = "Integrated Security-SSPI;Persist Security Info=False;" + _

"Initial Catalog=northwind;Data Source=NAB-WK-EN12345"

Dim connection As SqlConnection = New SqlConnection(connectionString)

connection.Open()

Try

Dim command As SqlCommand = New SqlCommand("InsertCustomer", connection)

command.CommandType = CommandType.StoredProcedure

command.Parameters.Add("@CustomerID", "PAULK")

command.Parameters.Add("@CompanyName", "Pauly's Bar")

command.Parameters.Add("@ContactName", "Paul Kimmel")

command.Parameters.Add("@ContactTitle", "The Fat Man")

command.Parameters.Add("@Address", "31025 La Jolla")

command.Parameters.Add("@City", "Inglewoog")

command.Parameters.Add("@Region", "CA")

command.Parameters.Add("@Counrty", "USA")

command.Parameters.Add("@PostalCode", "90425")

command.Parameters.Add("@Phone", "(415) 555-1234")

command.Parameters.Add("@Fax", "(415 555-1235")

Console.WriteLine("Row inserted: " + _

command.ExecuteNonQuery().ToString)

Catch ex As Exception

Console.WriteLine(ex.Message)

Throw

Finally

connection.Close()

End Try



End Sub

End Class
==============================================


I executed the Form_Kimmel.vb and I got no errors. But I did not get the new values insterted in the table "Custermers" of Northwind database. Please help and tell me what I did wrong and how to correct this problem.

Thanks in advance,
Scott Chang

View 10 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved