Bulk Inserts Between Two Different DBMS

Sep 21, 2004

Hi folks,
I have a table located in DB2 nd I need to have a mirror image of this table on a SQL2000 database to avoid some server downtime problems.
Right now I have a solution using ADO.NET with Windows Services.

This windows service invokes itself everyday morning and pulls all the records from this table in DB2 to a dataset. Then I loop through the dataset and insert every record into SQL 2000 Table. This method is working fine ( It take approximately 2 minutes to insert 5000 records). I am just wondering whether there is any way to acheive bulk insertion in this case. Considering future growth of table I am not thinking the existing solution is neither elegant nor efficient.

Please let me know if I can achive the same either using XML, BULK INSERTS or any other mechanism in ADO.NET and please remeber that we are talking about data migration between different DBMS ( DB2 to SQL 2000)

Thanks,
Sai

View 1 Replies


ADVERTISEMENT

Bulk Inserts

Mar 3, 2006

I'm trying to perform a bulk insert as shown below. It's problematic b/c it's not updating the identity fields correctly and we're getting dups. I think, but I'm not sure, that On Update Cascade would solve all this, b/c we wouldn't have to concern ourselves with even touching the identity fields, b/c they would be autogenerated. Can someone shed some light?? I'm pretty confused.


CREATE PROCEDURE AddMiamirecords AS

BEGIN TRANSACTION

--USERS
INSERT INTO [Undex_Production].[dbo].[USERS]([LastName], [UserName], [EmailAddress], [Address1], [WorkPhone], [Company], [CompanyWebsite], [pword], [IsAdmin], [IsRestricted],[AdvertiserAccountID])
SELECT dbo.fn_ReplaceTags (convert (varchar (8000),Advertisername)), [AdvertiserEmail], [AdvertiserEmail],[AdvertiserAddress], [AdvertiserPhone], [AdvertiserCompany], [AdvertiserURL], [AccountNumber],'3',0, [AccountNumber]
FROM Miami
WHERE not exists (select * from users Where users.Username = miami.AdvertiserEmail)
AND validAD=1


--PROPERTY
INSERT INTO [Undex_Production].[dbo].[Property]([ListDate],[CommunityName],[TowerName],[PhaseName],[Unit], [Address1], [City], [State], [Zip],[IsActive],[AdPrintId])
SELECT [FirstInsertDate],[PropertyBuilding],[PropertyStreetAddress],PropertyCity + ' ' + PropertyState + ' ' + PropertyZipCode as PhaseName,[PropertyUnitNumber],[PropertyStreetAddress],[PropertyCity], [PropertyState], [PropertyZipCode],'0',[AdPrintId]
FROM [Undex_Production].[dbo].[miami]
WHERE miami.AdvertiserEmail IS NOT NULL
AND validAD=1


--ITEM
INSERT INTO [Undex_Production].[dbo].[ITEM] ([SellerID],[Price],[StartDate],[EndDate], [HomePageFeatured],[Classified],[IsClosed])
SELECT USERS.UserID, miami.PropertyPrice, convert(datetime,miami.FirstInsertDate), dateadd(day, 30, miami.FirstInsertDate)as EndDate, 1, convert (int,AdNumber) as Classified, 0
FROM USERS RIGHT OUTER JOIN
miami ON USERS.UserName = miami.AdvertiserEmail
WHERE validAD=1


--PROPERTYITEM
INSERT INTO [Undex_Production].[dbo].[propertyItem]( [propertyId], [ItemId])
SELECT Property.propertyId, ITEM.ItemID
FROM ITEM RIGHT OUTER JOIN
miami ON ITEM.StartDate = miami.FirstInsertDate AND ITEM.Price = miami.PropertyPrice AND ITEM.Classified = convert(int,miami.AdNumber) LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1

--CONDOFEATURES
INSERT INTO [Undex_Production].[dbo].[CondoFeatures](PropertyId,[Bedrooms], [Area], [PropertyDescription], [Bathrooms], [NumOfFloors])
SELECT Property.propertyId, [PropertyBedrooms], [PropertySquareFeet], dbo.fn_ReplaceTags (convert (varchar (8000),PropertyDescription)),
[PropertyBathrooms], [PropertyTotalFloors]
FROM miami LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1

--COMMUNITY FEATURES
INSERT INTO [Undex_Production].[dbo].[CommunityFeatures](PropertyId,[totalFloors],isComplete1)
SELECT Property.propertyId, miami.propertyTotalFloors,'0' as IsComplete
FROM miami LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1

--UNITDISCLOSURES
INSERT INTO [Undex_Production].[dbo].[UnitDisclosures]([propertyId],[monthcondoasso])
SELECT Property.propertyId, [propertyassocfee]
FROM miami LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1


--BROKERDEVELOPER
INSERT INTO [Undex_Production].[dbo].[BrokerDeveloper]([IsFSBO],[FSBOName],
[FSBOEmail],[FSBOWebsite],[IsDeveloper],[DeveloperName],[DeveloperWebsite],[IsBroker],[BrokerName],[BrokerageWebsite],
[propertyId],[brokercommission],[isComplete])SELECT
CASE AdvertiserType when 'FSBO' THEN 1 else 0 end,
CASE AdvertiserType when 'FSBO' THEN [AdvertiserName] else NULL end,
CASE AdvertiserType when 'FSBO' THEN [AdvertiserEmail] else NULL end,
CASE AdvertiserType when 'FSBO' THEN [AdvertiserURL] else NULL end,
CASE AdvertiserType when 'Developer' THEN 1 else 0 end,
CASE AdvertiserType when 'Developer' THEN [AdvertiserName] else NULL end,
CASE AdvertiserType when 'Developer' THEN [AdvertiserURL] else NULL end,
CASE AdvertiserType when 'Realtor' THEN 1 when 'Broker' THEN 1 else 0 end,
CASE AdvertiserType when 'Realtor' THEN [AdvertiserName] when 'Broker' THEN [AdvertiserName] else NULL end,
CASE AdvertiserType when 'Realtor' THEN [AdvertiserURL] when 'Broker' THEN [AdvertiserName] else NULL end,
Property.propertyId,[PropertyCommBroker],'0' as IsComplete
FROM miami LEFT OUTER JOIN
Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND
miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1
WHERE validAD=1
IF @@ERROR <> 0
BEGIN
ROLLBACK TRAN
RETURN
END
COMMIT TRANSACTION
GO

View 2 Replies View Related

Using Triggers With Bulk Inserts

May 4, 2005

Hello All.

I am attempting an

insert into <tablename> select * from...

This is inserting thousands of rows into a table that has a trigger on it. The trigger updates a seperate table.

Now my question is will the trigger fire for every record inserted because I need it to fire only once as the table the trigger is updating is becoming locked when the insert statement is executed. I am assuming it is because of the number of times the trigger is being fired that the table is becoming locked. Any help would be greatly appreciated.

View 4 Replies View Related

SQL 2012 :: Blocking During Bulk Inserts?

Dec 9, 2014

We are noticing a lot of blocking when performing bulk inserts.

Sometimes the lead blocker(s) are blocking inserts into the same tables, but most of the time, it's insert to other tables. So, while bulk insert into table X is running, bulk insert into table Y is blocked from the insert into X.

- We turned off transactions from the application performing the bulk inserts - no change.

- We enabled table locking from the application - no change

- We enabled lock_on_bulk_load on the tables - this seems to have worked; however still see some blocking.

According to the lock_on_bulk article - When you specify table locking for a bulk import operation, a bulk update (BU) lock is taken on the table for the duration of the bulk-import operation, shouldn't we be seeing this BU lock? Instead, we see nothing but LCK_M_X (exclusive table locks).

Just found out that in order get a BU lock, no indexing can exist on the table ... sure be nice if that was in the article!

Also, we saw the exclusive locks happening before we made these changes, meaning it wasn't using the row locks that it should be by default. Assuming we're missing something here with lock escalation. I am profiling just for lock escalation, and it never happens ...

So I guess my pending question at this time is that why when inserting into table X do inserts into table Y get blocked?

View 3 Replies View Related

T-SQL (SS2K8) :: Bulk Inserts In Batch?

Jun 16, 2015

I have a table with 370 million rows and 50+ columns. I need to change the data type of a column from character to numeric. Here's what it contains:

40 million with numbers I want to keep, the rest I just want to set to null:

4k with alpha characters
55 million other numbers
275 million empty strings

An alter column statement fails not just on the alpha characters but on the empty strings. So I tried a couple things on a test database to get an idea of the time it would take:

An update statement to clear out the non-numeric data is too slow (~1.5 days, batched 10000 at a time). I think I probably should create a new column anyway though, so I'm going to copy the data to a new table since it would be faster than adding a new column to the original table.

An insert ... select ... takes about 12 hours; adding WITH (TABLOCK) didn't seem to have any effect, and I'm not sure how to batch it. Recovery model is simple.

A select ... into ... only takes about 1 hour, but can't be batched.

Using a 3rd party ETL tool takes about 5 hours, batched.

I wanted to batch it to minimize impact on other queries but primarily the logs. Is there any way to do a fast batched bulk transfer within SSMS?

View 9 Replies View Related

BULK INSERT Inserts 2 Rows

May 15, 2008

Greetings

Got bulk insert thing going real nice thanks to your feedback! But now I notice that the BULK INSERT creates 2 identical rows while the flat file has the column headers and then corresponding row, just one row.
Why would it do that. Interestingly if the data file and format file are residing on a shared folder on SQL 2005, and I ran the BULK INSERT from studio it only inserts one row. But when I do BULK INSERT via user interface it creates 2 rows. What is going here.

View 10 Replies View Related

Bulk Inserts To Data Warehouse - Best Practices?

Jul 20, 2005

Hello all,I just started a new job this week and they complain about the length oftime it takes to load data into their data warehouse,which they do once a month.From what I can gather, they rebuild the indexes before the insert with an80% Fillfactor, then insert the data (with theindexes enabled), then rebuild the indexes with a 100% Fillfactor.Most of my RDBMS experience is with a different product. We would havedisabled the indexes and Foreign Keys, loaded the data, thenre-enabled them, moving any records that violated the constraints into anappropriate audit table to be checked after.Can someone share with me what the accepted "best practices" are for loadingdata efficiently into a data warehouse?Any thoughts would be deeply appreciated.Steve

View 2 Replies View Related

Client Workstation Initiated Bulk Inserts Fail

Sep 24, 2007

I'm setting up a new 2005 server and bulk insert from a client workstation (using windows authentication) is failing with:

Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "\FILESERVERNAMEsharedfolderfilename.txt" could not be opened. Operating system error code 5(Access is denied.).


Here's my BULK INSERT statement (though I'm pretty sure there's nothing wrong with it):


BULK INSERT #FIRSTROW FROM '\FILESERVERNAMEsharedfolderfilename.txt'
WITH (
DATAFILETYPE = 'char',
ROWTERMINATOR = '',
LASTROW = 1
)

If I run the same transact SQL when remote desktopped into the new server (under the same login as that used in the client workstation), it imports the file without errors.

If I use the sa client login from the client workstation (sql server authentication) the bulk insert succeeds.

My old SQL 2000 server lets me bulk insert the file without errors even from my client workstations using windows authentication.

I have followed the instructions on this site: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=928173&SiteID=1
, but still no luck and same error.

I'm pretty sure it is being caused by the increased constraints on bulk insert in 2005. Hoping someone can help. The more specific the better. If you need more info, let me know.

Oh and I've also made sure that the SQL service uses a domain logon account rather than the local system account.

Note that the file server (source file resides there) is a DIFFERENT machine than the 2005 SQL server. If I move the source file to the sql server machine the error goes away (not a preferred solution though).

Thanks!

View 9 Replies View Related

Row Level Security And Integration Services Bulk Inserts Don't Work

Jan 24, 2007

Hi i followed Microsofts "Implementing Row-and-Cell-Level Security in Classified Databases Using SQL Server 2005"

this works fine when i insert delete data on a normal script (mangement studio)

my project runs in a SSIS package, different users. i cannot do a bulk insert using OLEDB data Destination i get the following error

An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Conflicting locking hints are specified for table "dbo.tblUniqueLabelMarking". This may be caused by a conflicting hint specified for a view.".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Conflicting locking hints are specified for table "dbo.tblUniqueLabelMarking". This may be caused by a conflicting hint specified for a view.".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Conflicting locking hints are specified for table "dbo.tblUniqueLabel". This may be caused by a conflicting hint specified for a view.".
Error: 0xC0209029 at Data Flow Task, OLE DB Destination 1 [1741]: The "input "OLE DB Destination Input" (1754)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (1754)" specifies failure on error. An error occurred on the specified object of the specified component.

View 6 Replies View Related

Importing Automatically From Another DBMS

Jun 7, 2008

Hello,

I want to ask if there is any option in SQL Server 2000 that can be used to import automatically every day....I have an FoxPro database which is updated every day and I want my main table in SQL Sever to be updated by importing every thing again from the FOXPRO ever day at night.....is there any option for that?

Thanks,


Diana

View 1 Replies View Related

Microsoft SQL Server And Other DBMS

Sep 3, 2007

Hello:

Many companies use DBMS that are not Microsoft SQL Server. Those companies use Oracle, PostgreSQL, MySQL and Ingres, for example. I'm looking for a paper o website which exponses the advantages (strengths) and disadvantages (weakness) of those DBMS. Why? Our customers would like a comparison between different systems, and we must show them the Microsoft SQL Server possibilities.

Can you help me with this? In terms of performance, speed, security, maintenance, etc.

Thanks.

View 1 Replies View Related

How Is One To One Relationship Is Established In Sql Express Dbms ?

Apr 28, 2008

Table 1
vEmail_Id
vFirst_Name
vLast_ Name
Table 2
vEmail_Id
iCell No
How to relate Table1 with Table2 by 1 to 1 relation in sql express ?
 

View 4 Replies View Related

Choosing DBMS And Architecture For Ecommerce Website

Jul 20, 2005

Hi to allI have to choose a DBMS and a database architecture for an Ebay likewebsite about to be launched.The company wants to use a web hosting service and not host thedatabase on dedicated servers at the office.The database will contain web-only information and lots of back endinformation that is not really needed to be stored on the web host.I'm wondering how to design that part, should I store all informationon the web host only ? miror that DB every evening on some local DBserver to be able to use the data without eating up lots of bandwith ?separate the database in 2 parts ? how to sync and assure integritythen ? having a local DB will also mean the company will have to pay alicence for the DBMS ...What DBMS should I pick considering that the database will have tohold at least 1 million products to sale (eBay like) and all theinformation that goes with it. I thought any DBMS weaker than SQLServer or Sybase or Oracle will not be enough. What do you think ?Thanks a lot, hope I have made myself clear enoughP.S. I would really like to get lots of different points of view. Ithink I'll use Sybase after all, so I wonder if that'sa good choiceand I still want to know your thoughts about the 2 or 1 DB design(separate Web & Billing information for example, or leave all the infoin the hosted database, what techniques to use to keep the integrityand to have the latest information in-house)... Thanks a lot

View 2 Replies View Related

Pulling Data From MySQL DBMS In France To Sql Server 2005 Here In US

Jan 22, 2008



HI All,
I want to pull data from mysql dbms to sql servr 2005, i have wrote the following codes but it takes more than an hour and half which is not visible. Thus is there any consideration to consider and reduced time it takes. For your information i am going to use SSIS packags, there is no any tranformation, it is direct dump.
Here is the code i am using,


SELECT *
FROM
OPENQUERY (Server_1,'


SELECT
t3.Column11 as Column1,
Column12 as Column2,
Column13 as Column3,
Column14 as Column4,
Column15 as Column5,
Column16 as Column6,
Column17 as Column7,
Column18/1000 as Column8
FROM
table1 t1
INNER JOIN
table2 t2
ON t1.ColumnId = t2.columnID
INNER JOIN
Table3 t3
ON t2.columnId = t3.columnID
WHERE t1.Column4 > Sometime ')


View 11 Replies View Related

Double Space Replaced With Single Space By Dbms ??!

Jul 20, 2005

This is driving me bananas. Can't find any info on this anywhere....SQL 2000 seems to replace double space with a single space when I seta varchar field to " " (2spaces), it only stores " " (1space). Whyon earth would microsoft do this? If I save 2 spaces - I WANT TO SEE2 SPACES!!!!Can anyone help? Is this a database setting? Is this due to usingvarchar?Any help appreciated.Colin Hale

View 2 Replies View Related

Bulk Insert - Bulk Load Data Conversion Error

Jan 17, 2008

Im having some issues with bulk insert.

This is the table:

CREATE TABLE [dbo].[tmp_GA_status](

[GA_recno] [int] NOT NULL,

[GA_desc] [varchar](40) NULL

)


This is the file (unicode):
1|"test1"
2|"test2"
3|"test3"
4|"test4"
5|"test5"
6|"test6"
7|"test7"
8|"test8"


and this is the sql:

bulk insert tmp_GA_status from 'C: empTextDumpGA_status.dta'

with (CODEPAGE='RAW', FIELDTERMINATOR='|', ROWTERMINATOR='
', DATAFILETYPE='widechar')



so yeah, pretty simple. But whatever I do I get this;

Msg 4864, Level 16, State 1, Line 1

Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 2 (GA_desc).



So what am I doing wrong ?

View 13 Replies View Related

I Don't Suppose BULK UPDATE Exists?... Like BULK INSERT?

Sep 27, 2007

I have to update a field within a table of 60 records or so. Each record has a different field value. it's type varchar. i was given an excel file with the field values and was thinking of a bulk update like bulk insert, but i don't recall that it's possible that way.

Is the only way to create a table, bulk insert, then merge the two tables together with UPDATE?

Just wanted to see if there was an easier way to do it, otherwise i'll take the latter route. Thanks!

View 1 Replies View Related

Cannot Fetch A Row From OLE DB Provider BULK With Bulk Insert Task

Nov 23, 2005

Hi, folks:

View 18 Replies View Related

Pros: How To Bulk Delete And Bulk Insert?

Oct 11, 2000

I have a table containing 8 million records.
I need to replace 2 million of these records with
a scaled down query that goes something like:
SELECT 1, ShareholderID, Assets1
FROM MyTable (Yields appx. 200,000 recods)
SELECT 2, ShareholderID, Assets2
FROM MyTable (Yields appx. 200,000 recods)
.
.
.
SELECT 10, ShareholderID, Assets1 + Assest2 + Assets3 + ... + Assets9
FROM MyTable (Yields appx. 200,000 recods)

Updates and cursors just seem to be too slow.

So far I have done the following, but was wondering if anyone could think of a better way.
SELECT 6 million records that don't need to be deleted into a #TempTable
Use statements above to select into same #TempTable
DROP and recreate Original Table
SELECT 6 + 2 million records INTO original table.

This seems rather convoluted. Is there a better approach? Would it be worth while to dump data to a file and use bcp / Bulk Insert


Any comments are appreciated,

-Marc

View 3 Replies View Related

BULK INSERT ERROR Using Format File - Bulk Load Data Conversion Error

Jun 29, 2015

I'm trying to use Bulk insert for the first time and getting the following error. I think it might have something to do with my Format File and from the error msg there's a conversion error for the first column. In my database the Field is nvarchar(6) so my best guess is to use SQLNChar for the first column. I've checked the end of each line is CR LF therefore the is correct for line 7 right?

Msg 4863, Level 16, State 1, Line 1
Bulk load data conversion error (truncation) for row 1, column 1 (ASXCode).
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

BULK
INSERTtbl_ASX_Data_temp
FROM
'M:DataASXImportTest.txt'
WITH
(FORMATFILE='M:DataASXSQLFormatImport.Fmt')

[code]...

View 5 Replies View Related

Questions About Bulk Copy Insert Using 'Memory Based Bulk Copy Operations'

Feb 1, 2007

Hi~,

Before implementing memory based bulk copy insert with IRowsetFastLoad interface of SQL Server 2005 OLE DB provider, I want to know some considerations.

- performance : compared with T-SQL's "BULK INSERT ..." and bcp utility

- SQL Server's resource usage : when running memory based bulk copy, server resource's influence

- server side action(behavior) : when server is busy, delayed-update means IRowsetFastLoad::Commit(true) method can insert right after?

- row-count : The rowcount limitation can be inserted by IRowsetFastLoad::InsertRow() method before IRowsetFastLoad::Commit

- any other guide lines

View 1 Replies View Related

How Do You Use An Identity Column When Doing A Bulk Insert Using The Bulk Insert Task Editor

Apr 18, 2008



Hello,

I'm just learning SSIS and I've hit my first bump. I am doing a bulk import from a tab delimited text file to an empty sql table that has a Idendity column defined. How do I tell the bulk insert task to skip that column when inserting from the text file. If I remove the identity column it imports the data fine, but I want to create the indentity column in the table too.

Thanks.

View 8 Replies View Related

Inserts Per Second

Mar 3, 2006

Just curios,

I have four 72 GB Drive on a RAID 5

Disk Specs
IO/Second = 130 per disk
Speed RPM = 15 K

When I did a load test of inserting data into a table
with four Columns

Col1 INT
Col2 VARCHAR(32)
Col3 VARCHAR(4000)
Col4 DATETIME

I could insert around 1044 Inserts per second where as
I thought I could do max of 520 Inserts ( 130 * 4 ) because
each disk can only take 130 Inserts multiplied by 4 Disks
gives me a theoritical limit of 520.

Also How does the Query Analyser Connects to the datbase
Server.. does it use ODBC

Thx
Venu

View 1 Replies View Related

No Of Inserts

Mar 21, 2006

I am doing a simple IO Test with the below script ...

Just wanted to keep things simple and to check how many Inserts I can do on a given SQL Server.
I am running the below script from QA for 1 minute and then divide the No or rows inserted by 60.

Will it give me approximate results by duing this?

Actually the datafiles .MDF files are sitting on a single drive where the manufacturer specs shows that it will handle 130 IO's per disk. With the below script I am getting around 147 Inserts per second.

But my boss says that he is getting 2000 inserts per second on his laptop from a ...Am I missing some thing?

DECLARE @lnRowCnt INT
SELECT @lnRowCnt = 100000

WHILE @lnRowCnt > 0
BEGIN
SET NOCOUNT ON
INSERT INTO CTMessages..Iotest
SELECT @lnRowCnt , 'VENU' , REPLICATE ( 'V' , 4000 ) , 1000000

SELECT @lnRowCnt = @lnRowCnt - 1
END

Thx
Venu

View 3 Replies View Related

Two Inserts In One InsertCommand

Apr 11, 2007

I'm using a SQLDataSource and trying to do two inserts into two different tables with one InsertCommand, but it's not working. Here's the code I'm trying to use. Do you see anything wrong with the syntax? I keep getting an error that says error near ','  but I can't figure out why. Thanks
 
InsertCommand="INSERT INTO [OurProjects] ([Title], [Description], [Location], [Anchors], [Size], [Developer], [DesignBuilder], [Architect], [ImageName], [MapName], [ProjectTypeAbbrev], [Deleted]) VALUES (@Title, @Description, @Location, @Anchors, @Size, @Developer, @DesignBuilder, @Architect, @ImageName, @MapName, @ProjectTypeAbbrev, @Deleted),
INSERT INTO [OurProjectsImages] ([OurProjectsID], [ImageMonthName], [SwfName]) VALUES (@OurProjectsID, @ImageMonthName, @SwfName)"

View 2 Replies View Related

SQL TimeSpan And Inserts

May 27, 2008

I need to Add a Check in the database to ensure that user can only enter up to 20 entries to Database in a period of 10 minutes. Basically, to guard against people using scripts to add data to the database ( instead of using CAPTCHAE on the front end) what we want to do is restrict user to entering at most 20 transactions in 10 Minutes.How do I handle or do this in SQl Server 2005??
 
What I figure to do is right after I do an INSERT into the table, I Select the last 20 entries into that same table and then Calculate the Total time it took to add those 20 transactions and set the righ flag.
1) How do I select last 20 entries into a table??
2) How do I calculate the total time that elapsed between adding the first of those 20 records and the last??
Thanks in Advance

View 6 Replies View Related

Next Inserts ID Number, Someone Please Help

Sep 21, 2004

G'day,
I have a table with a primary key being a bigint and its set to auto increment (or identity or whatever ms calls it). Is there anyway I can get the ID number that will be assigned to the next Insert before I insert it? I want to use that ID number within another field when inserted.

I hope that makes sense.

Thanks for any help.

Robbo

View 3 Replies View Related

Multiple Inserts

Jul 14, 2005

Hi, I'm trying to create a form where new names can be added to a database. The webform looks like this:<body MS_POSITIONING="GridLayout">        <form id="Form1" method="post" runat="server">         Name:<asp:TextBox ID="newName" runat="server" />         <INPUT id="NewUserBtn" type="button" value="Create New User" name="NewUserBtn" runat="server"            onServerClick="NewBtn_Click">&nbsp;     </form>And the code behind looks like this:Public Sub NewBtn_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles NewUserBtn.ServerClick        Dim DS As DataSet        Dim MyConnection As SqlConnection        Dim MyCommand As SqlDataAdapter
        MyConnection = New SqlConnection("server=databaseserver;database=db;uid=uid;pwd=pwd")        MyCommand = New SqlDataAdapter("insert into certifications (name) values ('" & newName.Text & "'); select * from certifications", MyConnection)
        DS = New DataSet        MyCommand.Fill(DS, "Titles")
        Response.Redirect("WebForm1.aspx", True)    End SubWhen I try to insert one name it works. When I try to insert a second name, it overwrites the old one. Why is that?Thanks.James

View 3 Replies View Related

Identity Inserts

Apr 15, 2006

Hey All,
I was trying to use a typed dataset to create a very simple DAL. I found that the code generated for the INSERT statement includes an identity field the table has. That can obviously never work (unless identity_insert is set, which it is not). My question is whether it is possible to control this insert statement generation? Is there a property I am missing somewhere? My solution was to change the INSERT statement on the DataTableAdapter, but that seems awkward for me to have to do that..
Thanks,
Yuval

View 3 Replies View Related

Updates, Inserts

Jan 22, 2001

I have a number of columns with predefined character length but user can input more from gui. i want to trucncate automatically to the desired length and insert or update the database right now it does not allow me to update , or insert the values can i do it and how this is urgent

View 2 Replies View Related

Volume Inserts

Jan 24, 2000

We have a 4 processor 350 Hz NT 4.0 SQL server. Currently we have an application
that is inserting rows one at a time, each row insert is a separate transaction.
Currenty we are averaging 2500 rows a second with each row ( 56 bytes wide).
The data and the log are on one string of Raid disk. We plan to get another controller
and raid string to separate the data and the log onto separate controllers.
The developer is modifying the application to insert the data in blocks. What is the
impact to the transaction log? He seems to think that by inserting page blocks on
rows there would be less data going into the transaction log. Why would this be so?
Does anyone have any information on practical limits for inserts and log truncation
with similar machine configurations. He would like to try to get around 150,000 rows a second.
Has anyone accomplished inserts at this rate? What type of machine configuration?

View 1 Replies View Related

Inserts Overwriting

Nov 21, 2001

Hi,
I have a small web application managing complaints. During multiuser testing we noticed that when complaints where added at "exactly" the same time one complaint text seemed to be over writing the other, and returning the current max value for table id as current complaint number.

I tested in my development environment and was able to recreate reasonably easily ( 1 go out of 3 recreated the issue ). The Id column itself is an auto increment ( primary key ), so I can't think of a concievable reason why one record should overwrite another. I should say that I am assuming the record is overwritten, perhaps there is a clash and one complaint is ignored by the database.

Have anyone encountered this in the past?


Thanks

View 1 Replies View Related

Inserts Across Databases

Mar 6, 2002

Hi,
I have a procedure that I call on one database, and one of it's steps is to write to a table on another database, same server. the user exists in both databases, but i keep getting errors when i try and write to this second database. i know i can fix this by giving the user insert permissions on the table in this second database, but i do not want this for security purposes. any other ideas on how to accomplish this?

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved