Hi,
I have a small web application managing complaints. During multiuser testing we noticed that when complaints where added at "exactly" the same time one complaint text seemed to be over writing the other, and returning the current max value for table id as current complaint number.
I tested in my development environment and was able to recreate reasonably easily ( 1 go out of 3 recreated the issue ). The Id column itself is an auto increment ( primary key ), so I can't think of a concievable reason why one record should overwrite another. I should say that I am assuming the record is overwritten, perhaps there is a clash and one complaint is ignored by the database.
I am very new to SQL, but I am now receiving a updates on a database from an online HD. My idea was that I could download the updated database and paste over the existing database in my SQL folder. For some reason that is not working. How should this database be updated. I would prefer not to have to drop and add everyday.
Possibly I could make a database that is not in any folder and then make an update to it from the newly recieved database, but I am not sure how that would be done.
To insert entries into a table. The table has a primary key based on a field 'ID'. When inserting into the destination table, I want to make sure that the new entry will overwrite the old entry.
When I create a db backup on our network using BACKUP DATABASE...
BACKUP DATABASE [TKKCommonData] TO DISK = N'G:SQL_BACKUPSTKKCommonDataTKKCommonData_DATA.bak' WITH NOFORMAT, NOINIT, NAME = N'TKKCommonData_DATA-Full Database Backup', SKIP, NOREWIND, NOUNLOAD, STATS = 10
I've specified the NOINIT so that it appends rather than overwrites the database, however the database is still overwritten. Any idea how to get the database to backup and append to the set rather than overwrite the backup ?
I'm trying to create installer that installs database from empty backed up database. The SQL script executed by installer:
RESTORE DATABASE [DU] FROM DISK = N'$TARGETDIR$DuTempDU.bak' WITH FILE = 1, MOVE N'DU' TO N'$DATABASE_DIR$DU.mdf', MOVE N'DU_log' TO N'$DATABASE_DIR$DU_log.ldf', NOUNLOAD, STATS = 10 GO
There $TARGETDIR$ and $DATABASE_DIR$ are MSI variables set on runtime. The REPLACE flag is not set, but DU database (non empty )still gets overwritten with empty database. Do I need to check manually if database exists before trying to restore it?
I created a package which runs everydays and dumps the data into an excel file. The problem iam facing is that -today the package runs and fills in the excel file,tomorrow it again runs and fills in the data without deleting the previous records....... But i want it to delete the records already present and fill in the excel only with the new records...
I have a data dictionary report that is driven by measure and dimension description attributes in the SSAS project. Is there a way to generate some XMLA or something that allows me to update this "metadata" only without overwriting the database or requiring a full reprocess?
I'm using RS2000 SP2 and am getting an issue when exporting to PDF. If I have a table that spans more than one page and I set the RepeatHeaderOnNewPage to True, then on occasion the table header will be displayed on top of the first few rows of data. It does not happen on all the pages or all the time and I can not find any information on this issue. Has anyone come across this before and solved it?
I am tring to figure out how to simplify the process of populating a database created by an application with the same database, only with data already in it. So far i have created a backup of the database and used that backup file with SQL server management express to overwrite the existing database with that backup file on a new computer so the program will have data when initally installed for Demonstration purposes. I was hoping there was an executable script that i could use, so that when someone wants a demonstration of our product, they can see its options and functionallity with data available. Maby i am going about this the wrong way, i need to know if there is a way that when our program is installed an executable can simply be run to populate our database with a backup of our sample database. Any imput would be helpful. Thanks. Isaias
We have one package in production. variable var_date has an expression already defined to it. How can we overwrite this variable value from config file or from cmd file. We don't want to make changes to the package and redeploy it.
here is the situation. I have a DB on one system. I back it up and then restore it to a second system. This second system I run reports off of and I want to create custom views that do not exist on the original system. Can I restore the backup DB from the remote system without wiping out the custom views on the local system?
I have to do this this way as they won't let us create the views we want on the remote system so the only way we have access to run the reports is by restoring the backup locally.
i have 2 tables (both containing the same column names/datatypes), say table1 and table2.. table1 is the most recent, but some rows were deleted on accident.. table2 was a backup that has all the data we need, but some of it is old, so what i want to do is overwrrite the rows in table 2 that also exist in table 1 with the table 1 rows, but the rows in table 2 that do not exist in table one, leave those as is.. both tables have a primary key, user_id.
Disk Specs IO/Second = 130 per disk Speed RPM = 15 K
When I did a load test of inserting data into a table with four Columns
Col1 INT Col2 VARCHAR(32) Col3 VARCHAR(4000) Col4 DATETIME
I could insert around 1044 Inserts per second where as I thought I could do max of 520 Inserts ( 130 * 4 ) because each disk can only take 130 Inserts multiplied by 4 Disks gives me a theoritical limit of 520.
Also How does the Query Analyser Connects to the datbase Server.. does it use ODBC
I am doing a simple IO Test with the below script ...
Just wanted to keep things simple and to check how many Inserts I can do on a given SQL Server. I am running the below script from QA for 1 minute and then divide the No or rows inserted by 60.
Will it give me approximate results by duing this?
Actually the datafiles .MDF files are sitting on a single drive where the manufacturer specs shows that it will handle 130 IO's per disk. With the below script I am getting around 147 Inserts per second.
But my boss says that he is getting 2000 inserts per second on his laptop from a ...Am I missing some thing?
DECLARE @lnRowCnt INT SELECT @lnRowCnt = 100000
WHILE @lnRowCnt > 0 BEGIN SET NOCOUNT ON INSERT INTO CTMessages..Iotest SELECT @lnRowCnt , 'VENU' , REPLICATE ( 'V' , 4000 ) , 1000000
I'm using a SQLDataSource and trying to do two inserts into two different tables with one InsertCommand, but it's not working. Here's the code I'm trying to use. Do you see anything wrong with the syntax? I keep getting an error that says error near ',' but I can't figure out why. Thanks
I need to Add a Check in the database to ensure that user can only enter up to 20 entries to Database in a period of 10 minutes. Basically, to guard against people using scripts to add data to the database ( instead of using CAPTCHAE on the front end) what we want to do is restrict user to entering at most 20 transactions in 10 Minutes.How do I handle or do this in SQl Server 2005??
What I figure to do is right after I do an INSERT into the table, I Select the last 20 entries into that same table and then Calculate the Total time it took to add those 20 transactions and set the righ flag. 1) How do I select last 20 entries into a table?? 2) How do I calculate the total time that elapsed between adding the first of those 20 records and the last?? Thanks in Advance
G'day, I have a table with a primary key being a bigint and its set to auto increment (or identity or whatever ms calls it). Is there anyway I can get the ID number that will be assigned to the next Insert before I insert it? I want to use that ID number within another field when inserted.
Hi, I'm trying to create a form where new names can be added to a database. The webform looks like this:<body MS_POSITIONING="GridLayout"> <form id="Form1" method="post" runat="server"> Name:<asp:TextBox ID="newName" runat="server" /> <INPUT id="NewUserBtn" type="button" value="Create New User" name="NewUserBtn" runat="server" onServerClick="NewBtn_Click"> </form>And the code behind looks like this:Public Sub NewBtn_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles NewUserBtn.ServerClick Dim DS As DataSet Dim MyConnection As SqlConnection Dim MyCommand As SqlDataAdapter MyConnection = New SqlConnection("server=databaseserver;database=db;uid=uid;pwd=pwd") MyCommand = New SqlDataAdapter("insert into certifications (name) values ('" & newName.Text & "'); select * from certifications", MyConnection) DS = New DataSet MyCommand.Fill(DS, "Titles") Response.Redirect("WebForm1.aspx", True) End SubWhen I try to insert one name it works. When I try to insert a second name, it overwrites the old one. Why is that?Thanks.James
Hey All, I was trying to use a typed dataset to create a very simple DAL. I found that the code generated for the INSERT statement includes an identity field the table has. That can obviously never work (unless identity_insert is set, which it is not). My question is whether it is possible to control this insert statement generation? Is there a property I am missing somewhere? My solution was to change the INSERT statement on the DataTableAdapter, but that seems awkward for me to have to do that.. Thanks, Yuval
I have a number of columns with predefined character length but user can input more from gui. i want to trucncate automatically to the desired length and insert or update the database right now it does not allow me to update , or insert the values can i do it and how this is urgent
We have a 4 processor 350 Hz NT 4.0 SQL server. Currently we have an application that is inserting rows one at a time, each row insert is a separate transaction. Currenty we are averaging 2500 rows a second with each row ( 56 bytes wide). The data and the log are on one string of Raid disk. We plan to get another controller and raid string to separate the data and the log onto separate controllers. The developer is modifying the application to insert the data in blocks. What is the impact to the transaction log? He seems to think that by inserting page blocks on rows there would be less data going into the transaction log. Why would this be so? Does anyone have any information on practical limits for inserts and log truncation with similar machine configurations. He would like to try to get around 150,000 rows a second. Has anyone accomplished inserts at this rate? What type of machine configuration?
Hi, I have a procedure that I call on one database, and one of it's steps is to write to a table on another database, same server. the user exists in both databases, but i keep getting errors when i try and write to this second database. i know i can fix this by giving the user insert permissions on the table in this second database, but i do not want this for security purposes. any other ideas on how to accomplish this?
i have a {date value} i have a {frequency value} 1 = yearly 4 = quarterly 12 = monthly
i need to select an item then check the frequency
then do a loop insert based on the frequency
if frequency = 1
insert item, date into table where date = {date value}
elseif frequency = 4
Per item -- insert 4 new entrys insert item, date into table where date = {date value} insert item, date into table where date = {date value + quarter} insert item, date into table where date = {date value+ 2 quarters} insert item, date into table where date = {date value + 3 quarters}
< ' below is how i can calculate quartly values from a date iv vb .net just need to do the same within sql
Dim Quarterloop AS Integer for QuarterLoop = 0 to 3 > <= formatdatetime(dateadd("q", Quarterloop , MyDate),DateFormat.longdate) ><br> < Next >
elseif frequency = 12 --- per item insert 12 new entrys insert 12 items into the table looping from date and then in 12 increments of 1 month values
Currenlty I have huge amounts of data going into a table. I'm sending an xmldoc and using openxml with a cursor to seed them.
the question I have is whether to let duplicate keyed data rows bounce and then check @@error and then do an update on the nokeyed field or to do a select on the keyed field and then do an insert or update based on the selects results.
I have to create dynamic insert statements for the table. For example there are DevTableA and ProdTableA tables. I worte a SQL to get the new records added in the DevTableA but are not there in ProdTableA. The result gives me a list of rows. These tables have a column 'LanguageID' and 'LText'
The compare result has records only for LanguageID = 0. One I see the compare result. I am suppose to create insert statements for LanguageID = 1,2,5 and 6 and update the Ltext for those languages. The Ltext for other languages is in spreadsheet.
Can anyone advice me how to create the insert statements from the comapre result and add 4 more insert statements for LanguageID = 1,2,5 and 6 with their respective Ltext.
So far I thought I can create #table. Looks like I need more than 1 # table.
I'm trying to perform a bulk insert as shown below. It's problematic b/c it's not updating the identity fields correctly and we're getting dups. I think, but I'm not sure, that On Update Cascade would solve all this, b/c we wouldn't have to concern ourselves with even touching the identity fields, b/c they would be autogenerated. Can someone shed some light?? I'm pretty confused.
CREATE PROCEDURE AddMiamirecords AS
BEGIN TRANSACTION
--USERS INSERT INTO [Undex_Production].[dbo].[USERS]([LastName], [UserName], [EmailAddress], [Address1], [WorkPhone], [Company], [CompanyWebsite], [pword], [IsAdmin], [IsRestricted],[AdvertiserAccountID]) SELECT dbo.fn_ReplaceTags (convert (varchar (8000),Advertisername)), [AdvertiserEmail], [AdvertiserEmail],[AdvertiserAddress], [AdvertiserPhone], [AdvertiserCompany], [AdvertiserURL], [AccountNumber],'3',0, [AccountNumber] FROM Miami WHERE not exists (select * from users Where users.Username = miami.AdvertiserEmail) AND validAD=1
--PROPERTY INSERT INTO [Undex_Production].[dbo].[Property]([ListDate],[CommunityName],[TowerName],[PhaseName],[Unit], [Address1], [City], [State], [Zip],[IsActive],[AdPrintId]) SELECT [FirstInsertDate],[PropertyBuilding],[PropertyStreetAddress],PropertyCity + ' ' + PropertyState + ' ' + PropertyZipCode as PhaseName,[PropertyUnitNumber],[PropertyStreetAddress],[PropertyCity], [PropertyState], [PropertyZipCode],'0',[AdPrintId] FROM [Undex_Production].[dbo].[miami] WHERE miami.AdvertiserEmail IS NOT NULL AND validAD=1
--ITEM INSERT INTO [Undex_Production].[dbo].[ITEM] ([SellerID],[Price],[StartDate],[EndDate], [HomePageFeatured],[Classified],[IsClosed]) SELECT USERS.UserID, miami.PropertyPrice, convert(datetime,miami.FirstInsertDate), dateadd(day, 30, miami.FirstInsertDate)as EndDate, 1, convert (int,AdNumber) as Classified, 0 FROM USERS RIGHT OUTER JOIN miami ON USERS.UserName = miami.AdvertiserEmail WHERE validAD=1
--PROPERTYITEM INSERT INTO [Undex_Production].[dbo].[propertyItem]( [propertyId], [ItemId]) SELECT Property.propertyId, ITEM.ItemID FROM ITEM RIGHT OUTER JOIN miami ON ITEM.StartDate = miami.FirstInsertDate AND ITEM.Price = miami.PropertyPrice AND ITEM.Classified = convert(int,miami.AdNumber) LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1
--CONDOFEATURES INSERT INTO [Undex_Production].[dbo].[CondoFeatures](PropertyId,[Bedrooms], [Area], [PropertyDescription], [Bathrooms], [NumOfFloors]) SELECT Property.propertyId, [PropertyBedrooms], [PropertySquareFeet], dbo.fn_ReplaceTags (convert (varchar (8000),PropertyDescription)), [PropertyBathrooms], [PropertyTotalFloors] FROM miami LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1
--COMMUNITY FEATURES INSERT INTO [Undex_Production].[dbo].[CommunityFeatures](PropertyId,[totalFloors],isComplete1) SELECT Property.propertyId, miami.propertyTotalFloors,'0' as IsComplete FROM miami LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1
--UNITDISCLOSURES INSERT INTO [Undex_Production].[dbo].[UnitDisclosures]([propertyId],[monthcondoasso]) SELECT Property.propertyId, [propertyassocfee] FROM miami LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1
--BROKERDEVELOPER INSERT INTO [Undex_Production].[dbo].[BrokerDeveloper]([IsFSBO],[FSBOName], [FSBOEmail],[FSBOWebsite],[IsDeveloper],[DeveloperName],[DeveloperWebsite],[IsBroker],[BrokerName],[BrokerageWebsite], [propertyId],[brokercommission],[isComplete])SELECT CASE AdvertiserType when 'FSBO' THEN 1 else 0 end, CASE AdvertiserType when 'FSBO' THEN [AdvertiserName] else NULL end, CASE AdvertiserType when 'FSBO' THEN [AdvertiserEmail] else NULL end, CASE AdvertiserType when 'FSBO' THEN [AdvertiserURL] else NULL end, CASE AdvertiserType when 'Developer' THEN 1 else 0 end, CASE AdvertiserType when 'Developer' THEN [AdvertiserName] else NULL end, CASE AdvertiserType when 'Developer' THEN [AdvertiserURL] else NULL end, CASE AdvertiserType when 'Realtor' THEN 1 when 'Broker' THEN 1 else 0 end, CASE AdvertiserType when 'Realtor' THEN [AdvertiserName] when 'Broker' THEN [AdvertiserName] else NULL end, CASE AdvertiserType when 'Realtor' THEN [AdvertiserURL] when 'Broker' THEN [AdvertiserName] else NULL end, Property.propertyId,[PropertyCommBroker],'0' as IsComplete FROM miami LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1 IF @@ERROR <> 0 BEGIN ROLLBACK TRAN RETURN END COMMIT TRANSACTION GO
is there any easy way I can take a select statment (such as select from payments where datetime>'20071122' and output a sql insert statment for these records?
I basically need to move a specific set of records from one sql server to another (both sql server 2005) any suggestions for the best way to do this?
Hello everybody,Just short question:I have tables, which are only log tables (very less used for selects),but there is a lotof writing.I would like to have as much speed as possible by writing data intothis tables.create table [tbl] ([IDX] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[Time_Stamp] [datetime] NOT NULL ,[Source] [varchar] (64) COLLATE Latin1_General_CI_AS NULL ,[Type] [varchar] (16) COLLATE Latin1_General_CI_AS NULL ,[MsgText] [varchar] (512) COLLATE Latin1_General_CI_AS NULL ,CONSTRAINT [tbl] PRIMARY KEY NONCLUSTERED([IDX]) ON [PRIMARY]) ON [PRIMARY]GOQuestion:Is it better for inserts,, to remove PK but leave identity insert?How to make this table optimized for writing?If I will set fill level of the table with 0%, will I winn much?Once information: this table will be deleted with old data, dependingon row count (oldest ID's will be deleted each night).Thank You in advanceMateusz
according to the mysql manual, multiple inserts can be sped up bylocking the table before doing them and unlocking the table afterwards.is the same true of multiple inserts in mysql? if so, how would thetable be locked?any insights would be appreciated - thanks!
Dear Experts,What is the best way to do a large insert WITHOUT having direct accessto the machine SQL Server is running on? For example, imagine I wantto insert something like 20,000 records. If I were to have access tothe server, I could BULK INSERT into a temp table and then insert intothe destination table. But if I can't create a file on the server touse for BULK INSERT, what is the next best alternative to doing lotsof 1 record insert statements?Thanks,-Emin
Is there any known SQL Server bug whereby a record can be successfullyinserted and committed, but then later be found not to be in thedatabase? For example, if there was a server crash just after thecommit, could committed data be lost?I'm sure the answer must be "no", but a client is telling me this ishappening, and I said I'd enquire.