I've got a program that has a form. Â On the form there are several fields which are optional that I would prefer to remain as NULL in the database (rather than a default value) if the user doesn't fill them out. Â I'm using stored procedures for my inserts (sql 2000 + C#). Â
How do most people handle these situations? Â Right now I have a seperate function which receives the parameter values as params and then handles creating the parameters and executing the stored procedure. Â I'm not sure how to handle passing that I want one of those to be null, at least not without having multiple functions and possibly various stored procedures for different possibilities.
I'm working on a new project using SSIS in SQL Server 2005 and have an issue that I need resolved.
I'm loading an XML file into SSIS using the XML Source Adapter. This maps to an OLE DB destination which reads the data into a SQL table (please see below for the table structure). In between this stage is a data conversion to convert the Unicode characters from XML into non-Unicode characters.
I am in the process of converting DB2 mainframe data to SQL2005. During the conversion I ran into an issue with the DB2OLEDB provider not handling strings with embedded nulls. With the help of the Microsoft Tech support folks I was able to get a fix for the x64 DB2OLEDB provider to handle strings with embedded nulls.
The problem now however is that it appears that when the Data is copied to the Pipeline buffer it is truncated at the first null regardless of the DT_STR length. I have read where .NET is supposed to handle embedded nulls in strings but I am not sure what I need to do to get SSIS packages to handle this situation.
I know when I preview the query in the OLEDB provider within SSIS the data is correct, but as soon as it is passed to the SQL connector, a scripting component or a data conversion component the string is truncated at the first occurence of the embedded null.
I also tried doing a straight copy from the Data provider to a flat file, but the strings are once again truncated.
Is anyone else experienced any other similiar problems or found any resolutions to this type of problem I am getting down to crunch time on this conversion project and any help would be most appreciated.
When i do a select on my emplee table for rows with null idCompany i dont get any records
I then try to modify the table to not allow a null idCompany and i get this error message:
'Employee (aMgmt)' table - Unable to modify table. Cannot insert the value NULL into column 'idCompany', table 'D2.aMgmt.Tmp_Employee'; column does not allow nulls. INSERT fails. The statement has been terminated.
is there an elegant way to use one equals sign in a where clause that returns true when both arguments are null, and returns true when neither is null but both are equal and returns false when only one is null?
I have two SSIS packages that import from the same flat file into the same SQL 2005 table. I have one flat file connection (to a comma delimited file) and one OLE DB connection (to a SQL 2005 Database). Both packages use these same two Connection Managers. The SQL table allows NULL values for all fields. The flat file has "empty values" (i.e., ,"", ) for certain columns.
The first package uses the Data Flow Task with the "Keep nulls" property of the OLE DB Destination Editor unchecked. The columns in the source and destination are identically named thus the mapping is automatically assigned and is mapped based on ordinal position (which is equivalent to the mapping using Bulk Insert). When this task is executed no null values are inserted into the SQL table for the "empty values" from the flat file. Empty string values are inserted instead of NULL.
The second package uses the Bulk Insert Task with the "KeepNulls" property for the task (shown in the Properties pane when the task in selected in the Control Flow window) set to "False". When the task is executed NULL values are inserted into the SQL table for the "empty values" from the flat file.
So using the Data Flow Task " " (i.e., blank) is inserted. Using the Bulk Insert Task NULL is inserted (i.e., nothing is inserted, the field is skipped, the value for the record is omitted).
I want to have the exact same behavior on my data in the Bulk Insert Task as I do with the Data Flow Task.
Using the Bulk Insert Task, what must I do to have the Empty String values inserted into the SQL table where there is an "empty value" in the flat file? Why & how does this occur automatically in the Data Flow Task?
From a SQL Profile Trace comparison of the two methods I do not see where the syntax of the insert command nor the statements for the preceeding captured steps has dictated this change in the behavior of the inserted "" value for the recordset. Please help me understand what is going on here and how to accomplish this using the Bulk Insert Task.
Disk Specs IO/Second = 130 per disk Speed RPM = 15 K
When I did a load test of inserting data into a table with four Columns
Col1 INT Col2 VARCHAR(32) Col3 VARCHAR(4000) Col4 DATETIME
I could insert around 1044 Inserts per second where as I thought I could do max of 520 Inserts ( 130 * 4 ) because each disk can only take 130 Inserts multiplied by 4 Disks gives me a theoritical limit of 520.
Also How does the Query Analyser Connects to the datbase Server.. does it use ODBC
I am doing a simple IO Test with the below script ...
Just wanted to keep things simple and to check how many Inserts I can do on a given SQL Server. I am running the below script from QA for 1 minute and then divide the No or rows inserted by 60.
Will it give me approximate results by duing this?
Actually the datafiles .MDF files are sitting on a single drive where the manufacturer specs shows that it will handle 130 IO's per disk. With the below script I am getting around 147 Inserts per second.
But my boss says that he is getting 2000 inserts per second on his laptop from a ...Am I missing some thing?
DECLARE @lnRowCnt INT SELECT @lnRowCnt = 100000
WHILE @lnRowCnt > 0 BEGIN SET NOCOUNT ON INSERT INTO CTMessages..Iotest SELECT @lnRowCnt , 'VENU' , REPLICATE ( 'V' , 4000 ) , 1000000
I'm using a SQLDataSource and trying to do two inserts into two different tables with one InsertCommand, but it's not working. Here's the code I'm trying to use. Do you see anything wrong with the syntax? I keep getting an error that says error near ',' but I can't figure out why. Thanks
I need to Add a Check in the database to ensure that user can only enter up to 20 entries to Database in a period of 10 minutes. Basically, to guard against people using scripts to add data to the database ( instead of using CAPTCHAE on the front end) what we want to do is restrict user to entering at most 20 transactions in 10 Minutes.How do I handle or do this in SQl Server 2005??
What I figure to do is right after I do an INSERT into the table, I Select the last 20 entries into that same table and then Calculate the Total time it took to add those 20 transactions and set the righ flag. 1) How do I select last 20 entries into a table?? 2) How do I calculate the total time that elapsed between adding the first of those 20 records and the last?? Thanks in Advance
G'day, I have a table with a primary key being a bigint and its set to auto increment (or identity or whatever ms calls it). Is there anyway I can get the ID number that will be assigned to the next Insert before I insert it? I want to use that ID number within another field when inserted.
Hi, I'm trying to create a form where new names can be added to a database. The webform looks like this:<body MS_POSITIONING="GridLayout"> <form id="Form1" method="post" runat="server"> Name:<asp:TextBox ID="newName" runat="server" /> <INPUT id="NewUserBtn" type="button" value="Create New User" name="NewUserBtn" runat="server" onServerClick="NewBtn_Click"> </form>And the code behind looks like this:Public Sub NewBtn_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles NewUserBtn.ServerClick Dim DS As DataSet Dim MyConnection As SqlConnection Dim MyCommand As SqlDataAdapter MyConnection = New SqlConnection("server=databaseserver;database=db;uid=uid;pwd=pwd") MyCommand = New SqlDataAdapter("insert into certifications (name) values ('" & newName.Text & "'); select * from certifications", MyConnection) DS = New DataSet MyCommand.Fill(DS, "Titles") Response.Redirect("WebForm1.aspx", True) End SubWhen I try to insert one name it works. When I try to insert a second name, it overwrites the old one. Why is that?Thanks.James
Hey All, I was trying to use a typed dataset to create a very simple DAL. I found that the code generated for the INSERT statement includes an identity field the table has. That can obviously never work (unless identity_insert is set, which it is not). My question is whether it is possible to control this insert statement generation? Is there a property I am missing somewhere? My solution was to change the INSERT statement on the DataTableAdapter, but that seems awkward for me to have to do that.. Thanks, Yuval
I have a number of columns with predefined character length but user can input more from gui. i want to trucncate automatically to the desired length and insert or update the database right now it does not allow me to update , or insert the values can i do it and how this is urgent
We have a 4 processor 350 Hz NT 4.0 SQL server. Currently we have an application that is inserting rows one at a time, each row insert is a separate transaction. Currenty we are averaging 2500 rows a second with each row ( 56 bytes wide). The data and the log are on one string of Raid disk. We plan to get another controller and raid string to separate the data and the log onto separate controllers. The developer is modifying the application to insert the data in blocks. What is the impact to the transaction log? He seems to think that by inserting page blocks on rows there would be less data going into the transaction log. Why would this be so? Does anyone have any information on practical limits for inserts and log truncation with similar machine configurations. He would like to try to get around 150,000 rows a second. Has anyone accomplished inserts at this rate? What type of machine configuration?
Hi, I have a small web application managing complaints. During multiuser testing we noticed that when complaints where added at "exactly" the same time one complaint text seemed to be over writing the other, and returning the current max value for table id as current complaint number.
I tested in my development environment and was able to recreate reasonably easily ( 1 go out of 3 recreated the issue ). The Id column itself is an auto increment ( primary key ), so I can't think of a concievable reason why one record should overwrite another. I should say that I am assuming the record is overwritten, perhaps there is a clash and one complaint is ignored by the database.
Hi, I have a procedure that I call on one database, and one of it's steps is to write to a table on another database, same server. the user exists in both databases, but i keep getting errors when i try and write to this second database. i know i can fix this by giving the user insert permissions on the table in this second database, but i do not want this for security purposes. any other ideas on how to accomplish this?
i have a {date value} i have a {frequency value} 1 = yearly 4 = quarterly 12 = monthly
i need to select an item then check the frequency
then do a loop insert based on the frequency
if frequency = 1
insert item, date into table where date = {date value}
elseif frequency = 4
Per item -- insert 4 new entrys insert item, date into table where date = {date value} insert item, date into table where date = {date value + quarter} insert item, date into table where date = {date value+ 2 quarters} insert item, date into table where date = {date value + 3 quarters}
< ' below is how i can calculate quartly values from a date iv vb .net just need to do the same within sql
Dim Quarterloop AS Integer for QuarterLoop = 0 to 3 > <= formatdatetime(dateadd("q", Quarterloop , MyDate),DateFormat.longdate) ><br> < Next >
elseif frequency = 12 --- per item insert 12 new entrys insert 12 items into the table looping from date and then in 12 increments of 1 month values
Currenlty I have huge amounts of data going into a table. I'm sending an xmldoc and using openxml with a cursor to seed them.
the question I have is whether to let duplicate keyed data rows bounce and then check @@error and then do an update on the nokeyed field or to do a select on the keyed field and then do an insert or update based on the selects results.
I have to create dynamic insert statements for the table. For example there are DevTableA and ProdTableA tables. I worte a SQL to get the new records added in the DevTableA but are not there in ProdTableA. The result gives me a list of rows. These tables have a column 'LanguageID' and 'LText'
The compare result has records only for LanguageID = 0. One I see the compare result. I am suppose to create insert statements for LanguageID = 1,2,5 and 6 and update the Ltext for those languages. The Ltext for other languages is in spreadsheet.
Can anyone advice me how to create the insert statements from the comapre result and add 4 more insert statements for LanguageID = 1,2,5 and 6 with their respective Ltext.
So far I thought I can create #table. Looks like I need more than 1 # table.
I'm trying to perform a bulk insert as shown below. It's problematic b/c it's not updating the identity fields correctly and we're getting dups. I think, but I'm not sure, that On Update Cascade would solve all this, b/c we wouldn't have to concern ourselves with even touching the identity fields, b/c they would be autogenerated. Can someone shed some light?? I'm pretty confused.
CREATE PROCEDURE AddMiamirecords AS
BEGIN TRANSACTION
--USERS INSERT INTO [Undex_Production].[dbo].[USERS]([LastName], [UserName], [EmailAddress], [Address1], [WorkPhone], [Company], [CompanyWebsite], [pword], [IsAdmin], [IsRestricted],[AdvertiserAccountID]) SELECT dbo.fn_ReplaceTags (convert (varchar (8000),Advertisername)), [AdvertiserEmail], [AdvertiserEmail],[AdvertiserAddress], [AdvertiserPhone], [AdvertiserCompany], [AdvertiserURL], [AccountNumber],'3',0, [AccountNumber] FROM Miami WHERE not exists (select * from users Where users.Username = miami.AdvertiserEmail) AND validAD=1
--PROPERTY INSERT INTO [Undex_Production].[dbo].[Property]([ListDate],[CommunityName],[TowerName],[PhaseName],[Unit], [Address1], [City], [State], [Zip],[IsActive],[AdPrintId]) SELECT [FirstInsertDate],[PropertyBuilding],[PropertyStreetAddress],PropertyCity + ' ' + PropertyState + ' ' + PropertyZipCode as PhaseName,[PropertyUnitNumber],[PropertyStreetAddress],[PropertyCity], [PropertyState], [PropertyZipCode],'0',[AdPrintId] FROM [Undex_Production].[dbo].[miami] WHERE miami.AdvertiserEmail IS NOT NULL AND validAD=1
--ITEM INSERT INTO [Undex_Production].[dbo].[ITEM] ([SellerID],[Price],[StartDate],[EndDate], [HomePageFeatured],[Classified],[IsClosed]) SELECT USERS.UserID, miami.PropertyPrice, convert(datetime,miami.FirstInsertDate), dateadd(day, 30, miami.FirstInsertDate)as EndDate, 1, convert (int,AdNumber) as Classified, 0 FROM USERS RIGHT OUTER JOIN miami ON USERS.UserName = miami.AdvertiserEmail WHERE validAD=1
--PROPERTYITEM INSERT INTO [Undex_Production].[dbo].[propertyItem]( [propertyId], [ItemId]) SELECT Property.propertyId, ITEM.ItemID FROM ITEM RIGHT OUTER JOIN miami ON ITEM.StartDate = miami.FirstInsertDate AND ITEM.Price = miami.PropertyPrice AND ITEM.Classified = convert(int,miami.AdNumber) LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1
--CONDOFEATURES INSERT INTO [Undex_Production].[dbo].[CondoFeatures](PropertyId,[Bedrooms], [Area], [PropertyDescription], [Bathrooms], [NumOfFloors]) SELECT Property.propertyId, [PropertyBedrooms], [PropertySquareFeet], dbo.fn_ReplaceTags (convert (varchar (8000),PropertyDescription)), [PropertyBathrooms], [PropertyTotalFloors] FROM miami LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1
--COMMUNITY FEATURES INSERT INTO [Undex_Production].[dbo].[CommunityFeatures](PropertyId,[totalFloors],isComplete1) SELECT Property.propertyId, miami.propertyTotalFloors,'0' as IsComplete FROM miami LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1
--UNITDISCLOSURES INSERT INTO [Undex_Production].[dbo].[UnitDisclosures]([propertyId],[monthcondoasso]) SELECT Property.propertyId, [propertyassocfee] FROM miami LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1
--BROKERDEVELOPER INSERT INTO [Undex_Production].[dbo].[BrokerDeveloper]([IsFSBO],[FSBOName], [FSBOEmail],[FSBOWebsite],[IsDeveloper],[DeveloperName],[DeveloperWebsite],[IsBroker],[BrokerName],[BrokerageWebsite], [propertyId],[brokercommission],[isComplete])SELECT CASE AdvertiserType when 'FSBO' THEN 1 else 0 end, CASE AdvertiserType when 'FSBO' THEN [AdvertiserName] else NULL end, CASE AdvertiserType when 'FSBO' THEN [AdvertiserEmail] else NULL end, CASE AdvertiserType when 'FSBO' THEN [AdvertiserURL] else NULL end, CASE AdvertiserType when 'Developer' THEN 1 else 0 end, CASE AdvertiserType when 'Developer' THEN [AdvertiserName] else NULL end, CASE AdvertiserType when 'Developer' THEN [AdvertiserURL] else NULL end, CASE AdvertiserType when 'Realtor' THEN 1 when 'Broker' THEN 1 else 0 end, CASE AdvertiserType when 'Realtor' THEN [AdvertiserName] when 'Broker' THEN [AdvertiserName] else NULL end, CASE AdvertiserType when 'Realtor' THEN [AdvertiserURL] when 'Broker' THEN [AdvertiserName] else NULL end, Property.propertyId,[PropertyCommBroker],'0' as IsComplete FROM miami LEFT OUTER JOIN Property ON miami.PropertyUnitNumber = Property.Unit AND miami.PropertyZipCode = Property.Zip AND miami.PropertyCity = Property.City AND miami.PropertyStreetAddress = Property.Address1 WHERE validAD=1 IF @@ERROR <> 0 BEGIN ROLLBACK TRAN RETURN END COMMIT TRANSACTION GO
is there any easy way I can take a select statment (such as select from payments where datetime>'20071122' and output a sql insert statment for these records?
I basically need to move a specific set of records from one sql server to another (both sql server 2005) any suggestions for the best way to do this?
Hello everybody,Just short question:I have tables, which are only log tables (very less used for selects),but there is a lotof writing.I would like to have as much speed as possible by writing data intothis tables.create table [tbl] ([IDX] [numeric](18, 0) IDENTITY (1, 1) NOT NULL ,[Time_Stamp] [datetime] NOT NULL ,[Source] [varchar] (64) COLLATE Latin1_General_CI_AS NULL ,[Type] [varchar] (16) COLLATE Latin1_General_CI_AS NULL ,[MsgText] [varchar] (512) COLLATE Latin1_General_CI_AS NULL ,CONSTRAINT [tbl] PRIMARY KEY NONCLUSTERED([IDX]) ON [PRIMARY]) ON [PRIMARY]GOQuestion:Is it better for inserts,, to remove PK but leave identity insert?How to make this table optimized for writing?If I will set fill level of the table with 0%, will I winn much?Once information: this table will be deleted with old data, dependingon row count (oldest ID's will be deleted each night).Thank You in advanceMateusz
according to the mysql manual, multiple inserts can be sped up bylocking the table before doing them and unlocking the table afterwards.is the same true of multiple inserts in mysql? if so, how would thetable be locked?any insights would be appreciated - thanks!
Dear Experts,What is the best way to do a large insert WITHOUT having direct accessto the machine SQL Server is running on? For example, imagine I wantto insert something like 20,000 records. If I were to have access tothe server, I could BULK INSERT into a temp table and then insert intothe destination table. But if I can't create a file on the server touse for BULK INSERT, what is the next best alternative to doing lotsof 1 record insert statements?Thanks,-Emin
Is there any known SQL Server bug whereby a record can be successfullyinserted and committed, but then later be found not to be in thedatabase? For example, if there was a server crash just after thecommit, could committed data be lost?I'm sure the answer must be "no", but a client is telling me this ishappening, and I said I'd enquire.
We have installed a package developed by another company and sometimes(when the server is with a big rate of transactions), we are seeingthe following messages in package debug file:2004-04-22 14:23:27 3056:------------------------------------------------------------2004-04-22 14:23:27 3056: rlm03000: ls_PutOneRecord: Failed inserting[04113SO07236054]2004-04-22 14:23:27 3056: ODBCSetCursorName[3056]: Failed specifyingcursor concurrency to statement handle 1 for 'ls_rep_add'2004-04-22 14:23:27 3056: dbc01003: ODBCSetCursorName[3056]:ls_rep_add generated SQL error state: 240002004-04-22 14:23:27 3056: dbc01002: [Microsoft][ODBC Driver Manager]Invalid cursor state2004-04-22 14:23:27 3056: odbcFilterConnectErrors[3056]: set theiState to ODBC_DISCONNECT for <24000>2004-04-22 14:23:27 3056: Function Return Code [00001] ODBC_ERROR2004-04-22 14:23:27 3056: Operation [00000] ls_PutOneRecord2004-04-22 14:23:27 3056: Primary Return Code [00000] 000002004-04-22 14:23:27 3056: Secondary Return Code [0000000001] 000002004-04-22 14:23:27 3056:------------------------------------------------------------When this message appears in the app debug file we are loosing someinserts that the application do in the database.The MS SQL server configuration is:4 Pentium 760Mhz 4 GB RAMSQL 2000 Standard Edition 8.00.760 SP 4MDAC 2.7 driver ODBC SQLSRV32.dll 2000.81.9031.14 15/11/2002Any tip will be welcome,Thanks in advance,Reis
I've got a package that needs to do various single-row inserts/updates throughout the flow of the package. Each insert/update will use values from variables.
What is the best practice for doing this? I was going to use a DFT with my source being a derived column transformation, but it expects a true data source.
I've also thought about just using the obvious -- a SQL command task -- by my experience with doing commands using single values is that it's very quirky and hard to get the data types and mappings to cooperate.
I'm betting the SQL command task is my only option, but I wanted to check here first to see if others had other ideas.
this is the error is get, but i did the same thing on a select statement and it works fine...do i need to add something to the string or what i am kinda confused and help would be great.....
Operator '+' is not defined for types 'String' and 'System.Windows.Forms.TextBox'.
Can someone tell me what to do if several NULLS are showing up within tables. Is there an easy step to locate all NULLS and delete them within the table?
Can someone provide detailed infromation on how to delete NULLS within tables. I'm new o the SQL side so I hope I have explained what it is that I'm looking to do. Our in house programmer said that the NULLS maybe causing a problem with the software communicating with the SQL Server software.