Inserting Columns To A Table At A Particular Position
Jun 26, 2001
Hi,
Does anyone know if it is possible to add a column to a table at a particular position. The ALTER TABLE statement specifies that an ADD COLUMN puts this on the end of the table. Is there a way to insert into the middle of the table?
Alternatively, can it be done through SQL-DMO?
Obviously the fallback would be to drop and re-create the table with the correct column ordering, however this is not ideal for tables with large amounts of data.
It's possible to do this through Enterprise Manager (using Insert Column), so how does it do it - DMO, or by dropping and re-creating?
We're building a database schema upgrage tool and are trying to cover all possibilities.
Dear All,I want to get more than one table columns as a single column byalternative.For eg,Table name = employeecolumns = empid, empname, address1,address2Here, I want to interpolate the address2 column on address1 asmentioned below,empid empname address1address21 aaa First Layout,CA, US.Thanks in advance.Thanks and Regards,Ganapathi sundaram.G
Using SS2K, I'm getting the following error while bulk inserting:Column 'warranty_expiration_date' cannot be modified because it is acomputed column.Here is my bulk insert statement:BULK INSERT dbo.TestDataFROM 'TestData.dat'WITH (CHECK_CONSTRAINTS,FIELDTERMINATOR='|',MAXERRORS = 1,FORMATFILE='TestData.fmt')The computed column is not referenced in the format file and the data filedoes not contain the computed data.Thanks
I have a table with a string value, where all values are seperated by a space/blank. I now want to use SQL to split all the values and insert them into a different table, which then later will result in deleting the old table, as soon as I got all values out from it.
Old Table:
Code: ID, StringValue
New Table:
Code: ID, Value1, Value2 Do note: Value1 is INT, Value2 is of nvarchar, hence Value2 can contain spaces... I just need to split on the FIRST space, then convert index[0] to int, and store index[1] as it is.
I can split on all spaces and just Select them all and add them like so: SELECT t.val1 + ' ' + t.val2... If I cant find the first space that is... I mean, first 2-10 characters in the string can be integer, but does not have to be.Shall probably do it in code instead of SQL?Now I want to run a query that selects the StringValue from OldTable, splits the string by ' ' (a blank) and then inserts them into New Table.
Code: SELECT CASE CHARINDEX(' ', OldTable.stringvalue, 1) WHEN 0 THEN OldTable.stringvalue ELSE SUBSTRING(OldTable.stringvalue, 1, CHARINDEX(' ', OldTable.stringvalue, 1) - 1) END AS FirstWord FROM OldTable
Found an example using strange things like CHARINDEX..But issue still remains, because the first word is of integer, or it does not have to be...If it isn't, there is not "first value", and the whole string shall be passed into "value2".How to detect if the very first character is of integer type?
Code: @declare firstDigit int IF ISNUMERIC(SUBSTRING(@postal,2,1) AS int) = 1 set @firstDigit = CAST(SUBSTRING(@postal,2,1) AS int) ELSE set @firstDigit = -1
There are about 500 tables in one particular datbase. There are foreign keys set on most of the tables. I want to change the position of the primary key column in all those tables. How can I do that programatically?
I have a report that prints cards for customers. The body of the report contains an address box, letter body, followed by a table that contains all the people to be printed on the cards. If the number exceeds 6 people, another page is to be printed, with no address or letter body, but the table with the remainder of the people on it.
I have been able to get everything to work except for the location of the table on the subsequent pages. They do not appear at the bottom as they do on the first page, but at the top of the page. I've tried using a rectangle as a container for the table, with no luck.
I seem to remember doing this in the past, but as projects get shelved for an extended period of time, the technical knowledge tends to fade.
Any suggestions would be greatly appreciated. Thanks
Hi I am having a problem in auditing the column data in tables.My requirement is i have write a trigger which is capable of auditing the columns which are going to be added in the future also with out using dynamic SQL.is there any way to do so. I feel if i can get the column data based on ordinal position then it is possible. Can any body suggest. My set Up is like this I have a base_table to be audited. I have a Audit_spec table which contains name of the table and columns to be audited. And Audit table which actually captures the table name,column name ,old value and new value. I have to audit only those columns in the Audit_spec spec. If schema changes(Like new column added) happens to base_table and I want that column to be audited.with out any changes to my trigger code i should handle the newly added column ..
I am trying to insert data into two different tables. I will insert into Table 2 based on an id I get from the Select Statement from Table1. Insert Table1(Title,Description,Link,Whatever)Values(@title,@description,@link,@Whatever)Select WhateverID from Table1 Where Description = @DescriptionInsert into Table2(CategoryID,WhateverID)Values(@CategoryID,@WhateverID) This statement is not working. What should I do? Should I use a stored procedure?? I am writing in C#. Can someone please help!!
Hi, Let's say I have 1000 registered users in database table and each of them has numeric ranking value. How can I get the position of each user in comparison to other users ranking value?
So I have been trying to get mySQL query to work for a large database that I have. I have (lets say) two tables Table_One and Table_Two. Table_One has three columns: Type, Animal and TestID and Table_Two has 2 columns Test_Name and Test_ID. Example with values is below:
In Table_One all types come under one column and the values of all Types (Mammal, Fish, Bird, Reptile) come under another column (Animals). Table_One and Two can be linked by Test_ID
I am trying to create a table such as shown below:
This should be my final table. The approach I am currently using is to make multiple instances of Table_One and using joins to form this final table. So the column Bird, Reptile, Mammal and Fish all come from a different copy of Table_one.
For e.g
Select Test_Name AS 'Test_Name', Table_Bird.Animal AS 'Birds', Table_Mammal.Animal AS 'Mammal', Table_Reptile.Animal AS 'Reptile, Table_Fish.Animal AS 'Fish' From Table_One
[Code] .....
The problem with this query is it only works when all entries for Birds, Mammals, Reptiles and Fish have some value. If one field is empty as for Test_Two or Test_Three, it doesn't return that record. I used Or instead of And in the WHERE clause but that didn't work as well.
RO1 BOOK RL1 PDF/ECOPY RO2 PAPER RL2 CONFERENCE RO5 JOURNAL RL11 OTHER
Now, on the above query I want to insert three extra columns with the name (status, location and contact) where the results in the extra three columns would be based on the conditions I want to define in the query based on the above results…
Something for example (I am not trying to write a condition: my question is how to write it),
if column_code1 = RO1 and column_description2 = PDF/ECOPY on status column it should return a value ‘ONLINE’ & on location column it should return ‘WEB’ and on contact column it should write ‘BOB’.
Also, if column_code1 = RO5 and column_description1 = JOURNAL on status column it should return a value ‘ON PRESS FOR PRINT’ & on location column it should return ‘S.R STREET, LONDON’ and on contact column it should write ‘SMITH’ like below result…so the final output should be the top four columns and the extra three columns…
See the attachment for better formatting...
--------------------------------------------------------------------------------------------- status location contact --------------------------------------------------------------------------------------------- ONLINE WEB BOB ON PRESS FOR PRINT S.R STREET, LONDON SMITH
We are writing a C application that is using ODBC to insert recordsinto a database. We have a NOT NULL column that can legitimately havean empty value, i.e. we know the value and it is empty (i.e. a zerolength string).We are using SQLBindParameter() to bind a variable to theparameterized insert statement <<in the form: INSERT INTO table VALUES(?, ?, ?)>>. We are using SQLExecDirect() to process the SQL.We are running into the problem where ODBC is converts the empty (zerolength) string into a NULL value and this errors due to the fact thatthe column is defined as NOT NULL.We do not want to redefine the column as NULL, becasue myunderstanding of the correct usage of a NULL column is to indicatethat a value is unknown or meaningless... in our case we know thevalue (it is empty) and an empty value has meaning within ourapplication.I'm sure that this issue has been seen and address thousands(millions?) of times... any guidance would be appreciated.
Hi Gnite everyone, i once again need help with a T-SQL syntax for Auto Correction for insert and update when client enters the wrong format, i;e, creating a SSN Data type and a User Defined Procedure that auto corrects input format i;e, user inserts into table authors of the pubs DB 525-477845 column au_id that executes auto correction so user doesn't have to put in dashes for SSN format. Please help me with this syntax .
I have created a table Table with name as Varchar and id as int. Now i have started inserting the rows like, insert into Table values ('arun',20).Yes i have inserted a row in the table. Now i have got the values " arun's ", 50. insert into Table values('arun's',20) My sqlserver is giving me an error instead of inserting the row. How will you solve this problem?
I have created a sample Database for the school project,
After executing the query below, the Date column is supposed to have the dates I have entered before,
However the dates shown are 1900.
Any idea why is this happening?
I appreciate your help.
Thank you. Query:
Drop table AccountReceivable GO --BEGIN TRANSACTION Create table AccountReceivable ( AccountRecID int identity (1,1) not null, PatientID int not null, PresentCharges int default 0 not null, PaymentMade money default 0 not null, PreviousBalance money default 0 not null, BalanceDue money default 0 not null, LastPaymentDate datetime not null, PresentDate datetime default GetDate() not null ) GO ALTER TABLE AccountReceivable ADD CONSTRAINT PK_AccountRecID Primary Key (AccountRecID) GO
ALTER TABLE AccountReceivable ADD CONSTRAINT FK_PatientID_PatientID FOREIGN KEY (PatientID) REFERENCES PATIENT (PatientID) GO --COMMIT --query to find delinquent accounts --DATEDIFF (d, LastPaymentDate, PresentDate)
--Populate the Accounts Table DELETE AccountReceivable GO INSERT AccountReceivable (PatientID,PresentCharges,PaymentMade,PreviousBalance,BalanceDue,LastPaymentDate,PresentDate ) VALUES (913235,451.34,50,0,401.34,4/7/2006,DEFAULT) GO INSERT AccountReceivable (PatientID,PresentCharges,PaymentMade,PreviousBalance,BalanceDue,LastPaymentDate,PresentDate) VALUES (918035,109,109,0,0,3/6/2006,DEFAULT) GO INSERT AccountReceivable (PatientID,PresentCharges,PaymentMade,PreviousBalance,BalanceDue,LastPaymentDate,PresentDate) VALUES (914235,279,89,0,190,5/9/2005,5/9/2005) GO INSERT AccountReceivable (PatientID,PresentCharges,PaymentMade,PreviousBalance,BalanceDue,LastPaymentDate,PresentDate) VALUES (914235,0,90,190,100,5/9/2005,DEFAULT) GO INSERT AccountReceivable (PatientID,PresentCharges,PaymentMade,PreviousBalance,BalanceDue,LastPaymentDate,PresentDate) VALUES (912224,67.90,67.90,0,0,2/2/2006,DEFAULT) GO
Please note that the number columns are different in each table. I wanted to dump the data of Source table to Destination table. I meant to say that the rows of 2 columns in Source table to last 2 rows of Destination table. And also my oreder of the columns in Destination table will vary. So i need to a way to dynamically insert the data in bulk. but i will know the column names for sure before inserting.
Is there anyway to bulk insert into these columns.
I have a business need to create a report by query data from a MS SQL 2008 database and display the result to the users on a web page. The report initially has 6 columns of data and 2 out of 6 have JSON data so the users request to have those 2 JSON columns parse into 15 additional columns (first JSON column has 8 key/value pairs and the second JSON column has 7 key/value pairs). Here what I have done so far:
I found a table value function (fnSplitJson2) from this link [URL]. Using this function I can parse a column of JSON data into a table. So when I use the function above against the first column (with JSON data) in my query (with CROSS APPLY) I got the right data back the but I got 8 additional rows of each of the row in my table. The reason for this side effect is because the function returned a table of 8 row (8 key/value pairs) for each json string data that it parsed.
1. First question: How do I modify my current query (see below) so that for each row in my table i got back one row with 19 columns.
SELECT A.ITEM1,A.ITEM2,A.ITEM3,A.ITEM4, B.* FROM PRODUCT A CROSS APPLY fnSplitJson2(A.ITEM5,NULL) B
If updated my query (see below) and call the function twice within the CROSS APPLY clause I got this error: "The multi-part identifier "A.ITEM6" could be be bound.
2. My second question: How to i get around this error?
SELECT A.ITEM1,A.ITEM2,A.ITEM3,A.ITEM4, B.*, C.* FROM PRODUCT A CROSS APPLY fnSplitJson2(A.ITEM5,NULL) B, fnSplitJson2(A.ITEM6,NULL) C
I am using Microsoft SQL Server 2008 R2 version. Windows 7 desktop.
Here is My requirement, I'm not sure if this is possible. Creating table called master like col1, col2 col3, col4 , col5 ...Where Col1, col2 are updatable - this can be done easily
Col3, col4 are columns in another table but these can be just a read only ?? Is this possible ? this is possible with View but not friendly with share point CRUD...Col 5 is a computed column of col 2 and col5 ? if above step can be done then sure this can be done I guess.
I am trying to insert bulk data into main table from staging table in sql server 2012. If any error comes, this total activity is rollbacked. I don't want that to happen. I want to know the records where ever the problem persists, and the rest has to be inserted.
Hello,Using SQL Server 2000, I'm trying to put together a query that willtell me the following information about a view:The View NameThe names of the View's columnsThe names of the source tables used in the viewThe names of the columns that are used from the source tablesBorrowing code from the VIEW_COLUMN_USAGE view, I've got the codebelow, which gives me the View Name, Source Table Name, and SourceColumn Name. And I can easily enough get the View columns from thesyscolumns table. The problem is that I haven't figured out how tolink a source column name to a view column name. Any help would beappreciated.Garyselectv_obj.name as ViewName,t_obj.name as SourceTable,t_col.name as SourceColumnfromsysobjects t_obj,sysobjects v_obj,sysdepends dep,syscolumns t_colwherev_obj.xtype = 'V'and dep.id = v_obj.idand dep.depid = t_obj.idand t_obj.id = t_col.idand dep.depnumber = t_col.colidorder byv_obj.name,t_obj.name,t_col.name
But now I am getting error "General Network Error. Check your network documentation" after specifying Use existing stored procedure in TableAdpater Configuration Wizard.
ALTER PROCEDURE dbo.Insert_MailSignature( @Singnature image )
AS
SET NOCOUNT OFF;
INSERT INTO MailsSignature (Singnature) VALUES (@Singnature);
SELECT Id, Singnature FROM MailsSignature WHERE (Id = SCOPE_IDENTITY())
For testing I created a desktop application and found that the same Code, same(Use existing stored procedure in TableAdpater Configuration Wizard) and same stored procedure is working fine in inserting image into the table.
Hi, I have a table in which I will insert several redundant data. Don't ask why, is Integration services, it only reads data and inserts it in a SQL table. THis way, I have a SQL table with several lines repeating them selves. What I want to do is create a procedure that reads the distinct data and inserts it in another table, but my problem is that I am not able to select data line by line on the original table to save it in local variables and insert it on the another table, I just can select the last line. I've tried a while cycle but no succeed. Here is my code: create proc insertLocalizationASdeclare @idAp int, @macAp varchar(20), @floorAp varchar(2), @building varchar(30), @department varchar(30)select @idAp = idAp from OLTPLocalization where idAp not in (select idAp from dimLocalization)select @macAp=macAp,@floorAp=floorAp,@building=building,@department=department from OLTPLocalizationif (@idAp <> null)beginInsert into dimLocalization VALUES(@idAp,@macAp,@floorAp,@building,@department)endGO This only inserts the last line in the "oltpLocalization" table. O the other hand, like this:create proc aaaaasdeclare @idAp as int, @macAp as varchar(50), @floorAp as int, @building as varchar(50), @department as varchar(50)while exists (select distinct(idAp) from OLTPLocalization)begin select @idAp =idAp from OLTPLocalization where idAp not in (select idAp from dimLocalization) select @macAp = macAp from OLTPLocalization where idAp = @idAp select @building = building from OLTPLocalization where idAp = @idAp select @department = department from OLTPLocalization where idAP = @idApif (@idAp <> null)begin insert into dimLocalization values(@idAp,@macAp,@floorAp,@building,@department)endendgo this retrieves every distinct idAp in each increment on the while statement. The interess of the while is really selecting each different line in the OLTPLocalization table. I did not find any foreach or for each statement, is there any way to select distinct line by line in a sql table and save each column result in variables, to then insert them in another table? I've also thought about web service, that reads the distinct data from the oltpLocalization into a dataset, and then inserts this data into the dimLocalization table. Is there anything I can do?Any guess?Really needing a hand here!Thanks a lot!
I write a insert trigger on my table LeaveRegister(1000 rows) and inserting rows in audit table, but when i inserting a row in LeaveRegister table. In audit table 1000 + 1 rows are inserting every time.
I have a 2010 SSIS package where I am reading csv files with different fields and formatted data, I have created separate packages for each file and I am formatting the data to fit the final destination tables data elements, I've been instructed to create 7 separate packages to read each file and the format the data from the 7 csv files, and insert into their table tbl1, tbl2, tbl3...etc then, I'm taking a execute sql task and wanting to insert the tbl1, tbl2, tbl3...etc into destination table that will be the final table for all reports and other uses.
1- should I create a ID? 2- these files will be read once a month 3- I want to append the data, not drop and recreate each run, 4- It's 2012 SQL and 2010 SSIS
Each csv file is in a different format, some have 15 columns, other have 8 I have to parse the data, in SP to align with the fields in the destination table.
5- Can I force RowID to be the next auto gen number from tbl1,for the start of insert for tbl2, then last row of tbl2 for insert of tbl3???
I'm working on a database for a financial client and part of what i need to do is calculate a value from two separate rows in the same table and insert the result in the same table as a new row. I have a way of doing so but i consider it to be extremely inelegant and i'm hoping there's a better way of doing it. A description of the existing database schema (which i have control over) will help in explaining the problem:
id metric_id metric_type_id metric_name 1 80 2 Fiscal Enterprise Value Historic Year 1 2 81 2 Fiscal Enterprise Value Current Fiscal Year 3 82 2 Fiscal Enterprise value Forward Fiscal year 1 4 83 2 Fiscal Enterprise Value Forward Fiscal Year 2 5 101 3 Calendar Enterprise value Historic Year 1 6 102 3 Calendar Enterprise Value Current Fiscal Year 5 103 3 Calendar Enterprise value Forward Year 1 6 104 3 Calendar Enterprise Value Forward Year 2
Table Name: metric_type_details
id metric_type_id metric_type_name 1 1 Raw 2 2 Fiscal 3 3 Calendar 4 4 Calculated
The problem scenario is the following: Because a certain number of the securities have a fiscal year end that is different to the calendar end in addition to having fiscal data (such as fiscal enterprise value and fiscal earnings etc...) for each security i also need to store calendarised data. What this means is that if security with security_id = 3 has a fiscal year end of October then using rows with ids = 1, 2, 3 and 4 from the metrics_ladder table i need to calculate metrics with metric_id = 83, 84, 85 and 86 (as described in the metric_details table) and insert the following 4 new records into metrics_ladder:
Metric with metric_id = 101 (Calendar Enterprise value Historic Year 1) will be calculated by taking 10/12 of the value for metric_id 80 plus 2/12 of the value for metric_id 81.
Similarly, metric_id 102 will be equal to 10/12 of the value for metric_id 81 plus 2/12 of the value for metric_id 82,
metric_id 103 will be equal to 10/12 of the value for metric_id 82 plus 2/12 of the value for metric_id 83 and finally
metric_id 104 will be NULL (determined by business requirements as there is no data for forward year 3 to use).
As i could think of no better way of doing this (and hence the reason for this thread) I am currently achieving this by pivoting the relevant data from the metrics_ladder so that the required data for each security is in one row, storing the result in a new column then unpivoting again to store the result in the metrics_ladder table. So the above data in nmetrics_ladder becomes:
-- Dummy year variable to make it easier to use MONTH() function -- to convert 3 letter month to number. i.e. JAN -> 1, DEC -> 12 etc... DECLARE @DUMMY_YEAR VARCHAR(4) SET @DUMMY_YEAR = 1900;
with temp(security_id, metric_id, value) as ( select ml.security_id, ml.metric_id, ml.value from metrics_ladder ml where ml.metric_id in (80,81,82,83,84,85,86,87,88,etc...) -- only consider securities with fiscal year end not equal to december and ml.security_id in (select security_id from company_details where fiscal_year_end <> 'dec') ) insert into @calendar_averages select temppivot.security_id -- Net Income ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[80]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[81]) as [101] ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[81]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[82]) as [102] ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[82]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[83]) as [103] ,NULL as [104] -- Share Holders Equity ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[84]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[85]) as [105] ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[85]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[86]) as [106] ,(CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR))/12*[86]) +((12 - CONVERT(DECIMAL, MONTH(cd.fiscal_year_end + @DUMMY_YEAR)))/12*[87]) as [107] ,NULL as [108] -- Capex -- Sales -- Accounts payable etc... .. .. from temp pivot ( sum(value) for metric_id in ([80],[81],[82],[83],[84],[85],[86],[87],[88],etc...) ) as temppivot inner join company_details cd on temppivot.security_id = cd.security_id
********* END SQL *********
The result then needs to be unpivoted and stored in metrics_ladder.
And FINALLY, the question! Is there a more elegant way of achieving this??? I have complete control over the database schema so if creating mapping tables or anything along those lines would help it is possible. Also, is SQL not really suited for such operations and would it therefore be better done in C#/VB.NET.