How can I create a table identical to another one, in a stored procedure? I need to copy the indexes and constraints too. Example: I have a table "employee" and I want another table "employee2" with the same indexes and primary key and references.
I need to do the work in a stored procedure because there are many, many tables, and this process belong to a convertion program.
I can't script the table because this process must be automatic no manual.
How can I create a table identical to another one, in a stored procedure? I need to copy the indexes and constraints too. Example: I have a table "employee" and I want another table "employee2" with the same indexes and primary key and references.
I need to do the work in a stored procedure because there are many, many tables, and this process belong to a convertion program.
If on the source I have a new column, the script generated by SqlPackage.exe recreates the table on the background with moving the data into a temp storage. If the table is big, such approach can cause issues.
Example of the script is below: in the source project I added columns [MyColumn_LINE_1] and [MyColumn_LINE_5].
Is there any way I can make it generating an alter statement instead?
BEGIN TRANSACTION; SET TRANSACTION ISOLATION LEVEL SERIALIZABLE; SET XACT_ABORT ON; CREATE TABLE [dbo].[tmp_ms_xx_MyTable] ( [MyColumn_TYPE_CODE] CHAR (3) NOT NULL,
[Code] ....
The same script is generated regardless the table having data or not, having a clustered or nonclustered PK.
i have 2 tables (both containing the same column names/datatypes), say table1 and table2.. table1 is the most recent, but some rows were deleted on accident.. table2 was a backup that has all the data we need, but some of it is old, so what i want to do is overwrrite the rows in table 2 that also exist in table 1 with the table 1 rows, but the rows in table 2 that do not exist in table one, leave those as is.. both tables have a primary key, user_id.
Probabaly a silly question yet as a DOTNET developer, I'm trying to simulate DTS when for example, I don't have permission to perform DTS on a production server.
In particular and interested regards caching of rows before the service decides to flush the buffer and write to the target table. Safe to assume DTS is cursor based?
This table stores common information used in resolving technical problems based on Subject and Topic. However, I've now created a Subject/Topic where I want to copy all the data that corresponds to another Subject/topic.
Example:
There are 35 rows that correspond to Subject = 'Publisher01' and Topic = 'Subcategory03'. I want to create 35 new rows that contain the same RD and RR data, but have Subject = 'Publisher02' and Topic = 'Subcategory07'. Highest current RRID = 5008
I cannot figure out how to write that query. I apologize in advance for the fact that this is, no doubt, a seriously beginner question.
Hello, I have a Data Flow Source that uses a SQL Command to pull data. In the SQL statement, I used CAST to change all varchar types to Nvarvchar to suit MS Access. I can preview the data from the source. In testing, the SQL statement only pulls about ten records.
I have a Microsoft 2000 Access database table as a destination. Data in each column in the table is required, and all columns have defaults.
I also have a grid data viewer set up. I have the DefaultBufferMaxRows set to 2 so that I can see data going across. When I execute this dataflow, no data is transfered to the Access database table. No data shows up in the dataviewer. There are no errors. The 'Execution Results' tab does not show errors, but indicates that zero rows were transfered. There are no warnings.
How do I begin to isolate the problem? The following is the SQL Statement in the Data Flow Source. Thank you for your help! - cdun2
DECLARE @CategoryTable TABLE (ColID Int, ColCategory varchar(60), ColValue varchar(500) ) --and fill it INSERT INTO @CategoryTable (ColID, ColCategory, ColValue) SELECT 0, LEFT(RawCollectionData,CHARINDEX(':',RawCollectionData)), LTRIM(SUBSTRING(RawCollectionData,CHARINDEX(':',RawCollectionData)+1,255)) FROM Collections_Staging --Assign an ID to each block of data for each occurance of 'Reason:' DECLARE @ID int SET @ID = 1 UPDATE @CategoryTable SET [ColID] = CASE WHEN ColCategory = 'Reason:' THEN @ID - 1 ELSE @ID END, @ID = CASE WHEN ColCategory = 'Reason:' THEN @ID + 1 ELSE @ID END --Then put the data together SELECT --cast to Nvarchar for MSAccess a.ColID, CAST(a.ColValue as Nvarchar(30)) AS OrderID, COALESCE(CAST(b.ColValue as Nvarchar(30)),'') AS SellerUserID, COALESCE(CAST(c.ColValue as Nvarchar(100)),'') AS BusinessName, COALESCE(CAST(d.ColValue as Nvarchar(15)),'') AS BankID, COALESCE(CAST(e.ColValue as Nvarchar(15)),'') AS AccountID, COALESCE(CAST(SUBSTRING(f.ColValue,CHARINDEX('$',f.ColValue)+1,500)AS DECIMAL(18,2)),0) AS CollectionAmount, COALESCE(CAST(g.ColValue as Nvarchar(10)),'') AS TransactionType, CASE WHEN h.ColValue LIKE '%Matching Disbursement%' THEN NULL ELSE CAST(h.ColValue AS SmallDateTime) END AS DisbursementDate, --COALESCE(h.ColValue,'') AS DisbursementDate, CASE WHEN i.ColValue LIKE '%Matching Disbursements%' THEN NULL WHEN CAST(LEFT(REVERSE(i.ColValue),4)AS INT) > 1000 THEN CAST(i.ColValue AS SmallDateTime) WHEN LEFT(REVERSE(i.ColValue),4) = '1000' THEN NULL END AS ReturnDate, --COALESCE(i.ColValue,'') AS ReturnDate, COALESCE(CAST(j.ColValue as Nvarchar(4)),'') AS Code, COALESCE(CAST(k.ColValue as Nvarchar(255)),'') AS CollectionReason
FROM @CategoryTable a LEFT JOIN @CategoryTable b ON b.ColID = a.ColID AND b.ColCategory = 'Seller UserId:' LEFT JOIN @CategoryTable c ON c.ColID = a.ColID AND c.ColCategory = 'Business Name:' LEFT JOIN @CategoryTable d ON d.ColID = a.ColID AND d.ColCategory = 'Bank ID:' LEFT JOIN @CategoryTable e ON e.ColID = a.ColID AND e.ColCategory = 'Account ID:' LEFT JOIN @CategoryTable f ON f.ColID = a.ColID AND f.ColCategory = 'Amount:' LEFT JOIN @CategoryTable g ON g.ColID = a.ColID AND g.ColCategory = 'Transaction Type:' LEFT JOIN @CategoryTable h ON h.ColID = a.ColID AND h.ColCategory = 'Disbursement Date:' LEFT JOIN @CategoryTable i ON i.ColID = a.ColID AND i.ColCategory = 'Return Date:' LEFT JOIN @CategoryTable j ON j.ColID = a.ColID AND j.ColCategory = 'Code:' LEFT JOIN @CategoryTable k ON k.ColID = a.ColID AND k.ColCategory = 'Reason:' WHERE a.ColCategory = 'Order ID:'
I need to copy the following columns from my Employee table in my Performance DB to my Employee table in my VacationRequest DB: CompanyID, FacilityID, EmployeeID, FirstName, LastName, [Password] = 'nippert', Role = 'Employee' I tried the advice on this website but to no avail:http://www.w3schools.com/sql/sql_select_into.asp
select * into dbo.ashutosh from attribute where 1=2
"USE WHERE 1=2 TO AVOID COPYING OF DATA"
//HERE "ASHUTOSH" IS THE NEW TABLE NAME AND "ATTRIBUTE" IS THE TABLE WHOSE REFERENCE IS USED// //the logic is to use where clause with 1=2 which will never be true and hence it will not return any row//
I have "inherited" a project working on a SQL 2000 database. The project calculates commissions based on data from an invoice header table and an invoice details table. The goal is to extract data from the primary database tables, perform limited manipulations, and store the resultant data into a table in a different database for reference and reporting. I have the query complete that extracts and manipulates the data, but I am stuck in trying to add this data to the final storage/reporting table. I would also like to do error checking to be certain that a record is not "re-inserted" from the source data to the destination table. Thanks in advance, Barry
I have a table, dbo.Table1(Id,Col1,Col2,Col3,Col4,Col5,Col6,Col7,Col8) that I need to split between two tables dbo.Table2(Id, Col1, Col3, Col4) and dbo.Table3(Id,dbo.Table2_Id,Col5,Col6,Col7,Col8). But in dbo.Table3 I need to have the Id column from dbo.Table2 populated since its a foreign key constraint in dbo.Table3. How do I go about doing this?
I have 3 tables with the follwing schema Table <Category> {
UniqueID, LastDate DateTime }
Assume the follwing tables with data following the above schema
Table Cat1 {
1, D1 2, D2 3, D3 } Table Cat2 {
2, D4 3,D5 4, D6 } Table Cat3 {
1, D7 3,D8 5,D9 }
I have a Master and the schema is as follows Table master {
UniqueId, Cat1 DateTime, -- This is same as the Table name Cat2 DateTime, -- This is same as the Table name Cat3 DateTime -- This is same as the Table name }
After inserting the data from all these 3 tables, I want the my master table to look like this Table Master {
Is it possible to easily copy data from one table to another if the data types don't match. I know you can do a INSERT INTO table1(col1,col2) SELECT (col2,col7) FROM table2 if the data types match but is there a way to do this if they don't. I'm not trying to copy date times into bit fields or anything. I just have an old table that I built when I really didn't know what I was doing now I at leastthink I have a better understanding of what data types to use, so I was wanting to move the data in the orignal table to my new one. Most of the fields in the olddatabase are text datatypes and the new database is nvarchar(50) data types. Thanks for any suggestions.
From a source table (in SERVER1) I get ids of candidates and from another source (in SERVER2) I get their CVs (text files stored in various Folders). My destination table (in SERVER3) has two fields, CandidateId & CandidateCV.
I have to transfer the data in above fashion for nearly 1 million records. How can I write a DTS package which picks up the text file from SERVER2 based on the CandidateId which comes from SERVER1? Probably I need some kind of looping mechanism which changes the candidate id & his CV file.
I am attempting to create a stored procedure that will launch at report runtime to summarize data in a table into a table that will reflect period data using an array type field. I know how to execute one line but I am not sure how to run the script so that it not only summarizes the data below but also creates and drops the table.
An SSIS package to transfer data from a DB instance on SQL Server 2005 to SQL Server 2000 is extremely slow. The package uses an OLEDB Source to OLEDB Destination for data transfer which is basically one table from sql server 2005 to sql server 2000. The job takes 5 minutes to transfer about 400 rows at night when there is very little activity on the server. During the day the job almost always times out.
On SQL Server 200 instances the job ran in minutes in the old 2000 package.
Is there an alternative to this. Tranfer Objects task does not work as there is apparently a defect according to Microsoft. Please let me know if there is any other option other than using a Execute 2000 package task or using an ActiveX Script to read records from one source and to insert them into the destination source, which I am not certain how long it might take and how viable will that be?
Hi all, I copied the following code from MSDN2 bcp Utility: bcp AdventureWorks.Sales.Currency2 in Currency.dat -T -c and executed it in my C:Program FilesMicrosoft SQL Server90ToolsBinn>. I got the following errors:
SQLState = 08001, NativeError = 2 Error = [Microsoft][SQL Native Client]Named Pipes Provider: Could not open a connection to SQL Server [2]. SQLState = HYT00, NativeError = 0 Error = [Microsoft][SQL Native Client]Login timeout expired SQLState = 08001, NativeError = 2 Error = [Microsoft][SQL Native Client]An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections.
Please help and tell me what wrong with this BCP execution or with the database, and how to make this BCP working.
I am trying to make a copy of my live MSSQL 2005 Database to my local Developer Edition install.
To copy the whole database, I am using MS Database Publishing Wizard. I am able to copy the tables and data without problem, but after generating the sql file and running the query against the local db to copy everything, I get errors for the stored procedures.
The stored procedures name is preceded by a login, for instance: EXEC dbo.sp_executesql @statement = N'CREATE PROCEDURE [matthews].[proc_allitemswithtaxologykeylist] ( @list varchar(7777) ) AS (sql code below)
I have created the user "matthews" on the local instance and given the account DBO role membership in the database. What else do I need to do?
Is there a better way to synch live databases with a local install, perhaps from within management studio itself? I am more experienced with enterprise manager, but 2005 won't let you connect with it.
When I transfer/copy tables from one database to another (on the same SQL Server), everything is fine. When I try to tranfer/copy Stored Procedures, I get the error message: "Failed to copy objects from Microsoft SQL Server to Microsoft SQL Server."
I am using the DTS import/export wizard to do this.
I have been abloe to copy tables from one db to another, but am getting a "Failed to copy objects" error when trying to import stored procs. can anyone please tell me what I'm doing wrong?
Obviously, in a select statement, I can select * into #temptable. But can I do the equivalent from a stored procedure? I do not want to edit the stored procedure, just take the records it returns and put them into the temporary table. I do not know what records are returned by the stroped procedure until it is executed. Is there a way I can do this? Thanks J
I have several stored procedures, created in a development environment,that I need to move to a 'QA' environment, and then in turn, to variousproduction environments.When I move these stored procedures, I would like to encrypt them,using the 'WITH ENCRYPTION' clause.My question is, how do I copy these stored procedures from developmentto their target SQL server environment in an encrypted state?Up until now, we have been moving them by generating an SQL script andthen executing that script on the target server. I have tried thisusing a script with 'WITH ENCRYPTION' specified within it, but itdoesn't appear to work when I try and execute that script on the targetserver.Any advice would be greatly appreciated.Nick.
Here is the scenario, I have 2 stored procedures, SP1 and SP2
SP1 has the following code:
declare @tmp as varchar(300) set @tmp = 'SELECT * FROM OPENROWSET ( ''SQLOLEDB'', ''SERVER=.;Trusted_Connection=yes'', ''SET FMTONLY OFF EXEC ' + db_name() + '..StoredProcedure'' )'
EXEC (@tmp)
SP2 has the following code:
SELECT * FROM SP1 (which won't work because SP1 is a stored procedure. A view, a table valued function, or a temporary table must be used for this)
Views - can't use a view because they don't allow dynamic sql and the db_name() in the OPENROWSET function must be used. Temp Tables - can't use these because it would cause a large hit on system performance due to the frequency SP2 and others like it will be used. Functions - My last resort is to use a table valued function as shown:
FUNCTION MyFunction ( ) RETURNS @retTable ( @Field1 int, @Field2 varchar(50) ) AS BEGIN -- the problem here is that I need to call SP1 and assign it's resulting data into the -- @retTable variable
-- this statement is incorrect, but it's meaning is my goal INSERT @retTableSELECT *FROM SP1