From ASP.NET, how to make temporary table in SQL server from select statement which build in a button?
For example, if the user clicks a button a table will create based on statement below:
“Select * from Order where [order by] = ‘Mike Lee’�
i am inserting something into the temp table even without creating it before. But this does not give any compilation error. Only when I want to execute the stored procedure I get the error message that there is an invalid temp table. Should this not result in a compilation error rather during the execution time.?
--create the procedure and insert into the temp table without creating it. --no compilation error. CREATE PROC testTemp AS BEGIN INSERT INTO #tmp(dt) SELECT GETDATE() END
only on calling the proc does this give an execution error
I have a text file which needs to be created into a table (let's call it DataFile table). For now I'm just doing the manual DTS to import the txt into SQL server to create the table, which works. But here's my problem....
I need to extract data from DataFile table, here's my query:
select * from dbo.DataFile where DF_SC_Case_Nbr not like '0000%';
Then I need to create a new table for the extracted data, let's call it ExtractedDataFile. But I don't know how to create a new table and insert the data I selected above into the new one.
Also, can the extraction and the creation of new table be done in just one stored procedure? or is there any other way of doing all this (including the importation of the text file)?
I would like to join two tables: one on a local server which I have admin access to and another server which I only have read access. The local table is very small, but the remote table is very large.
If I look at Query Analyzer's execution plan, it appears that the join will be done locally (i.e. the entire table is transferred from the remote server and then joined to my local table). Is there a way to create a temp table using linked servers, transfer my small local table to the remote server and then perform the join on the remote server? In the past, I have been able to use openquery to restrict the data to a small subset that is transferred but the local table is a little too large for that.
I appreciate any advice / guidance anyone can offer me!
I need to pass some SQL to someone else who will run it on their database. I have got the SQL for SQL Server 2000 but they are running SQL Server 7. Apparently the below MSSQL 2000 script doesn't work;
SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[tableName]( [id] [int] IDENTITY(1,1) NOT NULL, [ArticleID] [int] NULL, [Heading] [text] COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [BodyContent] [text] COLLATE SQL_Latin1_General_CP1_CI_AS NULL, [WrittenDate] [datetime] NULL, CONSTRAINT [tableName] PRIMARY KEY CLUSTERED ( [id] ASC ) )
What is the equivalent of the above in for SQL Server 7? I don't have access to it via SQL Server Manager so have to run the script.
I'm trying to create a proc for granting permission for developer, but I tried many times, still couldn't get successful, someone can help me? The original statement is:
I have a local SQL2005 server with a linked SQL2000 server. I would like to know how to create a temporary table in the remote server in such a way that I can make an inner join as follows; my idea is to optimized a distributed query by doing so:
create table #myRemoteTempTable
insert into #myRemoteTempTable select * from myLocalTable
update myRemoteTable set Value=#myRemoteTempTable.Value from myRemoteTable inner join #myRemoteTempTable on #myRemoteTempTable.ID=myRemoteTable.ID
I used SQL Server 2012 Management Studio to create a new table on an 2014 SQL Server instance and got this message: 'This backend version is not supported to design database diagrams or tables'. Does this mean that I have to have SQL Server 2014 Management Studio to create a table on a SQL Server 2014 instance?
I was trying to create temporal table feature which gets introduced in SQL Server 2016 but I am getting this error while trying to create it; Cannot enable compression for object 'Department_History'. Only SQL Server Enterprise Edition supports compression
I have installed SQL Server CTP 2.0 on my system. Microsoft SQL Server 2016 (CTP2.0) - 13.0.200.172 (Intel X86) May 21 2015 11:16:44 Copyright (c) Microsoft Corporation Express Edition on Windows NT 6.2 <X86> (Build 9200: )
Can I dynamically (from a stored procedure) generatea create table script of all tables in a given database (with defaults etc)a create view script of all viewsa create function script of all functionsa create index script of all indexes.(The result will be 4 scripts)Arno de Jong,The Netherlands.
hai, i want to create a table in a existing database in sql and i have to move data to the table and after my process i have to delete it .This is should be done thro codings.please help me. Thanks in advance Viswa
I am developing a asp.net app and I have a long, very long list of data fields. The values of the data fields have to be saved to a SQL database. The list is so long that it a pain in the ass to have to write all the code even to declare an object and all its properties by hand. So... I made a list of all the field names and types and load it all up in an array. Then I wrote a bunch of macros to write all the properties one by one with the corredt type for the object I needed to create instead of doing it all one by one by hand. I want to do the same thing with the Database. I dont want to have to hand define all the fiedls 1 by 1 by hand in the IDE. I want to write a simple macro, a loop theat loops thru my array and creates a field with the field name and type that it says in the array. Simple enough.. Now I have the table defined already since that was simple enough. It is the 200 ++ fields that I dont wanna do. So, please what is the code I need (in VB) to create 1 simple field in an existing table. Say the table is called "Table1" in the database "Database1" and the field I want is to be called "Field1" and to be 250 chr string..... ??? Thank you Marc
Can we create the Partition on Existing Table?e.g Create table t ( col1 number(10,0), Col2 Varchar(10)) ;After the table Creation can we alter the table to partition the table.
I'm using SQL Server 2008R2. I am developing a database which requires access to data from other servers. So far I have been creating views using OPENQUERY (where there's a performance benefit) to select specifically the columns I want. Generally, for my purposes, I find these OPENQUERY based views to perform better (some times significantly so) to simple SELECT <COLUMNS> FROM <SERVER>.<DATABASE>.<SCHEMA>.<TABLE> WHERE <Where clause Statements> format views. My understanding is that this is because an OPENQUERY "pushes" the query processing to the remote server and simply returns the final result set to the local server i.e. there's no cross-server join/synchronization going on.
My question is, if I were to create a Synonym for a table object on the remote server, where does the processing happen if I query from this Synonym or create a join with this synonym to a table in my local database?Essentially, I am trying to understand if there are any "hidden gotcha's" primarily from a performance perspective, to using synonyms.
We have two databases with same schema and tables (same table names, basically main DB and a copy of the main DB). following is example of table names from 2 DBs.
CREATE TABLE #SourceDatabase (SourceColumn1 VARCHAR(50)) INSERT INTO #SourceDatabase VALUES('TABLE1') , ('TABLE2'),('TABLE3') , ('TABLE4'),('TABLE5') , ('TABLE6') SELECT * FROM #SourceDatabase DROP TABLE #SourceDatabase CREATE TABLE #ArchiveDatabase (SourceColumn2 VARCHAR(50)) INSERT INTO #ArchiveDatabase VALUES('TABLE1') , ('TABLE2'),('TABLE3') , ('TABLE4'),('TABLE5') , ('TABLE6') SELECT * FROM #ArchiveDatabase DROP TABLE #ArchiveDatabase
We need a T_SQL statement that can create one view for each table from both the databases(assuming both databases have same number of tables and same table names). so that we can run the T_SQL on a thrid database and the third DB has all the views (one view for each table from the 2 DBs). and the name of the view should be same as the tables name. and all 3 DBs are on the same server.
the 2 temp tables are just examples, DBs have around 1700 tables each. so we ned something like following for each table.
CREATE VIEW DBO.TABLE1 AS SELECT * FROM [SourceDatabase].[dbo].[TABLE1] UNION ALL SELECT * FROM [ArchiveDatabase].[dbo].[TABLE1] CREATE VIEW DBO.TABLE2 AS SELECT * FROM [SourceDatabase].[dbo].[TABLE2] UNION ALL SELECT * FROM [ArchiveDatabase].[dbo].[TABLE2] CREATE VIEW DBO.TABLE3 AS SELECT * FROM [SourceDatabase].[dbo].[TABLE3] UNION ALL SELECT * FROM [ArchiveDatabase].[dbo].[TABLE3] CREATE VIEW DBO.TABLE4 AS SELECT * FROM [SourceDatabase].[dbo].[TABLE4] UNION ALL SELECT * FROM [ArchiveDatabase].[dbo].[TABLE4] CREATE VIEW DBO.TABLE5 AS SELECT * FROM [SourceDatabase].[dbo].[TABLE5] UNION ALL SELECT * FROM [ArchiveDatabase].[dbo].[TABLE5] CREATE VIEW DBO.TABLE6 AS SELECT * FROM [SourceDatabase].[dbo].[TABLE6] UNION ALL SELECT * FROM [ArchiveDatabase].[dbo].[TABLE6]
I need to create a new table on a SQL Server 2005 Database, but the data needed for the table is kept in different databases on different servers.
Half the data is on the SQL 2005 database, so that shouldn't be too much of a problem, but the other half is on a SQL Express database on a different server, is it possible to write a T-SQL query to retrieve this data on the other server?
Normally I use a create table + insert into kind of query to create custom tables but I don't know how to connect to the SQL Express server/database using T-SQL (if it's possible at all?) to do this.
Any tips, hints, ideas very welcome. Please use small words and short sentences.
public partialclassTrigger { Â Â Â [Microsoft.SqlServer.Server. SqlTrigger(Name = "myPM10000", Target = "[dbo].[PM10000]", Event = "FOR UPDATE, INSERT")] publicstaticvoidmyPM10000() etc.
When I try to build the project it fails:
Error 1 SQL71561: Trigger: [dbo].[myPM10000] has an unresolved reference to object [dbo].[PM10000]. D:ProjectsVS2013myproject_CLRobjDebugmyproject_CLR.generated.sql 8 32 myproject_CLR
I have tried with and without adding a database reference. With the reference all I accomplish is lost time waiting for that monster link to the database to generate. There has to be a way to make this happen, right? I don't want to have to rewrite this in Transact SQL.
I have some code that dynamically creates a database (name is @FullName) andthen creates a table within that database. Is it possible to wrap thesethings into a transaction such that if any one of the following fails, thedatabase "creation" is rolledback. Otherwise, I would try deleting on errordetection, but it could get messy.IF @Error = 0BEGINSET @ExecString = 'CREATE DATABASE ' + @FullNameEXEC sp_executesql @ExecStringSET @Error = @@ErrorENDIF @Error = 0BEGINSET @ExecString = 'CREATE TABLE ' + @FullName + '.[dbo].[Image] ( [ID][int] IDENTITY (1, 1) NOT NULL, [Blob] [image] NULL , [DateAdded] [datetime]NULL ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]'EXEC sp_executesql @ExecStringSET @Error = @@ErrorENDIF @Error = 0BEGINSET @ExecString = 'ALTER TABLE ' + @FullName + '.[dbo].[Image] WITHNOCHECK ADD CONSTRAINT [PK_Image] PRIMARY KEY CLUSTERED ( [ID] ) ON[PRIMARY]'EXEC sp_executesql @ExecStringSET @Error = @@ErrorEND
I'm trying to write a function that will retrieve the last backup date/time of a particular database on a remote server (i.e. by querying msdb.dbo.backupset). Unfortunately, you can't use sp_executesql in a function, so I can't figure out a way to pass the server name to the query and still be able to return the datetime value back to the calling TSQL code (so that rules out using EXEC().
I am running a script by the end of the day. What I need is the rows in my temp table get saved in a permanent table.
The name of the table should end with the current date at the end.
Declare @tab varchar(100) set @tab = 'MPOG_Research..ACRC_427_' + CONVERT(CHAR(10), GETDATE(), 112 ) IF object_id(@tab ) IS NOT NULL DROP TABLE '@tab'; Select * INTO @tab from #acrc427;
I am trying to create a table that would represent a workload for each shop. In order to do that I need to have WorkLoad table and ShopWorkLoad table which is actually just aggregation of WorkLoad.
WorkLoad contains a list of following items:
current orders that are in the process (one select statement) scheduled orders (another select statement) expected orders (third select statement) that come through a third-party system
All of this needs to be live. So, for example, as soon as order is added to Order table it should be included in WorkLoad if certain conditions are met. Same goes for scheduled orders (which come from another table). Expected orders will be loaded on a daily bases (based on historical data).
ShopWorkLoad table is aggregation of WorkLoad table.
Currently I did it this way:
Added after insert/update trigger on Order table: when order is created/updated, if it meets certain conditions, it should be inserted in WorkLoad, otherwise remove it from workload if it's in there and doesn't meet conditions
Added after insert/update trigger on Schedule table: when order is scheduled, if it meets certain conditions, it should be inserted in WorkLoad, otherwise remove it from workload if it's in there and doesn't meet conditions
Running daily job that populates WorkLoad table with expected orders based on historical values
Final step is to create an indexed view vShopWorkLoad
My biggest concern is usage of triggers which call pretty complex logic to determine whether item should be added to workload or not.
One other option was to create vWorkLoad view and somehow make it an indexed view but currently I don't see a way of doing that because the query consists of 4 union select statements, below is pseudo example. But even if doing it that way, how to build aggregated indexed view on top of vWorkLoad indexed view?
Third option is to use sql agent job which would run every x seconds (maybe 20) and it would execute all of these queries to populate WorkLoad table with delay of 10-20 seconds, but I am still not sure if this is acceptable to the client.
Fourth option is to create 3 or 4 indexed view where sum of them makes a workload. Then, ShopWorkLoad view would be built on top of these 3 or 4 indexed views, but in this case I don't know how this would affect performance since ShopWorkLoad query would be often queried.
Example of workload pseudo query:
select WorkLoadType = 'Order in process', OrderId, ShopId, ... from Order
I have a table that has company id, attachment file name, folderexists columns.
First what I need to do is create a series of folder or directories on a networked server using the company id as the folder name where the folder name does not already exist.
Second I need to move files based on attachment file name and company id to the proper folder.
For those who want to know, this is a remediation project because of a bug in our application.
The application is supposed to created the folder based on company id and then put the attachment in that folder.
I am having 100 of flat files need to load in respective staging table.I want to create table on run time as per filename input.suppose if input filename is ABC then table name should be Staging_ABC if file name is XYZ then it should be Staging_XYZ.Table structure is below need to create at run time
CREATE TABLE Staging_'Filename'( [COL001] [varchar](4000) NULL, [Id] [int] IDENTITY(1,1) NOT NULL, [LoadDate] [datetime] NOT NULL default getdate() )
i have created a fact table which has unique cluster index as below,
CREATE UNIQUE CLUSTERED INDEX [FactSales_SalesID] ON [dbo].[FactSales] (salesid ASC) WITH (DATA_COMPRESSION = PAGE) GO however later when i add CLUSTERED COLUMNSTORE INDEXES : CREATE CLUSTERED COLUMNSTORE INDEX CSI_FactSales ON dbo.FactSales WITH (DATA_COMPRESSION = COLUMNSTORE) GO
it prompts error.
Msg 35372, Level 16, State 3, Line 167 You cannot create more than one clustered index on table 'dbo.FactSales'. Consider creating a new clustered index using 'with (drop_existing = on)' option.