SQL Server 2014 :: View To Physical Table Options?
Jun 18, 2014
I have a view which select some few columns from multiple tables with where clause and have 5 unions of different category of data.
For the best performance i need to change this to physical table or any other options which can increase my performance.
View 2 Replies
ADVERTISEMENT
May 8, 2015
I have a performance issue with one of the views when I join the view with a temp table
I have 2 Views - View1 and View2.
There is a third view - view_UNION where the
view_UNION =
SELECT * FROM View1
UNION ALL
SELECT * FROM View2
If I have a query like -
Select view_UNION.* FROM
view_UNION INNER JOIN #TMP ON #TMP.ID = view_UNION.ID
the execution is too slow.
But if I execute the views separately, I get good performance.
How to improve the performance of the view_Union
View 7 Replies
View Related
Aug 11, 2014
I am trying to replicate data from a view in the publisher to a table in the subscriber (transaction replication). I do not need the view's base table, or the view itself, replicated to the subscriber. I only want to data from the view to feed a table in the subscriber.
Is this possible?
Running SQL Server 2008 R2 Enterprise.
View 1 Replies
View Related
Jan 21, 2015
Can we add physical nodes and VMs to windows cluster?
View 7 Replies
View Related
Dec 5, 2014
I've recently started working with a public sector organisation who have 4 clustered sql instances that has 80% of it's db mirrored.
Looking at the transaction log - it seems that a transaction log backup is a good idea as the log is 4x larger than the data file.But I'm not allowed access to the physical server to check onto which drive I can create the trn. No RDP, no vmware - let's be honest I'm not even allowed to launch cmd line Also the Server Manager informs me "We will need to carefully look at database backups if you guys want to start doing these backups on box, as that will break our off box backup routine (it will screw the transaction chain)."
I don't understand how backing up the transaction log could break the "transaction chain"?
View 9 Replies
View Related
Feb 2, 2015
I've been trying to get a definitive answer to this question but alas I have conflicting and patchy answers so far from other sources. I have an index that, lets say, requires 10GB of data space to rebuild..This index resides on a filegroup that spans 2 files on two seperate drives (i.e. a mdf and ndf)
When I rebuild this index how will each of these datafiles grow as the rebuild proceeds to completion? Lets for the time being remove the caveats of any other activity hitting the example index/database in question.My tests seem to show that only the mdf will grows (or the file with the lowest id in the that filegroup) provided there is enough space available in that particular file to complete the operation. The secondary ndf dat file doesnt grow at all if the mdf has enough space.
Is expected behavior? i.e. the index will be rebuilt in a contiguous manner relative to the files contained with the filegroup i.e. fileid 1 will grow till limit reached then next fileid grows etc?
View 0 Replies
View Related
Apr 21, 2015
I have a situation where I have Table A, Table B.
View C is created by joining table A and table B.
I have written a instead of trigger D on view C.
I do not insert/update/delete on the view directly.
For every insert/update in table A /B the values should get insert/update in the view respectively. This insert/update on view should invoke the trigger.
And I am unable to see this trigger work on the view if any insert/update occurs on base table level.
Trigger is working only if any operation is done directly on the view.
View 2 Replies
View Related
Aug 11, 2015
Is there a way to (automatically) remove/disable the first statements like SET ANSI_NULLS ON and SET QUOTED_IDENTIFIER ON which are generated by modify sp via mms 2014 interface?
--
SET ANSI_NULLS ON
--
SET QUOTED_IDENTIFIER ON
ALTER PROCEDURE [dbo].[sp_SendMail] @test INT = 0
AS
begin
--blabla
end
View 1 Replies
View Related
May 4, 2015
What will be the best way to write this code? I have the following code:
BEGIN
declare
status_sent as nvarchar(30)
SET NOCOUNT ON;
case status_sent =( select * from TableSignal
[Code] ...
then if Signal Sent.... do the following query:
Select * from package.....
if signal not sent don't do anything
How to evaluate the first query with the 2 options and based on that options write the next query.
View 2 Replies
View Related
Apr 23, 2008
Hi... I was hoping if someone could share me some thoughts with the issue that I am having at the moment.
Problem: When I run the package in my local machine and update local SS DB/table - new records writes OK in the table. BUT when I changed my destination meaning write record into another physical SS DB/table there is no INSERT data occurs. AND SO when I move/copy over that same package into another server (e.g. server that do not write record earlier) and run it locally IT WORKS fine too.
What I am trying to do is very simple - Add new records in a SS table using SSIS . I only care for new rows and not even changed rows.
Here is my logic -
1. Create Ole DB source to RemoteSERVER - using SELECT stmt
2. I have LoopUp component that will look for NEW records - Directs all rows that don't find match and redirect rows (error output).
3. Since I don't care for any rows that is matched in my lookup - I do nothing or I trash the rows
4. I send the error rows (NEW rows) into OleDB destination
RESULTS when I run the package locally and destination table is also local - WORKS FINE;
But when I run the package locally and destination table is in another Sserver (remote) - now rows is written.
The package is run thru BIDS manually so there is no sucurity restrictions attached to it.
I am not sure what I am missing. And I do not see error in my package either. It is not failing.
Thanks in advance!
View 6 Replies
View Related
Mar 31, 2015
I have a table with two columns
id | filepath
--------------------------------------------------
1| D:Doc filesThe BestHHT.JPG
2| D:Doc filesThe Bestsealed_pack.txt
3| D:Doc filesThe Bestlsbom.JPG
4| D:Doc filesThe Bestmoc.png
5| D:Doc filesThe Beststock.txt
6| D:Doc filesThe Bestdepot.JPG
And in a physical system there are more files than the table.
D:Doc filesThe BestHHT.JPG
D:Doc filesThe Bestsealed_pack.txt
D:Doc filesThe BestJKSlsbom.JPG
D:Doc filesThe Bestmoc.png
D:Doc filesThe Beststock.txt
D:Doc filesThe BestGDNdepot.JPG
D:Doc filesThe BestCASA.JPG
D:Doc filesThe BestSO.txt
D:Doc filesThe BestBA.JPG
I want to compare the filepath column in table with physical drive files and get the details of files which in table and not in physical and viceversa...
View 3 Replies
View Related
Mar 18, 2015
I have created row level security on two views and adding these two views to particular role.Today I have got an requirement that, middle level managers shouldn't see the all the columns. So I have created another role for Middle level managers and assign securables as those two views with selected columns by grant, and map all the middle level managers to this role. I thought my job is done. But these managers uses this view on SSAS(tabular model) and Excel, In those applications they are not able to load the data.
Later I come to know we can't use -- select * from ViewA ( in viewA I have restristced few columns in the role level) Work around is creating another view and assigning to the role. But how can we achieve column level security to implement this in either SSAS/SSRS/EXCEL?
View 6 Replies
View Related
May 26, 2015
How can I select data from a table and row counts from multiple tables in a view. For example:
Select * from Settings -- it gets 1 row only
Select count(*) from NewApps where Status = 'False'
Select count(*) from myUsers where Status = 'Pending'
I just want to get them all in 1 view...
View 1 Replies
View Related
Aug 10, 2015
I need to enable trace flag
OPTION(QUERYTRACEON 9481)
In one of my views I am having trouble finding where to put it in my existing statement:
USE [pec_prod]
GO
/****** Object: View [dbo].[PEC_Claim_Export_All] Script Date: 8/10/2015 9:18:26 AM ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER VIEW [dbo].[PEC_Claim_Export_All]
[Code] ....
Msg 156, Level 15, State 1, Procedure PEC_Claim_Export_All, Line 56
Incorrect syntax near the keyword 'OPTION'.
View 3 Replies
View Related
Oct 16, 2015
I was able to view the MDS entity through web interface. But now when i click on web interface I am not able to view it from internet explorer and Mozilla firefox. However when I try it from a different laptop with my login I am able to see the MDS entities. I tried reinstalling the Microsoft silver light but am still facing the same issue.I also have all the required access for viewing the entities..Is there any settings that I have to do for the explorer so that I will be able to view the entitites .
View 0 Replies
View Related
Nov 23, 2006
I recently migrated my database to SQL2005 from SQL7. It was simply a restore from a backup and was cake.
But now when I go to run a new maintenance plan I do not see my database.
I found that the database options are set to SQL7. If I change it to SQL 2005 I see the database.
What impact will this have on my data?
View 5 Replies
View Related
Aug 5, 2015
I have a user who needs access to views like(dbo.viewnameabc1,dbo.viewnameabc2 and so on...) dbo.viewnameabc* and anytime the user creates the view he already have the permission to view those views....
View 3 Replies
View Related
Oct 9, 2015
I am trying to use an indexed view to allow for aggregations to be generated more quickly in my test data warehouse. The Fact Table I am creating the indexed view on is a partitioned clustered columnstore index.
I have created a view with the following code:
ALTER view dbo.FactView
with schemabinding
as
select local_date_key, meter_key, unit_key, read_type_key, sum(isnull(read_value,0)) as [s_read_value], sum(isnull(cost,0)) as [s_cost]
, sum(isnull(easy_target_value,0)) as [s_easy_target_value], sum(isnull(hard_target_value,0)) as [s_hard_target_value]
, sum(isnull(read_value,0)) as [a_read_value], sum(isnull(temperature,0)) as [a_temp], sum(isnull(co2,0)) as [s_co2]
, sum(isnull(easy_target_co2,0)) as [s_easy_target_co2]
, sum(isnull(hard_target_co2,0)) as [s_hard_target_co2], sum(isnull(temp1,0)) as [a_temp1], sum(isnull(temp2,0)) as [a_temp2]
, sum(isnull(volume,0)) as [s_volume], count_big(*) as [freq]
from dbo.FactConsumptionPart
group by local_date_key, read_type_key, meter_key, unit_key
I then created an index on the view as follows:
create unique clustered index IDX_FV on factview (local_date_key, read_type_key, meter_key, unit_key)
I then followed this up by running some large calculations that required use of the aggregation functionality on the main fact table, grouping by the clustered index columns and only returning averages and sums that are available in the view, but it still uses the underlying table to perform the aggregations, rather than the view I have created. Running an equivalent query on the view, then it takes 75% less time to query the indexed view directly, to using the fact table. I think the expected behaviour was that in SQL Server Enterprise or Developer edition (I am using developer edition), then the fact table should have used the indexed view. what I might be missing, for the query not to be using the indexed view?
View 1 Replies
View Related
Apr 24, 2015
Why am I getting message "A valid table name is required for in, out, or format options."
I used the syntax from a tutorial about bcp utility. I am trying to create a format file for flat file import and export.
My server instance is "stat-hpsqlexpress"
The database name is "STATRLO"
Owner is "dbo"
Table name is "PM-allactivity-emaillog_042315"
The bcp comand I am trying to run is:
bcp STATRLO.dbo.PM-allactivity-emaillog_042315 format nul -c -t, -f C:databaseActivity_c.fmt -S stat-hpsqlexpress - T
Microsoft Windows [Version 6.1.7601]
Copyright (c) 2009 Microsoft Corporation. All rights reserved.
SQL Server Version:
Microsoft SQL Server 2012 (SP1) - 11.0.3153.0 (X64)
Jul 22 2014 15:26:36
Copyright (c) Microsoft Corporation
Business Intelligence Edition (64-bit) on Windows NT 6.1 <X64> (Build 7601: Service Pack 1)
Yes I know the instance says sqlexpress...it was upgraded.
View 3 Replies
View Related
Jun 14, 2015
In the notification tab of job properties, I can see three options in the drop down list at the right hand side.
When the job succeeds
When the job fails
When the job completes
In which system table these options are saved ? Is it possible to add a custom option in that dropdown list.
View 2 Replies
View Related
Jun 10, 2015
Here is my table:
My question is: How can I insert a row for each unique TemplateId. So let's say I have templateIds like, 2,5,6,7... For each unique templateId, how can I insert one more row?
View 0 Replies
View Related
Jun 21, 2015
Previously same records exists in table having primary key and table having foreign key . we have faced 7 records were lost from primary key table but same record exists in foreign key table.
View 3 Replies
View Related
Dec 23, 2013
We have two tables with names X and Y.
X has a,b columns. And Y has c,d columns.
I want to update b column in X table with the values from d column in Y table on condition X.a=Y.c.
View 3 Replies
View Related
Jul 29, 2014
table1
id value
1 11
2 12
3 13
4 14
table2
id1 value1
1 21
2 22
1 31
2 32
in need output as follows
id value id1 value1
1 11 1 21
2 12 2 22
3 13 null null
4 14 null null
1 11 1 31
2 12 2 32
3 13 null null
4 14 null null
View 9 Replies
View Related
Jan 26, 2012
If I have a table of 1,000,000 rows how I do find out what size this table is on disk?
And how do I find out the size of all tables on disc?
View 2 Replies
View Related
Mar 22, 2007
Hello,
I wonder how and if this can be achieved:
In a tabular report I have one or more columns that need to be repeated on every new physical page when printed.
Viewing the report in the ReportViewer control allows such columns to be fixed using the "FixedHeader" switch, allowing the user to conveniently scroll the reports content while always having the fixed columns in sight. This is perfect. However, when switching to the Print Layout view or when printing the report, I would like to have these fixed columns be printed on every new page that is generated at the beginning of the table printed.
E.g. I have a report that has a huge number of columns that need to be shown. When printed, the columns need at least 6 pages' width. I would be very convenient if I could repeat e.g. the first column (containing some identificational information) on every of these 6 pages. It wouldn't hurt if e.g. 7 pages would be generated because of the repeated column(s).
Any help is appreciated, thanks a lot!
Frank
View 7 Replies
View Related
Aug 3, 2015
I need to output a sproc into a new physical table, so the column definitions match the output.
Select Into DbName.NewTableName
Followed by an
Insert Into DbName.NewTableName
From (SprocNameHere),
View 9 Replies
View Related
Aug 14, 2007
Hi all,
This follows on from a query I had a few days back (and for which I was promptly flamed! However, I've got skin like a rhinoseros, so here goes...)
I have a table - ProjectSite - that is pulling information from a two tables (Project and Site). This table contains data regarding which sites are part of which project.
I now want a means of reporting dates against this. The problem is that each project has bespoke milestone dates, so I can't just create columns in ProjectSite. The only solution I can see is to pull each project (and there's quite a few and its corresponding sites into a new table) and then I can create my bespoke columns.
Does this sound like the best viable option, or can anyone suggest another means of doing this?
TIA,
SamuelT
View 1 Replies
View Related
Jan 12, 2015
I have a scenario where I would need to add +4 IDs with the existing IDs, below is an example:
IDWorkloadUnits
1EXO 3
7SPO 4
15LYO 10
Desired output should be as follows:
IDWorkloadUnits
1EXO 3
2
3
4
5
7SPO 4
8
9
10
11
15LYO 10
16
17
18
19
I am not worried about other attributes in the same table.
View 6 Replies
View Related
Apr 10, 2015
I have one environment where we get number of changes everyday. It takes time to take backup of database and then apply the script. I am thinking to automate it as
1. SQL Script (Hot-fix) will put in Shared folder accessible by SQL server.
2. I am writing a SP which will have parameters. Database name, SQL script, HF number.
SP will first take copy_only and compressed (all db's are 2008R2) backup of database. It will record SQL HF script from table into one row and it will apply that script to database using executesql. I am not able to take that SQL script in row.
View 2 Replies
View Related
Jul 21, 2015
We have a database on a 2005 box, which we need to keep in sync with one on a 2014 box (until we can turn off the one on 2005). The 2005 database is still being updated with changes that must be applied to the 2014 database, given the nature of the data (medical documents) we need to ensure updates are applied to the 2014 database in very near real time (these changes are - for example - statuses, not the documents themselves).
Cunning plan #1, ulgy - not at all a fan of triggers - but use an after update trigger to run a sp on the remote box via a linked server in this format, with a SQL Server login for the linked server with permissions to EXEC the remote proc.
CREATE TRIGGER [dbo].[SourceUpdate] ON [dbo].[SourceTable]
AFTER UPDATE
AS
SET XACT_ABORT ON;
SET NOCOUNT ON;
IF UPDATE(ColumnName)
[Code] ....
However, while the sp can be run against the linked server as a standalone query OK, when running it in a trigger it's throwing
OLE DB provider "SQLNCLI" for linked server "WIBBLE" returned message "The transaction manager has disabled its support for remote/network transactions.".
Msg 7391, Level 16, State 2, Procedure TheAfterUpdateTrigger, Line 19
The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "WIBBLE" was unable to begin a distributed transaction.
Whether it actually possible to call a proc on a remote box via a trigger and if so what additional hoops need to be jumped through (like I said, it'll run OK called via SSMS)?
View 3 Replies
View Related
Jun 19, 2014
I found loads of things but nothing seems to work...
I'm trying to get a link with XML data inside the page into a table but I can't find anything
View 9 Replies
View Related
Jan 8, 2015
I am trying to bcp a log file into a table. Fields are "tab" separated and rows are "crlf" separated.
The rows of the file are uneven, in the sense, one row might have 35 tabs and the next one might have 10.
The table i am trying to insert to has 40 columns (maximum no.of columns i calculated from the log file).
I tried both these commands
BCP.EXE TEST.dbo.LOG IN E:syslog.TXT -c -e error.txt -T -S
BCP.EXE TEST.dbo.LOG IN E:syslog.TXT -c -t -r
-e error.txt -T -S
(I also tried using a format file from table)
When I run this, the table is loaded with data but not in the intended way.This is what I have from the table
If the 1st line in the text file has 35 columns and the row ends after it, in the table the 1st row has correct info until the 35th column and instead of going to the next row for the next line in file, it continues to use the next 5 columns in table before it goes to the next row. I think its not getting the row delimiter.
How do I force it to next line.
View 4 Replies
View Related