I am trying to send a csv file with 15000 records via the database mail in SQL Server 2014. The problem is that when I open my email the csv only contains 209 records. I have tried the same thing in SQL Server 2012 and it works as expected - it sends the 15000 records in the csv.
I have tested this on several sql servers with 2014 edition on them, and I have the same issue on all of them. The query breaks off at different points on each sever - for example one of them breaks off at 209 records as i said above, another one at 307. The last record always gets truncated at the same place. The csv attachment size it's about 64 kb - which is well below the 4MB limit i've configured the database Maximum File Size bytes parameter.
What i am doing basically is creating a job that is meant to execute a stored procedure and send the results in a csv in an email. The stored procedure is something like:
I'm trying to do an unattended upgrade of 2014 RTM to 2014 SP1.
It's my first attempt at an upgrade configuration file, and its failing with missing registry entry for database engine service and replication service.
Error in summary.txt is:
The registry key SOFTWAREMicrosoftMicrosoft SQL ServerMSSQL12.MSSQLSERVER2495Setup is missing
That's a valid error, as the registry only has an entry for:
I have configured windows failover clustering 2012 on 4 of my test nodes.
I am trying to add another node into this cluster but its not happening. I am not even able to start the cluster service in services.msc
After installing windows failover clustering, when I go to the C:WindowsCluster folder, I am unable to find CLUSDB, CLUSDB.1.container, CLUSDB.2.container and CLUSDB.blf files in the folder.
These files are very much present on the other nodes where cluster service is running.
I tried copying these files manually to server where its missing but still no luck.
I'm learning how to use audits in SQL Server 2014, but when I opened the Security folder there was no Audits folder inside. I did a bit of research and I found out that auditing is a feature in all editions of SQL Server 2012 onwards so I should have the Audits folder in order to create a new audit. There doesn't seem to be many posts like this online because the only posts I found about missing audit folders were about SQL Server 2008 (which doesn't include auditing in the standard version).
My question is: How can I insert a row for each unique TemplateId. So let's say I have templateIds like, 2,5,6,7... For each unique templateId, how can I insert one more row?
Previously same records exists in table having primary key and table having foreign key . we have faced 7 records were lost from primary key table but same record exists in foreign key table.
I am using SSIS 2014 and installed adapter for sharepoint list source and destination and when I refresh the toolbox I don't see them. Is there a way to manually add them?
I also have a RESOURCES table of phrases (for translation purposes) similar to this:
res_id res_lang res_phrase AccessDenied en Access Denied
For some rows in the resources table I do not have all language codes present so am missing some translations for a given res_id.My question is what query can I use to determine the RESOURCE.RES_IDs for which I do not have a translation for.
For example I might have a de, en, cz translation for a phrase but not a pl phrase and I need to identofy those rows in order that I can obtain translations for the missing RESOURCE rows.
I have a query that uses the following fields from MSmerge_history:
MSmerge_history.start_time, MSmerge_history.runstatus and MSmerge_history.duration
Below is the query that I am using:
SELECT
MSmerge_agents.subscriber_name AS SubscriberName,
MSmerge_history.start_time AS SyncTime,
MSmerge_history.runstatus AS SyncStatusID,
MSmerge_history.comments AS Comments,
MSmerge_history.duration AS Duration
FROM distribution.dbo.MSmerge_agents MSmerge_agents INNER JOIN distribution.dbo.MSmerge_history MSmerge_history
ON MSmerge_agents.id = MSmerge_history.agent_id
WHERE MSmerge_history.runstatus IN (2, 6) AND publisher_db = DB_NAME()
AND MSmerge_agents.subscriber_name + CONVERT(nvarchar, MSmerge_history.start_time) NOT IN
(SELECT SubscriberName + CONVERT(nvarchar, SyncTime) FROM SyncActivities)
My query runs fine under SQL Server 2000 but when I run it in SQL Server 2005, it doesn't work any more. Looking at MSmerge_history table under SQL Server 2005, this fields have been removed. Does anyone know where I can access those fields? Is it in another table?
I have one environment where we get number of changes everyday. It takes time to take backup of database and then apply the script. I am thinking to automate it as
1. SQL Script (Hot-fix) will put in Shared folder accessible by SQL server. 2. I am writing a SP which will have parameters. Database name, SQL script, HF number.
SP will first take copy_only and compressed (all db's are 2008R2) backup of database. It will record SQL HF script from table into one row and it will apply that script to database using executesql. I am not able to take that SQL script in row.
We have a database on a 2005 box, which we need to keep in sync with one on a 2014 box (until we can turn off the one on 2005). The 2005 database is still being updated with changes that must be applied to the 2014 database, given the nature of the data (medical documents) we need to ensure updates are applied to the 2014 database in very near real time (these changes are - for example - statuses, not the documents themselves).
Cunning plan #1, ulgy - not at all a fan of triggers - but use an after update trigger to run a sp on the remote box via a linked server in this format, with a SQL Server login for the linked server with permissions to EXEC the remote proc.
CREATE TRIGGER [dbo].[SourceUpdate] ON [dbo].[SourceTable] AFTER UPDATE AS SET XACT_ABORT ON; SET NOCOUNT ON; IF UPDATE(ColumnName)
[Code] ....
However, while the sp can be run against the linked server as a standalone query OK, when running it in a trigger it's throwing
OLE DB provider "SQLNCLI" for linked server "WIBBLE" returned message "The transaction manager has disabled its support for remote/network transactions.".
Msg 7391, Level 16, State 2, Procedure TheAfterUpdateTrigger, Line 19
The operation could not be performed because OLE DB provider "SQLNCLI" for linked server "WIBBLE" was unable to begin a distributed transaction.
Whether it actually possible to call a proc on a remote box via a trigger and if so what additional hoops need to be jumped through (like I said, it'll run OK called via SSMS)?
When I run this, the table is loaded with data but not in the intended way.This is what I have from the table
If the 1st line in the text file has 35 columns and the row ends after it, in the table the 1st row has correct info until the 35th column and instead of going to the next row for the next line in file, it continues to use the next 5 columns in table before it goes to the next row. I think its not getting the row delimiter.
Select statement joining file1 to file2. File 1 may have 0, 1, or many corresponding rows in file2. I need to count the corresponding rows in table2. Table2 also has a Boolean column and I need to count the number of rows where it is true. So I need to count the total number of matching rows and the count of those that are set to true. This is an example of what I have so far. I had to add each column being selected into a Group by to make it work, but I do not know why. Is there some other way this should be set up.
SELECT c.CarId, c.CarName, c.CarColor, COUNT(t.TrailerId) as trailerCount, (add count of boolian, say t.TrailerFull is true) FROM Car c LEFT JOIN Trailer t on t.CarId = c.CarId GROUP BY c.CarId, c.CarName, c.CarColor
I was trying to create temporal table feature which gets introduced in SQL Server 2016 but I am getting this error while trying to create it; Cannot enable compression for object 'Department_History'. Only SQL Server Enterprise Edition supports compression
I have installed SQL Server CTP 2.0 on my system. Microsoft SQL Server 2016 (CTP2.0) - 13.0.200.172 (Intel X86) May 21 2015 11:16:44 Copyright (c) Microsoft Corporation Express Edition on Windows NT 6.2 <X86> (Build 9200: )
I have a table with some rows and columns what i want is i want to Show sum of particular column in the last row. This is my code.
SELECT DISTINCT Cluster.ClusterName, Gruppe.GruppeName, Arbeitspaket.ArbeitspaketName, BMWProjekt, AnzahlAP, Abgerechnet, InBearbeitung, Billanz FROM Bestellung INNER JOIN Cluster ON Bestellung.Cluster = Cluster.rowid INNER JOIN Arbeitspaket ON Bestellung.Arbeitspaket = Arbeitspaket.rowid INNER JOIN Gruppe ON Bestellung.Gruppe = Gruppe.rowid WHERE Projekt ="EA-284-Nxx" AND AnzahlAP <> 0 AND Abgerechnet is 1 AND InBearbeitung is NULL AND Billanz is NULL;
Recently, I partitioned one of my largest tables into multiple monthly field groups. For the current month, it is attached to my "Active' table. The older records are kept in the "historical" table. I need an efficient way to pull records when have a date range that can be spread across both tables.
I have a stored procedure in that attempts to perform a WHERE NOT EXISTS check to insert new records. If the table is empty, the procedure will load the table. However, an insert does not occur when a change to one or more source fields occurs against an existing record. The following is my code:
I expected that when one of the source values of any field in the second WHERE clause changes, that the procedure would insert a new record. Why is this not happening? One other note: I am not 'allowed' to use MERGE.
I have a T-SQL Statement Task to create table every time the package run. It doesn't return error but it doesn't drop the table either. The data is appended every time. The code is working fine in SQL server query window.
IF OBJECT_ID (N'M020_Vendor', N'U') IS NOT NULL DROP TABLE M020_Vendor GO SET ANSI_NULLS ON GO
I save Table size and recs. no every day. and check it some days.
... insert into @t exec sp_msforeachtable 'exec sp_spaceused ''?''' ...
But Today I saw sudden increase size in a table. about 128 MB in a day. (Average Growth fro this table was 4 or 5 MB in a day)This growth was for Only 4222 Records. While for more number of records (about 7000) in yesterday we had only 2 MB GRowth!
This Table information (Now):
sp_spaceused 'Table1'
Result:
name ---Rows --reserved --data
Table1--1021319--460328 KB --283104 KBI Try to gess The reason. I copy These new records to another table.But The result was more strange : on new table the size of these record was : < 1 MB I copied All records to another table . The size was : 148 MB (while this is 283 MB in my real database)