SQL 2012 :: Is There Any Functional Impact By Altering Stored Procedure?
Mar 31, 2014
I have a sp. I alter stored procedure by adding some logic to that. How can I test that is there any functional impact by altering that stored procedure? How to prove to the team that the modifications doesn't impact any functionality?
I have some columns of data in SQL server that are of NVARCHAR(420)format but they are dates. The dates are in DD/MM/YY format. I want tobe able to convert them to our accounting system format which isYYYYMMDD. I know the format is strange but it will make things easierin the long run if all of the dates are the same when working betweenthe 2 different databases. Basically, I need to take a look at theyear portion (with a SUBSTRING function maybe) to see if it is greaterthan 50 (there will not be any dates that are less than 1950) and ifit is concatenate 19 with it (ex. 65 = 1965). Then, concatenate themonth and day from the rest to form the date we need in NUMERIC(8).So, a date of January 17, 2003 (currently in the format of 17/01/03)would become 20030117. In VB, the function I would write is somethinglike the following:/*Dim sCurrentDate as StringDim sMon as stringDim sDay as StringDim sYear as StringDim sNewDate as StringsCurrentDate = "17/01/03"sMon = Mid(sCurrentDate, 4, 2)sDay = Mid(sCurrentDate, 1, 2)sYear = Mid(sCurrentDate, 7, 2)If sYear < 50 ThensYear = "20" & sYearElseIf sYear > 50 ThensYear = "19" & sYearEnd ifsNewDate = sYear & sMon & sDay*/I was thinking of doing this in a Stored Procedure but am really rustywith SQL (it's been since college).The datatype would end up being NUMERIC(8). How I would write it if Inew how to write it would be: grab the column name prior to theprocedure, create a temp column, format the values, place them intothe temp column, delete the old column, and then rename the tempcolumn to the name of the column that I grabbed in the beginning ofthe procedure. Most likely this is the only way to do it but I have noidea how to go about it.
We are using SQL Server 2005 to develop a simple SP. We started by including an output parameter which would report back the identity of the record being inserted or updated. We have since been trying to drop and recreate the SP without the output parameter, or alter the SP with the same outcome in mind. Neither has been succeeding, as confirmed by inspection of the sys.objects and sys.parameters tables. What might we be missing? We are using the Developer Edition, which may or may not be adequate to the task. Or maybe earlier versions of SQL Server are more robust and would be more successful to help us succeed? Please advise. Thank you.
I have a stored procedure and in that I will be calling a stored procedure. Now, based on the parameter value I will get stored procedure name to be executed. how to execute dynamic sp in a stored rocedure
at present it is like EXECUTE usp_print_list_full @ID, @TNumber, @ErrMsg OUTPUT
I want to do like EXECUTE @SpName @ID, @TNumber, @ErrMsg OUTPUT
What is the impact on the users to drop an index on a table while in use? I will recreate the index afterwards. The table is used constantly by a three of processes/users at all times.
I am upgrading my application's SQL Server from 2008 R2 to 2012.
As discussed in the below URL I am able to see the Identity jump after I upgrade and the server is restarted.
Now since I cannot afford this and at this moment I do not have the time to create a sequence with NOCACHE and test it again I have to go ahead and add trace flag 272 in the start up parameter as this is the only solution which I can implement and even rollback without much hassles.
[URL] ....
What I got to know, this flag will disable the new feature of IDENTITY property that has been implemented as part of SQL Server 2012 and will make it work like it was doing in SQL Server 2008 R2.
But I want to know implementing this flag would impact any other feature or performance (except the performance of IDENTITY) of SQL Server.
I have been investigating the number of connections activeinactive to a certain database server and I have stumbled across an application which seems to not be clearing its database connections.For one instance of a client there was >70 sql connections which eventuated from the closing and reopening one 1 screen in the culprut app. Once the application was closed all of the connections are recycled but its evident that within the application itself it is not correctly reusing already existing open connections.
I have raised a point with the main programmer that we need to investigate more into how the application is managingot managing its ADO .NET connections to SQL.
I am starting with doing some reading here URL... and I was hoping to get some more information about the possible impact of excessive sql connections on the SQL Server itself. Our organization is quite lucky in that our SQl Servers are Overspecced given their workload, bearing that in mind I would like to dig a bit deeper to get some stats if I can to highlight the scope of the issue to the managementprogrammers.Our SQL server peaks at 6500 processes and a good 70% of those are due to this applications mis-management of its sql connections.
How to alter all objects in database i want to find if can any syntax errors in my database after restoring from sql 2008 to 2012. I Can create as test and drop them but trying to find a way to alter proc , views and functions..
On SQL 2012 (64bit) I have a CLR stored procedure that calls another, T-SQL stored procedure.
The CLR procedure passes a sizeable amount of data via a user defined table type resp.table values parameter. It passes about 12,000 rows with 3 columns each.
For some reason the call of the procedure is verz very slow. I mean just the call, not the procedure.
I changed the procdure to do nothing (return 1 in first line).
So with all parameters set from
command.ExecuteNonQuery()to create proc usp_Proc1 @myTable myTable read only begin return 1 end
it takes 8 seconds.I measured all other steps (creating the data table in CLR, creating the SQL Param, adding it to the command, executing the stored procedure) and all of them work fine and very fast.
When I trace the procedure call in SQL Profiler I get a line like this for each line of the data table (12,000)
SP:StmtCompleted -- Encrypted Text.
As I said, not the procedure or the creation of the data table takes so long, really only the passing of the data table to the procedure.
If you drop / rename a table and then recreate the table with the SAME NAME, what impact does it have on stored procedures that use these tables? From a system perspective, do you have to rebuild / recompile ALL the stored procedures that use this table?
I had a discussion with someone that said that this is a good idea, since the IDs of the tables change in sysobjects and from a SQL SERVER query plan perspective, this needs to be done...
Question 2
If you Truncate a Table as part of a BEGIN TRANSACTION, what happens if an error occurs? Will it Rollback? The theory is that it won't because Truncate doesn't utilize the logs where as Delete From uses the SQL Logs?
I was just looking at an SSIS package someone else set up and I went into one of the execute SQL tasks and it is calling a stored procedure to truncate a table. There are a lot of places that tables are truncated within this SSIS package. But each one the 'database', 'owner', and 'table' name are hard coded.
Like this: exec dbo.uspTruncateTable 'dbname', 'dbo', 'tblname'
Of course there are different names, just changed for an example.
I have a database user dsrpReader that can execute stored procedures in one database; it's the only thing that this user can do. Works great except for the below stored procedure.
AS BEGIN -- SET NOCOUNT ON added to prevent extra result sets from
[Code] ....
If I run the above as an administrative user (windows login), I get N rows of information back (N > 0). If I run it as an unprivileged user (see beginning of post), I get 0 rows back and no error messages.
Adding 'with execute as owner' solves the issue, but I'm not sure of the implications. Am I opening up the database to attacks (or even the complete server)?
If so, how to continue.
In an attempt to solve the issue I have given permissions to the user dsrpReader on the information_schema.columns but have no success. It did not work. This was just a try, I actually want to set up a dedicated user with some permissions that I can use in the 'with execute as 'limiteduser'.
I am logging into a SQL instance to run the following query:
DECLARE @ReturnCode int EXECUTE @ReturnCode = [master].dbo.xp_create_subdir N'sharemasterFULL' IF @ReturnCode <> 0 RAISERROR('Error creating directory.', 16, 1)
The share in which the folder is to be created has my account added with full permissions to create files. However this command fails unless I add the SQL Service account user with rights to the folder also.
Is this expected behaviour, is this something specific to extended stored procedures?
We need to create a pdf file from SQL server preferably from a stored procedure. Application will call the stored procedure and it should generate pdf. From my research it appears it can be done using various external tools with licensing/costs. But is it possible to do this within sql server database without additional costs? I read that this can be done by SSRS in SQL server but not sure if it is a good solution and if it is additional licensing..
but in this case, I would want to pass the values from a couple of cells in the worksheet. Do I have to use ADO (so this isn't a SQL Server question at all?)
I've been tasked with creating a stored procedure which will be executed after a user has input one or more parameters into some search fields. So they could enter their 'order_reference' on its own or combine it with 'addressline1' and so on.
What would be the most proficient way of achieving this?
I had initially looked at using IF, TRY ie:
IF @SearchField= 'order_reference' BEGIN TRY select data from mytables END TRY
However I'm not sure this is the most efficient way to handle this.
To get the results of a stored proc into a table you always had to know the structure of the output and create the table upfront.
CREATE TABLE #Tmp ( Key INT NOT NULL, Data Varchar );
INSERT INTO #Tmp EXEC dbo.MyProc
Has SQL 2012 (or SQL 2014) got any features / new tricks to let me discover the table structure of a stored procedure output, - i.e treat it as a table
EXEC dbo.MyProc INTO #NewTmp or SELECT * INTO #NewTmp FROM ( EXEC dbo.MyProc )
We require to convert a list of SPs in to SSIS packages. Most of the SPs do the below steps:
mainly our store procedure r to have compare the present date to past date , and comparing emp id between the files and also some joins. updating table r take place.
Is it possible to pass entire where sentence to a store procedure?From app, I'll generate a where sentence like below:
where orderID = '123' and orderCidy='London'
I created a store procedure like below but got an error said that An expression of non-boolean type specified in a context where a condition is expected, near 'END'
CREATE PROCEDURE getorder @mywhere VARCHAR(100) AS BEGIN SET NOCOUNT ON; select * from order @mywhere END
begin try declare @param2 int begin transaction exec proc2 @param2 commit transaction end try begin catch if @@trancount > 0 rollback transaction end catch
i haven't had an opportunity to do this before. I have nested stored proc and both inserts values into different tables. To maintain atomicity i want to be able to rollback everything if an error occurs in the inner or outer stored procedure.
Is there any way to compare two similar databases (A & B) stored procedure. I have to find stored procedure in second database B with respect to the difference.
I have a stored procedure that calls several views that rely on each other. In the past these views used to go parallel and use up all 100% of the CPU (12 cores), and now when the same stored procedure runs it only uses 8% of the CPU (1 core). This extends the time spent on the query from roughly 10-15 sec to 2-3min. I'm not quite sure why this is happening.
Are there some obvious things to look at when optimizing views to utilize all cores/threads? Also, it doesn't matter if I set Cost Threshold for Parallelism to 1 or 50 or 5, it is always the same, and I have Max Degree of Parallelism set to 0 as well, which should mean to use all cores when available.
I have a table with the list of all TableNames in the database. I would like to query that table and find any tables used in any stored procedure in that DB.
Select * from dbo.MyTableList where Table_Name in ( Select Name From sys.procedures Where OBJECT_DEFINITION(object_id) LIKE '%MY_TABLE_NAME%' Order by name )
Any way to have a process run that will not write its changes to the transaction log? I have a process that runs every three hours and has a huge impact on the transaction log (it becomes larger than the database itself). We do hourly backups of the transaction log and normally it is reasonably sized but when this process runs, it gets HUGE.
The process takes source data, massages it and writes it to summary tables. It is not something we need to track as we can recreate the summary tables if needed and it has no impact on the source tables.
Everything is driven through a stored procedure. Is there a way to run a stored procedure and tell it that nothing it does should be written to the transaction log?