Random SQL Errors In Application During A High-load Almost 'batch' Process. Is This Expected?
Jan 4, 2008
Setup is a common VB.NET application, SQL back, with 5-10 users. Occassionaly from another VB.NET app, I'll need to run a process to upload and insert a handful of records. The records are somewhat large with 100+ fields that require a handful of SELECT and UPDATE statements making the server 'busy' for several minutes.
Independently, the Main application and the Upload application work just fine. It's only when users are active in the main application during the time of the 'upload' process. It's almost as if they 'bump' into each other and the server throws up errors on either app even simple SQL statements. The same tables are being queried, but it doesn't feel like a concurrency issue.
Are these problems expected when running somewhat of a high load?
Any ideas on how to make these guys work together without them fighting?
Shouldn't SQL be able to handle the statements even during the barrage of update queries and requests?
What happens in situations with hundreds or even thousands of users simultaneously accessing?
Dear list Im designing a package that uses Microsofts preplog.exe to prepare web log files to be imported into SQL Server
What Im trying to do is convert this cmd that works into an execute process task D:SSIS ProcessPrepweblogProcessLoad>preplog ex.log > out.log the above dos cmd works 100%
However when I use the Execute Process Task I get this error [Execute Process Task] Error: In Executing "D:SSIS ProcessPrepweblogProcessLoadpreplog.exe" "" at "D:SSIS ProcessPrepweblogProcessLoad", The process exit code was "-1" while the expected was "0".
There are two package varaibles User::gsPreplogInput = ex.log User::gsPreplogOutput = out.log
How do I use the execute process task? I am trying to unzip the file using the freeware PZUnzip.exe and I tried to place the entire command in a batch file and specified the working directory as the location of the batch file, but the task fails with the error:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC0029151 at Unzip download file, Execute Process Task: In Executing "C:ETLPOSDataIngramWeeklyUnzip.bat" "" at "C:ETLPOSDataIngramWeekly", The process exit code was "1" while the expected was "0".
Then I tried to specify the exe directly in the Executable property and the agruments as the location of the zip file and the directory to unzip the files in, but this time it fails with the following message:
SSIS package "IngramWeeklyPOS.dtsx" starting.
Error: 0xC002F304 at Unzip download file, Execute Process Task: An error occurred with the following error message: "%1 is not a valid Win32 application".
The command in the batch file when run from the command line works perfectly and unzips the file, so there is absolutely no problem with the command, I believe it is just the set up of the variables on the execute process task editor under Process. Any input on resolving this will be much appreciated.
I am designing a utility which will keep two similar databases in sync. In other words, copying the new data from db1 to db2 and updating the old data from db1 to db2.
For this I am making use of the 'Tablediff' utility which when provided with server name, database, table info will generate .sql file which can be used to keep the target table in sync with the source table.
I am using the Execute Process Task and the process parameters I am providing are:
The customer.bat file will have the following code: tablediff -sourceserver "LV-SQL5" -sourcedatabase "TC_CTI" -sourcetable "CUSTOMER_1" -destinationserver "LV-SQL2" -destinationdatabase "TC_CTI" -destinationtable "CUSTOMER" -f "c:SQL_bat_Filessql5TC_CTIsql_filescustomer1"
the .sql file will be generated at: C:SQL_bat_Filessql5TC_CTIsql_filescustomer1.
The Problem: The Execute Process Task is working fine, ie., the tables are being compared correctly and the .SQL file is being generated as desired. But the task as such is reporting faliure with the following error :
[Execute Process Task] Error: In Executing "C:SQL_bat_FilesSQL5TC_CTIpackage_occurrence.bat" "" at "C:Program Files (x86)Microsoft SQL Server90COM", The process exit code was "2" while the expected was "0". ]
Some of you may suggest to just set the ForceExecutionResult = Success (infact this is what I am doing now just to get the program working), but, this is not what I desire.
Here I will describe my problem. 1. We are loading large amount of data from database on background thread which is starting on Application_start event in global.aspx.cs file.The data is later cached for subsquent request to improve the performance. 2. Now when we put the application on web farm garden, it is not able to load the application. 3. We are sending the request the servers through Router kind of application. 4 This application is working fine on single server enviornment.
I orignally wrote a post here regarding some info on setting up a cluster. Upon further analysis of the problem with our system, I noted that at particular times we have tremendous amounst of Update, Insert, Delete etc, transactions hitting out database.
I thought originally SQL Clustering could solve this problem, but the time and upkeep that will be required to maintain such a configuration might not be feasible and more importantly it may not even fix the problem.
Next week I plan on doing some more specific performance monitoring off the database during normal business activity, but my initial suspicion is that there is a tremendous amount of I/O processing due to the high transaction load which is slowing down the application.
I was wondering what you have done to alleviate such problems? One of the solutions I have come up with is to possibly create a Master/Slave SQL Server design where the Slave handles most of the database transactions and then at a low load during the day update the Master DB. How does this sound? Any other ideas would be greatly appreciated...
Can anyone tell my why I'm getting these first few errors? IT seems to be preventing my SQl Agent from successfully firing off my SSIS package:
01/04/2006 09:53:48,,Warning,[396] An idle CPU condition has not been defined - OnIdle job schedules will have no effect 01/04/2006 09:53:48,,Warning,[260] Unable to start mail session (reason: No mail profile defined) 01/04/2006 09:53:48,,Information,[129] SQLSERVERAGENT starting under Windows NT service control 01/04/2006 09:53:48,,Error,[364] The Messenger service has not been started - NetSend notifications will not be sent
I have a problem running a batch file, now the problem is that when i run the batch file the command prompts the user for an input, but I have all the output of the bat file going into a log file. So when i run the bat file the process just sits there until i hit the 'y' key or unless i nput sumthin manually. This is a problem becuase this batch file is running on the UAT server as a job and there is no one there to input once the job is running. The commnd in the batch only requires an input once a month.
for eg if run: launch_scrt.bat and i want to put sum parameters such as 'y' or 'n' to avoid the manual input once the job is running. Any ideas?
First noticed it when I went from JIT compiling of my .net app to pre-compiled via the command prompt, although I can't see that that's go anything to do with it...
My database is around 2gb and I'm using service broker.
If I drop a new version of my app into production, sqlserver.exe goes nuts and eats up the cpu.
Tonight, I stopped IIS and SQL Server, then dropped a new version in. I hadn't re-started IIS at this point... As soon as I started SQL Server, off it went again - eating up the CPU. I quickly launched a trace via the Profiler - no activity showing.
So, I started IIS after a couple of mins (still with the high cpu) and then just waited. After about 10 mins, SQL Server settled down and everything was normal again.
What is it doing when I start the SQL Server service?
I have tried asking the same question in other forums. All i get is links
Please help me. I have the following requirement:
I have the following tables: Theater - TheaterId, TheaterName, Revenues,locationid, stateid State - StateId, StateName Location - LocationId, LocationName, StateId
I want to generate reports that will tell me the revenue generated for each theater in each location in a state. I want to run a batch process which will loop through the 3 tables and will passing the location and state id as parameters one by one. I want each report to be generated as a pdf and stored in a location. How do I do this?
How to view the errors occured during the job execution (than viewing the Job history).Is there any files available to refer the errors?. Thanks in advance.
Hello everyone,I have around 20 reports in an ASP web-application which connects to aSQL Server 2000 dB, executes stored procedures based on inputparameters and returns the data in a nice tabular format.The data which is used in these reports actually originates from a 3rdparty accounting application called Exchequer. I have written a VBapplication (I call it the extractor) which extracts data fromExchequer and dumps the same into the SQL Server dB every hour. Therunning time for the extractor is an average of 10 minutes. Duringthese 10 minutes, while the extractor seems to run happily, my ASPweb-application which queries the same dB that the extractorapplication is updating becomes dead slow.Is there anyway I can get the extractor to be nice to SQL Server andnot take up all its resources so that the ASP web-application users donot have to contend with a very very slow application during thosetimes?I am using a DSN to connect to the dB from the server that runs theweb-application and well as the other server which runs extractor.Connection pooling has been enabled on both (using the ODBCAdministrator). The Detach Database dialog gives me a list of openconnections to the dB. I have been monitoring the same and I havenoted 10-15 open connections at most times, even during the executionof extractor.All connection objects in the ASP as well as VB applications areclosed and then set to nothing.This system has been in use from 2002. My Data file has grown to 450MBand my Transaction Log is close to 2GB. Can the Transaction Log be aproblem. For some reason, the size of the Transaction Log does not godown even after a complete dB backup is done. Once a complete dBbackup is done, doesn't the Transaction Log lose its significance andcan be actually deleted? Anyway this is another post I'm doing todayto the group.In the extractor program,1) I create a temporary table2) I create an empty recordset out of the table3) I loop through the Exchequer records using Exchequer's APIs, addingrecords into the recordset of the temporary table as I go along.4) I do an UpdateBatch of the Recordset intermitently5) I open an SQL Transaction6) I delete all records from the main table7) I run a INSERT INTO main_table SELECT * FROM #temp_table8) I commit the transactionI hope that the information is sufficientThanksSam
I have a SSIS package with the last three tasks in the control flow are stopping the SSAS, then "on success" the second last task is execute a batch file to copy a bunch of files to a remote server using the robocopy command, then "on success" the last task is to start the SSAS. I test all three tasks individually and they are all working fine. The problem is in-between the second last task and the last task, the second last task is to execute a batch file and then forward to the last task. The task just moved to the last task once the batch file is executed and it did not wait until the actual robocopy job is completed. Therefore it caused a problem in the robocopy process. Thanks.
I have an "execute process" task which executes a perl script. When I run the task, it shows a prompt with the message "the publisher can not be verified". It gives the option to continue the task or to cancel.
The problem for me is that I want to schedule this package to run automatically, and I don't want the automated process to show this dialog. I need to override the security setting and have SSIS execute my batch scripts without prompting. Assuming I don't want to rewrite the Perl code into a script task, is there a way to do acomplish this?
Trying to run a SSIS package from a SQL job, and the package itself has a step that calls a process task that runs a batch file. The syntax in the process task I have is the following:
executable: c:windowssystem32cmd.exe Arguments: /C e:SungardPTAencryptfile.bat Working Directory: e:sungardpta
I keep getting the following in my log:
PackageStart,MIMKEIMC11N,MI rustserviceadmin,PTADailyTransactionExtract,{46F7381F-B345-47DC-BFC0-17CCF02A935A},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:11 PM,1/29/2008 1:59:11 PM,0,0x,Beginning of package execution. OnError,MIMKEIMC11N,MI rustserviceadmin,EncryptFiles,{FCF5B653-CC05-4183-981B-F5EF4906DD09},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:12 PM,1/29/2008 1:59:12 PM,-1073573551,0x,In Executing "c:windowssystem32cmd.exe" "/C e:SungardPTAencryptfile.bat" at "e:sungardpta", The process exit code was "1" while the expected was "0". OnError,MIMKEIMC11N,MI rustserviceadmin,PTADailyTransactionExtract,{46F7381F-B345-47DC-BFC0-17CCF02A935A},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:12 PM,1/29/2008 1:59:12 PM,-1073573551,0x,In Executing "c:windowssystem32cmd.exe" "/C e:SungardPTAencryptfile.bat" at "e:sungardpta", The process exit code was "1" while the expected was "0". OnTaskFailed,MIMKEIMC11N,MI rustserviceadmin,EncryptFiles,{FCF5B653-CC05-4183-981B-F5EF4906DD09},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:12 PM,1/29/2008 1:59:12 PM,0,0x,(null) PackageEnd,MIMKEIMC11N,MI rustserviceadmin,PTADailyTransactionExtract,{46F7381F-B345-47DC-BFC0-17CCF02A935A},{F82C7944-D28C-4F70-8CB7-F0BD7ED748D2},1/29/2008 1:59:12 PM,1/29/2008 1:59:12 PM,1,0x,End of package execution.
Can a batch file that resides on another server be executed from a different machine? I have a batch file that resides on a server that I would like to run using SQL 2005 Integration Services. Is there anything I can do that would allow me to remotely execute this batch file and have it run in that environment.
BATCH FILE:
cd C:Trandev otrun -at OTRecogn.att -DINPUT_FILE=%1 -tl 1 -cs dv -lg mylog -I C:Trandev represents the remote environment
I have tried mapping the remote machine to a network drive on my local machine and using that drive to execute the batch file in an Execute Process Task, but it does not work.
SSIS:
I have a FOR EACH loop grabbing files and writing fileName to a variable that is passed to the Process Task as an argument through an expression(%1 in the batch file above). The Working Directory is a mapped network drive. The Executable is also a network drive plus batch file name.
Any help would be appreciated.
My computer is a HP Compaq dc7100, 512mb RAM, WindowsXP
I have a remote batch file on machine B that I need to execute using 'Execute process task' control from a package on machine A. The batch file uses pgp software and encrypts a file sitting on machine B itself. The reason why my batch file is sitting on machine B, is because the PGP software is on machine B.
If I execute the batch file by itself from machine B, the script runs fine. I refer the same batch file as a UNC path from my package on machine A. But that does not work since the 'Working directory' is still machine A. I can not set machine B's folder as the working dir because it does not accept UNC path. So I say, ok , let me map a path to that UNC location and map it as drive 'Z:'. Certainly if I do so, I will be running the process on machine A and the batch file will look for the pgp software on machine A, and hence fail.
I have tried third party remote batch execution tools (PSEXEC) but have not had success, not because of SSIS limitations, but simply because the PGP executable when run through the PSEXEC tool, does not identify the location of the public keys on machine B and hence gives an encrytion failure.
How do I get the remote batch file to execute such that it executes with its own env? Is there a better remote execution tool I can try or are there any other features of SSIS I can use to get around this issue? I need the results of the batch file and hence do not want to make it an asyncronous process.
I am new to this, but have scoured the web and not found an answer to my question...
I have an execute process task that runs a simple batch file. When this batch file completes with an ERRORLEVEL greater than 0, I would like the task to fail. I thought this simply meant setting the "FailTaskIfReturnCodeIsNotSuccessValue" property to true, and setting the SuccessValue to 0. However, this does not appear to work.
Even with a simple batch file forcing the errorcode to 1 as follows, the task still completes "successfully".
When I deploy the cube which is sitting on my PC (local) the following 4 errors come up:
Error 1 The datasource , 'AdventureWorksDW', contains an ImpersonationMode that that is not supported for processing operations. 0 0 Error 2 Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Adventure Works DW', Name of 'AdventureWorksDW'. 0 0 Error 3 Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Customer', Name of 'Customer' was being processed. 0 0 Error 4 Errors in the OLAP storage engine: An error occurred while the 'Customer Alternate Key' attribute of the 'Customer' dimension from the 'Analysis Services Tutorial' database was being processed. 0 0
Today I ran into the error (see title of this post) while trying to execute a batch file using SSIS. If anyone else runs into this problem I want to point them to my post on the issue, as I did not easily find a solution when doing the search myself: http://blog.lyalin.com/2008/02/1-is-not-valid-win32-application-ssis.html
Summary (So you dont need to visit my blog):
I was trying to execute a totally empty placeholder batchfile and this produces the "%1 is not a valid win32 application" error when using Execute Process Task. Simply add something to your placeholder file (like DIR command) and you can continue running your package without error
I hope this saves someone time in the future.
Comments? If anything in my post is inaccurate I hope you guys post some corrections.
Hello I am looking for some advise. I have a process to import flat text files. We are importing data from five vendors. These files are seperated by vendor with each vendor having their own directory. The general file layouts are different for each vendor. Each vendor may have up to five different types of files to be imported that are their one directory (sales, inventory, transactions etc). Sales file for Vendor 1 is different layout than sales file for Vendor 2. Each vendor may have multiple instances of each type (store 1 inventory, store 2 inventory, store 1 sales, store 2 sales etc.) There could be up to five hundred files (of the five different types) in a given vendors directory. I am using an import package. This package has five (.dtsx files) different data flows. Each of these data flows has connection managers that connect to the specific types of files (sales, inventory, transactions etc) for that vendor.
My current play is to have the data flow(.dtsx file) parse the store name (from the file name) for each of the file types. It would load/process each of the available file formats (sales, inventory, transactions etc) for that store in that vendor folder. We want the process to load all data files from a given store. It would then move on to the next available store (same vendor). I would like to set the process to run multithreaded so that I am loading as many of the stores for that vendor as possible (there could be over a hundred stores for each vendor) at the same time. How do I get each dataflow to run multiple instances (Instance1 for vendor(x).dtsx, instance2 for vendor(x).dtsx etc) for maximum Vendor(x) input. What is the best way/process/design to track each store name so that each process (instance1, instance2 etc) is loading a distinct store. The files will be moved to a history folder as they are processed. Should I have a different process that gets each store name initially and then saves that information to a SQL table. The dataflow would load the next available store name from the SQL table query and let SQL lock that store?
Please help. We have no idea what is wrong. Error message occurs at the end of the load. "Error at Destination for Row number 6218607. Errors encountered so far in this task: 1. SqlDumpExecptionHandler: Preocess 11 generated fatal exception c0000005 EXCEPTION_ACCESS_VIOLATION. SQL Server is terminating this process."
I installed Sql Server Express Advanced today, and decided to install the toolkit as well. When I open Business Intelligence Development Studio, I get the "Package Load Failure" for the 'ReportDesignerPackage' and 'DataWarehouse VSIntegration layer' packages. I can't seem to find any recent clogs or forum responses that address this issue, and the older ones (most are from 2005) haven't solved the problem. Do I have to reinstall everything???
It is possible to program part of the process of load of data within the SSIS. The origin is a Flat file (.txt and .dat) and the destiny a SQL Server 2005. All the fields of the file are not mapean origin with the destiny table and data are needed other tables that are in the Data Base.
I cannot find any information on this error. It occurs on packages that are writing to the same table using a sql server destination. I suppose it would be a good exercise in error handling, but I'd rather avoid it.
In my SSIS package I am trying to import a .csv flat file with about 170,000 rows of data with 75 columns (I know this is rather large). I would like to create a SSIS package that loads ALL of the rows of data into a table. This table has it's fields defined (such as money, float, varchar, datetime, etc). I want the data to load to this table even if there is a conversion error converting any field. When the data is loaded to this table, any fields that couldn't be converted should be set to null. Of the 75 columns of data, there are MANY columns that could be invalid.
For example, if the flat file contains the value "00/00/00" for my "PaidDate" field, I would want all of the other fields to be populated and the value for this particular record's "PaidDate" field to be null.
It would also be nice if I could trap any of the fields that caused a problem and log them along with the row.
I know that SSIS supports the "Redirect Row" method for handling data, but in my case, I don't want to redirect it. I want to continue to load it, just with a null value.
Is there an easy way of doing this (i.e. perhaps by using the Advanced Editor for the Flat File Source and setting something in the Input/Output columns)? I really don't want to create a Derived Column or Data Conversion transformation for every field that needs to be converted (b/c there are 75 columns to maintain this for).
I know if I import this file to an Access 2003 database, it imports the data fine and logs any conversion errors to an "_ConversionErrors" table. I am basically looking for this same behavior in SSIS, without having to maintain 75 columns.
Imports System.Data.SqlClientImports System.DataImports Microsoft.SqlServerPartial Class _Default Inherits System.Web.UI.Page Protected Sub Button1_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles Button1.Click End Sub Protected Sub TextBox3_TextChanged(ByVal sender As Object, ByVal e As System.EventArgs) Handles TextBox3.TextChanged End Sub Protected Sub SqlDataSource1_Selecting(ByVal sender As Object, ByVal e As System.Web.UI.WebControls.SqlDataSourceSelectingEventArgs) Handles SqlDataSource1.Selecting End Sub Protected Sub Save_Button1_Click(ByVal sender As Object, ByVal e As System.EventArgs) Handles Save_Button1.Click Dim conn As New SqlConnection() conn.ConnectionString = Data Source=REDSTONE;Persist Security Info=True;User ID=******;Password=******;Unicode=True End SubEnd Class
End of statement expected. C:Documents and SettingskasireddyMy DocumentsVisual Studio 2005WebSitesBankCustomerInfo.aspx.vb 'System.Data' is a namespace and cannot be used as an expression. C:Documents and SettingskasireddyMy DocumentsVisual Studio 2005WebSitesBankCustomerInfo.aspx.vb C:Documents and SettingskasireddyMy DocumentsVisual Studio 2005WebSitesBankCustomerInfo.aspx.vb
I have an Execute Process Task within my package that executes a BCP command which outputs the resultset of a query to a file in the network share. It works fine most of the time, however sometimes for unknown reasons, the following error message gets logged in my log table - In Executing "c:Program FilesMicrosoft SQL Server90ToolsBinncp.exe" "Select Comments, SoldToCustomerNbr, ProductGroupingCode, ProductGroupingName, RevSumCategoryCode, RevSumCategoryName, ValidFromDate, DTSCollectPct, DTSPrepaidPct, DTSPickUpPct, DCCollectPct, DCPrepaidPct, DCPickUpPct From ShipmentTypeCustomerBlend" queryout \xxxLOGShipmentTypeCustomerBlendLog_060719201440.txt -c -t" " -SDummyServer -T -e"d:SSIS Error LogsJob ExecutionBcpErrors.log" at "", The process exit code was "1" while the expected was "0".
The above error was captured from the System::ErrorDescription variable, by the error event handler, that was attached to the Execute Process Task. This error does not help me to debug the issue.
On running the below statement from the command prompt, i get the actual error message, which is the expected behavior -
This error message indicates that either the network path - \xxxLOG is not available for the output file creation or the file - \xxxLOGShipmentTypeCustomerBlendLog_060719201440.txt could not be created for some reason.
I 've tried to capture the error message from the StandardErrorVariable and the StandardOutputVariable properties of the ExecuteProcess Task, but in vain.
Is this a bug ? If so, is there a way to get the actual error message from the task ?
I have an execute process task that kicks off gzip to uncompress files within a for each loop. We get a LOT of bad files which causes gzip to throw an unexpected EOF error. This gets bubbled up into SSIS as a Win32 unhandled exception error which then throws up the VS JIT Debugger interface. I know what these errors are and do not want to debug. Is there anyway that I can simply ignore the exception and just throw it away?
UPDATE: Using Visual Studio 2012 and SSDT 11.1.50512.0
I am using this code to load a dacpac and get the version number from it:
using (DacPackage dacpac = DacPackage.Load(ADacpacFile)) { retVal = dacpac.Version.ToString(); }
After I install the application on a test machine, this code executes properly and returns the version number on the first run of the application. I can execute the code multiple times while running and retrieve the version number.
Once I exit the application and then reload it, the above code fails with the non-descript error message "Could not load package from 'filename.dacpac'. Stack trace shows: at Microsoft.SqlServer.Dac.DacPackage.Load(String filename, DacSchemaModelStorageType modelStorageType, FileAccess packageAccess).
I have tried rebooting. I have tried manually copying over the dacpac files from the install source. I get the same error every time. The only thing that works is uninstalling the application and reinstalling it. Then it works for the first execution and fails again afterwards.
On my development box, it works every time without error. Only when I deploy the application via install does this happen. I have tried two different boxes (both Win7 64bit) with the same result. The dll's that are being installed with the application are:
Hello, I currently have a Transactional Log reader agent failing with the below error: The process could not execute 'sp_replcmds' Error: 14151, Severity: 18, State: 1 SQL Server Assertion: File: <logscan.cpp>, line=2223 Failed Assertion = 'm_noOfScAlloc == 0'. Stack Signature for the dump is 0x24642FE5 Error: 3624, Severity: 20, State: 1. SQL Server Assertion: File: <logscan.cpp>, line=1985 Failed Assertion = 'startLSN >= m_curLSN'. Stack Signature for the dump is 0xD7150BD4 Now, I understand that SP4 is supposed to fix a similar issue. SP4 has been installed and the errors keep happening. I do notice that the hot fix mentions different line numbers than the above errors. Does anyone know if this is a new bug? If not can someone explain the fixes to me, thanks,
I have stored procedure .In SP i am using cursur to load data from Parent to several child table.
I have attached the script with this message.
And my problem is how to use direct select and insert or load to speedup the process instead of cursor.
USE [IconicMarketing] GO /****** Object: StoredProcedure [dbo].[SP_DMS_INVENTORY] Script Date: 3/6/2015 3:34:03 PM ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO