I am having a problem importing data into SQL 7 from any type of source. I go through the whole import process no problem. When I click the finish button to start the import, nothing at all happens. Enterprise Manager and the DTS just hang and I must use crtl+alt+delete to end the program. Can anyone give me any suggestions as to what might be happening. Big Thanks in advance, I've been working on this for days.
I am a VoIP phone system using SQL on the back end. I am trying to get a Trigger to fire and email me when a certain number has been dialed.
Create Trigger trg_Emergency_Calls on dbo.CallDetailRecord for Insert
IF @@ROWCOUNT=0 RETURN ---NO rows affected exit proc
IF (SELECT finalcalledpartynumber FROM inserted)='95593684' BEGIN --RAISERROR ('Call Stored Procedure Here',16,10) EXEC WEB_SRVR03.master.dbo.sp_SMTPMail @body='This is a test Email'
END Return
GO
The problem is I have to execute the actually email SP is on another server and has to be that way. The trigger actually runs each time but if the IF statement becomes true then the trigger hangs and never completes. Watching the other server(WEB_SRVR03) there is never a request to execute the sp_SMTPMail. I have been trying to troubleshoot this with profiler but I never see any locks or anything that would give me a problem. Also the insert statement that caused the trigger to fire also never finishes and so the record isn't written to the db. If anyone has any suggestions I would appreciate it. Thanks
I want to finsih the execution of a DTS package from an ActiveX task. If a condition is ok, the package would continue as normal. If not, I want to finish the package without any error. Just without executing the next tasks. Do you have any idea?
I have a Stored Procedure which Ideally should run when a Customer Logs in, the Procedure will check the available stock and create a Temp Table for the Information, which allows many other Queries in the site to run a lot faster, (due to no joins). The Query has taken as much as 30seconds (Lots of Records and 1/2 dozen Joins) to run upon log in and causing a Timeout for the web application.
I want the procedure to run as it is, but for the login method to not be dependent on the Process tried this in .NET cmd.BeginExecuteNonQuery() (cmd=SQLCOmmand) which doesn't do what I want it just allows me to run heaps of QUeries at the same time.
Can anyone help me with getting this procedure to run and not hold up the Web application? Not sure whether I need to do this in .NET, or whether I can get .NET to run a Batch File or something, but someone must have had a similiar problem, please help.
I have a scheduled job to run daily at 4 am. the job imports data from client side from text files and puts data in our sql table.It takes around 1 hour. the problem is it doesnt stops after the import process completes.so after one hour i can see the data is imported into my sql table so thats fine but the job keeps running. I tried observing that job's spid in activity monitor in sql 2005 but after one hour i cant even see that spid but still job runs. its weird.and after that when i stop the job manually then it stops saying job completed succcessfully. that step is a last step and it uses windows cmd.my understanding is the job step doesnt understands that it got finished. what should i do in here?? any ideas r appreciated we are running the same job for another servers and its fine.
No Error but Export to Excel does not finish When the report has 2 pages with total 500 rows exporting to Excel is not a problem. If it has 100 pages 5000 rows exporting to excel does not end and it does not return any error but the process does not end either. What might the problem be?
I have a database A include five tables, and have more than 1,500,000 rows. There is a replica database of A. First of all, there is no data in the two dbs. When I finish inserting data into A, the replica db seems still work, the log file size still changes. How can I know the replication finished or not? How long it will take to finish replicate 1,500,000 rows of data?
I have been working with this for about a month now, and no similar problems to date. Today I am trying to introduce 4 configuration flags that control whether optional ETL stage feeds are executed. I did this by adding a do-nothing script component. The precedent and constraint is used, and it checks the boolean variable flag. The first package executes fine. But it never returns from there. This precedent has nothing fancy on it either. It simply does not run any more of the package, make any more conditional checks, nor the common completion tasks. It just seems to think it is done.
The optionals all fire execute package tasks. One thing that might be tripping it up is that I attempt to run one package twice, each time with varying parent package variable set to control it to use a different destination database for each run. Should this not be OK to do?
Here's my dilema, I want to run a stored procedure that starts another stored procedure running, but does not wait for the stored procedure to complete execution.
The stored procedure should execute immediately, and leave the other procedure to complete running in the background. Is there any way to do this?
We recently upgraded out SQL version from SQL2008R2 to SQL2014. As such, the compatibility mode changed to SQL2104 (120).
We have several queries that used to run fine that now take forever to bring back results. There are no errors (which surprised me). They just take way too long now. PLus they seem to be causing high I/O and CPU.
If I change the compt level back to SQL2008 - these queries run fine.
QUERY with SQL2008 compt level - finished in 2 minutes. QUERY with SQL2014 compt level - finishes in 3 hours 22 minutes.
same exact query - same server - only thing changed was compatibility level.
WHat do I look for in the queries that could be causing this? (they look fine but obviously I'm missing something here)..
I've setup the option to mail the error to a person. When the option is on I get the error message by mail but the package does not finish (eg. the failing task does not become red and the output windows never says anything about the error) - if I set the option to off the task fails as expected.
I have this query I need for a report. Originally it was 4 queries to be used in Crystal Reports. Now I want to create the same report with SSRS and therefore I incorporated all queries in one in order not to use subreports [URL].....
Tempdb fills up to nearly 90 GB. I am running SQL Server on a local box, so I am sure there is no other traffic. Here is the query:
SELECT AdHaupt.NSprache_ID ,AdHaupt.mengentext AS mengentextHaupt ,AdHaupt.Einzelpreis ,AdHaupt.Anzeigebezeichnung ,AdHaupt.Gesamtpreis
[Code] ...
I ran it with TOP 10 as well, just to see if it will finish at all, but it never did (ran for an hour now).
I'm running my SSIS packages from a scheduler (WindowsService) that I wrote in .Net. For the logging of the SSIS events I only use te SSIS Log provider for SQL Server. Nevertheless, the package run start and stop events (and execution failures) are still logged in the Windows Event Log. Is there a way to stop SSIS from writing this event-log entries?
Hi all, There is nothing happen when I finished my checkout process, I expect the data will be saved to order and orderitem table in my SQL database, but no data found on order and orderitem table and no error messages display during operation!!! Below is my checkout.aspx.vb code, the whole code line number around 138, I captured the part from 1~ 64 line number, I suspect line 35 - 48 have a problem, can somebody help me, many thanks. 1 Imports System 2 Imports System.Data.SqlClient 3 Imports SW.Commerce 4 Partial Class CheckOut 5 Inherits System.Web.UI.Page 6 Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load 7 If Not Page.IsPostBack Then 8 If Profile.Cart Is Nothing Then 9 NoCartlabel.Visible = True 10 Wizard1.Visible = False 11 End If 12 If User.Identity.IsAuthenticated Then 13 Wizard1.ActiveStepIndex = 1 14 Else 15 Wizard1.ActiveStepIndex = 0 16 End If 17 End If 18 End Sub 19 Sub chkUseProfileAddress_CheckedChanged(ByVal sender As Object, ByVal e As System.EventArgs) 20 21 ' fill the delivery address from the profile, but only if it’s empty 22 ' we don’t want to overwrite the values 23 24 If chkUseProfileAddress.Checked AndAlso txtName.Text.Trim() = "" Then 25 txtName.Text = Profile.Name 26 txtAddress.Text = Profile.Address 27 txtcity.Text = Profile.City 28 txtCountry.Text = Profile.Country 29 End If 30 End Sub 31 Sub Wizard1_FinishButtonClick(ByVal sender As Object, ByVal e As System.Web.UI.WebControls.WizardNavigationEventArgs) 32 33 ' Insert the order and order lines into the database 34 35 Dim conn As SqlConnection = Nothing 36 Dim trans As SqlTransaction = Nothing 37 Dim cmd As SqlCommand 38 Try 39 conn = New SqlConnection(ConfigurationManager.ConnectionStrings("swshop").connectionstring) 40 conn.Open() 41 trans = conn.BeginTransaction 42 cmd = New SqlCommand() 43 cmd.Connection = conn 44 cmd.Transaction = trans 45 46 ' set the order details 47 48 cmd.CommandText = "INSERT INTO Order(MemberName, OrderDate, Name, Address1, Address2, Country, Total) VALUES (@MemberName, @OrderDate, @Name, @Address, @city, @Country, @Total)" 49 cmd.Parameters.Add("@MemberName", Data.SqlDbType.VarChar, 50) 50 cmd.Parameters.Add("@OrderDate", Data.SqlDbType.DateTime) 51 cmd.Parameters.Add("@Name", Data.SqlDbType.VarChar, 50) 52 cmd.Parameters.Add("@Address", Data.SqlDbType.VarChar, 255) 53 cmd.Parameters.Add("@City", Data.SqlDbType.VarChar, 15) 54 cmd.Parameters.Add("@Country", Data.SqlDbType.VarChar, 50) 55 cmd.Parameters.Add("@Total", Data.SqlDbType.Money) 56 cmd.Parameters("@MemberName").Value = User.Identity.Name 57 cmd.Parameters("@OrderDate").Value = DateTime.Now() 58 cmd.Parameters("@Name").Value = CType(Wizard1.FindControl("txtName"), TextBox).Text 59 cmd.Parameters("@Address").Value = CType(Wizard1.FindControl("txtAddress"), TextBox).Text 60 cmd.Parameters("@City").Value = CType(Wizard1.FindControl("txtCity"), TextBox).Text 61 cmd.Parameters("@Country").Value = CType(Wizard1.FindControl("txtCountry"), TextBox).Text 62 cmd.Parameters("@Total").Value = Profile.Cart.Total 63 Dim OrderID As Integer 64 OrderID = Convert.ToInt32(cmd.ExecuteScalar())
SQLServer2005_SSMSEE in my notebook with vista business and nearly the end of installation the process is cancelled by an error. its necessary install some application before? SQL Server Express is installed by default in the notebook.
thaks a lot and excuse for my bad english (bettr more for me is SPANISH)
i searched and all i found is questions, not answers.
maybe it's a silly question, but i really can't find any documention / posts about this.
i have a scheduled job in sqlagent that executes a SSIS package that runs every minute. As a result, my application log in eventviewer gets filled very quickly.
i tried using "/REPORTING E" option with no luck.
i tried enabling logging on the package, and then select only the OnErro Event, no luck.
I'm setting up a Maintenance Plan to do a backup for me in the night of my SQL Databases but before the backup starts I have created a SQL Server Agent Job that runs a DOS batch (.bat) file using CmdExec. The problem is I need this process to finish processing before the rest of the tasks run in the Maintenance Plan. It seems like SQL Server Agent Job Task tells the batch file to start running (and it runs fine) but it just continues to the next task and does not wait for it to finish.
Is there any way I can configure this to wait before proceeding?
I was wondering if anyone knew of a way to disable the following, SQLISPackage start/finish events sent to windows event logs everytime a SSIS package is executed and completed.
I need to execute a DTS that have a couple of steps and one of them is a process task that simply call an exe file i made that will send an email to warn the user.
What happens here is that the process task executes my exe file but it doesn't wait for it to compete and fires the next task after and finally closes.
There is anyway to make a "while statament" to wait until my exe application finishes?
Ive built an SSIS package which generates a file from a legacy system and then downloads the file into a designated folder on the server. I need the file watcher task to wait for a the file to completely finish loading before it says it is complete. Currently, as soon as the file is created, the WMI step finishes.
I posted a message about this yesterday but I have more info..
I have a server (SQL 7.0, sp2) that anywhere I connect to it through EM the EM hangs. I can drill down into the server and even get into the Security drop down but if I hit Databases to drill down or Management etc. It hangs.
I did profiler on this and another server. When I click the + by the Database folder I hang and get "TSQL:Batch Starting: exec sp_MSdbuseraccess N'db', N'%'" but never get a "TSQL:BatchCompleted" for that statement. (Where other working servers I do get the set of both start and completed)
Drilling into the Security tab runs the query: exec sp_MSDBUserpriv... etc.) and this one works..
Anyone have any info on: exec sp_MSdbuseraccess N'db', N'%'
I can't find anything in BOL, Microsoft KB, Technet etc.. Please Help as there are certain things I can't do without EM... THANKS!!!
I have been trying all day to install SQL 7.0 on an NT 4.0 SP6a server. It gets to the point where it wants to load MDAC and just goes out to lunch. No errors are generated and the rest of the system does not appear to be affected. However, the only way to end the install is to end task it. Has anyone run into this?
Some of my jobs hang at FTP step for several hours until some one kills the FTP (it's a perl process) on the box. Is there a way to find how much time a particular step of the job is executing? and know it by somehow if it's hanging at that step?? If we know this,I want to kill the perl process(FTP) on the box in an automated way whenever the job hangs! Any ideas welcome!! Thanks. :-)
Hi all - I'm having a problem with a dts job that I haven't run across before. The function of the job itself is pretty basic just moving data into a sql table using a few lookups and mappings. The job starts fine and gets to a status of 5000 records processed and just sits there. It's doesn't go into a 'not responding state' nor does it produce any error information in the error file I set up. Anyone have any thoughts on what might be happening? How to identify the specific record it may be hanging on? Help!! - this one is driving me nuts.
I'm running a pretty simply DTS just to pump some data into a table. Nothing fancy just a couple lookups and a couple columns being copied over. I get to 5000 records processed and the DTS just spins its wheels. It does not go into a 'not responding' state. There are no indexes on the table and the db size is plenty big so it is not reindexing or resizing the db at this point. Any ideas on how I can debug this one would be greatly appreciated.
I've run dbcc check commands on a 5Gig database which has 2 Gig of unallocated space, with no problems with the tables or database. However when i try to run dbcc shrinkfile ( on the DATA file ) from Query Analyser the command just runs indefinitely with no response.
I am having problems of my DTS package hanging when run through the sql server agent through a scheduled job. If I execute the job manually it runs fine? If I run the package manually it runs fine? Of course there is no way to actually tell what is being hung? Any ideas?
Hi everyone, I had to do a restore of a database from tape. I have the tape set to eject when it's complete, the tape ejects and the dialog box looks like the restore finishes fine. But when I look at the database after the restore it is still in the loading phase. So the restore never really took or looks like it hangs. Do you guys know why or how to fix this problem? Thanks in advance.
My client is using SQL Server 6.5 on NT 4.0. They have recently began to have their SQL server session freeze intermittently. I have determined that only users that are accessing one particular database are freezing which leads me to a locking problem. What is the best way to determine the locking problem (i.e. event logs, sp_lock, ....)?
I've done some searching, asking of friends, and searching every log file and event file I can think of. So now I'm coming here.
Recently I moved some of our databases from an NT4 box running SQL 7 to an Advanced 2000 box running SQL 2000. The web server is still on an NT4 box. It seems that about three times a day or so ASP type files will hang on the webserver. This server hosts different sites and all ASP type files will stop even if some of them hit the old SQL7 server.
Right now I moved the web server back to looking at the SQL 7 machine and things are going fine.
Can anyone offer me a direction to start looking? Why is it working fine to the old stuff but not the new stuff? Is there an issue with NT4 with its IIS trying to talk to an Advanced 2000 with its SQL2000?
I've written a recursive CTE to update a table's heirarchy data with a lineage that the CTE builds. This works and executes over 80,000 records on my local test environment in about 45 seconds. However when I try to execute it from a "production" environment, it hangs and never completes the update (well at least it's taking longer than 5 minutes and I'm too impatient to wait). The CTE itself still runs fine and returns data if I do a Select *, but not when I run the update. Here's my CTE, any ideas? Is there anything in database settings that would disallow this query update? or field requirements?
WITH MyLineage (ID, Lineage, ParentID) AS (SELECT ID, CAST('.'+ convert(nvarchar(max), ID) +'.' AS nVarchar(max)) as Lin, ParentID FROM Equipment WHERE OrgID=2 AND ParentID IS NULL UNION ALL SELECT E.ID, CAST(L.Lineage+ convert(nvarchar(max),E.ID)+'.' AS nVarchar(max)) AS Lin, E.ParentID FROM Equipment E INNER JOIN MyLineage L ON L.ID=E.ParentID WHERE L.ID=E.ParentID) UPDATE E SET E.Lineage=L.Lineage FROM Equipment E INNER JOIN MyLineage L ON E.ID=L.ID
If I try to do the update like this:
UPDATE Equipment SET Lineage=(SELECT Lineage FROM MyLineage WHERE MyLineage.ID=Equipment.ID)
it will hang in my local test environment as well. It seems like the other sql 2005 server instance is converting the join update into a nested select update, or something. I don't know what's going on, help?