I have a stored procedure that uses containstable and want to make it a little dynamic so I was going to add a parameter that consist of the column names that needed to be search. But when I add a variable I get an error saying incorrect syntax....
Can you not use a variable as a column list?? I have a variable for search criteria and it works fine...
Here is my syntax
containstable([tablename],@columnlist,@srch)
I have been looking online and can't seem to find anything that says I can or cannot use a variable.
I have an integer variable which i had to cast as a string in order for it to show as an integer in the sql expression which is a string in order for it to INSERT as an integer? Read that 5 times fast!!!
all the variable beginning with rc are Int32 variables being loaded using the Row Count task. they are being inserted into an integer field in the table. Is this whack or what???
I was trying to cast is using DT_I4 but couldn't find any samples nor could I get it to work. The above is successful!!!
I have a client program that writes to sql server database 10 records per second . i want to compute the CPU usage and the memory usage for the whole program or CPU usage,memory usage for the insert statement in the program .
Hello, When I am seeing SQL Server 2005 Management studio Server Dashboard> I am seeing my(USERS) databases and msdb database usage is very small % of in CPU Usage(%), Logical IO Performed (%) Usage pie chart.
90% of Total cpu usage is showing for Adhoc Queries. what excatly this means in Dashboard? if application uses more than it would have shown in Database level or not?
sicerely this dashboard is good, if any one is watching daily, please advice their experiences here.
I'm working on an SSIS package that uses a vb.net script to grab some XML from a webservice (I'd explain why I'm not using a web service task here, but I'd just get angry), and I wish to then assign the XML string to a package variable which then gets sent along to a DataFlow Task that contains an XML Source that points at said variable. when I copy the XML string into the variable value in the script, if do a quickwatch on the variable (as in Dts.Variable("MyXML").value) it looks as though the new value has been copied to the variable, but when I step out of that task and look at the package explorer the variable is its original value.
I think the problem is that the dataflow XML source has a lock on the variable and so the script task isn't affecting it. Does anyone have any experience with this kind of problem, or know a workaround?
I have a SQL Task that updates running totals on a record inserted using a Data Flow Task. The package runs without error, but the actual row does not calculate the running totals. I suspect that the inserted record is not committed until the package completes and the SQL Task is seeing the previous record as the current. Here is the code in the SQL Task:
DECLARE @DV INT; SET @DV = (SELECT MAX(DateValue) FROM tblTG); DECLARE @PV INT; SET @PV = @DV - 1;
I've not been successful in passing a SSIS global variable to a declared parameter, but is it possible to do this:
DECLARE @DV INT; SET @DV = ?; DECLARE @PV INT; SET @PV = @DV - 1;
I have almost 50 references to these parameters in the query so a substitution would be helpful.
I'm new to SSIS, but have been programming in SQL and ASP.Net for several years. In Visual Studio 2005 Team Edition I've created an SSIS that imports data from a flat file into the database. The original process worked, but did not check the creation date of the import file. I've been asked to add logic that will check that date and verify that it's more recent than a value stored in the database before the import process executes.
Here are the task steps.
[Execute SQL Task] - Run a stored procedure that checks to see if the import is running. If so, stop execution. Otherwise, proceed to the next step.
[Execute SQL Task] - Log an entry to a table indicating that the import has started.
[Script Task] - Get the create date for the current flat file via the reference provided in the file connection manager. Assign that date to a global value (FileCreateDate) and pass it to the next step. This works.
[Execute SQL Task] - Compare this file date with the last file create date in the database. This is where the process breaks. This step depends on 2 variables defined at a global level. The first is FileCreateDate, which gets set in step 3. The second is a global variable named IsNewFile. That variable needs to be set in this step based on what the stored procedure this step calls finds out on the database. Precedence constraints direct behavior to the next proper node according to the TRUE/FALSE setting of IsNewFile.
If IsNewFile is FALSE, direct the process to a step that enters a log entry to a table and conclude execution of the SSIS.
If IsNewFile is TRUE, proceed with the import. There are 5 other subsequent steps that follow this decision, but since those work they are not relevant to this post. Here is the stored procedure that Step 4 is calling. You can see that I experimented with using and not using the OUTPUT option. I really don't care if it returns the value as an OUTPUT or as a field in a recordset. All I care about is getting that value back from the stored procedure so this node in the decision tree can point the flow in the correct direction.
The SSIS package passes the FileCreateDate parameter to this procedure, which then compares that parameter with the date saved in tbl_ImportFileCreateDate.
If the date is newer (or if there is no date), it updates the field in that table and returns a TRUE IsNewFile bit value in a recordset.
Otherwise it returns a FALSE value in the IsNewFile column.
SELECT @CreateDateInTable = FileCreateDate FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName
IF EXISTS (SELECT ProcessName FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName)
BEGIN
-- The process exists in tbl_ImportFileCreateDate. Compare the create dates.
IF (@FileCreateDate > @CreateDateInTable)
BEGIN
-- This is a newer file date. Update the table and set @IsNewFile to TRUE.
UPDATE tbl_ImportFileCreateDate
SET FileCreateDate = @FileCreateDate
WHERE ProcessName = @ProcessName
SET @IsNewFile = 1
END
ELSE
BEGIN
-- The file date is the same or older.
SET @IsNewFile = 0
END
END
ELSE
BEGIN
-- This is a new process for tbl_ImportFileCreateDate. Add a record to that table and set @IsNewFile to TRUE.
INSERT INTO tbl_ImportFileCreateDate (ProcessName, FileCreateDate)
VALUES (@ProcessName, @FileCreateDate)
SET @IsNewFile = 1
END
SELECT @IsNewFile
The relevant Global Variables in the package are defined as follows: Name : Scope : Date Type : Value FileCreateDate : (Package Name) : DateType : 1/1/2000 IsNewFile : (Package Name) : Boolean : False
Setting the properties in the "Execute SQL Task Editor" has been the difficult part of this. Here are the settings.
General Name = Compare Last File Create Date Description = Compares the create date of the current file with a value in tbl_ImportFileCreateDate. TimeOut = 0 CodePage = 1252 ResultSet = None ConnectionType = OLE DB Connection = MyServerDataBase SQLSourceType = Direct input IsQueryStoredProcedure = False BypassPrepare = True
I tried several SQL statements, suspecting it's a syntax issue. All of these failed, but with different error messages. These are the 2 most recent attempts based on posts I was able to locate. SQLStatement = exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output SQLStatement = exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
Parameter Mapping Variable Name = User::FileCreateDate, Direction = Input, DataType = DATE, Parameter Name = 0, Parameter Size = -1 Variable Name = User::IsNewFile, Direction = Output, DataType = BYTE, Parameter Name = 1, Parameter Size = -1
Result Set is empty. Expressions is empty.
When I run this in debug mode with this SQL statement ... exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output ... the following error message appears.
SSIS package "MyPackage.dtsx" starting. Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "No value given for one or more required parameters.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
When the above is run tbl_ImportFileCreateDate does not get updated, so it's failing at some point when calling the procedure.
When I run this in debug mode with this SQL statement ... exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output ... the tbl_ImportFileCreateDate table gets updated. So I know that data piece is working, but then it fails with the following message.
SSIS package "MyPackage.dtsx" starting. Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC001F009 at GLImport: The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object. ". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
The IsNewFile global variable is scoped at the package level and has a Boolean data type, and the Output parameter in the stored procedure is defined as a Bit. So what gives?
The "Possible Failure Reasons" message is so generic that it's been useless to me. And I've been unable to find any examples online that explain how to do what I'm attempting. This would seem to be a very common task. My suspicion is that one or more of the settings in that Execute SQL Task node is bad. Or that there is some cryptic, undocumented reason that this is failing.
I am in the middle of taking course 2073B €“ Programming a Microsoft SQL Server 2000 Database. I noticed that in Module9: Implementing User-Defined Functions exercise 2, page 25; step 2 is not returning the correct answer.
Select employeeid,name,title,mgremployeeid from dbo.fn_findreports(2)
It returns manager id for both 2 and 5 and I think it should just return the results only for manager id 2. The query results for step 1 is correct but not for step 2.
Somewhere in the code I think it should compare the inemployeeid with the previous inemployeeid, and then add a counter. If the two inemployeeid are not the same then reset the counter. Then maybe add an if statement or a case statement. Can you help with the logic? Thanks!
Here is the code of the function in the book:
/* ** fn_FindReports.sql ** ** This multi-statement table-valued user-defined ** function takes an EmplyeeID number as its parameter ** and provides information about all employees who ** report to that person. */ USE ClassNorthwind GO /* ** As a multi-statement table-valued user-defined ** function it starts with the function name, ** input parameter definition and defines the output ** table. */ CREATE FUNCTION fn_FindReports (@InEmployeeID char(5)) RETURNS @reports TABLE (EmployeeID char(5) PRIMARY KEY, Name nvarchar(40) NOT NULL, Title nvarchar(30), MgrEmployeeID int, processed tinyint default 0) -- Returns a result set that lists all the employees who -- report to a given employee directly or indirectly AS BEGIN DECLARE @RowsAdded int -- Initialize @reports with direct reports of the given employee INSERT @reports SELECT EmployeeID, Name = FirstName + ' ' + LastName, Title, ReportsTo, 0 FROM EMPLOYEES WHERE ReportsTo = @InEmployeeID SET @RowsAdded = @@rowcount -- While new employees were added in the previous iteration WHILE @RowsAdded > 0 BEGIN -- Mark all employee records whose direct reports are going to be -- found in this iteration UPDATE @reports SET processed = 1 WHERE processed = 0
-- Insert employees who report to employees marked 1 INSERT @reports SELECT e.EmployeeID, Name = FirstName + ' ' + LastName , e.Title, e.ReportsTo, 0 FROM employees e, @reports r WHERE e.ReportsTo = r.EmployeeID AND r.processed = 1 SET @RowsAdded = @@rowcount -- Mark all employee records whose direct reports have been -- found in this iteration UPDATE @reports SET processed = 2 WHERE processed = 1 END RETURN -- Provides the value of @reports as the result END GO
insert into #t(branchnumber) values (005) insert into #t(branchnumber) values (090) insert into #t(branchnumber) values (115) insert into #t(branchnumber) values (210) insert into #t(branchnumber) values (216)
[code]....
I have a parameter which should take multiple values into it and pass that to the code that i use. For, this i created a parameter and temporarily for testing i am passing some values into it.Using a dynamic SQL i am converting multiple values into multiple records as rows into another variable (called @QUERY). My question is, how to insert the values from variable into a table (table variable or temp table or CTE).OR Is there any way to parse the multiple values into a table. like if we pass multiple values into a parameter. those should go into a table as rows.
Hello, I run the DTS to copy data from Progress to SQL Server. How can match/convert the date variables to check. The field p-date as format 'mm-dd-year' . It has the value of 10/09/2001. The field s-date as varchar format. It has the value 2001-10-09. How can use the where condition ( Select ...... WHERE p-date = s-date.)
Hello! I'm using SQL Server 2000. I have a variable which contains the name of another variable scoped in my stored procedure. I need to get the value of that other variable, namely:
CREATE table #myTable(value VARCHAR(20)) INSERT into #myTable values('@operation')
SELECT top 1 @parameterValue = value from #myTable -- Now @parameterValue is assigned the string '@operation' -- Here I need some way to retrieve the value of the @operation variable I declared before (in fact -- another process writes into the table I retrieved the value from), in this case 'DIS'
DROP TABLE #myTable
I've tried several ways, but didn't succeed yet! Please tell me there's a way to solve my problem!!!
I have an SSIS package that creates a csv file based on a execute sql task.
The sql is incredibly simple select singlecolumn from table.
In the properties of the execute sql task I specify the result set as "full result set" when I run it I get the error that: Error:
The type of the value being assigned to variable "User::CSVoutput" differs from the current variable type.
Variables may not change type during execution. Variable types are strict, except for variables of type Object.
If I change the resultset to single row then I only get the first row from the DB (same if I choose none), If I choose XML then I get the error that the result is not xml. What resultset type do I need to choose in order to get all the rows into the CSV? The variable I am populating is of type string and is User::CSVoutput
I need to provide some infomation on how much (trans/request)SQL7 can handle. I checked the white papers and testimonials, but don't see any actual numbers. We have a clustered SQL7 environment sitting on some Compaq 6400 using 4 cpu's. Our database size is only about 3.5 gig and we are using IIS 3. Does anyone know where to get this information?
Hi guys, I have a sell server its a new server 4 processors and 4 gigs of ram. SQL server is pinning the CPU's 100% I can’t figure out why, I'm at mdac 2.7 with the hotfix. I don’t know what else to look at.
Hi .... I'd like to know if I can use BCP to transfer the whole database to a device or file and then back to the server(I mean,using only one server) with a new char set/sort order configuration,or I have to do it table by table??? How do I use BCP to perform this task?What commands and parameters should I use???
I have a SQL 2000 Server, dual processer with hyperthreading. 2Gb of Ram. The machine is only being used as a SQL Server
Normally the server runs at about 15-20% usage. I have now noticed a problem where the SQL server suddenly jumps to 100%, this can happen after a week or a couple of days. It requires a server reboot to fix the problem. Stopping the SQL server and restarting will not work.
I don't know if this is part of the problem, but I have noticed that after a day or so the processor usage climbs by about 10%, if I stop the SQL service and restart, the processor usage drop by the 10%.
Also I have set the memory usage to be 1.5Gb, but it takes a day or two for the SQL server to consume this amount of memory. Don't know if this has anything to do with it.
I am new to MSSQL, Iam using it to store largeamounts of data on a daily basis,that I import from a CSV file at the rate that I am going it should be about 1Gig a month of data. I noticed that as I add data to MSSQL my ram usage climbs by the size of the data. Is there someting Ihave done wrong inthe setup.
hi all, when I click web page (executing some stored procedure generally will take less than 2 minutes) the CPU usage is becoming 100% and taking a long time to run.I already posted a forum before (SQL server 2005 running slow ). I dont know these problems are related.If I restart the server then it will run as ususal.What should be problem.Server Windows 2003 ,SQL server 2005 Is it bcos of any memory lekage or any other reason..
HiI am having a real issue with CPU usage by SQL Server, and it is notrelated to a poor query.I have a clients database that I am currently investigating some issueswith. After I perform a standard task using the application, and theresults have been returned to the application the cpu usage remains at100%.Even once the application has been completely closed down the cpu usageremains at 100%. Nothing else is happening.I am at a complete loss as to how to proceed to with investigation ofthis issue (i have been looking at this for over a week using SQLServer tools and Performance Monitor, and eliminating various otherpossibilities)I downloaded Process Explorer, and looking at the threads forsqlservr.exe there is one in particular that is consuming all of thecpu time:MSVCRT.DLLI am running SQL Server 2000 SP4 on the following machine:Windows 2000 SP4Pentium 4 3Ghz (Note this is seen as 2 processors so the reported cpuusage is 50%)1Gb MemoryI also have about 20Gig of free disk space.One other thing, the page faults reported on the Performance tab ofProcess Explorer exceeds 3 million. This is after running theapplication for around 10 minutes.Please can anybody suggest anything at all that might help? I'm sorrythere is not too much information in here but I have not been able tofind out anything useful!Many Thanks in advance.Paul
Hi guys,Got an odd SQL string that I need to produce that is most probably simple toconstruct but with it being hot in our office, I simply can't get my headaround it....!!Its based around an online emailing facility whereby multiple hotels can beemailed via a single application. Users within the application have accessrights to email only specific hotels.The tables are laid out like this (irrelevant columns left out)...CampaignID, CampaignName, CampaignHotelIDs1 Test Campaign 1,4,5,7,92 Test Campaign2 1,2UserID, UserName, UserHotelIDAccess1 Test User 1,6,72 Test User 2,7Now on the stats page I want to give users access to view ONLY sentcampaigns to which they have access to view, I was considering the IN SQLstatement to achieve something like this...'WHERE CampaignHotelIDs IN UserHotelIDAcess'....but that doesn't want to work, can anyone give me any ideas to get thisworking within just a single SQL query?Cheers, @sh
I have 2 SQL Server 2005 instances in one server. We noticed that CPU usage is high in the server. Is there is any posibility to know how many percentage each SQL Server instances is taking?
LIke Total usage is 60%. Instance 1 is 40% Instance 2 is 20%
Hi all, I want to write a backup application for SQL server.I have read the VDI specifications. I want to know whether I can use the VDI just to freeze and thaw without taking the snaphot and the backup. Or is there any other way to do the same ?
Good day to all, I'm new here, so I don't know if this is the right forum to post my problem. I have a web application written using C# .net 2005 (W/ajax). The application has a module that uploads data from excel file to the sql server 2005 database. w/c is by the way, i'm using SQL 2005 Express Edition, the app can upload up to more than 10,000 records from an excel file. Everything is ok until it was deployed in a test environment, while having a run through with the system, the application encounter an error after which, we cannot log in to the system anymore. I restarted the server (web and sql server in 1 machine running winxp) then I can log-in again in the system. When I'm tracing where the problem came from, I noticed that the memory usage of sqlservr.exe increases everytime the app connects to the server. I already fix some code to close some objects that might have caused the high memory usage, then I run sp_who in the management studio and there are still connections used by the app AWAITING COMMAND. Then I manually kill (using kill spid) connection that are left opened by the application. But the mem usage of sqlservr did no decrease. Is there a way to release the memory usage of sqlservr.exe? In ASP.Net ? I have a hint that this has been causing the error. Thanks a lot.
Hi, it seems that every day SQL Server 2000 has some kind of memory leak, the memory usage creeps above 150000 approximately 3 times per day. Is this normal? It starts at about 13000.
Is there any way that I can monitor what is causing the memory usage to be so high and maybe rectify it?