How To Pass Large Queries Into Variables In SSIS?
Nov 26, 2007
Hi,
I want to pass below given query into a variable
"if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[ <POS_MONTH>]]') and OBJECTPROPERTY(id, N'IsUserTable') = 1)
DROP TABLE [dbo].[ <POS_DATE>]
GO
SELECT * INTO [dbo].[<POS_DATE>] ]
FROM SG_POS_Template
WHERE 1 = 0;
GO"
Where [<POS_DATE>] is a parameter by which value will be assigned dymanically.....anybody please help me out....!!
View 5 Replies
ADVERTISEMENT
Jan 14, 2008
Can any one please help me. how to pass the database connection through variables to SSIS.
And also calling SSIS from visual studio.net
Regards
Ravi
View 1 Replies
View Related
Jun 1, 2006
Hi
Does anybody know how to pass values from asp dot net to SSIS package variables ?
Currently I have an SSIS package for monitoring windows service... for that...
I have to pass the Server-IP Addrress, UserName, Password, Service Name as Parameter.
I would like to pass these parameters through an Interface from RUN TIME.
Please help this problem
Regards
Deepu M.I
View 1 Replies
View Related
Jan 14, 2008
Can any one please help me. how to pass the database connection through variables to SSIS.
And also calling SSIS from visual studio.net
Regards
Ravi
View 1 Replies
View Related
Nov 30, 2007
I'm trying to understand how to use SSIS variables within SQL commands and which Data Flow toolbox objects I can use for this. My simple package has a OLE DB Source object that returns the rows in a table. For the next operation, the output from the OLE DB Source is input into a Lookup data flow transformation where there is a select query that returns another value from a different table...
select bla from myTable where dbname = db_name()
The output is then passed on to an OLE DB Destination.
The above lookup query only works because, it so happens, that I could get a unique and correct result back filtered by current database name. However, due to other required changes, I now have introduce more 'where clauses'. I have all the necessary clause information in SSIS variables (iterated via a For Each loop). I would like do something like...
select bla from myTable where dbname = ? and bla_type = ?
However, there isn't a facility for Parameters in the Lookup object's 'use results of a SQL query'. I can see that I can use parameters in the OLE DB Source object by changing the 'data access mode' to SQL Query. However, it doesn't seem valid to join two OLE DB Source objects together in the data flow? I can't see any other suitable objects in the Data Flow toolbox. Ideally, I would be able to use Parameters in the Lookup object or some other object that would do the same job.
Thanks,
Clive
View 6 Replies
View Related
Aug 8, 2013
How to pass variable from Parent to child and child to Parent Packages is this possible in SQL SSIS 2012. I need this only in SSIS 2012 ...
View 6 Replies
View Related
Feb 27, 2008
I'm new to SSIS, but have been programming in SQL and ASP.Net for several years. In Visual Studio 2005 Team Edition I've created an SSIS that imports data from a flat file into the database. The original process worked, but did not check the creation date of the import file. I've been asked to add logic that will check that date and verify that it's more recent than a value stored in the database before the import process executes.
Here are the task steps.
[Execute SQL Task] - Run a stored procedure that checks to see if the import is running. If so, stop execution. Otherwise, proceed to the next step.
[Execute SQL Task] - Log an entry to a table indicating that the import has started.
[Script Task] - Get the create date for the current flat file via the reference provided in the file connection manager. Assign that date to a global value (FileCreateDate) and pass it to the next step. This works.
[Execute SQL Task] - Compare this file date with the last file create date in the database. This is where the process breaks. This step depends on 2 variables defined at a global level. The first is FileCreateDate, which gets set in step 3. The second is a global variable named IsNewFile. That variable needs to be set in this step based on what the stored procedure this step calls finds out on the database. Precedence constraints direct behavior to the next proper node according to the TRUE/FALSE setting of IsNewFile.
If IsNewFile is FALSE, direct the process to a step that enters a log entry to a table and conclude execution of the SSIS.
If IsNewFile is TRUE, proceed with the import. There are 5 other subsequent steps that follow this decision, but since those work they are not relevant to this post.
Here is the stored procedure that Step 4 is calling. You can see that I experimented with using and not using the OUTPUT option. I really don't care if it returns the value as an OUTPUT or as a field in a recordset. All I care about is getting that value back from the stored procedure so this node in the decision tree can point the flow in the correct direction.
CREATE PROCEDURE [dbo].[p_CheckImportFileCreateDate]
/*
The SSIS package passes the FileCreateDate parameter to this procedure, which then compares that parameter with the date saved in tbl_ImportFileCreateDate.
If the date is newer (or if there is no date), it updates the field in that table and returns a TRUE IsNewFile bit value in a recordset.
Otherwise it returns a FALSE value in the IsNewFile column.
Example:
exec p_CheckImportFileCreateDate 'GL Account Import', '2/27/2008 9:24 AM', 0
*/
@ProcessName varchar(50)
, @FileCreateDate datetime
, @IsNewFile bit OUTPUT
AS
SET NOCOUNT ON
--DECLARE @IsNewFile bit
DECLARE @CreateDateInTable datetime
SELECT @CreateDateInTable = FileCreateDate FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName
IF EXISTS (SELECT ProcessName FROM tbl_ImportFileCreateDate WHERE ProcessName = @ProcessName)
BEGIN
-- The process exists in tbl_ImportFileCreateDate. Compare the create dates.
IF (@FileCreateDate > @CreateDateInTable)
BEGIN
-- This is a newer file date. Update the table and set @IsNewFile to TRUE.
UPDATE tbl_ImportFileCreateDate
SET FileCreateDate = @FileCreateDate
WHERE ProcessName = @ProcessName
SET @IsNewFile = 1
END
ELSE
BEGIN
-- The file date is the same or older.
SET @IsNewFile = 0
END
END
ELSE
BEGIN
-- This is a new process for tbl_ImportFileCreateDate. Add a record to that table and set @IsNewFile to TRUE.
INSERT INTO tbl_ImportFileCreateDate (ProcessName, FileCreateDate)
VALUES (@ProcessName, @FileCreateDate)
SET @IsNewFile = 1
END
SELECT @IsNewFile
The relevant Global Variables in the package are defined as follows:
Name : Scope : Date Type : Value
FileCreateDate : (Package Name) : DateType : 1/1/2000
IsNewFile : (Package Name) : Boolean : False
Setting the properties in the "Execute SQL Task Editor" has been the difficult part of this. Here are the settings.
General
Name = Compare Last File Create Date
Description = Compares the create date of the current file with a value in tbl_ImportFileCreateDate.
TimeOut = 0
CodePage = 1252
ResultSet = None
ConnectionType = OLE DB
Connection = MyServerDataBase
SQLSourceType = Direct input
IsQueryStoredProcedure = False
BypassPrepare = True
I tried several SQL statements, suspecting it's a syntax issue. All of these failed, but with different error messages. These are the 2 most recent attempts based on posts I was able to locate.
SQLStatement = exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
SQLStatement = exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
Parameter Mapping
Variable Name = User::FileCreateDate, Direction = Input, DataType = DATE, Parameter Name = 0, Parameter Size = -1
Variable Name = User::IsNewFile, Direction = Output, DataType = BYTE, Parameter Name = 1, Parameter Size = -1
Result Set is empty.
Expressions is empty.
When I run this in debug mode with this SQL statement ...
exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
... the following error message appears.
SSIS package "MyPackage.dtsx" starting.
Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec ? = dbo.p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "No value given for one or more required parameters.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
When the above is run tbl_ImportFileCreateDate does not get updated, so it's failing at some point when calling the procedure.
When I run this in debug mode with this SQL statement ...
exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output
... the tbl_ImportFileCreateDate table gets updated. So I know that data piece is working, but then it fails with the following message.
SSIS package "MyPackage.dtsx" starting.
Information: 0x4004300A at Import data from flat file to tbl_GLImport, DTS.Pipeline: Validation phase is beginning.
Error: 0xC001F009 at GLImport: The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
Error: 0xC002F210 at Compare Last File Create Date, Execute SQL Task: Executing the query "exec p_CheckImportFileCreateDate 'GL Account Import', ?, ? output" failed with the following error: "The type of the value being assigned to variable "User::IsNewFile" differs from the current variable type. Variables may not change type during execution. Variable types are strict, except for variables of type Object.
". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
Task failed: Compare Last File Create Date
Warning: 0x80019002 at GLImport: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (3) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "MyPackage.dtsx" finished: Failure.
The IsNewFile global variable is scoped at the package level and has a Boolean data type, and the Output parameter in the stored procedure is defined as a Bit. So what gives?
The "Possible Failure Reasons" message is so generic that it's been useless to me. And I've been unable to find any examples online that explain how to do what I'm attempting. This would seem to be a very common task. My suspicion is that one or more of the settings in that Execute SQL Task node is bad. Or that there is some cryptic, undocumented reason that this is failing.
Thanks for your help.
View 5 Replies
View Related
Jul 31, 2006
Hi
We already used Oracle Datasatage Server the following Query statement for Source and Lookup.here there is parameter maping in the SQl Statement . How can achive in SSIS the Folowing Querystatment?
Query 1: (source View Query)
SELECT
V_RDP_GOLD_PRICE.GDR_PRODUCT_ID, V_RDP_GOLD_PRICE.ASSET_TYPE, V_RDP_GOLD_PRICE.PREFERENCE_SEQ, V_RDP_GOLD_PRICE.RDP_PRICE_SOURCE, TO_CHAR(V_RDP_GOLD_PRICE.PRICE_DATE_TIME,'YYYY-MM-DD HH24:MI:SS'), TO_CHAR(V_RDP_GOLD_PRICE.REPORT_DATE,'YYYY-MM-DD HH24:MI:SS'), V_RDP_GOLD_PRICE.SOURCE_SYSTEM_ID
FROM
V_RDP_GOLD_PRICE V_RDP_GOLD_PRICE
WHERE
REPORT_DATE = (select max(report_date) from V_RDP_GOLD_PRICE where source_system_id = 'RM' )
Query 2: (look up )
SELECT
GDR_PRODUCT_ID,
TO_CHAR(MAX(PRICE_DATE_TIME),'YYYY-MM-DD HH24:MI:SS') ,
TO_CHAR(REPORT_DATE,'YYYY-MM-DD HH24:MI:SS')
FROM
V_RDP_GOLD_PRICE
where
GDR_PRODUCT_ID = :1 and
report_date = TO_DATE(:2,'YYYY-MM-DD HH24:MI:SS') AND
PRICE_DATE_TIME BETWEEN TO_DATE(:2,'YYYY-MM-DD HH24:MI:SS') - 7) AND TO_DATE(:2,'YYYY-MM-DD HH24:MI:SS')
GROUP BY GDR_PRODUCT_ID, TO_CHAR(REPORT_DATE,'YYYY-MM-DD HH24:MI:SS')
please anyone give the sample control flow and how to pass the parameter?
Thanks & regards
Jeyakumar.M
View 1 Replies
View Related
Aug 8, 2007
Hello,
I placed a post regarding this issue previously but no success. So I thought I explain everything properly this time in a new post. Thanks
I have created a stored procedure which passes variables to the ssis package and then executes the package.
The two variables inside the ssis package are @FileName and @ConnectionPath
As you can see from the below stored procedure, xp_cmdshell is used to execute the package.
If only the first variable is used in the package and the @connectionPath variable is hardcoded inside the package then package runs fine.
Problem is in this particular call as you see below because @ConnectionPath is included.
The output of print is:
dtexec /f d:sysapplCEMSSISImportsTradesBaseProfiles2.dtsx /set Package.Variables[User::FileName].Properties[Value];"d:ApplDataCEMWorkingTempprofiles.csv"
/set Package.Variables[User::ConnectionPath].Properties[Value];"Data Source=servername1instancename1, 2025;Initial Catalog=CounterpartyExposure;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False;"
Error is:
Error: 2007-08-08 08:46:01.29
Code: 0xC0202009
Source: BaseProfiles2 Connection manager "CounterpartyExposure"
Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft OLE DB Provider for ODBC Drivers" Hresult: 0x80004005 Description: "[Microsoft][ODBC Driver Manager] Data source name not found and no default driver specified".
End Error
if only the output is run in the query analyser then the error is:
The identifier that starts with 'Data Source=gblond088sjyMSQL_curves_DEV1, 2025;Initial Catalog=CounterpartyExposure;Provider=SQLNCLI.1;Integrated Security=SSPI' is too long. Maximum length is 128.
/*********************************************************************************
uspCEMTradeExecutePackage2 'd:sysapplCEMSSISImportsTradesStaticMappingOverride.dtsx',
'StaticMappingOverride.csv',
'Data Source=servername1instancename1, 2025;Initial Catalog=CounterpartyExposure;Provider=SQLNCLI.1;Integrated Security=SSPI;Auto Translate=False;'
*********************************************************************************/
ALTER procedure [dbo].[uspCEMTradeExecutePackage2]
@FullPackagePath varchar(1000),
@FileName varchar(500),
@ConnectionPath varchar(1000)
as
declare @returncode int
declare @cmd varchar(1000)
declare @FilePath varchar(1000)
declare @FullFilePath varchar(1000)
set @FilePath = 'd:ApplDataCEMWorkingTemp'
set @FullFilePath = @FilePath + @FileName
print ' ----------- ' + @FileName
set @cmd = 'dtexec /f ' + @FullPackagePath + ' /set Package.Variables[User::FileName].Properties[Value];"' + @FullFilePath + '"'
set @cmd = 'dtexec /f ' + @FullPackagePath +
' /set Package.Variables[User::FileName].Properties[Value];"' + @FullFilePath + '"
/set Package.Variables[User::ConnectionPath].Properties[Value];"' + @ConnectionPath + '"'
print @cmd
set nocount on
begin try
exec @returncode = master..xp_cmdshell @cmd
end try
begin catch
exec @LastGoodVersionSP
DECLARE @msg nvarchar(200)
SET @msg = ('Error during execute package')
EXECUTE uspErrorReporter @msg
end catch
set nocount off
View 3 Replies
View Related
Oct 20, 2000
Good morning one and all,
Does any1 know if when A SQL pass through query is executed in access (to a SQL srv dbase) wether or not access will wait until that call is completed before running other steps (in an macro say). I don't think it does but would appreciate a more definitive answer.
TIA for any and all help
Gurmi
View 6 Replies
View Related
Mar 2, 2005
iam trying to pass variable to a statement to grab data to from one DB table and pitch it in the same table in another DB base on evaluation like a where clause. but its not working what am i doing rong
here is the code
if exists (select * from dbo.sysobjects where id = object_id(N'[dbo].[ClientComment]')
and OBJECTPROPERTY(id, N'IsUserTable') = 1)
Declare @BDFR varchar(20), @BDTO varchar(20), @EQID varchar(20), @TABLEDESC varchar(20), @DBO varchar(20)
set @TABLEDESC = 'ClientComment'
set @DBO = '.dbo.'
set @BDFR = 'Commander' + @DBO + @TABLEDESC
set @BDTO = 'Test_Commander'+ @DBO + @TABLEDESC
set @EQID = '80_300_113'
insert into @BDTO
select from @BDFR where Eqid = @EQID
View 5 Replies
View Related
Jan 26, 2004
Fairly new to triggers and stored procedures and so far I haven't had to pass any variables or even call an SP from a trigger. The problem I have is to pass a 'callid' variable from an insert trigger to a stored procedure.
The SP is to delete any rows that already exist that contain the passed 'callid' and then insert the new inserted data (using variables again hopefully).
Is this at all possible and if so what is the syntax for passing the variable from the trigger and then reading it into the stored procedure ?
TIA
View 14 Replies
View Related
Aug 17, 2006
I have a project that uses GUID's througout and I'm completely stumped.
1) I create a "batch" GUID to batch the records I'm about to process.
2) I call a web service on a remote machine, and reserve the batch records by inserting the batch GUID into a string ---works fine
3)I call another web service that returns the rows that I just reserved as XML objects and insert into a string variable
4)I need to use the "batch"GUID variable which is typed as a string (DT_WSTR) as an added column so in a Data Flow Task I do the following:
a) use the XML string variable as the source of a XML Source Task -- works (now that I'm passing custom objects and not a dataset -- curious as to why I can't consume a dataset but thats a different question)
This is where things get tricky:
I've tried to add the BatchGUID as a derived column, as a datatype of DT_WSTR (unicode string) and convert it to uniqueidentifier in the Data Conversion task --- error, cannot convert unicode to uniqueidentifier (I know I can in C# and SQL Server so why not here).
I've tried to CAST the BatchGUID as a uniqueidentifier and pass that to the datasource -- again conversion error.
I've tried using the type Object and Casting to anything and that doesn't work either.
I've tried to pass the unicode all the way to the SQL Server Destination -- and insert into UniqueIdentifier field... again no go.
All help would be appreciated, at this point I can't see any way of using a UniqueIdentifier as a key field, and maintaining it through the package...
Is this a bug?
Oh and if you want to have some real fun, try returning the type of UniqueIdentifier as an output parameter using a ADO.NET connection.
Thanks!
View 4 Replies
View Related
Feb 29, 2008
I’ve got the following piece of ASP code, which updated a
table through the standard edit/update command filed options.
<asp:SqlDataSource ID="SqlDataSource1" runat="server"
ConnectionString="<%$
ConnectionStrings:CAT_SYSTEMConnectionString %>"
OldValuesParameterFormatString="original_{0}"
SelectCommand="sp_diplomaViewQualifications"
UpdateCommand="UPDATE
PsnQualifications SET quantity=@quantity WHERE rowstatus=1 and
qualId=@original_qualId"
DeleteCommand="UPDATE
PsnQualifications SET rowStatus=0, lastUpdateOn=getdate(), lastUpdateBy=@createdBy
WHERE rowstatus=1 and qualId=@original_qualId"
SelectCommandType="StoredProcedure">
<UpdateParameters>
<asp:Parameter Name="original_qualId"
/>
<asp:Parameter Name="quantity"
/>
</UpdateParameters>
<DeleteParameters>
<asp:Parameter Name="original_qualId"
/>
<asp:Parameter Name="createdBy" DefaultValue="TEST"/>
</DeleteParameters>
<SelectParameters>
<asp:Parameter Name="masterKey"
/>
</SelectParameters>
</asp:SqlDataSource>
My problem is that I need to pass the contents
of the VB variable ‘createdBy’ into the DeleteParameters option.
This is defined in the code as: Dim createdBy As String = getUserLoginName(Me.Page)
How do I pass this into the update and/or delete part
of the command field on the ASP page?
View 3 Replies
View Related
Apr 6, 2000
Is there a way that I can prevent people from running
pass through queries on my SQL Server - without
removing linked server access? I simply want to make
sure that the processing occurs on the clients machine,
not on my server....
Thanks!
Dean
View 1 Replies
View Related
Jul 28, 2004
Hi All,
I'm currently in the middle of building quite a large CMS using ASP.NET and MSSQL2K and have began to question if the amount of queries I am using for one page to be built is too many?
For one page (View Forum) I am getting all of the templates and checking access then pulling a list of threads, getting the first and last posts, then user info for the first and last posts... anyway to view 10 threads on the page the number of queries comes to about 54 and the page takes 0.064 seconds to load.
My question is, Is this to many queries to be running for a single page load? All queries are using Stored Procedures.
Thanks Guys.
View 3 Replies
View Related
Nov 30, 2004
Hey Guys:
--- Not sure if this should be moved to webforms forum, or if it belongs here ---
Alright, I have been dealing with this issue for a few days now, and have found a few solutions but they all seem to throw different errors so I figured I'd ask here.
What i am trying to do is have a webform where user enter data, and have the data passed across forms, then displayed and inserted into a database on another form. THe first for has an asp:rangevalidator control dymamicly built so I cannot simply take of the tags and use the old style.
Eventually the user will be directed to a paypal form, and upon successful completion be redirected to the page with the insert command within it, but for now, passing it to a second page for review, then inserting it will work.
I am not sure how to accomplish this, a tutorial or a code example would be great!! I have though about panels, creating public objects, etc, but all the solutions I have found have one issue or another when I attempt to create them.
I'm using asp.net 1.1, VB.net and SQL server.
Thanks,
Brian Sierakowski
View 1 Replies
View Related
Sep 13, 2004
I want to pass a command line values into an osql run stored procedure.
The commandline values should be are the values to put into the insert stored procedure that writes them to a table.
Is it possible to do this.
View 1 Replies
View Related
Jan 15, 2008
Dear All!
I use a "Execute SQL Task" to call a Stored Procedure. My Stored Procedure have an input parameter with type: ntext. And in "Execute SQL Task", I set variable in Parameter Mapping as following:
Variable Name: User::xmlDocument (this variable is a String to store xml data)
Direction: Input
DataType: NVARCHAR
ParameterName: 0
When length of "User::xmlDocument" is too large then error is happen but on the contrary, "Execute SQL Task" run successfully.
So, Can you show me how to pass large data into stored procedure using "Execute SQL Task"?
I am looking forward to hearing from you
Thanks
View 20 Replies
View Related
Jul 23, 2005
Is there any easy way to pass (dynamically) parameters to pass-throughqueries,when working with MS Access as front-end for SQL Server ?Thanks.
View 1 Replies
View Related
Jul 26, 2007
I'm working with a table with about 60 million records. This monster is growing every minute of the day as well, by 200,000 - 300,000 records/day. It's 11 columns wide, and has one index on a datetime column. My task is to create some custom reports based on three of these columns, including the datetime one.
The problem is response time. Any query executed on this table takes forever--anywhere between 30 seconds and 4 minutes. Queries such as this one below, as simple as it is, can take a minute or more:
select
count(dt_date) as Searches
from
SearchRecords
where
datediff(day,getdate(),dt_date)=0
As the table gets larger and large, the response time is going to get worse and worse. Long story short, what are my options to get the speed of queries down to just a few seconds with a table this big? So far the best I can come up with is index any other appropriate columns (of which there is one for sure, maybe two).
View 6 Replies
View Related
May 31, 2007
Hi,
I currently have a large table (35 million rows, over 80GB). I have one varchar(max) column on the table that is used in the fulltext index.
To query the complete index is fast, for example:
SELECT 'ipod', COUNT(*)
FROM CONTAINSTABLE(MyDB.dbo.Contents, [Body], 'ipod') CT
This took 70 seconds (which I can live with). However, I seldom run queries like this, most are more like:
SELECT 'ipod', COUNT(*)
FROM CONTAINSTABLE(MyDB.dbo.Contents, [Body], 'ipod') CT
JOIN Pages ITP ON ITP.PageID = CT.[Key]
JOIN Feeds ITF ON ITP.IPID = ITF.IPID
JOIN Buyers ITB ON ITB.IBID = ITF.IBID
WHERE ITB.ID IN (1342,246)
These queries are much slower (this example took 17 minutes). I understand that FT searches the index and returns all rows that match the query to SQL. SQL then performs the joins and counts only the correct results. (Correct me if I'm wrong here).
One solution I've seen to this to put data or "tags" into the FT column - so my Body column would become something like:
'{ID:1342}' + [Body]
That sounds like a very good idea. I could then change the 2nd query above to be:
SELECT 'ipod', COUNT(*)
FROM CONTAINSTABLE(MyDB.dbo.Contents, [Body], '("ID:1342" OR "ID:246") AND "ipod"') CT
That all works well until I want to select 1000 different ID's because the FT query will become very long and complex. Also I'm only including one column (ID) in this example - but I have about 7 or 8 columns that I would need to include in these "tags". Quering multiple columns become very complex quickly and no doubt I will reach a query limit at somepoint.
If anyone has any other suggestions to the above I'd love to hear them. Another thought I'm having is to partition the table. I can find very little online about how FT behaves on partitioned tables - I fear it behaves exactly the same, what I'd like to think is that I could partition the table on an ID say 100 per partition or something, and then fulltext would only search the relevant partitions. If it behaves like this it may work. If no-one knows then I'll give it ago, but this will take me a while due to the table size - so I'm hoping one of you clever lot know!
Many thanks for any advice.
Simon
View 2 Replies
View Related
Jan 27, 2005
Hi,
I'm writing an Access pass-through query against a SQL server backend and I need some advice on passing parameters. Currently I use vba to substitute the literal values for the parameters prior to passing the query to SQL Server. However, I am going through a loop thousands of times with different literals for these parameters which causes the server's cache to fill up. In Oracle, there is a way to use bind variables for the parameters so that only one copy of the query is cached.
Does anyone know how I can do this in SQL Server?
For instance, I have 20,000 employees and I'm pulling info by SS#:
Select * from EmpTable where SS_number = [SSN]
Is there a way I can pass this query to SQL Server and then pass the value of [SSN] as I loop through the dataset?
Thanks.
View 5 Replies
View Related
Jul 4, 2007
Hi,
We are running SQL Server 2005 Ent Edition with SP2 on a Windows 2003 Ent. Server SP2 with Intel E6600 Dual core CPU and 4GB of RAM. We have an C# application which perform a large number of calculation that run in a loop. The application first load transactions that needs to be updated and then goes to each one of the rows, query another table get some values and update the transaction.
I have set a limit of 2GB of RAM for SQL server and when I run the application, it performs 5 records update (the process described above) per second. After roughly 10,000 records, the application slows down to about 1 record per second. I have tried to examine the activity monitor however I can't find anything that might indicate what's causing this.
I have read that there are some known issues with Hyper-Threaded CPUs however since my CPU is Dual-core, I do not know if the issue applies to those CPUs too and I have no one to disable one core in the bios.
The only thing that I have noticed is that if I change the Max Degree of Parallelism when the server slows down (I.e. From 0 to 1 and then back to 0), the server speeds up for another 10,000 records update and then slows down. Does anyone has an idea of what's causing it? What does the property change do that make the server speed up again?
If there is no solution for this problem, does anyone know if there is a stored procedure or anything else than can be used programmatically to speed up the server when it slows down? (This is not the optimal solution however I will use it as a workaround)
Any advice will be greatly appreciated.
Thanks,
Joe
View 3 Replies
View Related
Mar 30, 2008
I have a test page where I'm using SqlConnection and SqlCommand to update a simple database table (decrease a number).
I'm trying to figure out how to make a number in the database table to decrease by 1 each time a button is being pressed. I know how to update the number by whatever I want to, but I have no idea what the correct syntax is for putting variables inside the query etc. Like "count -1" for instance.
The database table is called "friday" and the one and only column is called "Tickets".
Here's the code behind the button:protected void Button1_Click(object sender, EventArgs e)
{SqlConnection conn;
conn = new SqlConnection("Data Source=***;Initial Catalog=***;Persist Security Info=True;User ID=***;Password=***");
conn.Open();int count = -1;
SqlCommand cmd = new SqlCommand("select Tickets from friday", conn);
count = (int)cmd.ExecuteScalar();if (count > 0)
{
string updateString = @"
update friday
set Tickets = 500" ; <------ Here I want to set Tickets like "current count -1"SqlCommand cmd2 = new SqlCommand(updateString);
cmd2.Connection = conn;
cmd2.ExecuteNonQuery();
}
else
{
}
conn.Close();
}
View 3 Replies
View Related
May 20, 2008
Hello,
We have created several Table Valued User Defined Functions in a Production SQL Server 2005 DB that are returning large (tens of thousands of) rows obtained through a web service. Our code is based on the MSDN article Extending SQL Server Reporting Services with SQL CLR Table-Valued Functions .
What we have found in our implementations of variations of this code on three seperate servers is that as the rowset grows, the length of time required to return the rows grows exponentially. With 10 columns, we have maxed out at approximately 2 500 rows. Once our rowset hit that size, no rows were being returned and the queries were timing out.
Here is a chart comparing the time elapsed to the rows returned at that time for a sample trial i ran:
Sec / Actual Rows Returned
0 0
10 237
20 447
30 481
40 585
50 655
60 725
70 793
80 860
90 940
100 1013
110 1081
120 1115
130 1151
140 1217
150 1250
160 1325
170 1325
180 1430
190 1467
200 1502
210 1539
220 1574
230 1610
240 1645
250 1679
260 1715
270 1750
280 1787
290 1822
300 1857
310 1892
320 1923
330 1956
340 1988
350 1988
360 2022
370 2060
380 2094
390 2094
400 2130
410 2160
420 2209
430 2237
440 2237
450 2274
460 2274
470 2308
480 2342
490 2380
500 2380
510 2418
520 2418
530 2451
540 2480
550 2493
560 2531
570 2566
It took 570 seconds (just over 9 1/2 minutes to return 2566 rows).
The minute breakdown during my trial is as follows:
1 = 655 (+ 655)
2 = 1081 (+ 426)
3 = 1325 (+244)
4 = 1610 (+285)
5 = 1822 (+212)
6 = 1988 (+166)
7 = 2160 (+172)
8 = 2308 (+148)
9 = 2451 (+143)
As you can tell, except for a few discrepancies to the resulting row count at minutes 4 and 7 (I will attribute these to timing as the results grid in SQL Management Studio was being updated once every 5 seconds or so), as time went on, fewer and fewer rows were being returned in a given time period. This was a "successful" run as the entire rowset was returned but on more than several occasions, we have reached the limit and have had 0 new rows per minute towards the end of execution.
Allow me to explain the code in further detail:
[SqlFunction(FillRowMethodName = "FillListItem")]
public static IEnumerable DiscoverListItems(...)
{
ArrayList listItems = new ArrayList();
SPToSQLService service = new SPToSQLService();
[...]
DataSet itemQueryResult = service.DoItemQuery(...); // This is a synchronous call returning a DataSet from the Web Service
//Load the DS to the ArrayList
return listItems;
}
public static void FillListItem(object obj, out string col1, out string col2, out string col3, ...)
{
ArrayList item = (ArrayList) obj;
col1 = item.Count > 0 ? (string) item[0] : "";
col2 = item.Count > 0 ? (string) item[1] : "";
col3 = item.Count > 0 ? (string) item[2] : "";
[...]
}
As you will notice, the web service is called, and the DataSet is loaded to an ArrayList object (containing ArrayList objects), before the main ArrayList is returned by the UDF method. There are 237 rows returned within 10 seconds, which leads me to believe that all of this has occured within 10 seconds. The method GetListItems has executed completely and the ArrayList is now being iterated through by the code calling the FillListItem method. I believe that this code is causing the result set to be returned at a decreasing rate. I know that the GetListItems code is only being executed once and that the WebService is only being called once.
Now alot of my larger queries ( > 20 000 rows) have timed out because of this behaviour, and my workaround was to customize my web service to page the data in reasonable chunks and call my UDF's in a loop using T-SQL. This means calling the Web Service up to 50 times per query in order to return the result set.
Surely someone else who has used Table Valued UDFs has come accross this problem. I would appreciate some feedback from someone in the know, as to whether I'm doing something wrong in my code, or how to optimize an SQL Server properly to allow for better performance with CLR functions.
Thanks,
Dragan Radovic
View 7 Replies
View Related
Aug 18, 2015
I pull data from Sql Server through the query, I want to pass the region parameter to the power pivot connection query. So that I can automatically pull the required region data. The parameter should pick the value from the excel range. And also how to control this through VBA
View 4 Replies
View Related
Mar 30, 2015
Our monitoring tool shows that our production system periodically experiencing large rate - up to 800 memory pages/sec. How to find out which particular queries, S.P., processes that initiate this?
View 3 Replies
View Related
Feb 22, 2006
I am trying to get the contents of the Excel Files dynamically and dumping into the SQL Database using SSIS. Through WMI Event Watcher, I could find when one or more Excel files dumped in a particular folder and using ForEach Loop Container I was able to take all the filenames and pass it through Variables. But at the same time in the Data Flow, I have to pass each Sheet of an Excel File to the Excel Source control and export the data to my SQL Database using OLEDB Destination.
For that I need to get the names of each sheets in an Excel File and pass it to the Excel Source Control through variables. But when I give Data Access Mode as "Table name or view name variable" and provide the variable name in that, then it is giving an error message as "A destination table name has not been provided".
And at the same time, Since I was not able to provide an static Filename (as I am passing through Variables), when I tried to map the columns in the OleDB Destination, it is not allowing me to map the columns.
So all these things I should do at Run-time using Variables in SSIS. I don't want to hard-code any filenames or Sheet names. If any one of you have a solution, please share with me.
Thanks & Regards,
Prakash Srinivasan
View 3 Replies
View Related
Mar 9, 2007
I have an Execute SQL Task that returns a Full Rowset from a SQL Server table and assigns it to a variable objRecs. I connect that to a foreach container with an ADO enumerator using objRecs variable and Rows in first table mode. I defined variables and mapped them to the columns.
I tested this by placing a Script task inside the foreach container and displaying the variables in a messagebox.
Now, for each row, I want to write a record to an MS Access table and then update a column back in the original SQL Server table where I retreived data in the Execute SQL task (i have the primary key). If I drop a Data Flow Task inside my foreach container, how do I pass the variables as input to an OLE DB Destination on the Data Flow?
Also, how would I update the original source table where source.id = objRects.id?
Thank you for your assistance. I have spent the day trying to figure this out (and thought it would be simple), but I am just not getting SSIS. Sorry if this has been covered.
Thanks,
Steve
View 17 Replies
View Related
Jul 6, 2000
Hello,
I am facing a huge problem in my sql server database using access as a front end.The main problem is trying to execute queries "views" ,since they reside on sql server now,and using variables or parameters in reports and forms to filter on this query.
Ex.
how can the following be implemented using the same query but in sql server?
Access
------
SELECT MAT_Charts.YYYYMM
FROM MAT_Charts
WHERE ((([Area_Code] & "-" & [GROUP_CODE])=[Reports]![MAT_Chart_C1].[MAT_Key]))
GROUP BY MAT_Charts.YYYYMM;
It is specifically this statement in which I am interested:
[GROUP_CODE])=[Reports]![MAT_Chart_C1].[MAT_Key]))
Thank you very much for your concern.
View 2 Replies
View Related
Jul 12, 2006
I'd like to pass a parameter value from a ASP.NET application to an SSIS package. My SSIS package pulls data from SQL and loads it a flat file based on the parameter value.
Is this possible?
View 3 Replies
View Related
Feb 25, 2007
I have a SSIS package that imports (appends) data to a SQL table from an ODBC source. This package is run from within a VS project.
I'd like to pass a parameter to this package to limit the records it imports. I have seen some discussion of this but I need a detailed step-by-step or a good working example. Specifically:
1. How to get the parameter to the package to begin with. The VB function that runs the SSIS only passes parameters to the stored procedure that executes the SSIS.
2. How to refer to the parameter value in the SQL statement of the data reader object in the SSIS package. Instead of saying 'Select * from table1' it needs to say 'Select * from table1 WHERE field1 = @passedparam'
View 5 Replies
View Related