Slowness In SSIS Package And The Importance Of Disk Maintenance

Sep 22, 2006

During testing a package repetatively that deletes/inserts into several tables, over the course of several days, my package, which took 45 minutes to load 1700 XML files, began to take over 6 hours. Turns out it was an I/O bottleneck, and the Avg Disk Queue Length was around 200 and I was incurring many PAGEIOLATCH_EX. My devl machine uses a single local disk, no raid, so I had no options there, but I ran the maintenance wizard to recreate indexes/statistics and defraged the hard drive, and regained my original 45 minutes time. I guess I'll have to put a maintenance plan together to do this nightly.

-Kory

View 1 Replies


ADVERTISEMENT

Running Out Of Disk Space During SSIS Package Execution

Oct 3, 2006

Hi all,

I'm running out of disk space when running SSIS package. Is there any way to select where temp files are saved during package execution ?

View 8 Replies View Related

Is It Possible To Move My Sql 2000 Database (in C Disk) To Another Disk (Disk) ?

Dec 28, 2006

hello,all
          I am new to Sql 2000,I installed sql 2000 database in C disk,but Now I found my C disk space is smaller than before,So I want to move my databse(include data and structure)   from C Disk to D Disk(its space is very large) .
         is it possible to do it ? 
         if its can be done ,do I need to change my asp.net program source code (exp: chaneg my crystal  report connectstring ) ?
        thanks in advanced!
 
 
 
      

View 1 Replies View Related

Job Running SSIS Package Keeps Failing But The SSIS Package By Itself Runs Perfectly Fine

Aug 30, 2006

Hey, I've a few jobs which call SSIS packages. If I run the SSIS package, it runs fine but if I try to run the job which calls this package, it fails. Can someone help me troubleshoot this issue? None of my jobs that call an SSIS package work. All of them fail.

Thank you

Tej

View 7 Replies View Related

Subquery -&> Slowness

Feb 24, 2006

I have some queries that involve subqueries to select the appropriaterecord from a bunch of candidates. For example the following, whichselects the most recent transaction for a given customer:CREATE TABLE #Customer (CustID int, OtherInfo int)INSERT #Customer SELECT 1,7INSERT #Customer SELECT 2,8INSERT #Customer SELECT 3,9CREATE TABLE #Event (CustID int, EventID int, EventAmt int, EventDtsmalldatetime)INSERT #Event SELECT 1,1,1000,'20060224'INSERT #Event SELECT 2,1,2000,'20060224'INSERT #Event SELECT 3,2,3000,'20060224'INSERT #Event SELECT 3,1,5000,'20060225'SELECT c.CustID,c.OtherInfo ,e.EventAmt,e.EventDtFROM #Customer c JOIN #Event e ON c.CustID=e.CustIDWHERE EventDt = (SELECT MAX(EventDt) FROM #Event WHERE CustID=c.CustID)ORDER BY c.CustIDOver millions of customers and events, this takes forever. Creating atemp table which identifies the appropriate dates, and then joining tothat, speeds things up considerably -- the following two queriestogether take a lot less time the the one above.CREATE TABLE #Temp (CustID int,EventDt smalldatetime)INSERT #Temp SELECT CustID,MAX(EventDt)FROM #Event GROUP BY CustIDSELECT c.CustID,c.OtherInfo ,e.EventAmt,e.EventDtFROM #Customer cJOIN #Temp t ON t.CustID=c.CustIDJOIN #Event e ON c.CustID=e.CustID AND t.EventDt=e.EventDtThis seems pretty simple, and I would assume the query planner shouldbe able to figure it out without assistance. Is there a better way toforumulate the original query? BTW, as far as I can tell the indexesare appropriate for these queries; however the subqueries seem to bedone one at a time.

View 2 Replies View Related

SQL Server Slowness Problem

Sep 22, 2006

Any expert can tell me why the query I captured in profiler in the form below

declare @P1 int
set @P1=180150002
declare @P2 int
set @P2=1
declare @P3 int
set @P3=16388
declare @P4 int
set @P4=0
exec sp_cursoropen @P1 output, N'SELECT .... ', @P2 output, @P3 output, @P4 output
select @P1, @P2, @P3, @P4

is taken longer time if run using query analyzer? I tested the SELECT query in form above is only taken 26 seconds, whereas from profiler, it shown it tooks 3 minutes to run. My production server currently faced this slowness problem and hit timeout whenever the query is RPC from application.



-Tension

View 2 Replies View Related

Insert/ Update/ Delete Slowness.

Jul 10, 2006

SQL2K sp4

Howdy all. I opened a 200 mb. file in Query Analyzer that is full of Inserts/ Updates/ and Deletes. I tried just to parse it, and killed it after 18 hours. There is no blocking. All of the appropriate indexes exist. I even removed them and retried JIC. The box is plenty powerful for this task. Does anyone have any ideas?
I've tried several times with no luck. At the top of the file is SET IMPLICIT_TRANSACTIONS ON and then every 10,000 statements is COMMIT WORK. I've tried adjusting the number of commits to a lower number with no luck. This works fine on smaller files (3 - 20 mb).

View 1 Replies View Related

Slowness On Multi Processor Boxes.

Aug 16, 2006

SQL2K
SP4

Howdy all. I have a query with bizarre results in Query Analyzer.

Box1; 4 x 3.0 processors, 4 gigs of RAM. Results never come back.
Box2; 8 x 3.0 processors, 16 gigs of RAM. Results never come back.

Both of the above boxes are extremely under utilized. The absolute max amount of CPU being used is 25%. Box1 had 1 gig free RAM and Box2 had 7 gigs free RAM.

Box3; 1 x 3.0 processor, 1 gig of RAM. Results in 15 seconds.
Box4, all the same as box 3.

I took a backup of the DB and restored it from box to box to box, so I know everything is identical.

I once had a deadlock issue where I had to use the maxDop hint and tried that here, but it didn't help.

All ideas are appreciated.

TIA, cfr

View 1 Replies View Related

Count Slowness Using CROSS APPLY

May 13, 2008



Hello,

I am doing a report that uses paging and in order to optimize it, i used row_number() so i could make it return x rows per page, so, in order to compute the number of pages needed, i have to count the total number of rows, which gets very slow because i'm using a cross apply with a table-valued function. Is there any way so i can get the number of rows processed by row_number() so i dont have the need to do count?

Thanks in advance !

View 3 Replies View Related

Database Optimization Maintenance Plan Package.

Jan 29, 2008

This is how I am arranged my database optimization maintenance plan package.

1. Backup database
2. Check database integrity
3. Reorganize Index
4. Clean up old files
5. Update statistics
6. Rebuild Index.

Are these steps in the right order. Are there steps that I should take out that are unecessary?

View 1 Replies View Related

SSIS Error For Insufficient Disk Space

Jan 10, 2008



Hello,
I am testing my SSIS pakage, but I got a space disk issue (the C disk is over 100 GB):
Error: Date Time
Code: 0xC004704A
Source: xxxxDTS.Pipeline
Description: The buffer manager cannot extend the file "C:DTSxxxF.tmp" to length xxxxxx. There was insufficient disk space.
End Error
Error: Date Time
Code: 0x80070070
Source: xxxxDTS.Pipeline
Description: There is not enough space on the disk.
etc....

How can I solve the problem?
Is there any way to use different path for .tmp file?

Thank,
any help will be very appreciated.

View 7 Replies View Related

SSIS ERROR : Overflowed The Disk I/O Buffer, DTS_E_PRIMEOUTPUTFAILED

Nov 1, 2007

Hi,
I get folllwing error while SSIS package is executing and uploading Data from Flat Files to SQL Server 2005. This Error goes away when I change my SSIS Package Connection Manager to read UNICODE data files.
Is there any smart way to figure out which flat files have UniCode Data in the and which is not a Unicode data file.

Thanks,
Vinod


Information: 0x402090DC at Upload EP Data, DAT File Reader [1]: The processing of file "C:Data2EP05PF2000002_070412_002921.dat" has started.
Information: 0x4004300C at Upload EP Data, DTS.Pipeline: Execute phase is beginning.
Error: 0xC020209C at Upload EP Data, DAT File Reader [1]: The column data for column "SYSTEM_LOCKED" overflowed the disk I/O buffer.
Error: 0xC0202091 at Upload EP Data, DAT File Reader [1]: An error occurred while skipping data rows.
Error: 0xC0047038 at Upload EP Data, DTS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "DAT File Reader" (1) returned error code 0xC0202091. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
Error: 0xC0047021 at Upload EP Data, DTS.Pipeline: SSIS Error Code DTS_E_THREADFAILED. Thread "SourceThread0" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited.
Error: 0xC0047039 at Upload EP Data, DTS.Pipeline: SSIS Error Code DTS_E_THREADCANCELLED. Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown. There may be error messages posted before this with more information on why the thread was cancelled.
Error: 0xC0047021 at Upload EP Data, DTS.Pipeline: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0047039. There may be error messages posted before this with more information on why the thread has exited.
Information: 0x40043008 at Upload EP Data, DTS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Upload EP Data, DAT File Reader [1]: The processing of file "C:Data2EP05PF2000002_070412_002921.dat" has ended.

View 13 Replies View Related

SSIS Maintenance Plan

Jul 12, 2007

Hye, I'm currently in charge of setting up maintenance plans on 100 plus servers. What I would like to do is to build an SSIS project that would build the maintenance plans on these servers. Is this possible? I would like this to be done like this so the maintenance plans are not being run based on a centralized package, but on local jobs. If anybody knows how to do this or has any ideas they would be greatly appreciated.

-Kyle

View 3 Replies View Related

SID Importance

Mar 12, 2001

Hi,
I am using Sql 7.0 with sp2. I just started as a sql dba. I have a question here, What is the importance of SID's ? When we are mapping to sql logins and user_id 's how we have to give importance regarding SID's.
Pls suggest me a good article or some suggestions...

Thanks!

View 1 Replies View Related

Error While Executing SSIS Package From Other SSIS Package

Jan 10, 2007

Hi,

In our project we have two SSIS package.

And there is a task (Execute SSIS package) in First package that calls the execution of second package.

I m continuously receiving an error "Failed to decrypt protected XML node "PackagePassword" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available."

As we are running first package by job, job runs successfully logging above error

The protection level of second package is set to "EncryptSensitiveWithUserKey"

Can anybody please suggest how to handle it?

View 4 Replies View Related

Does Anyone Use SSIS For Database Schema Maintenance?

Mar 1, 2007

We currently use SSIS to build DTS packages in which we store changesto our database schema, as well as scripts that need to be run uponeach release. This works well for small sets of changes that neverneed to be updated or for architectures with only one database.We store each of the changes included in the package in separatefiles, which are tracked using version control. It is growing timeconsuming to maintain parity between those files and what is in theSSIS.Furthermore, we have been unable to discover an easy way to load afile's contents into a package SQL Task without opening the file andcopy-pasting the contents into a new SQL task.ANY information at all would be extremely appreciated!

View 1 Replies View Related

Calculating COUNTER Physical Disk: AVG. DISK QUEUE LENGTH

Sep 10, 2007

If I return the Average, Minimum, and Maximum values for the counter Physical Disk: Avg. Disk Queue Length, and those values are 10, 0, 87 respectively, which value do I use to compute the Avg. Disk Queue Length for a 4 disk array(RAID 10): Average, Minimum, or Maximum? The disk(lun) is on a SAN.

View 1 Replies View Related

Order By Importance

Apr 27, 2007

hi i have an association rule mining model and i want to order the output by the importance here is the select statment:

SELECT

[RelatedOrder].[Order Line]

From

[RelatedOrder]

NATURAL PREDICTION JOIN

(SELECT (SELECT 888 AS [Product ID]) AS [Order Line]) AS t



thanx

View 3 Replies View Related

Maintenance Plan Tasks In SSIS(Sql 2005)

Oct 10, 2006

I am finding difficult to set Expression Property "SelectedDatabases" in Check Databases Integrity Task in SSIS.

I keep getting error:

TITLE: Expression Builder
------------------------------

Expression cannot be evaluated.


ADDITIONAL INFORMATION:

The data type of variable "User::varDataset" is not supported in an expression.

Reading the variable "User::varDataset" failed with error code 0xC00470D0.

(Microsoft.DataTransformationServices.Controls)

I need the flexibilty to be able to control the Task behaviour during runtime through Variables.

Any suggestions .... or even a different approach will be helpful..

cheers

aigbor

View 1 Replies View Related

Get Total Disk Size And Free Disk Space

Nov 13, 2007

-- Initialize Control Mechanism
DECLARE@Drive TINYINT,
@SQL VARCHAR(100)

SET@Drive = 97

-- Setup Staging Area
DECLARE@Drives TABLE
(
Drive CHAR(1),
Info VARCHAR(80)
)

WHILE @Drive <= 122
BEGIN
SET@SQL = 'EXEC XP_CMDSHELL ''fsutil volume diskfree ' + CHAR(@Drive) + ':'''

INSERT@Drives
(
Info
)
EXEC(@SQL)

UPDATE@Drives
SETDrive = CHAR(@Drive)
WHEREDrive IS NULL

SET@Drive = @Drive + 1
END

-- Show the expected output
SELECTDrive,
SUM(CASE WHEN Info LIKE 'Total # of bytes : %' THEN CAST(REPLACE(SUBSTRING(Info, 32, 48), CHAR(13), '') AS BIGINT) ELSE CAST(0 AS BIGINT) END) AS TotalBytes,
SUM(CASE WHEN Info LIKE 'Total # of free bytes : %' THEN CAST(REPLACE(SUBSTRING(Info, 32, 48), CHAR(13), '') AS BIGINT) ELSE CAST(0 AS BIGINT) END) AS FreeBytes,
SUM(CASE WHEN Info LIKE 'Total # of avail free bytes : %' THEN CAST(REPLACE(SUBSTRING(Info, 32, 48), CHAR(13), '') AS BIGINT) ELSE CAST(0 AS BIGINT) END) AS AvailFreeBytes
FROM(
SELECTDrive,
Info
FROM@Drives
WHEREInfo LIKE 'Total # of %'
) AS d
GROUP BYDrive
ORDER BYDrive

E 12°55'05.25"
N 56°04'39.16"

View 16 Replies View Related

Should The Quorum Disk Be A Physical Disk Or Majority Node Set?

Nov 15, 2006

Hello,

I am trying to setup a test cluster and am having an issue. When I try to create the resource of a physical disk it takes both the drive e: and drive q: and doesn't seperate them into two physical disks as resources. This means when I try to associate the quorum disk it links the to physcial disk resource of drive e and q. Then when I try to install SQL2k5 I get the warning about installing SQL on the quorum disk. Am I missing something? Is there a way to seperate e and q onto two physical disk resources so I can specifically associate the quorum to q and the sql to e or should I be setting the quorum disk to a majority node set? Thanks in advance.

John

View 4 Replies View Related

Importance Of Sqlctr.h File

Sep 30, 2004

Hi all-- there is a file in C:Program FilesMicrosoft SQL ServerMSSQLBinn direcetory called sqlctr.h which contains a lot of counter parametres..could any one tell me having its importance and can we change any of parametres to gain performance..
Thanks in advance..

// This file is generated by the description file processor.
// Please do not edit.

#defineBUFMGR_OBJECT0
#define BUF_RESERVED_PAGE_COUNT2
#define BUF_CHECKPOINT_WRITES4
#define BUF_AWE_LOOKUP_MAPS6
#define BUF_BLOCK_WRITES8
#define BUF_COMMITTED_PAGE_COUNT10
#define BUF_AWE_UNMAP_CALLS12
#define BUF_TARGET_PAGE_COUNT14
#define BUF_AWE_UNMAP_PAGES16
#define BUF_CACHE_RATIO_BASE18
#define BUF_FREELIST_STALLS20
#define BUF_HASHED_PAGE_COUNT22
#define BUF_LIFE_EXPECTANCY24
#define BUF_CACHE_HIT_RATIO26
#define BUF_AWE_WRITE_MAPS28
#define BUF_PAGE_REQUESTS30
#define BUF_STOLEN_PAGE_COUNT32
#define BUF_BLOCK_READS34
#define BUF_NUM_FREE_BUFFERS36
#define BUF_LAZY_WRITES38
#define BUF_READAHEAD_PAGES40
#define BUF_AWE_STOLEN_MAPS42
#define BUF_PROCCACHE_SIZE44
#defineBUFPART_OBJECT46
#define BUFPART_NUM_FREE_BUFFERS48
#define BUFPART_FREE_BUFFERS_USED50
#define BUFPART_FREE_BUFFERS_EMPTY52
#defineGENERAL_OBJECT54
#define GO_LOGINS56
#define GO_LOGOUTS58
#define GO_USER_CONNECTIONS60
#defineLOCKS_OBJECT62
#define LCK_TOTAL_WAITTIME64
#define LCK_NUM_WAITS66
#define LCK_AVERAGE_WAITTIME_BASE68
#define LCK_NUM_DEADLOCKS70
#define LCK_NUM_TIMEOUTS72
#define LCK_NUM_REQUESTS74
#define LCK_AVERAGE_WAITTIME76
#defineDBMGR_OBJECT78
#define DB_REPLTRANS80
#define DB_DBCC_SCANRATE82
#define DB_REPLCOUNT84
#define DB_LOG_SIZE86
#define DB_LOG_TRUNCS88
#define DB_LOG_USED_PERCENT90
#define DB_LOG_SHRINKS92
#define DB_BULK_KILOBYTES94
#define DB_FLUSH_WAIT_TIME96
#define DB_ACT_XTRAN98
#define DB_LOGCACHE_READS100
#define DB_FLUSH_WAITS102
#define DB_BCK_DB_THROUGHPUT104
#define DB_DBCC_MOVERATE106
#define DB_LOG_GROWTHS108
#define DB_TOTAL_XTRAN110
#define DB_LOGCACHE_BASE112
#define DB_BYTES_FLUSHED114
#define DB_LOG_USED116
#define DB_LOGCACHE_RATIO118
#define DB_DATA_SIZE120
#define DB_BULK_ROWS122
#define DB_FLUSHES124
#defineLATCH_OBJECT126
#define LATCH_TOTAL_WAIT_NP128
#define LATCH_WAITS_NP130
#define LATCH_AVG_WAIT_NP132
#define LATCH_AVG_WAIT_BASE134
#defineACCESS_METHODS_OBJECT136
#define AM_EXTENTS_ALLOCATED138
#define AM_WORKTABLES_CREATED140
#define AM_GHOSTED_SKIPS142
#define AM_FULL_SCAN144
#define AM_PAGES_ALLOCATED146
#define AM_PAGE_SPLITS148
#define AM_SINGLE_PAGE_ALLOCS150
#define AM_EXTENTS_DEALLOCATED152
#define AM_PROBE_SCAN154
#define AM_FREESPACE_PAGES156
#define AM_WORKTABLES_FROM_CACHE_BASE158
#define AM_LOCKESCALATIONS160
#define AM_PAGE_DEALLOCS162
#define AM_WORKTABLES_FROM_CACHE164
#define AM_INDEX_SEARCHES166
#define AM_FREESPACE_SCANS168
#define AM_FORWARDED_RECS170
#define AM_WORKFILES_CREATED172
#define AM_SCAN_REPOSITION174
#define AM_RANGE_SCAN176
#defineSQL_OBJECT178
#define SQL_AUTOPARAM_REQ180
#define SQL_BATCH_REQ182
#define SQL_RECOMPILES184
#define SQL_AUTOPARAM_UNSAFE186
#define SQL_COMPILES188
#define SQL_AUTOPARAM_FAIL190
#define SQL_AUTOPARAM_SAFE192
#defineCACHE_OBJECT194
#define CACHE_USE_COUNT196
#define CACHE_HIT_RATIO_BASE198
#define CACHE_OBJECT_COUNT200
#define CACHE_HIT_RATIO202
#define CACHE_PGS_IN_USE204
#defineMEMORY_OBJECT206
#define MEMORY_MEMGRANT_MAXIMUM208
#define MEMORY_CONNECTION_MEMORY210
#define MEMORY_MEMGRANT_WAITERS212
#define MEMORY_MEMGRANT_OUTSTANDING214
#define MEMORY_SQL_CACHE_MEMORY216
#define MEMORY_OPTIMIZER_MEMORY218
#define MEMORY_LOCKS220
#define MEMORY_SERVER_MEMORY222
#define MEMORY_LOCKOWNERS_ALLOCATED224
#define MEMORY_LOCK_MEMORY226
#define MEMORY_LOCKS_ALLOCATED228
#define MEMORY_SERVER_MEMORY_TARGET230
#define MEMORY_LOCKOWNERS232
#define MEMORY_MEMGRANT_ACQUIRES234
#defineUSER_QUERY_OBJECT236
#define QUERY_INSTANCE238
#defineREPLICATION_AGENT_OBJECT240
#define RUNNING_INSTANCE242
#defineMERGE_AGENT_OBJECT244
#define MERGE_CONFLICTS_INSTANCE246
#define UPLOAD_INSTANCE248
#define DOWNLOAD_INSTANCE250
#defineLOGREADER_AGENT_OBJECT252
#define LOGREADER_LATENCY_INSTANCE254
#define LOGREADER_TRANSACTIONS_INSTANCE256
#define LOGREADER_COMMANDS_INSTANCE258
#defineDISTRIBUTION_AGENT_OBJECT260
#define DISTRIBUTION_TRANS_INSTANCE262
#define DISTRIBUTION_LATENCY_INSTANCE264
#define DISTRIBUTION_COMMANDS_INSTANCE266
#defineSNAPSHOT_AGENT_OBJECT268
#define SNAPSHOT_TRANSACTIONS_BCPED270
#define SNAPSHOT_COMMANDS_BCPED272
#defineBACKUP_DEV_OBJECT274
#define DB_BCK_DEV_THROUGHPUT276

View 3 Replies View Related

Explaining Importance To End Users

Jul 17, 2007

Hello Experts at Microsoft.



I am thinking of an easy way to explain importance to Marketers without going into the math. This is what i came up with so far. Does this sound correct to you guys?



Reasoning:

IMPORTANCE = Log(Improvement)



Improvement=P(X&Y)/(P(x)*P(y))



Improvement= (Probability 2 products are sold together)/(random chance 2 products are sold together)



If the (Probability 2 products are sold together) = (random chance 2 products are sold together) then Improvement=1. The log(1) = 0



IMPORTANCE SCORE
-2 to -1 10 to 100 times less likely than random chance
-1 to 0 0 to 10 times less likely than random chance
0 to 1 0 to 10 times more likely than random chance
1 to 2 10 to 100 times more likely than random chance
2 to 3 100 to 1000 times more likely than random chance
3 to 4 1000 to 10000 times more likely than random chance
4 to 5 10000 to 100000 times more likely than random chance
5 to 6 100000 to 1000000 times more likely than random chance
6 to 7 1000000 to 10000000 times more likely than random chance

View 1 Replies View Related

Association Rules - Importance

Feb 14, 2008


I understand Mr. MacLennan's explanation provided at http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=282651&SiteID=1 and appreciate the time he took to explain how importance works. However, like the user with username "sang", I also ran the data in BI 2005 and got the same results listed by the aforementioned user. I did this using the following data:







donut
muffin

y
y

y
y

y
y

y
y

y
y

y
y

y
y

y
y

y
y

y
y

y
y

y
y

y
y

y
y

y
y

n
y

n
y

n
y

n
y

n
y

etc.

The rule muffin -> donut has an importance of -0.105302438, which is not the same as Mr. MacLennan's results. I tried switching the roles of a and b in a -> b and using different bases on the logarithms. I don't get the result of -0.105302438 with any of these. I also tried to calculate importance with a small data set I have and can't get the results using Mr. MacLennan's explanation with that data set either. Any thoughts on the descrepancy?

View 5 Replies View Related

Importance Of Triggers In Sql Server

Feb 21, 2008



WHY DO WE USE TRIGGERS IN SQL SERVER2005.

WAT IS ITS IMPORTANCE.

AND SOME SAMPLES

PLEASE GIVE ME SOLUTIONS

View 1 Replies View Related

The Mean Of Using Association With Importance And Probability

Apr 12, 2007

hi,
i have a exercise using association datamining
my database have 350 records,
i use 90 records for datamining and it release some rules which i choose on top of mSOLAP_NODE_SCORE,
but when i use select statement to check my result i have 1 records, the same as my result, and 5 records not true;
for example:
rules A=a,B=b-> C=c
select * from <my_table> where A='a' and B='b' and C='c'; ==>1 record return
select * from <my_table> where A='a' and B='b' and C<>'c'; ==>5 records return
C with 3 values c1,c2,c
with the second statement C includes 2 c1 and 3 c2

i don't understand how they work.
i want to choose some best rules can present my database.
how can i choose importance and probability to get best rules.
with database have 90 records and a database have 350 records which values i should use for minimum_probability, Minimum_Support, Minimum_importance...
when i choose rules i should choose on importance or probability.

thanks for your help

View 4 Replies View Related

Attribute 'importance' Factor

Jan 31, 2008

Is there a way to explicitly assign 'weights' or 'importance' factors to attributes and have that to be considered by the association rules and decision trees algorithms during training? I would like to do so without preprocessing the data (In any case, I can't think on a way to assign weight with preprocessing to boolean attributes like 'smoker')

thanks

View 3 Replies View Related

SSIS Erros Relating To Maintenance Plan Creation

Apr 2, 2006

I've been searching everywhere for a solution to this problem and no answers exist anywhere. When I try to create a new maintenance plan I get the following error. I've been told it may be related to SSIS but nobody has a solution. How do I fix this issue so I can create a maintenance plan.



Exception has been thrown by the target of an invocation. (mscorlib)
ADDITIONAL INFORMATION:
An OLE DB error 0x80004005 (Client unable to establish connection) occurred while enumerating packages. A SQL statement was issued
and failed.
An OLE DB error 0x80004005 (Client unable to establish connection) occurred while enumerating packages. A SQL statement was issued
and failed.

View 1 Replies View Related

Association Algorithm - Importance Of A Rule

Mar 6, 2006

Can anyone tell me, how the Business Ã?ntelligence Studio calculates the importance of a rule. I can't find the formula. I know some formulas, but the result in SQL Server is completly different.

Thanks!

View 12 Replies View Related

Importance Of The New SQL Server Windows Groups

Oct 19, 2005

Those of you who have installed SQL Server 2005 may have noticed that the installation creates several new Windows groups on the server.  Do not underestimate the importance of these groups.

View 3 Replies View Related

Maintenance Jobs Failing SSIS Subsystem Failed To Load

May 1, 2008

Environment: SQL Server 2005 Enterprise Edition x64, 3 server cluster. Two active servers with seperate instances and one passive server. SQL Server was installed on the two active servers.

Problem: When I fail over either of my instances to the passive server in the cluster my maintenance jobs fail to run and there are error messages in the application event viewer "SSIS Subsystem failed to load". I am guessing that all of the needed components are not installed on the passive server? Is this a close guess? If so, exactly what components are missing and do you have to have another license to install them?

Thanks in advance for any advice.

View 2 Replies View Related

Unable To Edit Or Create SSIS Packages Or Maintenance Plans

Oct 30, 2007

Hi,

I am running SQL 2005 Enterprise with SP2. In the past I have been able to successfully create SSIS packages using the BIDS and also Maintenance Plans from within Management Studio.

Today I tried to edit a maintenance plan and got the following error message:-

TITLE: Microsoft SQL Server Management Studio
------------------------------
Retrieving the COM class factory for component with CLSID {0BE35203-8F91-11CE-9DE3-00AA004BB851} failed due to the following error: 80040154. (Microsoft.DataWarehouse)
------------------------------
BUTTONS:
OK
------------------------------

This error also occurs if I try to create a new Maintenance Plan.

I then tried opening an existing SSIS package in BIDS and got the following error:-

------------------------------------------------------------------------------------------
Microsoft Visual Studio is unable to load this document
Index (zero based) must be greater than or equal to zero and less than the size of the argument list.
------------------------------------------------------------------------------------------

These errors only occur on my PC so I know that the problem is with my PC. I have totally uninstalled SQL 2005 components from my PC and reinstalled them with no luck.

To my knowledge, I haven't installed any other programs since creating these packages, although there's a possiblilty my PC may have had some patches applied through SMS.

I am running Windows 2000 Professional.

Can anyone help me resolve this problem?

Thanks.

Martin

View 6 Replies View Related

Disk Crash Of Disk That Contains The Paging File.

Feb 20, 2001

Hello,

this is my configuration :

1) 3 disks in RAID5 that hold the SQL data
2) 1 disk in RAID0 that holds the only paging file.

What will happen to the SQL data (DB) when the disk that holds the paging file crashes?

Kindest regards,
Luc.

View 1 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved