Backups Causing Connectivity Issues

Sep 21, 2004

I'm maintaining a database with an automated backup process.

Periodically, it does a "BACKUP DATABASE" to a local file and then do a file copy across the network to another system. This file is 2.7GB and when it is being copied it causes network problems; specifically ADO connections to the database get broken.

Does anyone have experience with this kind of issue?

View 8 Replies


ADVERTISEMENT

DB Engine :: Will Transaction Log Backups Not Free Up Log During Full Backups

Nov 15, 2015

The space allocated to the Log in question is 180 GB. During this time period I was running TLog backups every 5 minutes, yet the log continued to chew through to 80 GB used, even after the process was complete and a final TLog backup had been taken. It continued to stay very large until the Full backup was complete -- or something else that I'm unaware of completed. Like every other DBA I typically take a TLog backup to shrink the log, but what appeared to be the case here was the Full completed and it released the used log space. All said, will Transaction Log backups not free up the log during Full backups?

View 3 Replies View Related

Error 16955, What Is Causing This?

Sep 19, 2001

View 1 Replies View Related

BDE/SQL Causing GENERAL_SQL_ERROR And Crashing App

Aug 27, 2007

Dear Folks:

I am currently engaged in finding the reason why a certain SQL client application (now running under XP SP2) fails when it attempts to query an sql database. This "failure" happens after the nth operation of the same. The application can run anywhere between 2 seconds and 2 hours before it displays a GPF dialog window and is terminated by the user. The application was developed more than 10 years ago using Borland c++ 4.52 (I am using 5.01) and was deployed as a 16 bit windows app. Since then, they've the client) "apparently" had some success running it under Win95/Win98 and 32 bit OSs like W2K and XP.

At this time, I (think I) know that a specific query operation (runing through the KDBF framework) that returns a 13059 BDE code before the returned object is accessed and the KERNAL reports a NULL Handle error. Inside the KBDEF framework, the query function translates to a DbiQExecDirect(,,) function call to the BDE. I have looked at the code and can not find anything wrong.. nor would i expect too. The application worked at one time. But,clearly, something has changed in the environment of both the test system I have, and one installed at a customer site. I have modifed just about every modifiable parameter I can think of. The system in question is an IBM branded machine running XP pro with SP2 applied. I do not know if the app run under XP SP1 correctly, however.

Any suggestions would be greatly appreciated. (i know.. stop using BDE)

Thank you
JRC

View 1 Replies View Related

NOT FOR REPLICATION Causing PK Errors

Nov 30, 2007

Hi all

I have set up Transactional Replication from 2005 to 2000. After running a few transactions I disabled the replication, pointed one of our IIS servers to the 2000 database to do a quick functional test and validate replication. We are getting "Cannot insert duplicate value in object XXXX with violation of PK constraint errors. I was able to figure out the cause. This was happening for tables with Identity columns which have been set NOT FOR REPLICATION. The IDENT_CURRENT values are different than the MAX value in the table. so doing a DBCC CHECKIDENT with RESEED seemed to have fixed the issue. Now I am running the DBCC command on ALL tables that have been used for Replication and have NFR set to true. Its taking a while as the tables are big.

Now I am wondering if anyone has faced similar issue? Is this a known issue? I have googled and have found nothing or no one complaining about this. I believe most people have used replication for reporting where they are just querying the databases. We are testing replication as a fall back scenario.

Opinions welcome..
thanks,
Don

View 8 Replies View Related

Big Tables Causing Issues

Mar 3, 2008

I have a very rubbish connection!! And have been having prodlems with "Protocol error in TDS stream" when trying to save a copy locally through a dts. Basically i have a table with about 11,000 rows, but this table had 28 columns. I've now normalised to a certain extent and now have 2 tables one of 11 columns mostly ints and one of 18 columns, but they will both be excess of 11,000 rows.

So, on to the question ... Will these 2 tables likely ease my network issues... The application works fine on 1 big table. It's just copying to my local machine that's always been the issue since the table hit 10,000 rows

If only I knew what I was doing ... Thanks in advance of any guidence

View 3 Replies View Related

Autonumbering Causing Deadlocks.

Jul 20, 2005

Gents,I have come into a system that uses a secondary table to generate (forwant of a better word) Identities.egcreate table myidents( name sysname not null, ident int not null)create procedure getnextident @table sysname, @ident int outputasbeginif not exists (select top 1 1 from myidents where name = @table)insert into myidents values (@table, 0)update myidentsset @ident = ident = ident + 1where name = @tableendnow, (ignoring for now the use of reserved words) the problem is thatthis is called frequently, from other procedures. Trouble is that thecalling procedures call it from within a transaction. We now have awickedly hot spot on this table, with frequent deadlocks.Is there any relatively quick fix for this? Some locking hints orwhatever.Or do we need to go and recode, moving this kind of thing outside thetransaction (which are all rather too long for my liking), and evencosidering using identity columns as a replacement?Thanks

View 4 Replies View Related

Database For BCM Causing Errors??

Feb 28, 2007

Hi All,

I tried to create a new project in BCM and I get an error indicating that there is already a project with that name. There is no project with that name in my list of projects! However, I think during one of the frequent installs and uninstalls before I was able to install a somewhat stable BCM, i might have created a project with that name prior to one of the many crashes (it still hangs and has to be killed in task manager periodically). I suspect I might have a rogue database that was created during one of the reinstalls which included installing and uninstalling small business accounting, which I have since also uninstalled. I have tried to use studio manager express to look at the tables in the databases, of which there are two, "mssmallbusiness" and "contctmgr 16022007" , in order to find an instance of a record with the name in question, no joy yet. Do I need the "mssmallbusiness" database to run BCM and if not can I delete it? I suspect that the mssmallbusiness database could be interfering somehow with the "contctmgr 16022007".

Since installing BCM, my outlook has slowed down considerably and hangs often on launch, I would abandon BCM, but I think it's a really cool tool and I would like to install small business accounting too! My machine is a centrino duo with 1 GB RAM and plenty HD space.

I know this message is somwhat convoluted, but if you can make sense of my issues, pls help.

Regards,

EdincoT

View 1 Replies View Related

End Conversation Causing Blocks

Apr 9, 2007

Hi



I was trying to clean up some conversation in Service Broker and caused alot of blocking that I seem to unable to kill. there was 1 conversation that I was not able to end, so I wanted to restart sql service, But I can't even restart the SQL service. I get the following in Event Viewer



Timeout occurred while waiting for latch: class 'SERVICE_BROKER_TRANSMISSION_INIT', id 00000001A2B03540, type 2, Task 0x0000000000C2EDA8 : 0, waittime 5400, flags 0xa, owning task 0x00000002DEBCA5C8. Continuing to wait.



Has anyone come across this



thanks



Paul

View 1 Replies View Related

DateTime Param For SP Causing BIG Headache...!!

Aug 8, 2006

I've got a stored procedure and one of the parameters is a DateTime.  But no matter what I do to the string that's passed into the form for that field, it doesn't like the format.  Here's my code: SqlConnection conn = new SqlConnection(KPFData.getConnectionString());
SqlCommand cmd = new SqlCommand("KPFSearchName", conn);
cmd.CommandType = CommandType.StoredProcedure;

SqlParameter param = cmd.Parameters.Add("@DOB", SqlDbType.SmallDateTime);
param.Direction = dir;
param.Value = txtDOB.Text;

// also have tried this:

param.Value = Convert.ToDateTime(txtDOB.Text);

// and

param.Value = Convert.ToDateTime(txtDOB.Text).ToShortDateString;

No matter what I do I always get a formatting error - either I can't convert the string to a DateTime, or the SqlParameter is in the incorrect format, or something along those lines.  I've spent a couple hours on this and hoping someone can point out my obvious mistake here...??Thanks for your help!!eddie

View 5 Replies View Related

SQL Query Assistance. MAX Causing Issue

May 23, 2007

I got some help on here before with building my query. I thought this was working fine but it turns out when there are multiple records for a column type, it only grabs the first one. I need to get all records. Is there an alternative to MAX? I needed to structure it like this because I needed to return each row as a column and this was the way suggessted before.
My query:SELECT TOP (100) PERCENT PRODUCT_NUMBER, PRODUCT_NAME,
MAX(CASE WHEN ColumnName = 'Federal Specification Number' THEN TheValue ELSE NULL END) AS [Federal Specification Number]FROM (SELECT dbo.PRODUCT_FEATURE_VALUES.PRODUCT_ID AS ProductID, dbo.SHARED_FEATURE_VALUES.FEATURE_TEXT_VALUE AS TheValue,
dbo.SHARED_FEATURE_TYPES.FEATURE_TYPE AS ColumnName, dbo.PRODUCTS.PRODUCT_NUMBER, dbo.PRODUCTS.PRODUCT_NAME
FROM dbo.PRODUCT_FEATURE_VALUES INNER JOINdbo.SHARED_FEATURE_TYPES ON
dbo.PRODUCT_FEATURE_VALUES.FEATURE_TYPE_ID = dbo.SHARED_FEATURE_TYPES.FEATURE_TYPE_ID INNER JOINdbo.SHARED_FEATURE_VALUES ON
dbo.PRODUCT_FEATURE_VALUES.FEATURE_VALUE_ID = dbo.SHARED_FEATURE_VALUES.FEATURE_VALUE_ID INNER JOINdbo.PRODUCTS ON dbo.PRODUCT_FEATURE_VALUES.PRODUCT_ID = dbo.PRODUCTS.PRODUCT_ID
UNIONSELECT dbo.EXTENDED_ATTRIBUTE_VALUES.PRODUCT_ID AS ProductID, ISNULL(dbo.EXTENDED_ATTRIBUTE_VALUES.SMALL_TEXT_VALUE,
dbo.EXTENDED_ATTRIBUTE_VALUES.LARGE_TEXT_VALUE) AS TheValue, dbo.EXTENDED_ATTRIBUTES.COLUMN_NAME AS ColumnName, PRODUCTS_1.PRODUCT_NUMBER, PRODUCTS_1.PRODUCT_NAME
FROM dbo.EXTENDED_ATTRIBUTE_VALUES INNER JOINdbo.EXTENDED_ATTRIBUTES ON
dbo.EXTENDED_ATTRIBUTE_VALUES.EXT_ATT_ID = dbo.EXTENDED_ATTRIBUTES.EXT_ATT_ID INNER JOIN
dbo.PRODUCTS AS PRODUCTS_1 ON dbo.EXTENDED_ATTRIBUTE_VALUES.PRODUCT_ID = PRODUCTS_1.PRODUCT_ID) AS t1
WHERE PRODUCT_NUMBER = '02083'
GROUP BY PRODUCT_NUMBER, PRODUCT_NAME
ORDER BY PRODUCT_NUMBER
 
This returns:
Product_Number    Product_Name                                 Federal Specification Number 
02083                   Di-Electric Grease, 10.5 Wt Oz          FDZ-CFR-21-178.3570
There is another record for Federal Specification Number I need to return as well. If I change to MIN, it gets the other record. Anyway I can get both?

View 15 Replies View Related

InnerJoins Causing No Rows To Be Returned

Jun 11, 2008

I have my SQLDataSource configured as shown in the picture. I ran the Execute Query and input an ID I know is in the database and it returned nothing. I ran into this probelm on another part of my site and i got it working by using 2 SQLDataSources, but im trying to keep the amount of code down.
 
http://junk.icore-studios.com/junk/Codeissues/Innerjoins.JPG
 
Any ideas on why it would be behaving this way?

View 2 Replies View Related

Deletes Causing Tran Log To Fill

Aug 2, 2000

I seem to be having a problem on all of my SQL servers. WHen I or a developer attmept to do a delete on a table i get a Log file for database is full. I truncate the log try again and get the same error. IT doesnt seem to matter how much is being deleted or how big the table is. THis is very strange and very frustrating.

Thanks
David

View 5 Replies View Related

CASE Statement Causing Too Many Rows To Appear

Jul 5, 2007

Here's my basic syntax:


Code:

select
dma.dma_market_area,
count(fc.contactid) as Number_Of_QCs,
(case when fc.ctca_currenttier = 1 then count(fc.contactid) end) as P1,
(case when fc.ctca_currenttier = 2 then count(fc.contactid) end) as P2,
(case when fc.ctca_currenttier = 3 then count(fc.contactid) end) as P3,
(case when fc.ctca_currenttier = 4 then count(fc.contactid) end) as c1,
(case when fc.ctca_currenttier = 5 then count(fc.contactid) end) as c2,
(case when fc.ctca_currenttier = 6 then count(fc.contactid) end) as c3



from ...



And then I get output like this:

dma_market_areaNumber_Of_QCsP1P2P3c1c2c3
ALBANY-SCHENECTADY-TROY66NULLNULLNULLNULLNULL
ALBANY-SCHENECTADY-TROY1NULL1NULLNULLNULLNULL
ALBANY-SCHENECTADY-TROY1NULLNULLNULL1NULLNULL
ALBANY-SCHENECTADY-TROY1NULLNULLNULLNULLNULL1


How can I get there all to group to one row, based on the first column?

View 1 Replies View Related

Check Constarint Causing Problems

Apr 17, 2006

I have been having problems inserting data with dates.

If I do not use a constraint then the data is inserted without a problem but if the constraint is added I get this error.
Code:

"The conversion of a char data type to a datetime data type resulted in an out-of-range datetime value"



What is the problem here?

I have the logins set to British English

View 3 Replies View Related

Replicating 3 Databases Causing Problems?!

Aug 8, 2004

We are trying to deal with replication in a legacy design involving 2 SQL servers each taking INSERTS from about 100 call centre client PCs. In each case a client logs into either SQL Server and upon each INSERT is handed a unique Call_ID to use when inserting additional information in relation to that specific call.

Each of the two databases are subsequently being replicated into a third database where reports are being pulled.

The problem is that to prevent each database giving the same Call_ID to a client we have setup SQL 1 to use a Call_ID starting with 1 and incremented by 2 (i.e odd numbers!). SQL 2 starts with 0 and increments by 2, (even numbers). These ‘increment’ rules are built into the table schema and seem to be causing a problem when we try to replicate into the third database as the two initial schemas are not considered identical.

The first database to be replicated will work and the second will fail. We get messages saying it is due to unique values.

I thought we may be able to have identical schemas by changing the ID field to a fixed 12 digit number and prefix it with a 10xxxxxxxxxx on one server and 11xxxxxxxxxx on the other. The 10 and 11 would be held in a table with the value being pulled based upon server name.

ServerValue
SQL110
SQL211

Hence we would be able to extract the value and prefix the ID with it.

Has anyone come across the reason as to why the first replication will work but the second will always fail? And would this mod solve the problem?

Moreover I suspect that our design is fundamentally flawed and that we need to have two servers handling a single database? This single database would then more easily be replicated to the reports database.

Thanks for any input!

Paul

View 1 Replies View Related

Too Many Writes At Once Causing Time Outs?

Feb 18, 2004

Hi everyone! I'm new to this forum and I suspect I'll be using this forum frequently. Good stuff.

Allow this question may appear to be Web-related, I think the problem is with what I'm doing with the database. Please read.

I'm trying to implement a page tracking solution using ASP and SQL 2000. It basically writes a new record to a table every time a user visits a page on the site. It appeared to work fine at first, then I've increasingly been getting time out errors on my pages -- all pointing to the include file that fires the database write.

Here's the code that's referenced on every page:

Set Conn = Server.CreateObject("ADODB.Connection")
Conn.Open "dsn=x;uid=y;pwd=z;"

Set objRecordset1= Server.CreateObject("ADODB.Recordset")
objRecordset1.Open "SELECT * FROM table",Conn,1,2
objRecordset1.AddNew
objRecordset1.Fi elds("PAGE") = Left(request.servervariables("SCRIPT_NAME"),100)
objReco rdset1.Fields("QUERY_STRING") = Left(request.servervariables("QUERY_STRING"),100)
objRec ordset1.Fields("DATE") = Date()
objRecordset1.Fields("TIME") = Time()
objRecordset1.Fields("PLATFORM") = Left(request.servervariables("HTTP_USER_AGENT"),100)
obj Recordset1.Fields("REFERRER") = Left(request.servervariables("HTTP_REFERER"),100)
objRec ordset1.Fields("USER_IP") = Left(request.servervariables("REMOTE_ADDR"),20)
If Request.Cookies("TEST")("ID")<>"" Then
objRecordset1.Fields("VISITOR_ID") = Request.Cookies("TEST")("ID")
End If
objRecordset1.Update

Conn.Close
Set Conn=Nothing
%>

After taking out the reference to the above code everything speeds back up. So, I know the performance hit and time out issues have to do with the code above.

Is it the simultaneous write to the table, the constant opening and closing of the recordset, the cursor type, the lock type – or combination of things?

HELP!! Thanks!

David

View 3 Replies View Related

Simple Query Causing Frustration

May 27, 2008

Hi, I am a new learner and user to SQL Sever 2005 and am having some major frustration trying to write a simple query.

I have two tables, 1) Ticket_Purchase, 2) Flight.

The Ticket_Purchase table has these columns: Ticket_Purchase_Number(PK), Flight_Number(FK), Date_Purchase_Made, Ticket_Price, Class_of_Ticket, Passenger_ID

The Flight table has these columns: Flight_Number(PK), Flight_Date, Flight_Departure_Time, Flight_Arrival_Time, Flight_Origin, Flight_Destination

I am trying to create a query that will tell me: On which flight were the most first class tickets sold?

There are only two types of classes; 'E' for economy and 'F' for First Class.

So far I am able to get a list of all the First class flights for each flight and can visually see which flight has the most first class tickets by counting them manually on the report generated, but I am totally confused on how to simply pull the single flight with the most First class tickets sold. I wonder if this requires something more like a join or a nested sub query?

The SQL I wrote for the above is:

Select Class_of_Ticket, Flight_Number
From Ticket_Purchase
Where Class_of_Ticket = ('F')
Order By Flight_Number;


And it produces:

Class_of_Ticket Flight_Number
--------------- -------------
F 1
F 1
F 1
F 2
F 2
F 3
F 3
F 3
F 3
F 4
F 4
F 4
F 4
F 4
F 4
F 4
F 4
F 4
F 4
F 4
F 4
F 4
F 4
F 5
F 5
F 6
F 6
F 6
F 7
F 7
F 8
F 8
F 8
F 9
F 9
F 9
F 9
F 9

(38 row(s) affected)



Rather I would like it to produce:
First_Class_Seats Purchased Flight_Number
--------------------------- ------------
14 4


I hope I didn't make this to confusing to understand as I am still learning the syntax and 'lingo' of how to communicate this stuff verbally.

Thank you for any help you could offer. It would be much appreciated.

Edit: the query report I pasted from SQL should have the flight number directly under the column header. For some reason the space between Class and Flight_number is being eliminated in the post.

View 5 Replies View Related

Group By Alias Causing Problem...

May 30, 2008

Hi,

I am getting odd result while executing the below query.

SELECT COUNT(DISTINCT(D.image_id)),
(CASE WHEN D.stage_id IN (SELECT SS_STAGE.stage_id FROM SS_STAGE WHERE SS_STAGE.STAGE_ID = D.STAGE_ID )
THEN (SELECT SS_STAGE.STAGE_ID FROM SS_STAGE WHERE SS_STAGE.STAGE_ID = D.STAGE_ID)
ELSE D.stage_id
END) stage_id
FROM deadline D, OCCURRENCE O
WHERE O.image_id = D.image_id
AND (D.APPROVED_STAGE IS NULL OR D.CONFLICT = 1)
AND D.LOGON = 'pbitest2'
AND O.delete_ind = ' '
GROUP BY stage_id


My actual requirement is to group by using the alias name.
This query getting the results by grouping the STAGE_ID from DEADLINE table!!!.

Please help me on this...Thanks in Advance.


Sudheer

View 5 Replies View Related

Attach DB Causing Cachestore Flush

Mar 16, 2007

I have detached a SQL Server 2005 database from one server and attached it to another SQL Server 2005 and I now get the following in the error Log and in the event viewer evry 10 - 20 minutes or so.

SQL Server has encountered 1 occurrence(s) of cachestore flush for the 'Object Plans' cachestore (part of plan cache) due to some database maintenance or reconfigure operations.

2007-03-16 12:37:14.64 spid17s SQL Server has encountered 1 occurrence(s) of cachestore flush for the 'SQL Plans' cachestore (part of plan cache) due to some database maintenance or reconfigure operations.

2007-03-16 12:37:14.64 spid17s SQL Server has encountered 1 occurrence(s) of cachestore flush for the 'Bound Trees' cachestore (part of plan cache) due to some database maintenance or reconfigure operations.

Starting up database 'DBName'

It appears under different SPID's 18, 20, 24 ...... and consistently has the 4 errors in a row
Full Text indexing is running for both servers but I dont know if this is the cause of the error.

I would greatly appreciate any help to get rid of this as I have trawled the net and not found anything of use.

Thank You.

View 8 Replies View Related

Cursor Causing Infinite Loop

Dec 13, 2007

Hi i have a cursor in a Stored Procedure. The problem is that it's poiting to the first row and causing an infinite loop on it.
How can i stop this and make it go to all rows. Here is my code.

Declare @CountTSCourtesy int
Declare @WaiterName nvarchar(100), @CursorRestaurantName nvarchar (100)
Declare waiter_cursor CURSOR FOR

SELECT new_waiteridname, new_restaurantname
FROM dbo.FilteredNew_CommentCard
Where new_dateofvisit between @FromDate and @ToDate and new_restaurantname = @Restaurant
Open waiter_cursor
FETCH NEXT FROM waiter_cursor
into @WaiterName,@CursorRestaurantName
While @@FETCH_STATUS=0

BEGIN
Exec WaitersCountExCourtesy @WaiterName,@CursorRestaurantName

END
Close waiter_cursor
Deallocate waiter_cursor
END


Thanks in advance...

View 1 Replies View Related

Sql Server 2000 Causing TCP/IP To Crash?

Nov 23, 2005

I installed Sql Server 2000 on a Windows 2003 machine and everythingappeared to be fine. It has Sql Server SP4 and Windows 2003 SP1 installed.The problem occurs whenever I use Enterprise Manager or Query Analyzer formore than a few minutes. After a while, all internet connectivity on theserver machine is broken. I cannot connect to it through Enterprise Manager,and on the machine itself, nothing related to the internet works at all.There are no errors in the Windows log or the Sql log. Disabling andenabling the Ethernet adapter fixes connectivity, until the next time I useEnterprise Manager for a while. What is going on?The machine is a Dell PowerEdge SC420 with a BroadCom NetXtreme Gigabitadapter and Windows 2003 Enterprise Edition.

View 2 Replies View Related

Timestamps Causing Write Conflicts

Jul 20, 2005

I have an Access XP ADE application connected to a SQL Server 7.0 SP4database. I have created a timestamp column in the main table.Unfortunately, I am now getting persistent write conflict errors.The order of operations are:1. The application starts and loads the recordset into the form using astored procedure.2. I modify a field and press a save button which uses me.dirty=false toforce a save.3. The field is saved to the database. Using profiler I can observe themodified field being saved. As I would expect, the update statement isusing the primary key and the timestamp column value. For the sake of thisdiscussion let's assume the value of the timestamp is 5ad9.4. Without navigating off the record, I alter the same field (or adifferent field) and press save again and a write conflict will appear.Using profiler I can see the update statement that is attempting to updatethe record. The update statement is using the previous value (5ad9) of thetimestamp column.I thought that the timestamp column value is incremented each time therecord is updated. The ADE application does not appear to be recognizingthe new timestamp value.Any help or advice you could give would be appreciated.ThanksGeorge

View 3 Replies View Related

Alter Column Causing Log To Fill

Jul 23, 2005

I'm trying to simply change a column definition from Null to Not Null. It'sa multi million row table. I've already checked to make sure there are nonulls for any rows and a default has been created for the column. My log isset to autogrow and as the alter column colname char(6) Not Null runs thelog begins to grow. If I use no check BOL say the optimizer won't considerthe change. How can I change the nullability of a column that currentlycontains no nulls without using up extreme amounts of log space?Danny

View 1 Replies View Related

Googlebot And MSN Bot Causing Sql Login Errors

Aug 31, 2007

Hi there

I am running sql 2005, I recently changed my sql login account for security reasons. The site is connecting to the database fine but in my server logs, all the Search engine bots are causing sql login errors. Its like they are still cacheing the old account

Any ideas?
Thanks
Clinton

View 1 Replies View Related

Fuzzy Grouping Causing Minidump

Oct 18, 2007

I was running a Fuzzy Grouping task on SQL Server Enterprise Edition SP1 without any issues. I then applied SP2 and now that same Fuzzy Grouping is causing a minidump and terminating the process.

First, does anybody know anything about this kind of issue?

Second, I tried to run the minidump file in Visual Studio but I cannot actually run the dump file in Visual Studio as I keep getting the following exception:


Debugging information for 'DtsDebugHost.exe' cannot be found or does not match. No symbols loaded.

Finally, I did obtain a random error on the server itself that displayed the GUID: 58FC39EB-9DBD-4EA7-B7B4-9404CC6ACFAB.

This GUID appears to be tied to a Dr. Watson error but, again, I cannot figure out what process is breaking.

Can somebody please help?

View 1 Replies View Related

Basic Package Is Causing An Error

Aug 9, 2006

I'm getting the following error message on a basic copy from a datareader (using an ODBC datasource) to a sqlnativeclient. There are no transformations or anything. Don't know what is going on. Any insights are appreciated.

[SQL Server Destination [361]] Error: An OLE DB error has occurred. Error code: 0x80040E14. An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E14 Description: "Could not bulk load because SSIS file mapping object 'GlobalDTSQLIMPORT ' could not be opened. Operating system error code 2(The system cannot find the file specified.). Make sure you are accessing a local server via Windows security.".

View 10 Replies View Related

People With Same Last Name Causing Problems (Adam You There)

Dec 22, 2006

I run a report which displays transcripts for members. One page per person with their name, courses they've took, date of the course and their score. The problem I am having is when the report runs and comes across the same last name for many people (say, Hanna and Billy Alexander), it's only giving me the first "Alexander" it comes across and it is putting all the information for the other "Alexander's" in that first transcript. How do I separate out the people who's last name is the same say "Smith" or Jones"?

I have a distinct clause on my query and when I run it I can see the other people with the same last name and their unique courses and scores (each of them have a unique member ID) - it's when I actually run the report that it groups the information (courses, dates, scores, etc) for the same last name people all under the first one it comes across. Hope that made sense.

Thx,

Billy

View 4 Replies View Related

What's In SSIS Causing Duplicate Records?

Jan 7, 2008

I've a dtsx package which runs nightly to do following:

1. select data from a SQL replicated table
2. do some lookups (Lookup, Derived Column, Multicast, Conditional Split, etc.)
3. insert into another SQL table on another server using "Table or view - fast load", rows per batch = 10000, maximum insert commit size = 10000, and "redirect row" on error output on destination to an error log text file.
Once in a while, I found duplicate records in the error log; these rows cannot be inserted into destination table due to primary constraint. For example, transaction_id=111000 appears twice in the error log but it is a unique key in the source table.


My questions:
1. What could be the cause of duplicating rows during ETL in SSIS? I've asked this before and have spent so much time research but still could not find the reason. This link is from my previous post:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=452319&SiteID=1


2. For a daily extract data with over millions of rows, what would be best to set rows per batch, maximum insert commit size, etc? I've read some posts on this forum and decide to use 10000 for both, but once in a while there's just one duplicate rows that causes the whole batch of 10000 rows not committed.

Thanks for any feedbacks.

-Ash

View 32 Replies View Related

Cascading Parameters Causing Slowdown

Feb 1, 2008

Hi,

I am currently working on a report with 3 cascading parameters. These three parameters depend on the datasets whose data are retrieved from a large table with SELECT DISTINCT. As the table grows larger, selecting values for these parameters cause postback, and slow in performance from user perspective. I am looking for a way to reduce postback. Can anyone suggest the way to retrieve the dataset all once and filter the dataset without causing postback? Or any other way to improve the performance will be greatly appreciated.

Thanks,
NL

View 7 Replies View Related

Default Column Causing Problems!

Jan 4, 2007

Hi everybody,

Iam migrating a table called Vendors from sql 2005 to flatfile.

but it end with error message that the default column is causing problem.

the table is as follows,


CREATE TABLE VENDORS
(
RECORDTYPE CHAR(5) DEFAULT 'VNDRS' NULL,
SETID CHAR(5) NOT NULL,
VENDORID CHAR(10) NOT NULL,
VENDORNAMESHORT CHAR(14) NOT NULL,
VENDORNAMESEQNUM INT NULL,
NAME1 CHAR(40) NOT NULL,
NAME2 CHAR(40) NULL,
REMITVENDOR CHAR(10) NULL,
CUSTSETID CHAR(5) NULL,
CUSTID CHAR(15) NULL,
ENTEREDBY CHAR(8) NULL,
ARNUM CHAR(15) NULL,
OLDVENDORID CHAR(15) NULL,
WTHDSW CHAR(1) NOT NULL,
VATSW CHAR(1) NOT NULL,
NAME1AC CHAR(40) NULL,
NAME2AC CHAR(40) NULL,
PRIMARYVENDOR CHAR(10) NULL,
LASTACTIVITYDT DATETIME NULL,
HUBZONE CHAR(1) NOT NULL,
EEOCERTIFDT DATETIME NULL,
VENDORAFFILIATE CHAR(5) NULL
)

any idea as what i need to do?
pls help out.

Thanks and Regards,
sg





View 6 Replies View Related

Table Joins Causing Duplicates?

May 15, 2015

I have a table with call data (ContactID, Queues Entered, Call Status, Date & Time Stamps etc). Each entry relating to a contact ID goes onto a new row. The first row for a contact is the date and time it is created. It then captures the queue (or queues) it enters before it is answered. Finally, it captures when the call is released (Completed).

I'm trying to link all this data into one single row per contact ID to make it easier to report on.

I started off by using DISTINCT to pull back all of the Contact ID's. I then used a Left Join to pull back the date and time of creation. I created a further Left Join to pull back the first queue that it entered and so on.

When I did this, I started getting duplicates. This is because some calls enter more than one queue.

How can I do this so that it only has one ContactID per row. Also, for the Queue, is there anything I can do to ensure it pulls back the first Queue it enters? (These are time stamped). Subsequently, I would then need to add the second and third queue it enters in other columns. (A call can enter a maximum of 3 queues).

View 8 Replies View Related

Postback Causing Database Transactions To Replay

Dec 7, 2006

Hey,
 I hope someone can quickly tell me what I am obviously missing for this weird problem. 
To give a general picture, I have an ASP.net webpage that allows users to select values from several dropdown menus and click an add button which formats and concatenates the items together into a listbox. After the listbox has been populated the users have the option to save the items via a save button.
The save button parses each item in the listbox to basically de-code the concantenated values and subsequently inserts them into a table residing on a backend MSSQL 2005 database.
PROBLEM:
In the process of testing the application, I noted this strange behavior. If I use the webpage to insert the values, go to the table where the values are stored and delete the rows; Upon a refresh of the web page the same actions seem to be getting replayed and the items are again inserted into the table.
Naturally, what I'd really like would be for the page to refresh and show that the items aren't any longer there and not the other way around. 
If the code that performed the insert was residing in a component that was set for postback I'd expect this type of behavior but its in the Save buttons on_click event. I have tried practically everything in effort of targeting the problem but not having much luck with it.
Is this behavior practical and expected in ASP.net or has anyone ever heard of anything similar? I have never encountered this type of problem before and was hoping someone could provide some clues for resolving it. If more information is required I'd be happy to supply it. Hopefully, there's a simple explanation that I am simply unaware since I haven't experienced anything like this before.
Anybody got any ideas???
Thanks.

View 6 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved