Sqldatareader Reads From Second Row Skip The First Row.
Jun 20, 2007
Hello,
im using sqldatareader to read my data and whenever time i loop through the reader it starts from second row why is that?
here is my code:while (reader.Read())
{
hinfo.Name = reader["_name"].ToString();
hi.Add(hinfo);
}
i look at the database and i have two rows but its reading only the second row, skiping the first row
View 2 Replies
ADVERTISEMENT
Mar 25, 2008
Hi!
I was assigned to solve performance problems for an application. I fired up Sql Server profiler and started a trace. Downloaded Sql Server Trace Analyzer. It's a trial version so it's very limited. What I found is that one stored procedure generates almost 400 000 reads everytime it's used and it's used everytime the user wants to see his orders. I've tried to translate the t-sql to english from swedish, it looks something like this:
select top 100
o.orderid,
o.name,
o.latestdeldate,
os.name as OrderStatus,
os.orderstatusID,
p.placeID,
p.name as place,
p.address,
p.city,
a.name as worktype,
noOfActions=(select count(*) from actions a where a.order_orderid=o.orderid),
noOfServiceObjects = (select count(*) from Serviceobject s, Actions a where s.Place_PlaceID = o.Place_PlaceID and a.order_orderid = o.orderid and a.Serviceobject_serviceobjectid = s.serviceobjectid),
...
...
...
It has 8 select count(*) in the select statement then in the where statement it has 2 more select count(*).
I know it's very difficult for you to come up with a solution but do you know a better way than to use select count(*) everywhere? The count is used for to show different status flags on the website.
/Magnus
Jesus saves. But Gretzky slaps in the rebound.
View 19 Replies
View Related
Mar 13, 2001
Does any one know, how to skip the first and last row of a text file while importing using DTS.
ThankX in advance.
View 1 Replies
View Related
Aug 1, 2001
If I'm doing a dirty reads and a someone updates a record when I'm trying to read it is it possible to read both the old and new records thereby retrieving two records?
View 2 Replies
View Related
Oct 30, 2006
How can You find the reads and writes per second of your hard drives in sql. I am reading my SQL book and it says that your average disk should have 125 or less i/o's. And it gave the forumal but as mentioned I don't know how to find the reads and writes.
View 4 Replies
View Related
May 1, 2008
server: QAT on clustering server ----> 23 seconds
----------------------------------------------------
SS 2000 developer edition SP4
win NT 5.2 (3790) SP4
MeM 7935 MB
processors 4
root directory C:program files...
use a fixed memeroy size 640 MB
reserve physical memory for sql server
minimum query memory 1024 kb
use all available processors
minimum query plan threshold for considering 5
PROFILER READS = 5234
server: MILLER ----> 3 seconds
----------------------------------------------------
SS 2000 developer edition no service pack
win NT 5.2 (3790) SP4
MeM 2047 MB
processors 4
root directory f:MSSQL$INAQAT
dynamically configure sql server memory
use all available processors
minimum query plan threshold for considering 5
PROFILER READS = 598
----------------------------------------------------
Making story short. I got an application that hits only 1 database called RECORDS. I'm getting different duration when running an application. 23 and 3 seconds.
Same database, same objects and same application.
SERVER QAT is our staging server, means lots of databases
SERVER MILLER is just a server i just assembled, means just one database (RECORDS).
Not sure if it's because it's a clustering server that is causing the issue nor the reads. If its the reads, what is causing it? Do you think is the how the memory is configured?. Will the experts pls stand up?
View 20 Replies
View Related
Jul 18, 2006
So I€™m at a dead-end looking for the reason behind the following behavior. Just to make sure no one misses it, the 'behavior' is the difference in the number of reads between using sp_executesql and not.
The following statements are executed against a SQL 2000 database that contains >1,000,000 records in the act_item table. They are run using Query Analyzer and the Duration and Reads come from SQL Profiler
SQL 1:
exec sp_executesql N'update act_item set Priority = @Priority where activity_code = @activity_code', N'@activity_code nvarchar(40),@Priority int', @activity_code = N'46DF335F-68F7-493F-B55E-5F9BC6CEBC69', @Priority = 0
Reads: ~22000
Duraction: 250-350 ms
SQL 2:
DECLARE @Priority int
DECLARE @Activity_Code char(36)
SET @Priority = 0
SET @Activity_Code = '46DF335F-68F7-493F-B55E-5F9BC6CEBC69'
update act_item set Priority = @Priority where activity_code = @activity_code
Reads: ~160
Duration: 0 ms
Random information:
Activity_code is an indexed field on the table, although it is not the primary key. There are a total of four indexes on the table, none of which include the priority as one of the fields.
There are two triggers on the table, neither of which is executed for this SQL statement (there is an IF UPDATE(fieldname) surrounding the code in the trigger)
There are no foreign relationships
I checked (using perfmon) to see if a compilation/recompilation was happening. No it's not.
Any suggestions as to avenues that could be examined would be appreciated.
TIA
View 3 Replies
View Related
Oct 21, 2004
Hello:
I have one table and it contains a column named ID Number, and a column named Date. I have a Do While statement that runs a SQL select statement a few times based on the number of records with the same ID Number. During the Do While statement the information is copied into another table and deleted from the old table. After I look at the results, I see that at the second Do While loop, the data was not selected and the Select statement did not run... so the old variable value from varValue is used again... Any reasons on why?
Here is a code snippet of what is going on:
Do While varCount < varRecordCount
conSqlConnect.Open()
cmdSelect = New SqlCommand ("Select * From temp_records_1 where [id number]=@idnumber and date<@date", conSqlConnect)
cmdSelect.Parameters.Add( "@accountnumber", "10000" )
cmdSelect.Parameters.Add( "@date", dtnow )
dtrdatareader = cmdSelect.ExecuteReader()
While dtrdatareader.Read()
If IsDbNull(dtrdatareader("value")) = false Then
varValue = dtrdatareader("value")
End If
End While
dtrdatareader.Close()
conSqlConnect.Close()
'#####The information above is copied to another table here
'#####The record where the information was received is deleted.
varCount = varCount + 1
LoopAny ideas?
View 2 Replies
View Related
Mar 13, 2000
Hello somebody,
I submitted an update query on a table of 80 million rows, in the weekend. When I returned on Monday, the transaction was still running. I thought some thing wrong happened and cancelled the transaction. It was taking long time to rollback the transaction. I recycled the SQL Server assuming it will do faster recovery. Now I realised that anyway it is going to take lot of time. And SQL server is not going to be up till the database is recoverd completely.
Now can any body suggest me any thing to faster this process or skip this process. I dont know how long it is going to take rollback the transaction which ran for more than 70 hours.
Thanx in advance.
View 1 Replies
View Related
Apr 19, 2004
I was attempting to use BCP today via xp_cmdshell. I have never done anything with BCP before, so it was very enlightening. However, I ran across a problem that maybe someone could help explain to me a little more.
I am using the "queryout" option, and when I run it, the error I get is that you "can't skip fields except for on inserts" or something like that.
The reason I was trying to use bcp is the ability to dynamically generate a filename, i.e. filename = 04182004 (the date). Because in the file name argument, I can use a variable. Make sense?
Since I apparently can't ignore fields, I am thinking of taking all of the information I need daily out, and into a seperate table, then I can use the xp_cmdshell to run a bcp that creates a file with the date as a filename, and I won't be ignoring any fields because I have just put the information I need in the new table. Am I making sense? Does this sound like an appropriate thing to do?
View 3 Replies
View Related
Oct 25, 2007
Hello
I am tryung to execute a Store Proc using Execute SQL Task.
I am very aware that if there is any errors occur I have handled it sufficiently.
All I want to do is, when ever there are any errors in teh Store Proc then this Execute SQL task should not fail and it should go for the next Task in teh control flow.
How can I do this?
View 4 Replies
View Related
Apr 23, 2008
Hi,
There is an option in ssis to skip one or more header rows, but there isn't any thing to skip one or more footer rows.
Example:
header bla bla
1;"Joe";24;"New York"
2;"John";54;"Washington"
3;"Phil";36;"San Francisco"
footer bla bla
I skip the first record in the source definition. So I have left 4 records. How do I skip the fourth (last) record? The value contains some statistics so I cann't look for a special value. Is there a way to skip the last record with a script component?
Joost
View 5 Replies
View Related
May 31, 2006
I have a set of triggers that log the history of changes to a table - i.e. I record inserts, updates, deletes (pretty standard audit stuff I suppose). I want to also log reads on that data. If I were using sprocs for reading data, this would be relatively painless, but I am using an O/R mapper to handle my data access, which writes dynamic sql at runtime (and I don't want to use sprocs with it) and then sends it down to the DB. Is there a way I can intercept reads and log them to the same table I am logging other actions? I know very little about the new capabilities of SQL Server 2005, but I would think I could somehow, maybe via the new CLR capabilities or similar, get access to these types of events within the database? Anyone? I know I could always do this higher up in the application layers, but I would like to keep all of this at the database level if possible....Thanks,
View 1 Replies
View Related
Jan 17, 2002
SQL 6.5 - 5.5 Gig
NT
Hello,
Throughout the day our Document Management application generates high busts of physical page reads when users query the database.
What SQL configuration parameter(s) should I check/modify to insure that the database is performing at it's optimun during these bursts?
Thank You in advance.
View 1 Replies
View Related
Jul 21, 2000
Is there a way to get a total count of all SELECT, UPDATE, DELETE and INSERT statements to a SQL Server 6.5 database during a 12 hour period? I'm thinking maybe someone knows of a software that reads the log or monitors the server... I've been looking at the performance monitor and, although it has good information, it doesn't capture DML's.
FYI - it's for capacity planning.
TIA,
Mike
View 1 Replies
View Related
Aug 24, 2007
I'm trying to insert all the rows from a table to a new table.
(insert A select * from AA)
The reads on Profiler shows ar really high value (10253548).
First I created a unique clustered index and the reads shows (3258445), then I created a non clustered index expecting to have lower reads. Instead the reads shows (10253548).
I read creating indexes helps reduce reads. But it's not happening.
Any ideas what is going on?
=============================
http://www.sqlserverstudy.com
View 6 Replies
View Related
Mar 5, 2008
GUys,
Is there any way track tables which have most no of reads and writes from a database of 400 tables.
Thanks
View 9 Replies
View Related
Jul 27, 2007
Hi,
Can any of can explain, what the "Reads" column in Profiler exactly mean ? I'm not comfortable with the explanation given in BOL.
"The number of read operations on the logical disk that are performed by the server on behalf of the event. These read operations include all reads from tables and buffers during the statement's execution"
For the same procedure with same parameters, if the server is not loaded much, the Reads are in a few hundreds, but when there are more than 1000 concurrent users, why it is going to millions ? What other parameters affecting this reads ? And how can I reduce it ?
Environment: SQL Server 2005 64-bit Enterprise Edition on Windows Server 2003 R2 Server x64 Enterprise Edition SP2
Thanks in Advance.
Regards
Babu
View 4 Replies
View Related
Aug 28, 2006
Hi,
I have been seeing a basic scenario of a write transaction appearing to unexpectedly lock-out reading.
The database has isolation set to "READ COMMITTED".
The scenario is:
1.) Start a transaction (for doing a write)
2.) Do a read before the transaction (for doing the write) is committed (e.g. sqlCommand2.ExecuteReader()).
--> the code will appear to lock-up (then time out).
I see the same behavior if I step through the "write" code with the debugger (to a point after the transaction is started, but before it is committed), and run a "SELECT * FROM" type query from Microsoft SqlServer Management Studio.
Following is the code sample demonstates the issue.
Thoughts on how to resolve the issue (to let me do "read committed" reading of the database table)?
Thanks!
Andy
Module Transaction
Sub Main()
Dim exception1 As Exception
Try
' Create/Open Database Connection
Dim sqlConnection1 As New System.Data.SqlClient.SqlConnection("Server=GRB-AB;Database=Transaction;Trusted_Connection=True;")
sqlConnection1.Open()
' Start transaction
Dim sqlTransaction1 As System.Data.SqlClient.SqlTransaction = sqlConnection1.BeginTransaction()
' Set Parent record
Dim sqlCommand1 As New System.Data.SqlClient.SqlCommand("INSERT INTO Parent (Name) VALUES ('ParentValue');", sqlConnection1)
sqlCommand1.Transaction = sqlTransaction1
sqlCommand1.ExecuteNonQuery()
' Get Id from parent record (note: this code assumes the table was empty when this program starts)
sqlCommand1 = New System.Data.SqlClient.SqlCommand("SELECT Id FROM Parent;", sqlConnection1)
sqlCommand1.Transaction = sqlTransaction1
Dim parentId As Integer = CType(sqlCommand1.ExecuteScalar(), Integer)
'
' Do reading test to test concurrently reading table being written to
'
' Create/Open Database Connection for reading test
Dim sqlConnection2 As New System.Data.SqlClient.SqlConnection("Server=GRB-AB;Database=Transaction;Trusted_Connection=True;")
sqlConnection2.Open()
Dim sqlCommand2 As New System.Data.SqlClient.SqlCommand("SELECT Id FROM Parent;", sqlConnection2)
sqlCommand2.ExecuteReader()
Dim i As Integer
While (sqlCommand2.ExecuteReader.Read = True) ' <===== LOCKS UP HERE **************
i = i + 1
End While
'
' End reading test
'
' Set child record
sqlCommand1 = New System.Data.SqlClient.SqlCommand( _
"INSERT INTO Child (Name, ParentId) VALUES ('ChildValue', " & parentId.ToString & ");", sqlConnection1)
sqlCommand1.Transaction = sqlTransaction1
sqlCommand1.ExecuteScalar()
' Either 1.) commit transaction OR 2.) rollback transaction
Dim test As Boolean = False
If test = False Then
sqlTransaction1.Commit()
Else
sqlTransaction1.Rollback()
End If
sqlConnection1.Close()
sqlConnection2.Close()
Catch ex As Exception
exception1 = ex
End Try
End Sub
End Module
View 1 Replies
View Related
Sep 19, 2006
I have written a same stored proc in TSQL and SQL CLR which basically takes an input xml and returns xml document. In SQL Profiler, I am getting reads value about five times more for the CLR. Does anyone has any idea why the CLR is doing more reads than TSQL? Thanks in advance.
View 5 Replies
View Related
Jan 10, 2008
Hi, I have the code below. I need to skip the first row in the datatable as it has the headers. This works now, but my gridview gets the header row inserted as a record.Private Shared Sub InsertData(ByVal sourceTable As System.Data.DataTable, ByVal destConnection As SqlConnection) ' old method: Lots of INSERT statements ' first, create the insert command that we will call over and over: destConnection.Open() Using ins As New SqlCommand("INSERT INTO [tblAppointmentDisposition] ([contactdate], [dnbnumber], [prospectname], [businessofficer], [phonemeeting], [followupcalldate2], [phonemeetingappt], [followupcalldate3], [appointmentdate], [appointmentlocation], [appointmentkept], [applicationgenerated], [applicationgenerated2], [applicationgenerated3], [comments], [newaccount], [futureopportunity]) VALUES (@contactdate, @dnbnumber, @prospectname, @businessofficer, @phonemeeting, @followupcalldate2, @phonemeetingappt, @followupcalldate3, @appointmentdate, @appointmentlocation, @appointmentkept, @applicationgenerated, @applicationgenerated2, @applicationgenerated3, @comments, @newaccount, @futureopportunity)", destConnection) ins.CommandType = CommandType.Text ins.Parameters.Add("@contactdate", SqlDbType.Text) ins.Parameters.Add("@dnbnumber", SqlDbType.Text) ins.Parameters.Add("@prospectname", SqlDbType.Text) ins.Parameters.Add("@businessofficer", SqlDbType.NVarChar) ins.Parameters.Add("@phonemeeting", SqlDbType.Text) ins.Parameters.Add("@followupcalldate2", SqlDbType.Text) ins.Parameters.Add("@phonemeetingappt", SqlDbType.Text) ins.Parameters.Add("@followupcalldate3", SqlDbType.Text) ins.Parameters.Add("@appointmentdate", SqlDbType.Text) ins.Parameters.Add("@appointmentlocation", SqlDbType.Text) ins.Parameters.Add("@appointmentkept", SqlDbType.Text) ins.Parameters.Add("@applicationgenerated", SqlDbType.Text) ins.Parameters.Add("@applicationgenerated2", SqlDbType.Text) ins.Parameters.Add("@applicationgenerated3", SqlDbType.Text) ins.Parameters.Add("@comments", SqlDbType.Text) ins.Parameters.Add("@newaccount", SqlDbType.Text) ins.Parameters.Add("@futureopportunity", SqlDbType.Text) ' and now, do the work: For Each r As DataRow In sourceTable.Rows For i As Integer = 0 To 16 ins.Parameters(i).Value = r(i) Next ins.ExecuteNonQuery() 'If System.Threading.Interlocked.Increment(rowscopied) Mod 10000 = 0 Then 'Console.WriteLine("-- copied {0} rows.", rowscopied) 'End If Next End Using destConnection.Close() End Sub
View 3 Replies
View Related
Jan 10, 2008
I have this code. It works, but inserts the header row into the gridview. I need to avoid the first row. Protected Sub excelimport(ByVal dataSrc As SqlDataSource, ByVal fileName As String) Dim intFileNameLength As Integer Dim strFileNamePath As String Dim strFileNameOnly As String Dim strpath As String If Not (uploadfile.PostedFile Is Nothing) Then strFileNamePath = uploadfile.PostedFile.FileName intFileNameLength = InStr(1, StrReverse(strFileNamePath), "") strFileNameOnly = Mid(strFileNamePath, (Len(strFileNamePath) - intFileNameLength) + 2) 'If File.Exists(paths & strFileNameOnly) Then 'lblMessage.Text = "Image of Similar name already Exist,Choose other name" 'Else If uploadfile.PostedFile.ContentLength > 40000 Then lblmessage.Text = "The Size of file is greater than 4 MB" ElseIf strFileNameOnly = "" Then Exit Sub Else 'strfilename = uploadfile.FileName.Substring(0, (InStr(uploadfile.FileName, ".") - 1)) strFileNameOnly = fileName & ".csv" strpath = "/sites/marketing/apps/disposition/content/excel/" '& strFileNameOnly uploadfile.PostedFile.SaveAs(Server.MapPath(strpath) & strFileNameOnly) 'lblmessage.Text = "File Upload Success." 'Session("Img") = strFileNameOnly Dim strConn As String = "Provider=Microsoft.Jet.OLEDB.4.0;Data Source=" & (Server.MapPath(strpath)) & ";Extended Properties=""Text;HDR=No;FMT=Delimited""" '"Provider=Microsoft.Jet.OLEDB.4.0;" & _ '"Data Source=" & "/" & strFileNameOnly & ";" & _ '"Extended Properties=Excel 8.0;" Dim conn As New OleDb.OleDbConnection(strConn) Dim myData As New OleDbDataAdapter("SELECT * FROM " & strFileNameOnly, conn) Dim myDatatable As New System.Data.DataTable Dim mySqlConnection = New SqlConnection(ConfigurationManager.ConnectionStrings("BDOConnectionString").ToString()) ''You must use the $ after the object you reference in the spreadsheet myData.Fill(myDatatable) InsertData(myDatatable, mySqlConnection) 'System.IO.File.Delete(Server.MapPath(strpath)) GridView1.DataBind() upload.Visible = False End If End If ' GridView1.DataSource = myDataset.Tables(0).DefaultView ' GridView1.DataBind() End Sub Private Shared Sub InsertData(ByVal sourceTable As System.Data.DataTable, ByVal destConnection As SqlConnection) ' old method: Lots of INSERT statements ' first, create the insert command that we will call over and over: destConnection.Open() Using ins As New SqlCommand("INSERT INTO [tblAppointmentDisposition] ([contactdate], [dnbnumber], [prospectname], [businessofficer], [phonemeeting], [followupcalldate2], [phonemeetingappt], [followupcalldate3], [appointmentdate], [appointmentlocation], [appointmentkept], [applicationgenerated], [applicationgenerated2], [applicationgenerated3], [comments], [newaccount], [futureopportunity]) VALUES (@contactdate, @dnbnumber, @prospectname, @businessofficer, @phonemeeting, @followupcalldate2, @phonemeetingappt, @followupcalldate3, @appointmentdate, @appointmentlocation, @appointmentkept, @applicationgenerated, @applicationgenerated2, @applicationgenerated3, @comments, @newaccount, @futureopportunity)", destConnection) ins.CommandType = CommandType.Text ins.Parameters.Add("@contactdate", SqlDbType.Text) ins.Parameters.Add("@dnbnumber", SqlDbType.Text) ins.Parameters.Add("@prospectname", SqlDbType.Text) ins.Parameters.Add("@businessofficer", SqlDbType.NVarChar) ins.Parameters.Add("@phonemeeting", SqlDbType.Text) ins.Parameters.Add("@followupcalldate2", SqlDbType.Text) ins.Parameters.Add("@phonemeetingappt", SqlDbType.Text) ins.Parameters.Add("@followupcalldate3", SqlDbType.Text) ins.Parameters.Add("@appointmentdate", SqlDbType.Text) ins.Parameters.Add("@appointmentlocation", SqlDbType.Text) ins.Parameters.Add("@appointmentkept", SqlDbType.Text) ins.Parameters.Add("@applicationgenerated", SqlDbType.Text) ins.Parameters.Add("@applicationgenerated2", SqlDbType.Text) ins.Parameters.Add("@applicationgenerated3", SqlDbType.Text) ins.Parameters.Add("@comments", SqlDbType.Text) ins.Parameters.Add("@newaccount", SqlDbType.Text) ins.Parameters.Add("@futureopportunity", SqlDbType.Text) ' and now, do the work: For Each r As DataRow In sourceTable.Rows If sourceTable.Rows.IndexOf(sourceTable.Rows(0)) Then 'do nothing Else For i As Integer = 0 To 16 ins.Parameters(i).Value = r(i) Next ins.ExecuteNonQuery() 'If System.Threading.Interlocked.Increment(rowscopied) Mod 10000 = 0 Then 'Console.WriteLine("-- copied {0} rows.", rowscopied) 'End If End If Next End Using destConnection.Close() End Sub
View 5 Replies
View Related
Sep 20, 2013
I am looking for best practice when passing a parameter to stored procedure that is not needed. For example, sometime the users will want the list to list only by certain state. Other times the user want all states. How can I make the SP to ignore the where clause if users want all states.
CREATE PROCEDURE usp_Example
@State nvarchar(2)
AS
SELECT FirstName,
LastName,
State
FROM SomeTable
WHERE State = @FirstName;
GO
View 1 Replies
View Related
Jul 23, 2005
Hello All,Does the BCP utility enable you to selectively import rows from a flatfile to a table ?For example:The first column in my flat file contains a record type - 1, 2..7I only need to import types 1, 2, & 3Can this be specified in the .fmt file ?Thanks in advancehharry
View 4 Replies
View Related
Jun 2, 2006
Hence you have a database which huge tables and a transactional replication (push subscriptions).
Now my question:
1. if I have to initialize a snapshot but I would like to do it without the snapshot agent, what methods are available?
2. Usually the distribution agent will request an initialize snapshot. How can I tell him, that I would like to use an alternative method and that the distribution agent should NOT request a snapshot?
3. Any suggestions about a good practive for materializing huge and big tables wihtout using the disitrbution agent (e.g. "switch off" replication, bcp table out of the primary site and bcp it into the target site, "start" distribution agent so that it doesn't request a snapshot).
Regards
Nobsay
View 6 Replies
View Related
May 15, 2007
Hi,
How to skip my 12 header rows from XLS input source?
(before the Excel driver reads (by default, 8 rows) in the specified source to guess at the data type of each column.)
thx,
f.sor
View 3 Replies
View Related
May 20, 2008
I am importing a flatfile and cannot seem to deal with an issue that seems quite simple.
The files have a header row with column names and those rows start with '#'
However sometimes this header row will also be present in the middle of the file.
The Source tries to parse this row and fails
Is there any way to tell the flafile source to skip rows that start with a particular character like comment rows?
View 5 Replies
View Related
Jul 24, 2007
Hi All,
I have a excel workbook with many sheets, in each sheet the first row has to be skipped and the second row contains the column information and thereafter are the records.
The Excel Source in SSIS just gives an option: check if the first row has column names.
But the first row for me is junk -- a link to parent or first sheet-- and has to be skipped and the second row has the column info.
How can this be accomplished .... any suggestions would be of great help!!!
Sample:
Main
id
desc
price
date
1
apple
1.0
1/1/1900
2
banana
2.0
1/1/2000
Main in the first row is actually a hyperlink ... once we click this it takes us to the first sheet in the workbook which has all sheet names as contents.
I am stuck and not sure how to skip this!!
View 5 Replies
View Related
Feb 7, 2006
OK. We know there is Header rows to skip options and it works great.
I've got the file that has a "footer". Here is an example:
.
PSC
filename=table1
records=0000000000525
ldbname=db1
timestamp=2006/02/07-16:25:00
numformat=44,46
dateformat=mdy-1910
map=NO-MAP
cpstream=ISO8859-1
.
0000260611
It's ALWAYS last 12 rows.
Is there a way to split at this point and put the 12 rows in a different location? The task is twofold - I don't need these control rows in my data and I need value of "records" to verify loaded number of rows.
UPDATED: After some testing I found out that the Flat File source does not see that footer at all. This is good and bad - I do want to load this metedat into some other tables.
Dima.
View 7 Replies
View Related
Dec 22, 2000
Hi Everybody,
One of my friend asked me "How do we reduce the query logical, scan reads
in SQL Server?".
I really don't know, how to answer him.
Can anybody explain me regarding this.
thanks,
Srini
View 2 Replies
View Related
May 5, 2015
Just migrated application from Oracle to SQL and we are seeing alot of deadlocking and blocking. I did notice that app seems to be passing isolation level of repeatable read. Attached is a .doc of one of the deadlocks, is there a way to avoid these in the repeatable read isolation level? This example is a select with two tables, using NCI's that cover the where, and a insert doing just a clustered index insert. Is this simply try to get rid of the repeateable read if not needed, guess have to check with vendor on that or is there a way to get this to not deadlock using repeatable read?
View 2 Replies
View Related
May 2, 2007
Hi,
I'm trying to figure out why my sqlserver is flatlined on the CPU. I'm doing a trace and can't help but notice this, with crazy high reads. I'm not sure what this is? It doesnt look good to me, altho maybe its nothing. Any info is much appreciated.
Thanks again!
mike123
Event Class/ TextData/ApplicationName/ LoginName/ CPU/ Reads/ Writes/ Duration
Audit Logout.Net SqlClient Data ProviderloginName3764129784 3146 156
View 3 Replies
View Related
Jul 23, 2005
I am running a profiler trace against a database and noticed that thereads column always shows 0. When running the same trace againstanother machine I get back values in the reads column. I took a querythat profiler reported as having 0 reads and ran in in query analyzerwtih STATISTICS IO on and confirmed that there are in fact reads:Table 'tt_cawardalloc'. Scan count 1, logical reads 8, physical reads0, read-ahead reads 1.Table 'tt_clineitem'. Scan count 10, logical reads 125208, physicalreads 1540, read-ahead reads 2995.Table 'tt_contractitem'. Scan count 32, logical reads 676, physicalreads 0, read-ahead reads 0.Table 'tt_contract2'. Scan count 3, logical reads 121, physical reads4, read-ahead reads 0.I am on SQL 2000 sp3a. Any help appreciated.Thanks!
View 8 Replies
View Related