SQL 7 High Page Faults/sec Compared To 6.5 ?

Aug 11, 1999

I am running an application on one NT Server, running against SQL Server 6.5 sp 3, and SQL 7 with sp1 applied.

The application is a 'data migration' type application - ie heavy insert and update workload - against many (50+ tables) with many different SQL statements.

The SQL 7 server is configured with 'floating' memory.

On SQL 7 - I am experiencing very high page faults/second for the sqlservr process - sometimes peaking at over 1,000. I was under the impression any number greater than 10 indicates a problem with system performance.

The same application, same data, same NT configuration etc against SQL 6.5 does not page fault. SQL Server 6.5 completes the work faster than 7.

Could anyone help me understand what's going on ?

Thanks in advance.

View 2 Replies


ADVERTISEMENT

Perfmon - Process:page Faults/sec

Jun 13, 2000

My question is, My perfmon counters reads anywhere between 0 and 1200 with the average being around 250 faults/second.
My concern is my memory max size being too large, I have 4gb ram, SQL reports usage at 2.9 GB, my max mem size is 3.9 gb, should I maybe set my SQL server to use a fixed memory size of 2.9 gb?
Thanks in advance

Pete Karhatsu

Copied from SWYNKs' article:

Process: Page Faults/sec
If this value is greater then 0 then the SQL Server process is producing soft page faults and as a result CPU overhead. Try setting the working set size value to be as close to the SQL Server's memory allocation.

View 1 Replies View Related

MS SQL 2005, .NET, Logins/sec And Page Faults

Jul 25, 2006

Hi, all.We have a couple of pathological sql servers that have lots and lots ofpage faults per second, up to 4000. Our client programs are written inC#/.NET 1.1 and utilizes connection pooling.Some of the client programs seems to log in hundred of times persecond, as reported by perfmon->.SQLServer:GeneralStatistics->Logins/sec. Stopping the client programs reduces thatnumber significantly.We've done code reviews of the client programs and they look OK.Monitoring .NET connections&pools does not show anything suspicicous.We're currently rewriting the clients to use one db connection insteadof the pools, but that takes some time and may introduce bugs. Doesanyone know why we have these problems and/or why logins/sec is sohigh? I'm thinking "bugs in the .NET client", but really have noidea...One thought I had was that the Page Faults reported for sqlsrv.exe isrelated to memory mapped IO and therefore can be ignored. Right orwrong?Any thoughs/pointers/ideas, even wild guesses, are most welcome.BjørnPS: The server memory is fixed at 1.5GB out of 2GB physical ram,clients run on the same machine and use TCP/IP comm.(I know...) Thehost itself is not paging.

View 4 Replies View Related

Whats A Memory Page Faults

Jul 20, 2005

Hi all. Dorky question , but I am still relatively new to the world ofms database servers so bare with me. I am monitoring the page faultrate on a server and it runs at 100% almost all of the time. Cansomeone help me understand what that means?

View 1 Replies View Related

Report Running Very Slow Compared To Query Analyzer - High TimeDataRetrieval

Jun 26, 2007

Hi,



I have a report in SQL Reporting Services 2005 which calls a stored proc and the report takes a very long time to run and sometimes returns zero records. But when i run the stored proc in query analyzer it takes about 4 seconds!!



I have checked the execution log on the RS using the below sql:






Code Snippet

use ReportServer

Select * from ExecutionLog with (nolock) order by TimeStart DESC



It shows that i have a large amount of time for the dataretrieval (601309ms, about 10mins) and does not return any records most likely because of a query timeout:



TimeDataRetrieval TimeProcessing TimeRendering Source Status ByteCount RowCount
601309 2227 3 1 rsSuccess 4916 0



The weird thing is that when i run it in query analyzer, i get about 400 records in 4 seconds !!



I dont understand what RS is doing to take up so much time like this to retrieve data.



The report is very simple - it basically returns the records straight out into a table.



The only thing I somewhat suspected was a parameter data type conflict between RS and SQL, specifically dates. I have a start and end date parameter in the report - i tried specifying this as date and string to see if it made any difference but it didn't.



Any help would be greatly appreciated.

View 19 Replies View Related

SQL Server 2000 Cluster - Excessive Page Faults

Jul 31, 2007

We are running SQL Server 2000 Enterprise Edition on a 2-node cluster with IIS/ASP.NET front-end hosting 150-200 active connections. There is a SVCHOST process running under LOCAL SERVICE account - hosting the Remote Registry process that is using only 4,200K but is page faulting 200-500 times per second. I realize this process is used for failover, but the page fault seems excessive. Any thoughts on this?

The servers are running Windows Server 2003 with 4 processors and 4gb RAM.

View 1 Replies View Related

High Page Reads

Jan 17, 2002

SQL 6.5 - 5.5 Gig
NT

Hello,

Throughout the day our Document Management application generates high busts of physical page reads when users query the database.

What SQL configuration parameter(s) should I check/modify to insure that the database is performing at it's optimun during these bursts?

Thank You in advance.

View 1 Replies View Related

High Page File Usage

May 3, 2007

Just built a new DB server. Using SQL2K5Standard 64bit on W2K3Server 64bit. Hardware includes 16GBs memory/dual 4-core procs and many spindles, however we only have a 2GB page file on C:. SQL is set to to usa a maximum of 12GB memory which is way more than it should need. Problem is, we are occasionally seeing very high page file usage. If Task Manager is correct, (which it really can't) I am using 10 and 13.8GB (as listed in PF Usage) But I only have a 2GB page file! Occasionally when simply copying a file from one partition to another, PerfMon shows Pages/sec jump to 10,000. Also see some very high Disk Queueing.This system isn't even in production yet. My main questions are two: 1) some documentation says with SQL2K5-64 bit running on W2K3-64bit I may not even need a page file. Is this correct? 2) What would be causing high usage of the swap file with 16GBs of installed memory? And help would be appreciated. Thanks in advance!



MikeE

View 5 Replies View Related

Id (indentity) Is Increments On Faults.

Jul 20, 2005

Hi,When i eg. manually ad entries to a table and, cancels the insert Ms SQLincrement the counter on the ID anyway. Is there a way to avoid thisbehavior?RegardsAnders

View 1 Replies View Related

High Availability To High Protection Without Reconfiguring Mirroring

Apr 23, 2007

Hi,

Is there a way to configure mirroring to go from High Availability to High Protection without having to reconfigure Database Mirroring? Using the interface in Management Studio, I can change the configuration option to High Performance, but not High Protection despite both of them being Synchronous.

If not, what are the recommended steps to configure the mirror once it already has been configured? Is just like initially setting up the mirror or would there be any shortcuts I could take? If I stop the mirroring and remove the witness, will the High Protection option be available?

Thanks,
J.

View 3 Replies View Related

High Safety Changed To High Performance After Fail Over ?

Mar 6, 2008

Hi There

I realise this is a stupid quesiton but i cannot really find any confirmation of this in BOL.


If you are running High Safety with automatic failover, when failover occurs does this automatically change to High Performance mode. SInce for failover to occur something has happen with the primary , it will be impossible to commit transactions on the new primary and mirror asyncronously since 1 of them is no longer available.


So am i correct in assuming that automatic failover also automatically changes the mode to High Performacne for that session?

Thanx

View 4 Replies View Related

Sql 2000 Compared 2...

Apr 2, 2004

anyone know reasons why sql server 2000 is a better choose for db creation instead of say oracle and mc access???



THANK YOU

View 10 Replies View Related

SSIS Compared To DTS?

Dec 7, 2007

There is no doubt that SSIS has been designed to build on the lessons learned from DTS but in one respect I think that there may have been a backward step and that is ease of use. A big attraction (a hallmark even) of DTS was that you could configure a package to move data between data sources and destinations in minutes.

When you first start using SSIS this is far from being true. I tended to get a lot of errors (many of them not very helpful) relating to stored procedure parameters, truncation of data etc. Looking around this Forum I think that I am not alone. This is partly a learning curve issue but I think that the learning curve is much steeper for SSIS.
.
This is not a plea to go back to DTS, we must move on, but I would like to think that there will be an emphasis on improving ease of use for SSIS going forward.

For the record two general improvements that I can see in SSIS (over DTS) are:-

1. A more logical structure.
2. The ability to easily set any Package Property, not just Variables at run time.

View 7 Replies View Related

Reporting Services :: All Record Are Displaying On One Page - How To Display Page By Page

Nov 11, 2015

I have created one reports but all the records are displaying on one page.find a solution to display the records page by page. I created the same report without group so the records are displaying in page by page.

View 3 Replies View Related

Two Strings To Be Compared Have Different Collation.

May 10, 2006

I've written a stored proc which passes in an SqlString parameter and compares it with an SqlString read from an SqlDataReader.

I get the following exception:

System.Data.SqlTypes.SqlTypeException: Two strings to be compared have different collation.

Any ideas on collation inside CLR stored procs?

Thanks.

View 3 Replies View Related

WHILE Loop Speed Compared To Cursors?

Jul 23, 2005

Working on some new code, I'm coming across WHILE loops used instead ofcursors. I was curious if anyone had any stats on how the speed ofdoing this compares to the speed of a cursor. I typically avoidcursors for performance sake, but I'm not sure how this avoids thespeed hit of a cursor, since it's doing essentially the same thing.Many thanks.

View 4 Replies View Related

Data Access Very Slow In .net As Compared To VB

Jan 9, 2006

 

Hi,

I have migrated my app from VB to VB.Net. A 3-tier app with remoting and COM+.

I am experiencing a long wait time of about 3 times higher than what it would take in the VB App.

I am using DataAdapter.FiLL method to fill the datatable.

I have tried.

Using DataReader ( Made the things worse )

Using BeginLoadData and EndLoadData

Creating a Dataset and calling fill with the dataset so that the round trip to the middletier is saved to bring the SQL.

But i feel now that whatever is done. the problem is with the fill method only?

Is there any alternative?

Please suggest. It is one of the most important thing which if not possible may lead to scrapping up idea of upgrading to .Net.

Shri

View 10 Replies View Related

Log Sent Rate Is Low As Compared To Log Bytes Flushed/sec

Jun 12, 2007

Hi,



We have Asynchronous Database Mirroring on SQL Server 2005 SP2 Entprise Edition/Windows 2000 Advanced Server. We noticed that log sent rate is quite low (average 1.3 MB/sec) in most of the cases whereas "Log bytes flushed/sec" is high (1.4 MB/sec) as a result Log send queue keeps on increasing and finally taking all the transaction log space. Our disk queue length is always in range of 0.01. And prinicipal and mirror servers are on local LAN.



I tried on low end server and high end server and in both cases Log sent rate is approx 1.3 MB/sec (Maximum 4 MB/sec).



Is there any limitation on Log sent rate?

How can we improve on log sent rate? Since both servers are on local LAN, network bandwith does not seems to be an issue.



Any help is greatly appreciated.



Thanks,

Ramesh

View 2 Replies View Related

VERY Slow Generate Scripts On SQL 2005 Compared To

Jun 10, 2008

i was using sql 2000, the database contains 500+ tables, 3000+ sp.
i moved to sql 2005 and found problem on generating script (right click database -> tasks -> generate scripts).
i need to generate the table relations.... it is very very slow compared to sql 2000 which is done in about 30 seconds to few minutes.
i already tried many ways including set options to false which in my thought could speed up a lot...but still very slow.

average generate script time with sql 2005 (sp 2): 70-90 minutes.
average generate script time with sql 2000 (sp 4): 2-3 minutes.

can anyone tell why ? thx in advance

View 9 Replies View Related

MSSQL ODBC Difference Compared To MySQL

Mar 6, 2007

The source for this problem can be found http://www.wellytop.com/SQLProblem.zip

This test creates two threads each with a database connection and uses transactions to insert values into the same table.
The objective of this test is to check that a thread cannot read the results from a pending transaction on a different thread.
In effect this checks dirty reads do not happen and transaction locking.

The test runs correctly and displays "PASSED" with MySQL indicating the transaction and threading worked.
When running with MSSQL Express 2005 it reports a deadlock error during a transaction.
It's not really possible to re-run the transaction and I would like MS SQL to operate similar to MySQL, i.e. MySQL waits for the other transaction to finish before the next transaction can operate on those table rows. I'd like to use MSSQL but I am wondering why this error doesn't happen with MySQL and so have, for the moment, chosen to use it as my preferred database solution.
I have experimented with transaction isolation levels and this doesn't seem to solve the problem.

I've tested this with a fresh install of Windows XP SP2 and no firewall turned on.

To run this test with MSSQL Express2005 use the ODBC Data Source Administrator (odbcad32.exe) to create a data source named MyExpressTest and attach this to an empty database that has been created with the default values. Enable the #define MSSQL in the coude otherwise it tests with MySQL.


To run this test with MySQL (to show how this test should work) use the ODBC Data Source Administrator (odbcad32.exe) to create a data source named mySQLNewTest and attach this to an empty database that has been created with the default values. Comment out the #define MSSQL to switch to MySQL mode.

View 6 Replies View Related

VERY Slow Generate Scripts On SQL 2005 Compared To SQL 2000

Aug 1, 2007

i was using sql 2000, the database contains 500+ tables, 3000+ sp.
i moved to sql 2005 and found problem on generating script (right click database -> tasks -> generate scripts).
i need to generate the table relations.... it is very very slow compared to sql 2000 which is done in about 30 seconds to few minutes.
i already tried many ways including set options to false which in my thought could speed up a lot...but still very slow.

average generate script time with sql 2005 (sp 2): 70-90 minutes.
average generate script time with sql 2000 (sp 4): 2-3 minutes.

can anyone tell why ? thx in advance.

View 2 Replies View Related

Transact SQL :: Text Data Type Cannot Be Compared With Distinct

Oct 9, 2015

Field is not listed as text in any of the databases it is a varchar(255) - and that can be changed if that is what causes the issue.  

But here is my syntax which produces the error Msg 421, Level 16, State 1, Procedure, Line 2

The text data type cannot be selected as DISTINCT because it is not comparable.

DECLARE @c NVARCHAR(MAX)
WITH c1 AS (
SELECT [abcd] AS table_name
FROM [intranet].[dbo].[soccerfieldinfo]
where [abcd] IS NOT NULL
), c2 AS (
SELECT Row_Number() OVER (ORDER BY table_name) AS r

[Code] ....

View 3 Replies View Related

Stored Procedures On SQL 2000 Compared To Dotnet's Datatable.select

Jul 20, 2005

Hi,Has any one ever compared the performance of calling a DataTable'sSelect method with a stored procedure doing the same thing?My point is:dataRows = DataTable.Select(filter) is better orPassing paramters to stored procedure?The datatable holds about 500-700 rows at any given time.If I select one of the approaches the business logic will go intorespective layers.With dotnet in picture what would be a good approach- Have the data in Datatable and do a filter on the data or callstored procedures which has been the convention.Can some one pl. suggest?

View 3 Replies View Related

Adding Logins Via SSEUtil Compared To SQL Server Management Studio

Feb 7, 2007

Hello all,

I am currently in the process of setting up an SQL Server Express installation that comes packaged with an application I have written. My problem is that I want to use SQL Server user management (not just windows users) which work fine if I set them up manually. I started writing a script that I have SSEUtil execute once the application is fully installed (a step in my installation script) which sets up the users and passwords etc. The script is similar to the following:

USE [DBName]
GO

EXEC sp_DropUser 'user1'
EXEC sp_DropUser 'user2'
EXEC sp_DropUser 'user3'
EXEC sp_DropUser 'user4'
GO

USE [master]
GO

EXEC sp_DropLogin 'user1'
EXEC sp_DropLogin 'user2'
EXEC sp_DropLogin 'user3'
EXEC sp_DropLogin 'user4'
GO

CREATE LOGIN user1 WITH Password = 'user1', CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
CREATE LOGIN user2 WITH Password = 'user2', CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
CREATE LOGIN user3 WITH Password = 'user3', CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
CREATE LOGIN user4 WITH Password = 'user4', CHECK_EXPIRATION=OFF, CHECK_POLICY=OFF
GO

USE [DBName]
GO

EXEC sp_AddUser 'user1'
EXEC sp_AddUser 'user2'
EXEC sp_AddUser 'user3'
EXEC sp_AddUser 'user4'
GO

ALTER USER user1 WITH DEFAULT_SCHEMA = MySchema
ALTER USER user2 WITH DEFAULT_SCHEMA = MySchema
ALTER USER user3 WITH DEFAULT_SCHEMA = MySchema
ALTER USER user4 WITH DEFAULT_SCHEMA = MySchema
GO

REVOKE ALTER,DELETE,INSERT,SELECT,UPDATE ON MySchema.Table1 FROM MyRole
REVOKE ALTER,DELETE,INSERT,SELECT,UPDATE ON MySchema.Table2 FROM MyRole
REVOKE ALTER,DELETE,INSERT,SELECT,UPDATE ON MySchema.Table3 FROM MyRole
REVOKE ALTER,DELETE,INSERT,SELECT,UPDATE ON MySchema.Table4 FROM MyRole
GO

EXEC sp_DropRole 'MyRole'
EXEC sp_AddRole 'MyRole'
GO

GRANT ALTER,DELETE,INSERT,SELECT,UPDATE ON MySchema.Table1 TO MyRole
GRANT ALTER,DELETE,INSERT,SELECT,UPDATE ON MySchema.Table2 TO MyRole
GRANT ALTER,DELETE,INSERT,SELECT,UPDATE ON MySchema.Table3 TO MyRole
GRANT ALTER,DELETE,INSERT,SELECT,UPDATE ON MySchema.Table4 TO MyRole
GO

EXEC sp_AddRoleMember 'MyRole','user1'
EXEC sp_AddRoleMember 'MyRole','user2'
EXEC sp_AddRoleMember 'MyRole','user3'
EXEC sp_AddRoleMember 'MyRole','user4'
GO

Now if I run this script from within SQL Server Management Studio it executes perfectly. The logins add, the role is added, each user is added to the database logins and assigned to the role, the schema is set correctly on each user.

Then when I try to run the exact same script from the SSEUtil application (SSEUTIL -s PCNAMEInstance -run USERS.SQL), it processes everything, except the Logins.

This is frustrating as it means to install for a client I would need to either get them to open the management console and run the script from there, or I have to go to site just to setup users.

Am I on the right track? Or is there another way to automate the adding of Logins?

Thanks in advance,

DSXC

View 5 Replies View Related

Data Transfer From Lotus Notes Very Slow Compared To SQL 2000 DTS

Sep 7, 2006

To extract data from an ODBC source, try the following:

Add an ADO.Net Connection Manager.
Edit the Connection Manager editor and select the ODBC Data Provider
Configure the Connection Manager to use your DSN or connection string
Add a Data Flow Task to your package.
Add a Data Reader Source adapter to your data flow
Edit the Data Reader source adapter to use the ADO.Net connection manager that you added.
Edit the Data Reader source to query for the data you wish to extract.

hth

Donald

Using the steps outlined above as described by Donald Farmer in another post on this forum, I have created an SSIS package which retrieves data from Lotus Notes 6.55. The DSN referenced by the ADO.Net Connection Manager connects to Lotus Notes via the NotesSQL ODBC driver 3.02g.

When I execute the dataflow, data is transferred from Lotus Notes, but the data transfer rate is extremely slow compared to SQL 2000 DTS. In SQL 2000 DTS, we can retrieve just under half a million records from Lotus Notes in about 13 minutes. Utilizing the same DSN on the same machine, SQL 2005 SSIS completes the transfer in about 57 minutes.

Is there anything that can be done to improve the performance in SSIS to retrieve data from Lotus Notes via ADO.Net ODBC?

Thanks!

View 3 Replies View Related

Power Pivot :: Find Percent Of One Of Last Occurrence Compared To All Last Occurrences (DAX)

Jun 3, 2015

I have a fact -REVENUE  table of accounts, each account can have multiple instances . And I have an Aggregate that summarize the latest occurrence of a revenue for each account ( in a chosen Period )   : 

AccountLastRevenue:=SUMX( 
VALUES('scd_FactAccountRevenue'[Account]),
    CALCULATE(
        SUM('scd_FactAccountRevenue'[Revenue]),
        LASTDATE('scd_FactAccountRevenue'[Revenue_Date])   ) 
)

How can I find the percent of one of the lastest accounts Compared to all accounts? Assuming i have connect Dim_Time ( Y-Q-M-D )  to Revenue_Date, how can i find the percent of one Month Compared to all the months in the Quarter ( And so on hierarchies ) ?

scd_FactAccountRevenue:

powerpivot :

View 11 Replies View Related

ADO.NET Returns Different Colum Value When Compared To View Results In SQL 2005 Management Studio

Feb 7, 2007

I have a complex view in my sql 2005 database.
The view returns a column that could be null (as the result of a left outer join).
The coulmn that is returned is an integer.
Everything works fine if I run the view from SQL 2005 Management Studio.
My column value is always null if I use ADO.NET's SqlAdapter to return a DataTable.
Has anybody seen this behaviour before?
Any help appreciated.
Regards,
Paul.

View 2 Replies View Related

Error Description Differs When Logged With Redirect Rows Compared With Debug Mode

Jan 18, 2007

Hi,

Can any one please tell me how to get the complete error description for example when i dont Redirect Row for Error in OLEDB Source i get a detailed error message with column name as

[RCheck [385]] Error: There was an error with input column "CHECK_STATUS" (456) on input "OLE DB Destination Input" (398). The column status returned was: "The value could not be converted because of a potential loss of data.".


But when I set Redirect Row for error and use the Script component to log them into a Table with ErrorDescription based on ErrorColumnID it only gives me this.

The data value cannot be converted for reasons other than sign mismatch or data overflow.



Thanks

Sat

View 1 Replies View Related

I Need To Pass Data Entered /created On The First Page To The Next Page And Populate The Next Page With Some Data From The Fir

Nov 28, 2006

Hello I have a project that uses a large number of MS Data access pages created in Access 2003 and runs on MS SQL2005.

When I am on lets say my client, (first page in a series) data access page and I have completed the fields in the (DAP), I am directing my users to the next step of the registration process by means of a hyperlink to another Data access page in the same web but in a linked or sometimes different table.


I need to pass data entered /created on the first page to the next page and populate the next page with some data from the first page / table. (like staying on the client name and ID when i go to the next page)


I also need the first data access page to open and display a blank or new record. Not an existing record. I will also be looking to creata a drop down box as a record selector.


Any pointers in the right direction would be appreciated.
I am some what new to data access pages so a walk through would be nice but anything you got is welcome. Thanks Peter€¦

View 2 Replies View Related

Can A Calc'd Query Column Be Compared Against A Multi Value Variable Without A Nested Query?

Nov 15, 2007

do i need to nest a query in RS if i want a calculated column to be compared against a multi value variable? It looks like coding WHERE calcd name in (@variable) violates SQL syntax. My select looked like

SELECT ... ,CASE enddate WHEN null then 1 else 0 END calcd name
FROM...
WHERE ... and calcd name in (@variable)

View 1 Replies View Related

T-SQL (SS2K8) :: Varchar Datatype Field Will Ignore Leading Zeros When Compared With Numeric Datatype?

Jan 28, 2015

Need to know if the varchar datatype field will ingore leading zeros when compared with numeric datatype ?

create table #temp
(
code varchar(4) null,
id int not null
)
insert into #temp

[Code] .....

View 4 Replies View Related

Expression After Table Object Forced To Display On The Last Page Even There Are Still Spaces At The Bottom Of The First Page.

Oct 16, 2006

The following objects are placed on the Report body of the Report pane of SQL Server 2005 Reporting Services :

<textbox: expression1>
<textbox: expression2>

<table:table1 with at least 30 columns and 30 expressions>

<textbox: string1> - considered as the Title in the Footer section of the report

<texbox:string2> <textbox:expression3>
<textbox:string3> <textbox:expression4>
<textbox:string4> <textbox:expression5>
<textbox:expression6>

I can't find any explanation why is it string1 and string 2 of the footer section of my report displayed separately from the expression3 which is aligned on it and the rest of the object on the second page.

The expected design is that all Footer items should be displayed together of whether it is placed on the first page or on the last page.

As a workaround of this, I converted string 1 into an expression (Added = and enclosed the string with double quote).. As a result, all of the items in the Footer section are now placed together on the last page of the report.

I also remember one of the issue I encountered before where the Footer items where placed together on the first page and still have space at the bottom of the page, but then expression 6 is forced to display (alone) on the last page of my report.

I can't find any discussion related to this, I wish somebody could give me an idea why RS behaved like this.

Thanks in advance

View 1 Replies View Related

Fit An Intere Table In Same Page Without Page Breaks Excel Export In A Single Sheet.

Feb 14, 2008

Fit an intere table in same page without page break for save the excel export.

My table has a Group for order my dates.

I need to have the intere table in the same page, i don't care about blank space at the end of the page.

I can't use the page break beacuse i need an excel export in a unique sheet..
I have tested.. every page break..you'll have a different sheet in your excel export

I need something like this

page 1

Zone
1
2
3
4
5
6
7
7
8
9
..

page 2

Zone
1
2
3
4
5
6
7
7
8
9
..

but an unique sheet in the excel export

View 4 Replies View Related







Copyrights 2005-15 www.BigResource.com, All rights reserved