ReportViewer 2008 / 9.0 PDF Export/Render Compression Question
Apr 15, 2008
I print 4x1 inch labels, anywhere from a few to a few thousand. It's all done from an asp.net page with a 9.0 report viewer in local processing mode.
108 Labels = 37MB or so.... this is totally unacceptable for something that when zipped becomes 1.3mb.
I thought version 9.0 was supposed to have PDF compression... Is there some kind of setting I'm not seeing any documentation for somwhere? how do i enable it?
This better not just be a server processing mode feature like the print button again... I've already taken plenty of heat from the bosses just for that alone. Try explaining the difference between a local report and a server report to your boss and then try to come up with a real world explanation as to why he can print some reports in some programs and is forced to export and printer them from adobe in other programs... please don't let me die have mercy.
I want to export file such as .csv, xml , word from ReportViewer in visual studio 2008 but when I run on ASP.NET .net framework 2.0, it can export only excel & pdf files. But in business intelligent project in Visual Studio 2005 I can export all files that I want. How can I export file from reportviewer in visual studio 2008 run on web application?
I'm trying to take the output of the Render method and bind it to the ReportViewer control. First of all, is this possible? If so, what is the best way to render the report (format: XML,CSV, NULL, etc), and then how to I 'bind' the result to the control?
I'm proving some code to demonstrate how I'm rendering the report: ... string format = "XML"; string devInfo = @"<DeviceInfo><Toolbar>False</Toolbar></DeviceInfo>"; ReportExecutionService rs = new ReportExecutionService(); ExecutionInfo execInfo = new ExecutionInfo(); ExecutionHeader execHeader = new ExecutionHeader(); byte[] result= null;
rs.ExecutionHeaderValue = execHeader; execInfo = rs.LoadReport(path, null); string SessionId = rs.ExecutionHeaderValue.ExecutionID; result = rs.Render(format, devInfo, out extension, out encoding, out mimeType, out warnings, out streamIDs);
Thanks, any help would be greatly appreciated. Peter
Our clients working with the Firefox browser on a Mac are unable to use the Multi-Value parameter drop down lists that the Report Viewer control generates. Of course I realize that the multi-select dropdown lists are not really dropdown option lists using the standard HTML select tag, but are rather tables within div tags with cells that contain spans, inputs, and labels.
Originally the report viewer displayed these lists in the wrong position within Firefox on any platform (Mac or PC). Furthermore, there were other visibility problems with those lists that made it virtually impossible to select a checkbox within the list. Fortunately, Microsoft fixed this problem with the latest version of Report Viewer, which we downloaded from the following link: http://207.46.19.190/downloads/details.aspx?FamilyID=cc96c246-61e5-4d9e-bb5f-416d75a1b9ef&displaylang=en
So currently we have SQL Reporting Services Report Viewer 9.0.21022.8 installed on our web server. And the dropdown lists do appear as expected, and they work properly in Firefox on a PC.
But, when the control is rendered in Firefox on a Mac, the list is not scrollable. The scroll bar that should appear on the right-hand side of the dropdown list, which would enable users to select values toward the bottom of the list, does not appear. That scrollbar is missing in Firefox.
This is likely related to a Firefox rendering issue with the overflow:auto style. There are numerous entries on the web that indicate Firefox for Mac has a problem with overflow:auto. For example: http://www.webdeveloper.com/forum/archive/index.php/t-96958.html http://www.daniweb.com/forums/thread44144.html http://iamthewalr.us/blog/2007/04/20/firefox-on-the-mac/#comment-2321 http://www.errorforum.com/mozilla-firefox-error/3503-will-float-mac-firefox-scrollbars-floating-pop-up-windows.html https://bugzilla.mozilla.org/show_bug.cgi?query_format=specific&order=relevance+desc&bug_status=__open__&id=187435
That being the case, it seems that there should be some workaround to address this, either via a style or through some alternate control. Or perhaps there is a property that we can apply to the ReportViewer control that I'm unaware of which addresses this.
If you know of a workaround, or can suggest an alternate approach that we could implement quickly, please respond. Thanks.
I found this method that will remove export options from a server report when called through ReportViewer. I like this one because you dont need to have a web reference to the SQL server to loop though each rendering extension. How do you call this method if for example, if wanted to remove the XML format from a particular report? I have tried ReflectivelySetVisibilityFalse("XML"), but no luck. Any clues?
I have a VS2005 C# winforms application that reads a SSRS request from a table and using the ReportViewer control produces a report and then exports in one of a number of formats via a specified path to a share on another server. This normally works without issue, however today I have had three instances of invalid or blank PDF's being produced. A sample error from the Acrobat Viewer is "There was an error processing the page. There was a problem reading this document (109)."
The software version are as follows:
Host Server: Windows Server 2003 with Sp1.
SQL 2005 with Sp1.
Acrobat reader: Version 7.0
By deleting the PDF file, resetting the processed flag to un-processed, the report was run again, and this time a perfectly readable PDF file was produced. As neither the source data nor the report definition file was updated during this time period, how it works at one time but not previously is currently inexplicable. I have run the report manually with the same input parameters using Internet Explorer and exported it successfully to another location.
Any ideas as to what is going on?
A fix to the winforms application will be to delete any existing file before exporting a new one.
As I recall, this was a known issue that was meant to be resolved with a future release of IE7. Is there any update on this one or known work-a-round? Thanks
I have installed the The Microsoft Report Viewer 2008 Redistributable Package and I add the report viewer component to my toolbox list.
while viewing the report viewer in my solution ...i was getting the error as given below..
Note: InnerException = {"The report definition is not valid. Details: The report definition has an invalid target namespace 'http://schemas.microsoft.com /sqlserver/reporting/2007/01/reportdefinition' which cannot be upgraded."}
I have an application built using VS 2008 (3.5 fx). I build a series of reports (using VS 2005 (sigh)) and am now tryiing to render them using a Windows form and the ReportViewer control in the VS 2008 app. However I always get the error message "The source of the report definition has not been specified" when I execute the line
m_reportViewer.RefreshReport();
When I try to view the very same report in a VS 2005 app it renders without error. I can also view the report using Report Manager and preview it in the VS 2005 app I use to build it. Any suggestions?
Thx
Helen -- Helen Warn, PhD Agile Software Inc. www.agile-soft.com
I have a request to export some table data to excel and the "notes" column (varchar 255) contains multiple lines separated by CR/LF. when I export to excel, the first record with CR/LF messes up the column alignment in excel, throwing off the format from that point on. how can i export to excel so that it preserves these CR/LF. or if not, how can I remove these characters so that excel can handle it?
And I always know the root ID from the first record on "table" dummy (generated with a common table expression), in this case it's ID 1, but from here, how to process this for any level of depth ?
Using below statement to export a table from sql server 2008 to EXCEL 2010
Insert into Openrowset ('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database=C:ExportXLS.xlsx;' , 'SELECT * FROM [employees$]') SELECT name,id,group,agency FROM dbo.employees
getting below ERROR
OLE DB provider 'Microsoft.Jet.OLEDB.4.0' cannot be used for distributed queries because the provider is configured to run in single-threaded apartment mode.
below changes also done. sp_configure 'show advanced options', 1; GO RECONFIGURE; GO sp_configure 'Ad Hoc Distributed Queries', 1; GO RECONFIGURE; GO
Any way to bulk export / import TDE Certs? I've got a bunch of databases that need to be moved to another system. Just about every database is using TDE and was wondering if there was a way to move these certs in a bulk fashion. I've got SQL and Powershell scripts to backup and restore multiple databases, but won't do me any good without the certs.
One of my current responsibilities is to export data to 3rd party vendors. Each export can contain many csv files. The exports are all different in terms of what data is being sent.
The way I have it currently setup is each file that needs to be created is a view. An SSIS package gets the data from the view, writes to CSV, and then sftp to 3rd party vendor. This seemed like a good idea at first because the columns are static but the calculations might change. So all I have to do is ALTER VIEW and I don't have to change anything in the package.
Is there a better way of doing this? I was curious to see what other people are doing. What makes it challenging is that all the exports are so different. If they were similar I could have created generic views that cover all the exports instead of each export having its own view. Eventually I'm going to have 100's of views.
1 is with only one field: numerator 2ed is (for example) sales information.
I need to export (using SP) the sales information with unique numerator per line. So, I need to get the value from table 1 (numerator), I need to promote the numerator by 1 for every line, and I need to update the first table (numerator) (as i need to keep it updated for the next run).
At the next run I need to get again the updated numerator from the first table, and so on...
Am not able to export more than 10 lakh records to excel sheet(2007 xlsx). I'll get sucess message but all data will not get copy to excel sheet .I have tried through import wizard in sql server and also directly copy and paste to excel sheet.
I have a table which has few columns as Numeric value. I need to export the output into a csv file as a report with column headers. I have used bcp command. Here column name and Column Header name is the same. The BCP query which i have used is Below
We have recently upgraded to SP1 of SSRS 2008. As a result, when we export a blank report to CSV, we now get a line of commas below the headings. Or found a way to not include the commas?  New SSRS Output
Portfolio_Reference,Portfolio_Name,R,TR,TD,TC,D,  Old SSRS Output
We're migrating from SSRS 2005 to SSRS 2008. We use itextsharp (a pdf processing library) to merge different PDF outputs into one large PDF file. This was working brilliantly in SSRS 2005, but in SSRS 2008, itextsharp *can* open and parse the PDF, but itext sees the "content" section of the PDF pages as blank. So, the resulting composite PDF ends up being a series of blank pages. Acrobat 9 and a couple other PDF tools (tested linux Evince and win GIMP) seem to be able to open the SSRS 2008-generated PDF fine, but the java-version of itext, and the .net version of itext (itextsharp) see nothing but blanks.So, I realize this points to a problem with itext, but I'm curious if merging PDFS outputed from SSRS 2008, and what PDF tools/libraries they are using.
I have a request from the customer to export a report to Word as a read-only document. I am restricted to out-of-the-box SSRS 2008, so I cannot write any custom export functionality.
The customer has the ability to export as PDF, and I suggested that, but before I tell them there's no easy solution, I wanted to get input from other Reporting Services folks.
Not seeing the Review Data Type Mapping Screen in SQL Server Import and Export Wizard?
Is there only a certain version where that screen shows up?
I am trying to import data from an MS Access application to SQL Server and all of the connections are good, but some of the data isn't and if I let it migrate using this tool it crashes on the bad data and there is no data that migrates. The Review Data Type Mapping screen will allow me to bypass the records in error and load the rest. however, I can;t do that if I cannot see the screen.
I have been wanting to compress my database. I am not really sure how this is done. I was looking on Enterprise Mangr. and if you right click on the db and go to all tasks, there is an option to shrink database. Is this the way you would compress your database, or are there other ways of doing this?
Can anyone tell me whether there is any data compression in SQL6.5. Have concerns with network traffic, and was wondering if data compression was a function SQL6.5, or if the data compression have to be coded into the actual database?
My application send/retrieve large data from and to the database over the internet. Is there anyway or method that i can compress the query before submit to database server? really appreciate any advice and comments.
We have run some tests on our application. Average message is about 2.5 MB. Messages are send once every 30 minutes. This is 3.5 Gb per month for one site. Now we already have three sites that will be sending this messages. This will be VERY high load on the WAN channel, and will cost us a LOT of money .
Does SQL Server replication impliment any kind of compression? It seems to me that this would be very helpful for congested WAN links and costly merge replication.
Hi, I was looking for a column compression functionality in SQL Server Compact and it seems that it doesn't exist (maybe I'm wrong?). I wonder if the SQL Server Team plan to implement column compression and if yes, when can we expect it?Thank you for your help!
I have a database which is 72GB, which is backed up every night as part of the maintenance plan. I have plenty of storage space, and the server that runs the database is fairly powerful (quad-processor 3.2ghz, 64bit, 48GB RAM) and is part of an active-passive cluster. The database backup is also copied to a SAN location.
My issue is with the size of the backup file. As part of the Disaster Recovery plan, I need to copy this database backup file accross the network to a remote site, so that in the event of a disaster at the site, business can continue at the remote site after restoring the database backup file. However, my database backup file is so big that I cannot copy it accross the network in time for the next morning. I have tried using WinRar and have managed to achieve a file about 20% of its original size, but it takes 2 hours to produce this file.
Is there any recommended reeading for this type of issue? Log shipping / mirroring has been investigated and will be part of the DR model but the 'powers that be' insist on having a full copy performed to the remote site.
Any suggestions? Thanks in advance guys n gals :-)