Monday, June 27, 2011

Pentaho Report Bursting with Pentaho Data Integration

Originally posted on the Pentaho Evaluation Sandbox

Report bursting is the process of sending personalized formatted results derived from one or more queries to multiple destinations. Destinations can be file systems, email distribution lists, network printers or even FTP hosts. Allowing a greater method of distribution. Usually, the end result will display information pertinent to the recipient or location; therefore each recipient only sees their own data. Below is a brief example of how Pentaho Report Bursting can be achieved with Pentaho Data Integration 4.2. By leveraging Pentaho Data Integration's new Pentaho Reporting Output step, once can create a simple tasks that executes and renders multiple reports from a single Pentaho Report template. This is a truly powerful example of how Pentaho Data Integration can be used for more than just ETL.

Special thanks to Wayne Johnson, Senior Sales Engineer for providing the sample and setup document.

How To document and sample here

3 comments:

planetzenhead said...

Hello. This is an excellent example. However, having a little trouble getting the Reporting Output step to actually generate a pdf file or any output. The transformation runs without any errors, but, no file is created where expected. Is there something needed to connect the Reporting Output Step to the BI server? Thanks in advance.

Michael Tarallo said...

Hello Planetzenhead - this is most likely because the SampleData datasource that the PRPT file is configured for is not running. The sample is configured to use the default SampleData connection that is automatically started with the Pentaho BI Server. If you start the Pentaho BI server it will start the embedded in memory Hypersonic database where the sample data resides. If you open the PRPT with the Pentaho Report Designer you will see the data source connection should be configured for JDBC - Hypersonic, localhost - or possibly JNDI SampleData - either way it the database most be started to use the default sample. I will update this on the blog entry. Thanks for testing.

Michael Tarallo said...

I have updated the notes here: http://sandbox.pentaho.com/2011/06/bursting-pentaho-reports-with-pentaho-data-integration-4-2/