Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Transformation Parameters

$
0
0
Hi,
I'm a beginner in PDI.
I cannot figured out how to use Transformation parameters and put them in Pentaho Reporting. I have created transformation and set-up in Transformation properties parameters. Now I need to use these parameters in Pentaho Reporting. But somehow they are not linked to the data. Can anyone help me with this?

not able to read data from a flat file

$
0
0
Hi All,

I have a requirement where I have to use a flat file with a single column and about 1 Million rows as a look up file. I am using the 'Merge Join' stage to achieve this lookup operation.

My stream data is coming from a 'Get Rows from Result' stage and the transformation is set to run for each record fetched from it.

Now, the problem is , for some inexplicable reason, the data is not being read from the lookup file and the job keeps getting stuck with out processing even a single record. When I try to copy the data from the lookup file into another file, the copy gets suspended after 19323 records. I tried to do this in diferent ways but none have worked.

I could not add any screenshots or attachements for this issue. Any help is highly appreciated.

Uploading Metadata to Solution folde: Bundle missing required domain-id property

$
0
0
I get the following error when trying to upload my metadata.xmi file to my solution folder in pentaho 5.1 CE. I'm using the PME to create the metadata model and exporting it as a .xmi file. I am then using the BA Server 5.1 CE's console to upload the xmi file into the solution folder. The error I get is:

Code:

Import File    Level    Message
/home/BEC/metadata.xmi    ERROR    org.pentaho.platform.plugin.services.importer.PlatformImportException: Bundle missing required domain-id property

How might I be able to resolve this? How do I set the domain-id property?

EDIT:
I tried to publish the metadata to the solution directory directly from PME instead of uploading it. PME notifies me that the file already exists and if I wish to replace it, to which I agree and receive a success message. However, when going into the solution folder through the PUC and downloading it, the folder seems to contain no metadata file in it. I made sure that the domain name is the same as that of my solution folder, so what am I potentially doing incorrectly?

Table input step blocking system

$
0
0
Hi,

I'm trying to get data from an mysql DB table. I've configured the connection and sucessfully tested it.
The problem is, if I try to preview or send the results of the query to a file or something, my CPU goes to 100% and a message saying "Operation in progress...", simply just sits there and nothing happens.
When running spoon I have a few errors:

Code:

(SWT:18756): GLib-GObject-WARNING **: /build/buildd/glib2.0-2.38.1/./gobject/gclosure.c:697: unable to remove uninstalled invalidation notifier: 0x7fe7ba748be0 (0x7fe7fde9f540)


Attempting to load ESAPI.properties via file I/O.
Attempting to load ESAPI.properties as resource file via file I/O.
Not found in 'org.owasp.esapi.resources' directory or file not readable: /opt/Pentaho/design-tools/data-integration/ESAPI.properties
Not found in SystemResource Directory/resourceDirectory: .esapi/ESAPI.properties
Found in 'user.home' directory: /root/esapi/ESAPI.properties
Loaded 'ESAPI.properties' properties file
SecurityConfiguration for Validator.ConfigurationFile not found in ESAPI.properties. Using default: validation.properties
Attempting to load validation.properties via file I/O.
Attempting to load validation.properties as resource file via file I/O.
Not found in 'org.owasp.esapi.resources' directory or file not readable: /opt/Pentaho/design-tools/data-integration/validation.properties
Not found in SystemResource Directory/resourceDirectory: .esapi/validation.properties
Not found in 'user.home' (/root) directory: /root/esapi/validation.properties
Loading validation.properties via file I/O failed.
Attempting to load validation.properties via the classpath.
validation.properties could not be loaded by any means. fail. Exception was: java.lang.IllegalArgumentException: Failed to load ESAPI.properties as a classloader resource.

(SWT:18756): GLib-GObject-WARNING **: /build/buildd/glib2.0-2.38.1/./gobject/gclosure.c:697: unable to remove uninstalled invalidation notifier: 0x7fe7ba748be0 (0x7fe7fc932c20)

(SWT:18756): GLib-GObject-CRITICAL **: g_closure_add_invalidate_notifier: assertion 'closure->n_inotifiers < CLOSURE_MAX_N_INOTIFIERS' failed

I've copied the ESAPI.properties to /root/esapi/ and /home/<user>/esapi and /opt/Pentaho/desgin-tools/data-integration/ and nothing seems to do the trick.
I'm running Linux Mint 16 and openjdk-7

Can anyone help me?

Thank you

Dashboard pings

$
0
0
Hi,

We developed a dashboard with many queries and parameters. It takes around 30 seconds to load but what we noticed by looking in the Chrome developement tools is that the pluging is sending lots of pings to the server and most of the 30 seconds are because the plugin waits for the ping response.


Is this normal? Is there a way to avoid this?

Thanks!

Migrating CDE dashboards from Pentaho 4.x to 5.x

$
0
0
Thanks to Pedro Vale, head of the Webdetails group for writing this note in the redmine wiki
Migrating dashboards

  1. Upload existing dashboards to PUC
  2. If you're using widgets, upload the contents of /cde/widgets to /public/cde/widgets
  3. For each widget, edit it and save its settings. This will update the widget metadata and make it consistent with 5.x
  4. Restart the server or call the CDE cache refresh url (/pentaho/plugin/pentaho-cdf-dd/renderer/refresh).
  5. For each dashboard using widgets, remove and readd the widget.
  6. In the datasources section, please update your mondrian datasource references - Simply selecting them again from the drop down will make it work. Alternatively, you can enable the legacy fallback solutoin described below.




Ensuring mondrian datasource references continue working

Step 1: Enabling Thomas Morgner's legacy fallback solution (note: disabled by default)

1 - open /tomcat/webapps/pentaho/WEB-INF/classes/classic-engine.properties
2 - add the following line

mondrian.spi.CatalogLocator=org.pentaho.reporting.engine.classic.extensions.datasources.mondrian.LegacyCatalogLocator

Step 2: Create and populate a mappings properties file

1 - (if not exists) create /tomcat/webapps/pentaho/WEB-INF/classes/mondrian-schema-mapping.properties
2 - add the mappping for the intended mondrian catalog; the new 5.0 syntax for mondrian catalogs must abide by the rule:
mondrian:/

Two mapping examples:

Example 1:
in /plugin-samples/cda/cdafiles/mondrian-jndi.cda you can see this datasource definition:




SampleData
SteelWheels
SteelWheelsSales




(see catalog value)

to handle this case, we would set in mondrian-schema-mapping.properties:
SteelWheels=mondrian:/SteelWheels


Example 2:
in /plugin-samples/pentaho-cdf-dd/cde_samples1.cda you can see this datasource definition:




../steel-wheels/analysis/steelwheels.mondrian.xml
SampleData


SampleData



(see catalog value)

to handle this case, we would set in in mondrian-schema-mapping.properties:
steelwheels.mondrian.xml=mondrian:/SteelWheels
(disregard the path, mapping is done with the filename)

More...

Can CDE read cookie data?

$
0
0
I'm looking for a means of reading values from a cookie from our authentication platform to pick up the user's username, that I'd like to then use as a parameter for other components.

I've tried using the following script to create a test cookie, but it's not showing up when i inspect the resources to see if CDE was actually creating it.

function setCookie(cname, cvalue, exdays) {
var d = new Date();
d.setTime(d.getTime() + (exdays*24*60*60*1000));
var expires = "expires="+d.toGMTString();
document.cookie = cname + "=" + cvalue + "; " + expires;
}

I don't need to create a cookie per se, I only really need to read one that already exists. Anyone have any hints on how to accomplish this?

CDE Table

$
0
0
Hello,

When i make a table now the data is below the Title which i can choose, the data from the database which selected comes under that. Like this:

Title Title Title Title
Data Data Data Data


Now i want it this way:
Title Data
Title Data
Title Data
Title Data


Where the title can be chosen in the table properties and the data is selected from the database (with a query)

Is this possible? And if yes, how?

Ivy Pentaho Demo Server providing Dashboard is Open source or not ??

$
0
0
HI,

Ivy Pentaho Demo Server providing Dashboard is Open source or not ?

How to use it with our Pentaho Community BI Edition 4.8 ??

Ivy Pentaho Demo Server providing Dashboard is Open source or not ??

$
0
0
HI,

Ivy Pentaho Demo Server providing Dashboard is Open source or not ?

How to use it with our Pentaho Community BI Edition 4.8 ??

Thanks.

Moving the .pentaho folder on a windows 7 machine

$
0
0
Hi,

I've recently started using PDI Kettle and it works well (starts) until I change networks my machine is connected to. My work machine usually sets its HOME folder to a network drive F:\ where Kettle creates a .pentaho & .kettle directories in this network folder and the application runs correctly. If I disconnect, it seems that Kettle fails to start, I only get the splash screen, and then it fails to load. I suspect this is Kettle attempting to access my F: that does not exist anymore.

I have managed to change the .kettle directory by setting the KETTLE_HOME environment variable, but it seems Kettle continues to try connecting to the F:\ to create the .pentaho directory. I also tried to set the PENTAHO_INSTALLED_LICENSE_PATH environment variable, but this does not move the folder.

Does anyone know how to change this .pentaho folder?

Most appreciated.
Paul




Some of my machine details

Windows 7 64bit
JRE 1.7

Kettle
Build Version 5.1.0
Commit ID: 0
Build date: June 19, 2014

How to use array paramenter

$
0
0
Hi folks,

How can I pass an array of data to Pentaho Report Designer or simply a CSV String and after that access to a specific position of the array?

Is there any way to know the status of server ?

$
0
0
Hi Forum,

Is there any command to check the status of pentaho server in linux kind of OS's.

some things like this :
sh ctlscript.sh status postgresql

Thank you.

Error on viewing report created using metadata source

$
0
0
Environment: Pentaho BA Server 5.1 CE, Pentaho Metadata Editor 5.1 CE, Pentaho Report Designer 5.1 CE, CentOS 6, MySQL DB.

Steps so far:
1) Created a new folder '/public/BEC' using the PUC
2) Created a metadata model and published to the server successfully with the 'Domain Name' as 'BEC'
3) Exported the same metadata.xmi file locally, and used it as a data source in PRD to create a report
4) Published the report successfully to '/public/BEC'

While opening the report on the PUC, I get the message, 'Report Validation Failed'. The error log is provided (http://www.speedyshare.com/AEK37/download/catalina.out) (Sorry about the external link; the log was too many characters to post here as text). What might be the cause of this? Is it truly a bug, or am I doing something incorrectly?

In short,
Code:

06:07:10,696 ERROR [SimplePmdDataFactory] errororg.pentaho.platform.api.repository2.unified.UnifiedRepositoryException: PentahoMetadataDomainRepository.ERROR_0005 - Error retrieving domain with id [BEC] - org.xml.sax.SAXParseException; Premature end of file.



06:07:10,698 ERROR [AbstractReportProcessor] 2072081900: Report processing failed.
org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Failed to parse query




06:07:10,699 ERROR [SimpleReportingComponent] [execute] Component execution failed.
org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Failed to parse query

How do I dynamically create parallel execution threads/pipelines?

$
0
0
I want to dynamically create a set of parallel execution "pipelines" based upon the rows created in a previous step. In this particular case, I want to extract rows of data from a database in "slices" of 10000 rows. If there are 80000 rows, then I would create 8 slices.

I can do this "manually" in a transformation by creating 8 different flows (each flow is just a "Table Input" step with a static query that filters on row numbers that is connected to a "MySQL Bulk Loader" step) in the same transformation. When the transformation is launched, all 8 flows execute in parallel.

However, the total number of rows in the source database could vary so I don't want to have manually modify the transformation each time. What I want to do is perform a step that provides a series of rows with the start and end row numbers in separate fields. This would then "spawn" X number of flows to perform the extraction in parallel.

I have already created a transformation that does this for each row sequentially, but that defeats the purpose of breaking up the data extraction into multiple slices. I've looked at parallel job execution, but the examples all contain a static set of transformations (i.e. you already set up the transformations/job entries in the job execution).

I would attach a picture of my "manual" parallel execution test, but for some reason I can't upload files (sorry).

Thanks.

Michael

Mondrian Cube on latest live data.

$
0
0
Hi,
I am working on dashboard like having bar charts and some filters.
For that,I have used following tools:

  1. Pentaho BI Server 4.8.0-stable -For Developing dashboard using CDE.
  2. Pentaho Schema Workbench 3.5.0 -For Developing Mondrian OLAP cube.
  3. CDE -For Developing Dashboard having Bar Charts and Select Components(Used for filters).

After publishing OLAP cube,It gives dashboard output on data up to only last publish time.
But I wanted that dashboard work on Live current data by using MDX and Cube for my performance enhancement issue,unfortunately I am unable to do that.
Please give me some guideline so that I can develop Cube which will have updated data or any other option which will make live data availble for that dashboard except use of SQL.

converting to Date format

$
0
0
hi all-

add_eee = '2014-07-19-16.10.05.880834'
aaa_eee = '2014-07-26-02.35.19.842187'

i just need to read the date... can anyone help me in reading just the date??

Thank You

Drilldown link on Number column in PRPT

$
0
0
Hi,


Need help in implementing drilldown on number column which opens new prpt on same window with corresponding value passed as parameter.
I need summary report (product wise price) which will contain hyperlink on number column,
on click it should open new with details(product|months|price) columns of particular product.

Summary view with
Hyperlink on price column
ProductLine Price
Classic Cars 360
Motorcycles 210

(above clicking on price=360 it should open details of product=classic-cars)

Drill down output
ProductLine Months Price
Classic Cars Jan 150
Classic Cars Feb 110
Classic Cars Mar 100


Let me know if drilldown function needs to be used or creating hyperlink with parameter.
I understand we have to create 2 prpt files, can we see effect of drilldown in report designer tool itself?


Thanks

CCC Pie Chart - slice_innerRadiusEx option unavailable

$
0
0
Hi all,
I've been exploring pentaho solution and have just started to develop a dashboard in CDE using some CCC charts, and I came into a block, so here's the problem I'm having:

I want to make a pie chart (donut), however the option used to transform the pie into donut seems to be unavailable "slice_innerRadiusEx" I mean, when I access the advanced properties of the pie chart I can't find it anywhere.
I know this is the option I'm looking for because I found it here: http://www.webdetails.pt/ctools/ccc.html
I managed to make a donut with protovis component (http://mbostock.github.io/protovis/docs/wedge.html) but I loose some other features that are native to CCC Charts, and that I don't want to loose.

Can anyone tell me why is this feature unavailable in CCC Pie Chart, or is there any other way to code "slice_innerRadiusEx: '50%'," into my CCC Chart Pie? I would really apreciate some help on this.

adicional note: I'm using 5.1.0.0.752 release.

Thank in advance

Pentaho PDI Report Bursting

$
0
0
Hi,

I am facing a scenario where i will have to email the report to different emails based on their locations, I want to implement Report Bursting in it. Can somebody please guide on this.

Thanks
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>