Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Created XMI but cannot see datasource in Ad Hoc Report Designer

$
0
0
So I have created an XMI file and placed it into a solutions folder, but when I go to create a new Ad Hoc report on my BI server nothing shows up. I assume there is more to publishing an XMI file than just placing it in a solutions folder. Does anyone know how to do this through the Webservices or Pentaho Java API? I do not want to use PME for this. If anyone has any information it would be a great help.

X-axis using logorithm scale

$
0
0
Hi All,

I am wondering if there is a way to change the scale of the x-axis to be logarithm instead of linear, it appear to be doable in protovis
var yScale = pv.Scale.log(0,popCountArray.length).range(0,height);
I have a dot chart that would like the x-axis ticks to be 0, 100, 10000, 1000000, 10000000,
instead of 0, 2000000,4000000,6000000,8000000, 10000000

If it is doable, where and how can I do it?

Thanks.

Is there a way to filter the data in the postFetch process?

$
0
0
Here is the story:

I have a metric dot chart that display asset risk data for a given year, the x-axis is the consequence in $, and the y-axis is the probability of failure for given asset risk.

In the Dashboard, I created a drop-down list for year selections. So every time a year is changed, the data source will listen to the year parameter change and fetch some data using SQL over Jndi, and then the chart will be refreshed.

Everything works pretty well except we would like to improve the performance of this chart by preventing it going through data source every time the year parameter is changed. Instead, we are hoping to achieve the following:

When chart is loaded, we preload Asset Risk Data for ALL years. now, with parameter change, instead of refetching the data from data source, is it possible to do the data filtering in the post fetch event by calling setData?

Say if I have the super data set as
Year RiskName Consequence Probability Of Failure
2011 Risk1 $50 .5
2011 Risk2 $10 .3
2011 Risk3 $54 .2
2012 Risk1 $150 .4
2012 Risk2 $13 .7
2012 Risk3 $56 .8
2012 Risk4 $156 .6
When Dashboard is loaded, we filtered the data based on year selected, and display the chart accordingly. If we first selected year 2011, then only risks for 2011 will be displayed, i.e., there will be 3 dots on the charts representing risk data.
Then when we change the year to 2012, with some sort of filtering logic + dropdown event listener defined, the chart will immediately display 4 risk data for 2012 without refreshing the chart, since all the data has already been loaded.


Question is:
1. Where do we save this preloaded super dataset?
2. How do we allow chart to filter the preloaded data based on parameter?
3. Can we prevent the chart from loading everytime the parameter is changed?

Is it doable? Is there anything that is missing? Am I over simplify the logic here?

Thank you for the help!

Regards

Zheng

schema workbench, problem to assign roles to virtual cube

$
0
0
I do not know if you can assign roles to a virtual cube,
workbech not let me choose virtual cubes, this is possible?
thanks

Hi anyone Suggest me How to create Heat Grid Chart using pentaho BIServer 4.8?

$
0
0
Hi anyone Suggest me How to create Heat Grid Chart using pentaho BIServer 4.8?

Order by on bar chart with out touching the query in CDE

$
0
0
Hi forum,

Is there any functionality available with in CDE to make the bars on bar chart in order by with out touching the SQL query ?


Thanks,
Sadakar

Clustering execution process

click action on table component?

$
0
0
I find this might work

function(a){

Dashboards.fireChange('Parameter_N',a.series); #Parameter_N is my parameter I can use for data source, right?
}

while my task is like this:

this is the table

a b

num 3
rate 5%

and below is a line chart trying to use the parameter passed from table cell I click

while I click on column a's [num], I want to my sql turns out as follows:

select
date,
[num]
from table a

while I click on
column a's [rate], I want to my sql turns out as follows:

select
date,
[rate]
from table a

how can I make it? thank you !




Measure, Dimension names not displaying Polish characters

$
0
0
Hi,

I'm struggling with the issue that Polish characters in dimension or measure names / captions are not displayed properly neither in JPivot nor in Saiku.
I've tried to change mondrian schema encoding to UTF8 or Central European - Windows 1250 but with no luck.
Anyone could have some suggestions, would be highly appreciated.

thanks,
Pawel J.

Create Custom Pentaho CDE Template

$
0
0
Hi Sir

can any one tell me is it possible to create our own custom template for dashboard.If it possible then how to create the custom template in CDE .is there any documentation or video or blog so that i can create our custom template.

Thanks and Regards
Sumit Bansal

Problem in launching Pentaho Schema workbench

$
0
0
Hi All,

I have downloaded the Pentaho Schema workbench 3.2.1 CE.

I have unzipped the folder and I tried running workbench.sh file.
Unfortunately it doesnt launch.
Below is the error description.
Please help as I'm new to Schema Workbench.
Thanks in advance.


sripada@sripadaR:~/schema-workbench$ ./workbench.sh
/home/sripada/schema-workbench
DEBUG: Using JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=/usr/lib/jvm/jdk1.6.0_38
DEBUG: _PENTAHO_JAVA=/usr/lib/jvm/jdk1.6.0_38/bin/java
10:21:31,647 ERROR [Workbench] main
java.lang.NoClassDefFoundError: org/scannotation/AnnotationDB
at org.pentaho.di.core.plugins.BasePluginType.findAnnotatedClassFiles(BasePluginType.java:237)
at org.pentaho.di.core.plugins.BasePluginType.registerPluginJars(BasePluginType.java:500)
at org.pentaho.di.core.plugins.BasePluginType.searchPlugins(BasePluginType.java:115)
at org.pentaho.di.core.plugins.PluginRegistry.init(PluginRegistry.java:420)
at org.pentaho.di.core.KettleEnvironment.init(KettleEnvironment.java:121)
at org.pentaho.di.core.KettleEnvironment.init(KettleEnvironment.java:68)
at mondrian.gui.Workbench.getDbMeta(Workbench.java:1225)
at mondrian.gui.Workbench.loadDatabaseMeta(Workbench.java:193)
at mondrian.gui.Workbench.<init>(Workbench.java:132)
at mondrian.gui.Workbench.main(Workbench.java:2094)
Caused by: java.lang.ClassNotFoundException: org.scannotation.AnnotationDB
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 10 more


Thanks,
Malibu

Pentaho BI Server 4.8 on Postgres 9.x and Apache Tomcat

$
0
0
Hi all,

I am looking for a manual/tutorial on how to setup the BI Server 4.8 with a Postgres database and an Apache Tomcat server.
So far I have found this one http://www.prashantraju.com/projects/pentaho/ which is a really great starter but seems to differ in some aspects to 4.8

Does anybody here know if there is a good tutorial out there available?
Thanks for any hint!

Bobse

Mailing CDE dashboard-- Need Help

$
0
0
Hi All,

I have developed the dashboards using CDE in Pentaho BI Server 4.5 CE with the help of information avaialable in Pentaho Forums.
Now I need all your help in next big thing.

My requirement is to mail the same Dashboard report on Daily basis by scheduling it in Server.
I dont know how to achieve it.
Please help me on this.
It will be a great help.

Thanks in advance.
Malibu

Weka Error Message

$
0
0
Using GUI - Explorer
When I try to save model - I get the following error message

Load Failed
java.io.FileNotFoundException: C:\Program Files\Weka-3-7\Model1.model (Access is denied)

What does this mean ?

Bob M

Can I send parameters to Interactive Report ?

$
0
0
Hi,
I want to created a CDF dashboard that will display data from table has been filtered and I will send that filters to interactive
report as parameters. Can I send that parameters from dashboard to interactive report?
Thank you.

calculate value from old and new record during update

$
0
0
Hi all,

I have following kind of target dataset, ID1 and ID2 are the keys.

ID1 , ID2 , Value
100 , 10 , 90
100 , 20 , 95
100 , 30 , 92


I need to update this dataset using a import dataset. e.g.
ID1 , ID2 , Value
300 , 30 , 90
100 , 20 , -15

The first record of the import dataset needs to be inserted, no problems there. Using Table output will do the trick.
But the second record is special, instead of a normal Update, I want the following to happen:
In the target dataset, find the record matching the keys: 100 , 20 , 95
Now I want to add the imported value to the target dataset: 95 + -15 and update the dataset.

The resulting record would be: 100 , 20 , 80

The end result:
ID1 , ID2 , Value
100 , 10 , 90
100 , 20 , 80
100 , 30 , 92
300 , 30 , 90

I don't see how to achieve this with Update or Insert/Update, this will result in 100 , 20 , -15
Any thoughts about this?

Thanks in advance,
Herbert.

Send output data in batches to REST api

$
0
0
Hi,

I want to send data over REST api in batches. e.g. 100k records are getting imported from particular table and some transformation is done on it. Now I want to post that data to REST api, each call with batch of 1k records. Is there any support to do batching of output rows? After record count is matched with batch size, REST call will be made. And record count will reset to 0 and process will continue.

100k [??] 1k x 100
DB ------> Transformation ----> Data splitter ----------> REST api

Another possible way to achieve the same is open input connection to DB and read 1k records, transform and post to service. Repeat the same procedure till all data is read. It will use loop in job, but loops are useful for small number (as per some post on forum). We may have 1 million records, looping over such data can create memory overflow error. so is there any other way to implement this approach.

Thanks.

Hadoop Online Training

$
0
0
Hi,
UNICOM is conducting Hadoop Online training Batch from 31st August 2013. This training will help you to understand Big Data Hadoop Ecosystem and will give hands on trainings.
For more details you can drop a Email at contact@unicomlearning.com or you can give us a call at +91 080 4202 3134.
Regards
Training Team Unicom
Visit us at www.unicomlearning.com

Big Data (Hadoop) Developer Online Training Call Us +919000444287

$
0
0
Big Data (Hadoop) Developer Online Training
Click Here For Enquiry
Understanding Big Data

  • Introduction/Installation - Hadoop Custom VM(Single Node)
  • Understanding Big Data
  • 3V (Volume-Variety-Velocity) Characteristics
  • Structured and Unstructured Data
  • Application and use cases of Big Data
  • Limitations of Traditional Large Scale Systems
  • How a distributed way of computing is superior (cost and scale)

HDFS

  • HDFS Overview and Architecture
  • Data Replication
  • Safe Mode
  • Name Node
  • Checkpoint Node
  • Backup Node
  • Configuration Files

HDFS Data Flows

  • Read
  • Write

HDFS Commands

  • File System
  • Administrative

Advanced HDFS Features

  • HDFS Federation
  • HDFS High Availability

MapReduce Overview

  • Functional Programming Paradigms
  • Input and Output Formats
  • Hadoop Data Types

MapReduce Overview

  • Input Splits
  • Shuffling
  • Sorting
  • Partitioning
  • Configuration Files
  • Distributed Cache

MR Alogrithm and Data Flow

  • WordCount

MapReduce Architecture

  • Legacy MR
  • Next Generation MapReduce (aka YARN/MRV2)

Developing and Deploying MR Programs

  • Standalone Mode
  • Hadoop Streaming

Optimizers

  • Combiners
  • JVM Reuse
  • Compression

MR Best Practice and Debugging
Fundamental MR Algorithms (Non-Graph)

  • Student Database
  • Max Temperature

Higher Level Abstractions for MapReduce – 1

  • Pig Introduction
  • Pig Latin Language Constructs
  • Pig User Defined Functions
  • Pig Use Cases

Higher Level Abstractions for MapReduce - 2

  • Hive - Introduction
  • Hive QL
  • Hive User Defined Functions
  • Hive Use Cases

NOSQL Databases
NoSQL Concepts

  • Review of RDBMS
  • Need for NOSQL
  • Brewers CAP Theorem
  • ACID vs BASE

Different Types of NoSQL Databases

  • Key Value
  • Columnar
  • Document
  • Graph
  • Columnar Databases

Hadoop Ecosystem

  • HBASE vs Cassandra
  • HBASE Architecture
  • HBASE Data Modeling
  • HBASE Commands
  • HBASE Coprocessors - Endpoints
  • HBASE Coprocessors - Observers
  • SQOOP
  • Flume & OOZIE

Can I send parameters to Interactive Report ?

$
0
0
Hi,
I want to created a CDF dashboard that will display data from table has been filtered and I will send that filters to interactive
report as parameters. Can I send that parameters from dashboard to interactive report?
Thank you.
Viewing all 16689 articles
Browse latest View live


Latest Images