Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

pentaho user console cohort analysis

$
0
0
hi all

i am using pentaho user console 4.8 which talks to a mysql db through the pentaho metadata layer.

sometimes users need to do cohort analysis. an example of the cohorts they are experimenting with are the number of units each partner has sold 0-9, 10-49, 50-249, 250+ maps to low, med, high, vhigh respectively. these mappings are prototype data analyses so it doesnt make sense to keep chopping and changing database columns.

solution 1: the current solution is to export all the necessary data to a csv and process in excel. this takes 30mins to stream all the uncompressed data to the browser then download as a csv. the query run straight in mysql INTO OUTFILE takes 30s, 30s to zip up and 10s to download. is it possible to replicate this INTO OUTFILE and download compressed process in PUC?

solution 2: is it possible to upload a new data source (csv with cohort mapping data) which is joinable to the current table?

solution 3: calculated measure in PUC?

thanks for your help! id be happy with any other viable solution too
ben

Unreadable values on bar chart

$
0
0
Hi all,
how you can avoid, in a stacked bar chart, that the value is not displayed when it is proportionally "insignificant" compared to others?
This is the case:

bar chart.jpg

the valus are 130, 35, 50, 5 but this one (5) is not displayed.

Thanks for the support.
Attached Images

How to export CDE dashboard to pdf without using PRD

$
0
0
Hello!


How can I export CDE dashboard to pdf without using PRD?

Carte: can I get parameters from URL?

$
0
0
Can I use a parameter from Carte's URL in a Job? Something like this:

http://localhost:8080/kettle/startJob/?name=myjob&xml=Y&testvar=filename.txt

I want to do this because I have a job to transform an input file but I want to change that filename dinamically, and creating a new XML file for each file is a bit nonsense.

I've tried many things and I couldn't find a solution :-(

Thanks in advance.

scaling filter for class variable

$
0
0
Hi,

Is there a filter I could wrap around my classifier, which would transform my class variable into a certain range? and then reverse transform the predicted value to the original scale?
For example, I would like to apply a log function to my class variable, but some of the values are negative. Is there a filter, which would figure out the minimum value to add to class variable to make it positive, apply log transform, then reverse the process after prediction? If I have to do it myself, what is the best way to do this? Saving the min value as an atrribute and then reference it later?

Thanks,
Sunil

Best practice tip

$
0
0
Hi all,
I'm coming to start a new project using an old version of Pentaho EE: v 4.6.


I need a tip in order to start in the right direction, then I expose you the work that I'm coming to start.


I have to buil a NEAR REAL-TIME BI SOLUTION wich takes as input a JMS, which consumes about 1000 records per minute,
then I have to aggregate the input data in a time-based window (i.e. 3-5 min.)


Then the point: about you is better to store the data in a intermediate DB and then read data to visualize it, or
is better to display directly the data on the dashboard by creating a report, whit PDI as data sources, and refreash the dashboard every 1 minute?

THANKS

How to migrate dashboards from different servers?

$
0
0
Hi, I wonder if anyone ever read any article on migrating dashboards from one server to another. What would be the best way to do this migration? Today working with pentaho in development and production environment. I did a test by just copying the folder, however I had to restart the service to be able to view the changes. I would like your help community!

Dashboards.context.user no longer works

$
0
0
I'm using ctools and pentahi 4.8. I have the following error: Dashboards.context.user in javascript returns null. Why? how can I remove this error?
This error occurs (I think, but I'm not sure) after i enable some of the links in the image (link to http://localhost:8080/pentaho/Publish?publish=now)

what setting should I modify or set?

Thanks in advance !
Ettore
Attached Images

Executing a field a user defined java expression

$
0
0
So, dumb question time.

Say that I've built up a string in a field that's a valid Java expression. For example, I've built up a field to contain the string

Code:

((someField == null || someField.isEmpty() || someField.startsWith("8872")) ? otherField.concat(someOtherField.substring(Integer.parseInt(startValue), Integer.parseInt(endValue)):someField.concat(aDifferentField)
and now I want to execute that as an expression in a User Defined Java Expression step. (Yes, someField, someOtherField, otherField, aDifferentField, "8872", startValue and endValue are all arbitrary fields that are unknown until run time.) Is there a way to do that?

Iframe error

$
0
0
  1. Hi,

    I'm trying to use :

    <iframe src="http://localhost:8080/pentaho/content/pentaho-cdf-dd/Render?solution=NOLA&path=&file=nola.wcdf&userid=public&password=public" width="100%" height="500" frameborder="0" scrolling="no"></iframe>

    But I getting this error:


    Failed
    No class registered for id pentaho-cdf-dd
    Server Version: Pentaho Platform Core 5.0.1-stable.-1



    I'm upgrading to version 5 and this works in my previous 4.5 version.


    Tks
    Emannuel

Table Component - values/numbers appear below bar

$
0
0
Hi All,

I am using Table Component of CDE. Everything is fine expect the values / numbers show below the bars.
Please see attach screen capture.

table_component.jpg

Please suggest how/which property should I use to place the number on top of the bars.

I am using Pentaho CE 5.1.

Thanks,
Prakash
Attached Images

adding new datasource

$
0
0
i have a problem with pentaho community business analytics, whenever i try to create a new datasource over mysql(jdbc) the test succeeds,but when i create it it says: "connection is not a valid connection. are you sure you want to save this connection".

i already put the jdbc driver in the right folder
how do i solve it?

classification full java example

$
0
0
Hi,

I am a Java developer. I would like to classify some texts, using Naive Bayes. I would like to learn algorithm and than just classify my text.
I spend some days to do it with Weka project.. I cannot find full example how to do it. I found many examples, but the documentation is not full..

I would like to do something like that:

String[] inputText = {"hey, buy this from me!", "do you want to buy?", "I have a party tonight!"};
String[] inputClasses = {"spam","spam","no spam"};

TextClassifier classifier = new TextClassifier();
classifier.lern(inputText, inputClasses);

String testText = {"you want to buy from me?"};
String result = classifier.classify(inputText);

Do you know really full example how to resolve simple problem like that ?

Thanks for help

Insert problem with primary key in PSQL

$
0
0
Hi!!
I am having trouble when inserting data into a database in PSQL due to primary key issues. I am using a "Execute sql script" and I would like not to insert something if it already exists in the table (or ignore the error). I have tried some things but they do not work:

1. Use a Table Output selecting the "ignore errors" and defining the database fields. The job is executing many times and therefore it does this step many times-> the previous information is erased (I have not selected the truncate table option).

2. The following script:

//Script here
var sqlDIM_LOC= "INSERT INTO dim_loc ( ccaa, province, municipality, localization, road, way)"+
"SELECT '"+ccaa+"','"+province+"','"+municipality+"','"+localization+"','"+road+"',"+way+" "+
"WHERE NOT EXISTS (SELECT localization,road,way FROM dim_loc WHERE "+
"localization='"+localization+"' AND "+
"road='"+road"' AND "+
"way="+way+")";

It says there is a missing ; -> even though, I have tried it on psql with some data; it executes correctly and returns an error due to primary key issues-> so I suppose it will return an error in Kettle too.

Any idea?

General error, i cant update dimension

$
0
0
Hi all,
I can not update my postgresql dimensions table, i'm getting the follow error:

2014/03/07 16:15:10 - DIM_CONTAS - Dispatching started for transformation [DIM_CONTAS]
2014/03/07 16:16:14 - DIM_CONTAS - DIM_CONTAS
2014/03/07 16:16:14 - DIM_CONTAS - DIM_CONTAS
2014/03/07 16:16:20 - Spoon - The transformation has finished!!
2014/03/07 16:16:20 - DIM_CONTAS - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Errors detected!
2014/03/07 16:16:24 - DIM_CONTAS - ERROR (version 4.4.0-stable, build 17588 from 2012-11-21 16.02.21 by buildguy) : Errors detected!

Seems like the error is in Insert / Update step, i have a column that i set update to N it's not work, if all the columns are setted to Y or N still not working

Where i'm getting wrong

Any help please.
Thank you

Is it possible to distribute rows to parallel transformations in a job?

$
0
0
I have a job that calls a transformation that gets a list of table names to extract data to text file from and I am trying to execute 5 simultaneous exports at one time. So my first transformation gets the list of table names, adds a repeating sequence of 1-5 and copies to result. The next steps in the job are 5 transformations set to run in parallel that each have a CASE step to evaluate for 1, 2, 3, 4, or 5 and execute a Table Input step and output to Text File. The problem I am having is that the Table Input step always runs after the CASE statement no matter what. It's the strangest thing. I have all 5 of these transformations set to RUN FOR EVERY ROW.

That got me to thinking...there has to be an easier way to distribute this original list of table names to a transformation a said number of times to do my exports. Partitioning maybe?!? Not sure... Any advice?

problem with kettle-engine.jar file.

$
0
0
Hi All,


I have created a job in Kettle and trying to integrate it with my existing Java Application, but things are not going fine with kettle-engine jar file.
when i keep kettle-engine.jar file in class path or lib folder then its showing some strange messages in console/log when i compile my application.
Please find the attachment for error.log for it.

if i dont use kettle-engine.jar file then its working fine and build is flowing to further stages properly and showing that class files in package import org.pentaho.di.trans.* not found, PFA error2.log for it.

i have tried to use two different versions of libs kettle-engine.jar (4.4 version) and kettle-engine-5.0.1-stable.jar (5.5 stable), but still ending with same error.

is thr any other way where we can avoid this error ?

Regards,
S.A. Mateen
Attached Files

Can't load the jobs, transformations via commandline in server after moving the files

$
0
0
So in the data-integration directory, I have a projects/tutorial/ directory that contains all my jobs & transformations.

I moved them to our remote server from my machine that will be housing the jobs & transformations and run them via cron on our databases.

The installation, etc are all in the same directory format (/opt/Pentaho/data-integration/, then projects/tutorial/).

I tried using Kitchen (the server doesn't have GUI) to run the jobs like this and also tried importing to no avail:

./kitchen.sh -dir=/opt/Pentaho/data-integration/projects/tutorial/ -job=/opt/Pentaho/data-integration/projects/tutorial/Main\ Job.kjb

./import.sh -filedir=projects/tutorial/ -dir=projects/Tutorial/

None of these work -- since I can't use spoon in the server, I'm stuck as to what I can do - I tried reading the docs and they don't really help either. What am I doing wrong?

BatchUpdateException

$
0
0
Hi all,

I have a transformation that reads a CSV file but when the final step exceutes to insert data into a data, the transformation fails with the following attached log file.



I excuted one of the SQL statement in the insert statement and it ran well without any errors.

Why am I getting these errors? Can some one explain to me what this means?


Thanks,

Ron
Attached Files

Date to milliseconds

$
0
0
Hi ,

I reading the data from a csv and want to convert a date (dd-MM-yyyy HH:mm:ss) into milliseconds (ex: 1394235596333 ).

As discussed in earlier thread (
http://forums.pentaho.com/showthread...e-to-Timestamp)
i converted a date into Integer. So that it converts to milliseconds.

When i preview this step , i see the value printed as dd-MM-yyyy HH:mm:ss1394235596333

Any idea on how to convert date to milliseconds.

Have attached the files for reference.

TIA,
Madhu
Attached Files
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>