Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

why the plugin icon image didn't show up

$
0
0
the plugin jar file, the icon image file and the plugin xml files are in the plugins/steps folder, but in kettle the image didn't show up. what possibly wrong here. note: every other plugins are working properly except the nee one.

thanks

Pentaho User Console partially loaded

$
0
0
Hello,

Pentaho User Console stops loading after i login. A couple of days ago, it used to load normally, but it doesn't anymore although it's still a fresh install.
I tried to read some log files, but i don't know exactly what i should be reading !
I downloaded the 6.1 version, but still the same thing.

Is it a common problem ?

And is it normal that it takes the server almost four minutes to load ?

Thank you.

2016-08-20_00-29-55.jpg
Attached Images

Visualization for MySQL

$
0
0
I have MySQL in my APplication and currently using fusion Charts. I read on some forums that Pentaho offers a Flexible Visualization ( with Drag and drop)
can some one on this Forum advise me what is the Product I should download from community Edition to run the Visulaization of MYSQL database.

Thanks in Anticipation

Doubt with MDX Query in Pentaho

$
0
0
Hi All,


I'm new with Pentaho and I'm having a lot of problems with MDX. I want to create a Dashboard in Pentaho with 'DATE_FROM' and 'DATE_TO' parameters.
As I don't have knowledge with MDX, I installed Saiku in my BI Server and I use it to create the MDX query.


This is one simple example of query for all dates


WITH
SET [~COLUMNS] AS
{[view.all].[view].Members}
SET [~ROWS] AS
{[time.all].[date].Members}
SELECT
NON EMPTY CrossJoin([~COLUMNS], {[Measures].[sessions]}) ON COLUMNS,
NON EMPTY [~ROWS] ON ROWS
FROM [VisionGeneral]


But as I want introduce the 'DATE_FROM' and the 'DATE_TO' as a parameters I don't know how to do it. For example I tried to set it in saiku and the result was the following.


WITH
SET [~COLUMNS] AS
{[view.all].[view].Members}
SET [~ROWS] AS
{[time.all].[2015].[01].[20150101], [time.all].[2015].[01].[20150102], [time.all].[2015].[01].[20150103], [time.all].[2015].[01].[20150104], [time.all].[2015].[01].[20150105], [time.all].[2015].[01].[20150106], [time.all].[2015].[01].[20150107], [time.all].[2015].[01].[20150108]}
SELECT
NON EMPTY CrossJoin([~COLUMNS], {[Measures].[sessions]}) ON COLUMNS,
NON EMPTY [~ROWS] ON ROWS
FROM [VisionGeneral]


But with this I can't introduce the two parameters than I want.
In The last example of MDX query the DATE_FROM=20150101 and the DATE_TO=20150108


I used to work with sql and this situation in sql was very simple to fix with the WHERE clause BETWEEN DATE_FROM and DATE_TO But here in MDX for me is very dificult to find the solution.


I hope that somebody can help me, because I spent a lot of time searching in internet but I couldn't find the solution and I really need it for my final degree project.


Thanks in advance and regards

CCC treemap root category legendclick

$
0
0
Hi all,

I have a CCC treemap chart in my dashboard and observed that: when the legendClickMode is set to True, clicking on the rootCategoryLabel in the legend will toggle the entire chart as expected; however I am not able to click on the legend back to toggle it back.

Meaning, once the chart becomes invisible when I clicked on the root category label in the legend, there's no way that I could make the chart visible. I found this behavior on http://community.pentaho.com/ctools/ccc/#section=charts as well.

Can someone please help me to either disable the root category all together or override this functionality?

Thanks.

Anti Join

$
0
0
Hello,

is there a way to perform an Anti Join in PDI?

thanks in advance

Doubt using a excel template containing subgroup in excel writer.

$
0
0
Hi ALL,

I am trying to get data from Database and put it into an excel, for this i am using Pentaho Data integration Kettle. I am using a template excel file which has subgroup in it. I want the Excel writer to maintain the subgroup when it writes into the excel file. Please tell me how to achieve this subgrouping in the output excel file. I kindly request you to please help me out.


Thanks in advance.

Data Integration for eBay API using data integration transformations

$
0
0
Team,

I have created a new kettle transformations to get the data from eBay API. This can be used for small/midsize or large merchants who does business with eBay and want to integrate into their database.

I have examples for:


1) Trading API
2) Finding API
3) Shopping API

The ReadMe.md will have all the information.

https://github.com/shopatlemo/Pentaho_ebay_Dev_API

Karthik

Settiing up CDE 6 to access image files?

$
0
0
Hi Guys. Am a newbie to CDE. Am having unexplained problems adding Image (Logos) to the CDE Dashboard after trying different options for the past several days. Googling doesn't really show me a clear and consistent way to display/setup the image properly. Am experiencing very erratic behavior on how to make the Logo display properly on the CDE Dashboard.

Since I am not familiar with CSS yet, am trying to use "HTML" or "Add Image" instead. When using HTML, here is the value that I had used:

<a>
<img src="http://ip_address:8080/pentaho/api/repos/%3Ahome%3Ausername%3ADashboard_Test1.wcdf/images/logo_xxx_Philippines.png" width="160" height="160" style="vertical-align:middle"/>
</a>

Do I really have to put in the full http link?

If I move this finalized Dashboard to the Public folder, I suppose the above http link would have to change to correctly point to the png file. The same goes for if I change the wcdf filename.

As for the "logo_xxx_Philippines.png" file, where do I copy this to in the biserver folder? If I use the browse-upload function, the png file doesn't show up in the directory list. I have already tried copying the png file to several other folders (both in home and public folders) with images folder names.


If I use the "Add Image" function, I don't know how to call the URL and which folder to copy the logo png file to in the biserver.

Can I use something like this?
<a>
<img src="/images/logo_xxx_Philippines.png" width="160" height="160" style="vertical-align:middle"/>
</a>

As for the other objects in CDE, I don't have any problem. I guess CDE should have a proper documented way explaining on how to add an Image file for use in CDE.


By the way, after deleting the png file wherever I had copied it to, the image logo still shows upon preview. So therefore, kinda weird for me. Even overwriting with another image on the same png filename doesn't display the new image. Is this a cache problem needing a refresh?

Pls. advise.

Error while dynamically configuring the offset for ADD SEQUENCE step

$
0
0
officeDev.kjbofficepart1.ktrofficepart2.ktr

I have created two transformations officepart1.ktr and officepart2.ktr in officeDev.kjb. In the officepart1, I am setting up the variable by getting max_id from the table.
In officepart2, using add sequence step I am adding the surrogate key where 'start at value' is ${ID}. But when I am running the job it is showing error at add sequence step. The error is:
Counter value for start could not be parsed, original value is [${ID}] which becomes [ 0]. Error: For input string: " 0".
Can someone tell me the reason behind this?
Attached Files

REST services

$
0
0
I'm not really sure if this question goes here or in the general topics but here goes:

Is it possible to view/refresh a report with/without parameters using REST services?
I'm asking this here cause I've searched for quite a bit now and I can only find outdated/partial documentation on the subject.

Customize CST interface : Community Startup Tabs

$
0
0
Hello ;
I would like to customize my CST Launcher Interface , any ideas or examples to do it

spoon malfunction

$
0
0
Hi everybody,

I design a transformation using spoon. Okay
I start the transformation, somthing does not work, it hangs up. Not okay, but a spoon relaunch should help and I would accept that so far.
I cannot restart spoon.
A reboot will help.
Spoon will not restart from a new command line (including set-pentaho-env).
Spoon will not restart after logout and logon and opening all the windows again.

Is there anybody out there who knows a trick to recover (delete a file, reset a path, or whatever it might be)?

System: Windows 7
JRE: 1.7
spoondebug:
14:34:35,542 ERROR [BlueprintContainerImpl] Unable to start blueprint container for bundle pentaho-big-data-impl-shim-hdfs
org.osgi.service.blueprint.container.ComponentDefinitionException: Error when instantiating bean .component-1 of class com.pentaho.big.data.bundles.impl.shim.common.ShimBridgingClassloader
...
Caused by: java.lang.NoClassDefFoundError: org/pentaho/di/core/hadoop/HadoopConfigurationListener

Idea: Could it be, that this mess is correlated to a memory shortage? There are only a few(6-7) Gb left.

Hoping for an answer
Martin vls

Problem with JSON

$
0
0
Hi,
i'm trying to read from a CSV file, a column that contains an entire JSON... but even if I've set up the length of the string at 9999 characters, Kettle read only the first 300s.

There is some option that I am forgetting to modify before read it?
This is not my first transformation...I've used Kettle since a couple of years ago but I've never experienced nothing similar.

Thank you so much.

Execute the same JOB, multiple sources, trying for high queries per second - Help

$
0
0
I have a question. This is more of a question about how Pentaho works, before I endeavor out and build some new JOBs and processes to help my problem. ANY HELP IS APPRECIATED.

I have a JOB that grabs data, sends that data to results, and then passes those parameters/variables via "execute for each row" to another JOB that executes a REST API query for each row, retrieving the results and then writing those results down. Everything works fine, no errors and the data is accurate. The way my JOB works right now, I send one "line" result at a time to the job, it spits out the results, then executes the next "line", the loop taking as long as the approximate execution of the query, 200-300 milliseconds per loop. I have measured the speed of the REST API query to average about 200 to 300 milliseconds, and since the source of the API is not under my control, I can not change or impact my performance of the API. The destination of this API can handle lots of queries per second, in fact it handles 1400-3600 per second right now, but the results per query are always being returned in 200 to 300 milliseconds.

SO.. my 2k line source took 27 minutes. Not good. since i really need to process about 1milllion lines a day. This is a question of Pentaho design, and how to deal with this problem and try to speed up my process.

1st thing I was thinking is I could actually BUILD more JOBS with the same capabilities, and try to split up the source to be able to query more times per second. If I build 2 JOBs, split the source, then execute across the two JOBs, I'd cut my time in half. etc.

What I really want to be able to do is keep that one unique JOB that works fine, but execute that same exact JOB as fast as I can via the source to get more efficiency, but I don't know if can I query the same JOB via multiple sources at the same time and have data continuity.

All of this to ask, can I execute the same JOB with passed through parameters/variables, multiple times/at the same time, via multiple sources, and not have a data integrity/loss issue? If I can't, how have others dealt with similar problems?

replacing String value by other String value

$
0
0
hi everyone,

I am beginner in pentaho But I have experience I Oracle (SQL), any way i was trying to replace values of column by other values or trim part of the string value

I tried many things that pentaho ignore them as: replace() , regexp_replace(), decode(), case , sudstr and other sql statements

please let me :)

Create conditional calculated measure

$
0
0
Dear all,


I am trying to build a particular chart... I need to filter on measures (other than just selecting "Greater than"/"Less than" some constant). That is why I am trying to do that through MDX expression.


I created an example where I have 8 rows and, for each one, I have columns QLIQi and QREFi. I want to show in one chart sum of (QLIQ-QREF)i where QLIQi > QREFi.


I tried to create a calculated measure using something like this, but does not work:



IIf ([QLIQ] > [QREF], [QLIQ]-[QREF], 0)


Could anyone please advise me what is my mistake?

Thanks.

Why use Pentaho Data Integration instead of pl/pgsql?

$
0
0
Dear community,

We're building a new DWH from scratch. Technology decision has not been taken yet. C-level tends to do the transformation from the staging area to the DWH layer directly in postgres DB via pl/pgsql to have it as lean as possible. Me, a Business Intelligence developer having worked with PDI (kettle) before tend to do it with PDI as I think with PDI it would be easier to scale and easier to develop / maintain / deploy.

Now the question is what other reasons there are from your point of view to use PDI instead of pl/pgsql?

Parallelization
Performance (has to be benchmarked anyway)
Maintenance and indepandence in the future (is it easy to find a new pl/pgsql dev?)
...

Thanks for your input.

Charting by passing 2 parameters from CDE dashboard

$
0
0
Hi Guys,

Newbie here. Am trying to pass 2 parameters from the dashboard to the Query for creating a Chart (using mdx over mondrianJndi). Passing 1 parameter always works easily. But I've heard that passing 2 parameters doesn't work (or is tricky).

Am not sure if this is right thread to ask this, but can somebody help me with my Query?

***** This single parameter passing works *****
WITH
SET [~ROWS_CaseStatus] AS
{Descendants([CaseStatus].[All Case Statuses] ,[CaseStatus].[Status])}
SELECT
NON EMPTY {[Measures].[CaseCount]} ON COLUMNS,
NON EMPTY {[~ROWS_CaseStatus]} ON ROWS
FROM [SummaryOfCasesReported]
where ${regionalOfficeParam}


***** While trying to pass two parameters doesn't work *****
WITH
SET [yearSelect] AS
'StrToSet("${yearParam}", [PeriodOfDateReported])'
SELECT
NON EMPTY({[Measures].[CaseCount]}) ON COLUMNS,
NON EMPTY({Descendants([CaseStatus].[All Case Statuses] ,[CaseStatus].[Status])}) ON ROWS
from [SummaryOfCasesReported]
where crossjoin({[RegionalOffice].[${regionalOfficeParam}]}, {[yearSelect]})


***** or this *****
SELECT
NON EMPTY({[Measures].[CaseCount]}) ON COLUMNS,
NON EMPTY({Descendants([CaseStatus].[All Case Statuses] ,[CaseStatus].[Status])}) ON ROWS
from [SummaryOfCasesReported]
where {[RegionalOffice].[${regionalOfficeParam}]} * {[PeriodOfDateReported].[${yearParam}]}



Do note that the "yearParam" is a 4-digit Integer for the Year (YYYY). The value comes from my DIM_DATE table of CALENDAR_YEAR field (integer). The FACT_cases table (SummaryOfCasesReported) had a date_key (integer) that links to the DIM_DATE table.

While the "regionalOfficeParam" is varchar.

Does anyone have an idea on how to make this work?

Thanks in advance.

Kettle java API Data integeration

$
0
0
Hi Experts,


I've a problem with the following code.
its generating the xml in required place, but the contents in the XML prints like this(below).
Its not showing the table fields .In Db i have A record .
how to fetch the fields for required table(here example--->eperson)

If I set output fields(setOutputFields) to XMLField its Showing correct XML.
But Table Name Dynamically Change.
how to get table fields dynamically...!!!!

Thanks in Advance....!!!

XML:

<?xml version="1.0" encoding="UTF-8"?><ROWS>
</ROWS>

code:



package com.htc.ites.configapi;

import java.io.DataOutputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.UnsupportedEncodingException;
import java.util.ArrayList;
import java.util.List;


import org.pentaho.di.core.KettleEnvironment;
import org.pentaho.di.core.database.DatabaseMeta;
import org.pentaho.di.core.exception.KettleException;
import org.pentaho.di.core.logging.LogChannel;
import org.pentaho.di.core.logging.LogChannelInterface;
import org.pentaho.di.core.logging.LogLevel;
import org.pentaho.di.core.row.RowMetaInterface;
import org.pentaho.di.core.util.EnvUtil;
import org.pentaho.di.trans.Trans;
import org.pentaho.di.trans.TransHopMeta;
import org.pentaho.di.trans.TransMeta;
import org.pentaho.di.trans.step.StepMeta;
import org.pentaho.di.trans.step.StepMetaInterface;
import org.pentaho.di.trans.steps.tableinput.TableInputMeta;
import org.pentaho.di.trans.steps.xmloutput.XMLOutputMeta;


public class XmlGenerate {


public static void main(String[] args) throws KettleException, UnsupportedEncodingException, IOException {

TransMeta transMeta =buildCopyTable();

}




private static TransMeta buildCopyTable() throws KettleException, UnsupportedEncodingException, IOException {

EnvUtil.environmentInit();
KettleEnvironment.init();

/*
* 1.Transformation(New---> Transformation)
*/

//Create a new transformation...

TransMeta transMeta = new TransMeta();
transMeta.setName("Simple");

/*
* 2.Table Input Details(select table input and fetch the database using query)
*/

//Database Connection...


DatabaseMeta databaseMeta = new DatabaseMeta();
databaseMeta.setDatabaseType("PostgreSQL");
List<DatabaseMeta> databaseMetas = new ArrayList<DatabaseMeta>();
databaseMetas.add(databaseMeta);
transMeta.setDatabases(databaseMetas);

//Defining Table Input...

DatabaseMeta sourceDBInfo = transMeta.findDatabase("Postgres");
TableInputMeta tableInputMeta = new TableInputMeta();
tableInputMeta.setDatabaseMeta(sourceDBInfo);
String selectQuery = "SELECT * FROM eperson";
tableInputMeta.setSQL(selectQuery);


StepMeta fromStep = new StepMeta("input",tableInputMeta);
fromStep.setLocation(150, 100);
fromStep.setDraw(true);
fromStep.setDescription("Reads information from table [ eperson ] on database [" + sourceDBInfo + "]");
fromStep.setStepID("1");
transMeta.addStep(fromStep);


//Defining XML OutPut
// StepMetaInterface metaInterface = fromStep.getStepMetaInterface();
// RowMetaInterface rowMetaInterface = metaInterface.getTableFields();
//rowMetaInterface.getFieldNames();

XMLOutputMeta xmloutputMeta = new XMLOutputMeta();
xmloutputMeta.setMainElement("ROWS");
xmloutputMeta.setRepeatElement("row");

/* org.pentaho.di.trans.steps.xmloutput.XMLField [] xf = new org.pentaho.di.trans.steps.xmloutput.XMLField[1];
xf[0] = new org.pentaho.di.trans.steps.xmloutput.XMLField();


xf[0].setFieldName("eperson_id");


xmloutputMeta.setOutputFields(xf);*/
xmloutputMeta.setFileName("D:\\person.xml");

StepMeta toStep = new StepMeta("output", xmloutputMeta);
transMeta.addStep(toStep);
/*
* 3.Transformation Hop
*
*/
TransHopMeta thm = new TransHopMeta(fromStep, toStep);
transMeta.addTransHop(thm);

//log

LogChannelInterface writeToLog = new LogChannel("sample");
writeToLog.setLogLevel(LogLevel.DETAILED);

//executing Transformation
Trans trans = new Trans();
trans.setLog(writeToLog);
trans.setTransMeta(transMeta);
trans.execute(new String[]{});


return transMeta;
}
}
Attached Files
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>