Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

CDE: Change the CCC onclick?

$
0
0
Hi guys,

i´m pretty new to the whole Pentaho Suite.
I am developing a Dashboard with three different Line Charts.

I have three Columns (Headers in HTML with Text in them) which are clickable and change their background-color for example.
It is necessary that a different Line Chart is displayed when I click on the proper Column.
Just like a switch for the different Line Chart Components.

Is this possible without using iFrames?

Best regards,

THaupt.

Real time reporting

$
0
0
Hi,

My current projet is about real time reporting, I am actually lost, What are the steps? Is there a difference between real time and periodical reporting in terms of the process?

Please, let me know if you have any ideas.

Thank you in advance.

Fields not available in report when data integration source uses a JSON Input task

$
0
0
PDI 6.1
PRD 6.1

I have a MongoDB that I'm trying to query & transform to create a report for. It seems that any fields I add to the stream through the JSON Input task aren't visible to the report.

Here's a walkthrough of a basic example:
- Have a collection with nested documents
- First MongoDB Input task retrieves OID field and a second field containing the JSON of the nested documents
- Send output to JSON Input task
- Parse JSON in stream to extract additional fields
- Send output to select task
- Remove JSON fields from stream (to clean it up a bit)

Now in PRD
- New report, data integration as data source
- choose transformation created above
- preview query --> receive error
- in console (after changing .bat to use java instead of javaw) - see error about "couldn't find field x in row"
- the fields it always complains about are always the ones that were added to the stream by the JSON Input task

I've been able to avoid this in some cases by not using the JSON Input task and always making sure the MongoDB Input task contains the fields, but this nested document scenario is causing me some grief. I'm currently going to explore using a javascript task to evaluate the json and output the fields I need - and initial tests indicate that fields added to the stream by the javascript task do show up in the report. Wonder if anybody has encountered this - if there's something I'm missing on that particular JSON Input task. It would make my life easier, and the transformation easier to maintain, if I could do all of this in the JSON Input task and not some scripting workaround.

Thanks for sharing any ideas or information you can. This forum has been an amazing resource so far and has helped me work through most issues until now.

Cheers!

Use a variable for Database Name

Reading multiple Excel sheets to one textfile

$
0
0
I have an Excel workbook with dept figures from each department on different sheets. I want to read all figures from each sheet and write them to rows in a text file with sheetname (dept name) as first column. Is this possible? When I set up an Excel Input and add selected sheets to the import I only get one of them and I also can't figure out how to attach the sheet name as a row in the output textfile. All sheets are the same, they differs from the sheet name.The output text file should be like this:Sheetname1;Sheet1:A1;Sheet1:B1Sheetname1;Sheet1:A2;Sheet1:B2Sheetname2;Sheet1:A1;Sheet1:B1Sheetname2;Sheet1:A2;Sheet1:B2etc.

Component not refreshed in time

$
0
0
Dear Community :

I am using Pentaho CE 6.0 and CDE.

I have a select component with months and a simple variable holding the selected month.
Based on the simple variable month, i refresh a table showing sales data of that month.
The table component has listeners: month and parameters: month, month. The datasource under the table component has also parameters: month, month string.
Whenever my select box changes, my table changes nicely with the selected month data. So far, so good.

However, based on the selected month, I want to hold the quarter in a variable and refresh the sales data of that quarter.
I did this creating a new simple parameter Quarter and a Post-change function in the select component for month, doing:

Code:

function f() {


Dashboards.setParameter('quarter', quarters[Dashboards.getParameterValue('month')-1]);


}

Again I created a table component listening to the month variable (if it listens to the quarter variable it doesn´t do anything) and having as parameters the quarter variable with an underlying Datasource also having parameters quarter.

The problem is that my table component updates 1 click to slow. For example: default select is March (Q4), it shows Q4. clicking on December (Q3) it still shows Q3. Clicking then on November (Q3), it shows Q3.

I find this strange since I also show the content of my quarter variable to the screen, using a text component and a function:

Code:

function() {       
  return Dashboards.getParam("quarter");
}

and here the quarter changes nicely whenever a month is selected.

Can somebody guide me please how to solve this.

Dashboard Designer - can it be used outside of PUC ??

$
0
0
Can dashboard designer be used outside of PUC? i'm using Pentaho EE with CDE and other plugins. i was hoping to call it via api like we can do with the analyzer. http://host:port/pentaho/api/repos/xdash/editor does bring up the designer but the file repository folder list is not available so we can't insert files or analyzer reports into it.

Rtl

$
0
0
Hi
Does Pentaho Reports supports now RTL ? I need to report some hebrew text.

Thanks

Issue with DBF Input

$
0
0
People, please help.


I'm migrating a database that is in dbf (yes, still people using .dbf's). At some point in processing, the kettle terminates the process and returns me an error that can not convert a string to number. Originally the input field is numeric, the value in error is an exception.

The kettle shows that the value he tried to convert to a number was [$2.55]. Yes, with this dollar sign in front. Analyzing dbf "inside" have not found any trace of something that looks like this, however, when I exported the dbf to text, the record with the error appears. Thus, correcting everything works! I thought the line on dbf with the error, delete it and so the transformation work.

Handmade the solution worked. The problem is that I am reading hundreds of these DBF files. Well, there's the question: is how I make a processing of data passed through the stream before writing them to the destination bank? I tried the solution mentioned by Matt on this post, but unfortunately for me it did not work.

The xDbase step does not allow me to change the field type. I could even force it to be a string, but can not. If I put a select value in order to change the metadata saying that it is a string, not also work. The error occurs before the record out of xDbase step and get to the next step.

So that's my problem.


Does anyone have an idea? Thank you so much guys.


Hugs!

problems with replacing a word.

$
0
0
I have one data file in which regions are named (like for instance Quebec) and I have another file in which Canada is divided into other regions (north, east, central, south and west). I need to combine these two data files and replace the named areas by the geographical position they have in Canada.

What I figured I would do is use the "replace in string" function. So what I do is the following. I specify the fields as follows:
replace.jpg

The output is fine I guess, but instead of replacing the terms in one stream field, it creates a separate stream field for each. I however want it in one field.

replace 2.jpg

As I need to replace a few more terms I will end up with 5 separate stream fields. I think I will have to merge the fields after this step into one? Is my thread of thought right here? Or is there another, better way.

Thanks.
Attached Images

open source Business Intelligence

$
0
0
Popularly revered as open source Business Intelligence package, Pentaho has phenomenal ETL, analysis, metadata and reporting capabilities. This BI tool helps customers recognize the benefits of big data while offering a cost-effective, agile and productive cloud delivery model.

How to troubleshoot a kitchen job

$
0
0
I am new to pentaho and scheduling a job using the kitchen batch file. One of the jobs failed this morning and in the logs all I see is

child index = 11, logging object : org.pentaho.di.core.logging.LoggingObject@21956a72 parent=d304120a-4bf8-4671-ae2b-473ee4150411

I know what step it failed at from the previous line, but I don't know how to interpret this error. Is there a way I can find out what happened? I did try to run it in spoon and it was successful.

Thanks

Big Data Keeps You Fit and Healthy

$
0
0
Big Data has proved its best by getting itself used in the health care facilities of a human being. It is the way on different conditions how Analytics has made the planet a safer home to live in. Ones large number of engineers and scientists have started working harder and even smarter on Big Data since then, it has been used in almost all commodities of life to make life much easier. The Application of this technology is assisting almost each and every industry of every field to grow more competent and gratifying.

The absolute uri: http://www.tonbeller.com/jpivot cannot be resolved

$
0
0
Hi, I've been trying to run a Web Application in NetBeans, using Mondrian 3.3.0.14703 source files and using Apache TomCat 8.0.27. The project runs normally but when I try to access any of the menus in the first picture an error message shows up. Does anybody have any idea how to proceed? Any help will be much apreciated and thanks for your attention.

Setting fields (key value pairs) as variable name(s) and value(s) in a ktr

$
0
0
Hi guys,

I have a string that contains a series of key/value pairs that need to be set as variables for the next transformation.

Easy bit is to split the field as rows and then split the resulting field into variable / value (fields)

What I haven't been able to figure out is how to setting these up as variables?

I'd like to be able to reuse this transformation to run other transformations so I can't really "map" the variables in advance.

The downstream transformations may have one variable or 20. This transformation should be able to run both.

I'm not very good with Java so I'd rather not use a java step.

Any help appreciated.

Thanks

D

Value mapper problem/merging tables problem

$
0
0
I have 2 different CSV files that I have to connect to each other. Both of these files have area descriptions in them but in one they are described by provinces or city names and in the other they are describe by their localization in any of the wind directions (north, east, etc.), this second file also allocates manager names to those wind directions (which manager is responsible for which region). So I thought I should use the value mapper function and assign the region names in such a way that they would be both equal and by this merge the tables. Unfortunately I see that it does not work as I hoped it would. It seems that some of the regions are missing after this step, it is almost as if the mapper is not working properly. I am missing some manager names that I would expect to appear, is there another function to achieve this what I want or am I missing an additional execution that I need to do?

Thanks again for all of your times.

Login issue please help

$
0
0
Hi all

I am trying to login in to

http://localhost:8080/pentaho/



these were the instructions given:
  1. · To start the web interface for the Pentaho BI server, you should open your browser and navigate to the address http://localhost:8080/.
  2. · After you launch the browser interface, you should click on Login as an Evaluator. Use the Administrator account to login. The default page of the platform
  3. · When you are finished, you should close the browser or logout. Then you should execute the stop procedure. You should go to the folder C:\Program Files\Pentaho\biserver-ce and double click the stop-pentaho.bat file. A command window will open showing some executing commands. Then both command windows will close.

I am unable to login , i installed Pentaho B.I server ,biserver-ce-5.3.0.0-213.zip and Pivot 4J .Please help

admin/admin and joe/admin is not working ?

login.jpg
Attached Images

Using xaction to call a kettle job which contains many transaction

$
0
0
Hello everyone,

I want to use xaction to call a kettle job, and this job contains many transactions. But when I run this xaction on bi server 6.0, some exception occurred..
QQ截图20160509103124.jpg

About my kettle job and transaction:

1. I use kettle variable ${Internal.Entry.Current.Directory}(i also tried to use ${Internal.Job.Filename.Directory}) to specify the path of the transactions in this job.
2. I upload these files into the same directory on bi server
3. It works fine when i use spoon, but seems that ${Internal.Entry.Current.Directory} or ${Internal.Job.Filename.Directory} was not replaced correctly when running on bi server..

below is the log:
2016-05-09 10:30:02,082 INFO [org.pentaho.di] 2016/05/09 10:30:02 - 数据导出 - 开始项[获取参数并设置为变量]
2016-05-09 10:30:02,086 INFO [org.pentaho.di] 2016/05/09 10:30:02 - 获取参数并设置为变量 - Loading transformation from XML file [/home/张三/获取参数并设置为变量.ktr]
2016-05-09 10:30:02,089 ERROR [org.pentaho.di] 2016/05/09 10:30:02 - 数据导出 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : org.pentaho.di.core.exception.KettleException:
2016/05/09 10:30:02 - 数据导出 - Unexpected error during transformation metadata load
2016/05/09 10:30:02 - 数据导出 -
2016/05/09 10:30:02 - 数据导出 - Unable to read file [file:///G:/home/张三/获取参数并设置为变量.ktr]
2016/05/09 10:30:02 - 数据导出 - Could not read from "file:///G:/home/张三/获取参数并设置为变量.ktr" because it is not a file.
2016/05/09 10:30:02 - 数据导出 -
2016/05/09 10:30:02 - 数据导出 -
2016/05/09 10:30:02 - 数据导出 - at org.pentaho.di.job.entries.trans.JobEntryTrans.getTransMeta(JobEntryTrans.java:1260)
2016/05/09 10:30:02 - 数据导出 - at org.pentaho.di.job.entries.trans.JobEntryTrans.execute(JobEntryTrans.java:651)
2016/05/09 10:30:02 - 数据导出 - at org.pentaho.di.job.Job.execute(Job.java:730)
2016/05/09 10:30:02 - 数据导出 - at org.pentaho.di.job.Job.execute(Job.java:873)
2016/05/09 10:30:02 - 数据导出 - at org.pentaho.di.job.Job.execute(Job.java:873)
2016/05/09 10:30:02 - 数据导出 - at org.pentaho.di.job.Job.execute(Job.java:546)
2016/05/09 10:30:02 - 数据导出 - at org.pentaho.di.job.Job.run(Job.java:435)

is there anyone help me? Thanks:)
Attached Images

How to join CSV files (without using DB)?

$
0
0
HI All!!!

I have 4 CSV files.
The first file has columns: A,B,C,D
The second file has columns A,B,C,E
The second file has columns A,B,C,F
The second file has columns A,B,C,G

I need to create a new output file which has columns A,B,C,D,E,F,G

Need a little help as to how to do it.

Thanks in advance.

Processing GZIP/Deflate xm file from HTTP Post

$
0
0
Hello Community,

I need some help with a process I'm designing that makes an HTTP POST to a web service and receives a gzip file as response. Here you can see the request:

1.jpg

I have been doing some tests using Postman Chrome Extension, which understands and decompress the file:

3.jpg

However, HTTP Post step does not understand the answer (see document field):

2.jpg

In the last step I try to identify the different fields in the xml. I did a test by downloading the XML from Postman and identifying the fields from there and it worked. However, I need to find a way to do it from Kettle without Postman. Do you know a way to decompress the file prior to capture the data stream?

Thanks in advance!

Martin
Attached Images
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>