Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Problem sorting members

$
0
0
Hi,
I have problem when sorting by member description using the ordinalColumn or ordinalExpression.
Sorting partially works, but when the accents character occures in the description, sorts badly.
I have set collation utf8_slovak_ci in the database and pure SQL ORDER BY test against DB works well.


Could you please let me know the solution ?


Thank in advance

Abort low priority 'Dashboards.fireChange' calls

$
0
0
Hi Experts,
I'm developing BI dashboards using Pentaho CE / CDE tools and I've defined several KPI / Graph components appearing in the dashboard loading in different levels of priorities.
For each selection change happens in the dashboard, I've defined a component to load first for checking the login validity, which is in higher priority(priority=1), and then call other components(priority>10). I'm using a overlay to pop-up the message "You are no longer logged in" when the login session is not valid. The Pop-Up is working fine as expected, but the other components are loading (soon after the 'login_check' component) with 'data not found' error. Waht I really want is to abort all these low priority 'Dashboard.firechange' calls once the 'login_check' component is responed with a failed authentication.

Thanks!!

-Suresh.

Channel is not open error in sftp

$
0
0
Hi,

I am trying to fetch some files using SFTP, but it is throwing the following error. I am able to do sftp manually, but while using pentaho it is throwing error. Any suggesstion as how to fix this??

Thanks in advance for your support.
Poulomi

2014/08/19 21:25:00 - Count < 1 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : Error getting files from SFTP :
2014/08/19 21:25:00 - Count < 1 - com.jcraft.jsch.JSchException: channel is not opened.
2014/08/19 21:25:00 - Count < 1 - channel is not opened.
2014/08/19 21:25:00 - Count < 1 - ERROR (version 5.0.1-stable, build 1 from 2013-11-15_16-08-58 by buildguy) : org.pentaho.di.core.exception.KettleJobException:
2014/08/19 21:25:00 - Count < 1 - com.jcraft.jsch.JSchException: channel is not opened.
2014/08/19 21:25:00 - Count < 1 - channel is not opened.
2014/08/19 21:25:00 - Count < 1 -
2014/08/19 21:25:00 - Count < 1 - at org.pentaho.di.job.entries.sftp.SFTPClient.login(SFTPClient.java:165)
2014/08/19 21:25:00 - Count < 1 - at org.pentaho.di.job.entries.sftp.JobEntrySFTP.execute(JobEntrySFTP.java:632)
2014/08/19 21:25:00 - Count < 1 - at org.pentaho.di.job.Job.execute(Job.java:678)
2014/08/19 21:25:00 - Count < 1 - at org.pentaho.di.job.Job.execute(Job.java:815)
2014/08/19 21:25:00 - Count < 1 - at org.pentaho.di.job.Job.execute(Job.java:815)
2014/08/19 21:25:00 - Count < 1 - at org.pentaho.di.job.Job.execute(Job.java:562)
2014/08/19 21:25:00 - Count < 1 - at org.pentaho.di.job.entries.job.JobEntryJobRunner.run(JobEntryJobRunner.java:73)
2014/08/19 21:25:00 - Count < 1 - at java.lang.Thread.run(Thread.java:682)
2014/08/19 21:25:00 - Count < 1 - Caused by: com.jcraft.jsch.JSchException: channel is not opened.
2014/08/19 21:25:00 - Count < 1 - at com.jcraft.jsch.Channel.sendChannelOpen(Channel.java:670)
2014/08/19 21:25:00 - Count < 1 - at com.jcraft.jsch.Channel.connect(Channel.java:151)
2014/08/19 21:25:00 - Count < 1 - at com.jcraft.jsch.Channel.connect(Channel.java:145)
2014/08/19 21:25:00 - Count < 1 - at org.pentaho.di.job.entries.sftp.SFTPClient.login(SFTPClient.java:162)
2014/08/19 21:25:00 - Count < 1 - ... 7 more

[CDF] after updating CDF all templates are gone

$
0
0
Hello,

i'm working with pentaho 5.0 (i think this also happens with pentaho 4.8) & CDF. After updating/installing cdf, all template files with pattern:
template-dashboard-XXX.html
under
\pentaho-solutions\system\pentaho-cdf

are gone. Also if you add any js library under \pentaho-solutions\system\pentaho-cdf\js, these are also erased.

Where shall I put my templates? Is there any options to customize your dashboard (creating templates each one with its own CSS/JS) without losing your files when you want to update CDF?

Regards.

SQL fields not visible in the Data pane though works at runtime

$
0
0
I have seen this pretty often that whenever i prepare any SQL in "JDBC Data Source", i could preview the data perfectly. But when I come on the main screen, it does not show those fields in the Data pane.

This results in adding the field mapping manually instead of drop down. And after adding it manually, it works perfect at run time.

I was just wondering if there is any trick to see those fields or its a known issue.

Thanks for your help in advance.

Add Input box in Dashboard CDE

$
0
0
Need Suggestion regarding adding input box in dashboard where i can add ids for fetching results.

For Example:Suppose I have 10 user_ids and trying to fetch all user information.
select * from user where id IN(1122,1134,1234,3455,3442,6775)

Ways to performance/stress test a dashboard?

$
0
0
So... I was trying to figure out how to stress test a dashboard from the client's point of view. Something like: if 1 user access, I get 1~2s of loading time; if 10 users access simultaneously, I get 1.5~3s of loading, if 100 users (oh, you get the idea, already).

Any ideas?

Internet Explorer 9

$
0
0
I can´t visualize properly user or admin console on IE9. Is it a compatibility problema with PUC release 5.x?

Data lost between steps

$
0
0
Hi
i face a strange issue of losing data between transformation steps, when pushed in high speed. The details are, we have a requirement to find duplicate by our custom algorithm. As the no of records can be large(several millions), we temporarily write the entire records to lucene indexing as documents. Then after appying the algorithm, duplicate sets are found and written to another lucene index. Now the entire records which is to be pushed to the next step is available in the lucene index as documents. The data is then pushed by using the putRow method in a loop. As the data is pushed in high speed, we found the rows are not completely received in the next step. The loss of data is consistent and the records and fields lost are random. When a Thread.sleep of 50 - 100 milliseconds is used we receive the entire data in the next step. The big concern is that the loss of data is not captured or alerted in any manner. Now my questions are


1. Does the putting of data in putRow ensures the record sent to the next step.


2. If not how to make sure the data is received properly in the subsequent steps.


Any help on this regard is highly appreciated. if required i can provide the code.


Thanks in advance


Regards
Anbarasan

Publish cube to create analyzer report

$
0
0
Good afternoon everyone!
I have an issue currently with one project that I am working with. I am trying to publish a cube in order to utilize the data source for later usage in an analyzer report. The schema looks to be fine and there are no errors. I am using the correct login information and I can clearly see the schema being published unto the server. When I directly download from the BI analytics page, I can see the folder has the appropriate XML contained.

However, I cannot see the data source when trying to create an analyzer report. I am able to see the data source when trying to create a new analysis view...

Is there something that I could be doing that is causing the data source not to show up?

Side note: This project is a beta version of an existing project so the schema has been updated to point to different tables though. Everything else is almost entirely the same.

What am I doing wrong here?

Read data from Http post

$
0
0
I am having an issue with reading a json file which is present in an http post method with raw parameters in pentaho. Its reading the html output instead of the json string. Can anyone please help me solve this issue.

linux setup for miltiple users

$
0
0
I'm trying to get PDI 5.1 set up on a Centos Linux server that several developers have access to and I want to have 1 installed copy of PDI.

Some of the steps I have taken are:
1. as root, unzip to /opt/data-integration
2. create a spoon file in /usr/bin with this inside it:

#!/bin/sh
export KETTLE_HOME="/opt/data-integration"

cd $KETTLE_HOME

./spoon.sh $*

3. chmod -R +r /opt/data-integration -- to make it readable by all
4. chmod 755 /usr/bin/spoon -- the file created above

When I try to run spoon from my users home directory, I an error about it trying to create files inside /opt/data-integration instead of my user's home dir. How do I fix this?

Error:

Unable to create default kettle.properties file : /opt/data-integration/.kettle/ kettle.properties
[Ljava.lang.StackTraceElement;@3472f4f1
org.pentaho.di.core.exception.KettleException:
Unable to read file '/opt/data-integration/.kettle/kettle.properties'
/opt/data-integration/.kettle/kettle.properties (No such file or directory)

at org.pentaho.di.core.util.EnvUtil.readProperties(EnvUtil.java:61)
at org.pentaho.di.core.util.EnvUtil.environmentInit(EnvUtil.java:89)
at org.pentaho.di.core.KettleClientEnvironment.init(KettleClientEnvironment.java:76)
at org.pentaho.di.core.KettleEnvironment.init(KettleEnvironment.java:91)
at org.pentaho.di.core.KettleEnvironment.init(KettleEnvironment.java:69)
at org.pentaho.di.ui.spoon.Spoon$1.call(Spoon.java:569)
at org.pentaho.di.ui.spoon.Spoon$1.call(Spoon.java:562)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.FileNotFoundException: /opt/data-integration/.kettle/kettle.properties (No such file or directory)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:146)
at java.io.FileInputStream.<init>(FileInputStream.java:101)
at org.pentaho.di.core.util.EnvUtil.readProperties(EnvUtil.java:58)
... 10 more

Executing python script from Pentaho Kettle

$
0
0
HI,
I have a python script that I would like to execute from Pentaho Kettle. I tried wrapping the call in a shell script as follows:

#!/bin/sh
KETTLE_HOME='/Users/Desktop//'
echo ${KETTLE_HOME}'/scripts/extract_jira_issue.py'
python ${KETTLE_HOME}'/scripts/extract_jira_issue.py'


When I run this shell script from the command line, it executes the python script without a problem, but when I call this shell script from Kettle shell script , I get the following error:

2014/08/19 16:35:33 - extract_jira_issue - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : (stderr) Traceback (most recent call last):
2014/08/19 16:35:33 - extract_jira_issue - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : (stderr) File "/Users/Desktop/scripts/extract_jira_issue.py", line 6, in <module>
2014/08/19 16:35:33 - extract_jira_issue - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : (stderr) from jira.client import JIRA
2014/08/19 16:35:33 - extract_jira_issue - ERROR (version 5.1.0.0, build 1 from 2014-06-19_19-02-57 by buildguy) : (stderr) ImportError: No module named jira.client

Any help is appreciated. My use case really requires python scripts to be executed from Pentaho kettle.

Thanks,
SBT

Pentaho Community Client List

$
0
0
Hi All,

There is no place which gives us a list of customers who is using pentaho community. Let's tag in our clients and support pentaho community, So as new clients will feel free to step in pentaho community.

Though pentaho community is secure and reliable there are customers who doesn't know its value and neglect to step in with community version.

Please add yourself if you are customer with necessary attachment if applicable.

Thanks,
Sivasakthi.

setRoleName to OlapStatement

$
0
0
I need to set current RoleName to OlapStatement, but this class don't have any API for this.
Only OlapConnection have method setRoleName.
But in my application instance of OlapConnection is shared for all users, therefore i want to set Role in OlapStatement when some user request data by mdx query.
Bow can i do this?

help needed designing star for sales process with different product types

$
0
0
I am working on doing a dimension model for a university. The current business process that I have picked up is actually sales/revenue. I have been reading different chapters of different books and although I think I have fair understanding of facts and dimensions I am having some tough time fitting the sales process on to the paper. Ideally the sales process in the school is similar to other businesses where students are customers and the product is the "courses" they take. However in certain situation there are different product types and I don't know how to fit the product type. For example student pays an Application fee, late fee or transcript request fee which is not associated with any course. How do I fit these different type of revenue streams in my star?
What I have done so far is like this


Code:

ales_FACT
====
Date_Key_FK
Product_Key_FK
Campus_Key_FK
Student_Key_FK
ChargeCredit_SKU
Amount


Product_Key
------
Product_Key_PK
SectionID
AcademicYear
AcademicTerm
AcademicSession
CourseCode
CourseName
ProductType???

Now for certain type of products (e.g. a transcript request fee) - I do not have the coursename,code, year term,session -- I am struggling how this will work.


Anyone has any input on this? or any helpful material/schema examples will definately appreciate them. I have failed to find out how a star will look like for a company that sells products as well as services.


Thanks,

kettle 5.1 -ce Round-robin.

$
0
0
Hello everyone:
I use Kettle (CE-5.1). Export DataBase data to Excel. my transform have tableInputStep and ExcelOutPutStep. And ExcelOutputStep use change send count.
mode is Round-robin.but I Find every step can get all equal count data. but not get data in return.
excel.jpg
Attached Images

Add previous value into field if field is empty

$
0
0
Hi All,
I have a problem adding missing values based on some certain conditions.
Here my problem:
I have some log files that I have to analyze but first of all I have to add some missing informations into the stream, otherwise I can't filter it. I will explain my problem with an example:
THe file contains 3 IDs that specify the communication channel:
I will name them Master_ID, Sub_ID1, Sub_ID2. Sometimes the Master_ID is missing in the log file, but if Sub_ID1 and Sub_ID2 are matching, the ID Master would be the same until a new ID Master comes with the same Sub_ID1 and Sub_ID2.
In the example below row 1, 2 and 6 should match and I want to insert into Master_ID value 12, but also row 3,4 and 5 are matching. Row 7 tells me that there is a new communication start for the Sub_ID1 and Sub_ID2as a new master Id comes.

Master_ID Sub_ID1 Sub_ID2
1 12 34 35
2 34 35
3 11 23 24
4 11 23 24
5 23 24
6 34 35
7 13 34 35
8 34 35



The other thing is, that i have a quite huge amount of files to analyze... 78 Million lines and thus I want to prevent sorting.
Any Idea how to solve this?

Best regards
Martin

HTTP POST response is truncated

$
0
0
Hi,

I am using a HTTP POST step to get a catalog of products from a supplier website.
The XML I receive in my output fields is truncated at line 10165. I have test my request with a SOAP client and there the response was complete and the XML valid.

It seems to me that it is a buffer size problem but I have no idea how to handle this issue.

Could someone help me?

Thanks

Parameters in Kettle ?

$
0
0
Hi Forum,
I'm newbie to kettle and need to use parameters in Kettel jobs/transformations to make them dynamic.

How to use parameters in Kettle ?

For example , How can I pick excel file dynamically from my hard drive ?

:-)

Sadakar
BI developer.
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>