Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

java heap overflow

$
0
0
Hi,

when I start a report in report designer, after a while I get a message: heap limit exceeded.
the report contains of a main report and a subreport.
The result should be 290 pages.
In Jasper same report is displayed in 19 seconds.
In Crystal same report is displayed in 15 seconds.

This is for me a ko-criteria for Pentaho Report.
Is there a way to solve this problem?

CCC Line Chart multichart - individual min/max

$
0
0
Quickie: can it be done? Individual min and max values for each chart in a multichart?
For this example I have:
2016-02-01_14h30_11.jpg
3 and 6 shows clearly that 1,2,4 and 5 are following 3 and 6 max values... but I wanted each of them to have their own. Doable?
Attached Images

IUserRoleListService get all users never called

$
0
0
[MOVED TO DEVELOPER SECTION]
Hi to all,
im facing up to the integration of Pentaho authentication mechanism with the my enterprise one.
Making reference to Pentaho Guidelines, i have built a CustomAuthenticationProvider querying my users database in order to perform authentication.

The main services i have dealt with are:
UserDetailsService interface
IUserRoleListService interface

The second one is responsible to handle all the functionalities related to users, roles, and user-roles mapping. But there is a problem:
While roles related methods like:
public List<String> getAllRoles() or public List<String> getRolesForUser(ITenant tenant, String username)

are correctly invoked (returning al the Klopotek roles), this doesn’t seem to happen with all methods relating to users, such us:
public List<String> getAllUsers()

which are never invoked instead.

I thought these methods were designated to the users list loading together with all the informations displayed inside the Users&Roles Pentaho Console page.
But what i get, as final result is that there are listed just the default existent users: Admin,Pat,Suzy,Tiffany.


Where am i wrong? Is there some configuration setting missed inside the spring security context?
Really thanks.
Regards.

PMR failure in v6.0.1

$
0
0
Hi,

I'm trying to integrate my cloudera VM with PDI which is installed in cloudera VM. I did setup a new cluster and cluster test passed successfully.

However when testing PMR job, the job passed Hadoop copy step and loaded data into HDFS, the job gets stuck at PMR for around 15min then shows errors that unknown host name. Not sure what is the issue.

Looking to know if I missed any step with configuring

Thanks


016/02/01 13:06:01 - PMR-Wordcount - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Invalid host name: local host is: (unknown); destination host is: "clouderamanager.cdh5.test":8032; java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost
2016/02/01 13:06:01 - PMR-Wordcount - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "clouderamanager.cdh5.test":8032; java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost
2016/02/01 13:06:01 - PMR-Wordcount - at sun.reflect.GeneratedConstructorAccessor104.newInstance(Unknown Source)
2016/02/01 13:06:01 - PMR-Wordcount - at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2016/02/01 13:06:01 - PMR-Wordcount - at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:743)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.ipc.Client$Connection.<init>(Client.java:402)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.ipc.Client.getConnection(Client.java:1511)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.ipc.Client.call(Client.java:1438)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.ipc.Client.call(Client.java:1399)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
2016/02/01 13:06:01 - PMR-Wordcount - at com.sun.proxy.$Proxy98.getNewApplication(Unknown Source)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getNewApplication(ApplicationClientProtocolPBClientImpl.java:217)
2016/02/01 13:06:01 - PMR-Wordcount - at sun.reflect.GeneratedMethodAccessor83.invoke(Unknown Source)
2016/02/01 13:06:01 - PMR-Wordcount - at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2016/02/01 13:06:01 - PMR-Wordcount - at java.lang.reflect.Method.invoke(Method.java:497)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
2016/02/01 13:06:01 - PMR-Wordcount - at com.sun.proxy.$Proxy99.getNewApplication(Unknown Source)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNewApplication(YarnClientImpl.java:206)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:214)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.mapred.ResourceMgrDelegate.getNewJobID(ResourceMgrDelegate.java:187)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.mapred.YARNRunner.getNewJobID(YARNRunner.java:231)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:446)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1306)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1303)
2016/02/01 13:06:01 - PMR-Wordcount - at java.security.AccessController.doPrivileged(Native Method)
2016/02/01 13:06:01 - PMR-Wordcount - at javax.security.auth.Subject.doAs(Subject.java:422)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
2016/02/01 13:06:01 - PMR-Wordcount - at org.apache.hadoop.mapreduce.Job.submit(Job.java:1303)
2016/02/01 13:06:01 - PMR-Wordcount - at org.pentaho.hadoop.shim.cdh54.HadoopShim.submitJob(HadoopShim.java:122)
2016/02/01 13:06:01 - PMR-Wordcount - at org.pentaho.hadoop.shim.cdh54.delegating.DelegatingHadoopShim.submitJob(DelegatingHadoopShim.java:112)
2016/02/01 13:06:01 - PMR-Wordcount - at org.pentaho.di.job.entries.hadooptransjobexecutor.JobEntryHadoopTransJobExecutor.execute(JobEntryHadoopTransJobExecutor.java:869)
2016/02/01 13:06:01 - PMR-Wordcount - at org.pentaho.di.job.Job.execute(Job.java:730)
2016/02/01 13:06:01 - PMR-Wordcount - at org.pentaho.di.job.Job.execute(Job.java:873)
2016/02/01 13:06:01 - PMR-Wordcount - at org.pentaho.di.job.Job.execute(Job.java:873)
2016/02/01 13:06:01 - PMR-Wordcount - at org.pentaho.di.job.Job.execute(Job.java:546)
2016/02/01 13:06:01 - PMR-Wordcount - at org.pentaho.di.job.Job.run(Job.java:435)
2016/02/01 13:06:01 - PMR-Wordcount - Caused by: java.net.UnknownHostException
2016/02/01 13:06:01 - PMR-Wordcount - ... 32 more
2016/02/01 13:06:01 - test_pmr - Starting entry [Failure]
2016/02/01 13:06:01 - Failure - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by buildguy) : Aborting job.

Dealing with Time differences in Timezones

$
0
0
Hi All,

I am selecting data from the previous day,am setting the date in two columns
PREV_DATE, CURR_DATE (Date data type)
My DB values are
01-02-2016 00:00:00 02-02-2016 00:00:00

when i preview data in the table input step, am getting the dates as

2016/02/01 00:00:00.000000000 2016/02/02 00:00:00.000000000

When i preview data in the execution result tab, am getting data as

2016/01/31 18:30:00.000000000 2016/02/01 18:30:00.000000000


There is a difference of 5h 30m.
This 5h 30m is the time difference in India and sydney.
Also Indian Time zone is UTC+5h 30m

(Note: For production, the kettle files will be moved to Server in Sydney)
But now for development, The Db machines , and kettle machines are in India for dev Db)
The files are taken from the Production environment for new changes.


I am using Pentaho 5.1.0 GA.

Can anybody help to solve the issue?
Attatched the screen shots.

Thank You.

(Sorry for the heading, It was meant to be "Time Zone Issues" . I am not able to edit it after posting)
Attached Images

How to extract Xaction Varaible inside Pentaho CDE

$
0
0
Hello Folks,

I have created xaction which return me following output when i run that xaction

username=admin
userroles=[Administrator, Authenticated]
Region=Asia

Now, I wanted to get this "Region" variable inside the Pentaho CDE (dashboard)

Can anyone help me to extract the "Region" Variable. ?

Note : I know ${env::username} inside the Query object will return the username name but this wont work with the Region Variable.
Moreover, I wanted to extract the variable by using the JavaScript (onload event of Body tag of html)

We are using Pentaho 6.0

Thanks
Naimish

IUserRoleListService get all users never called

$
0
0
Hi to all,
im facing up to the integration of Pentaho authentication mechanism with the my enterprise one.
Making reference to Pentaho Guidelines, i have built a CustomAuthenticationProvider querying my users database in order to perform authentication.

The main services i have dealt with are:
UserDetailsService interface
IUserRoleListService interface

The second one is responsible to handle all the functionalities related to users, roles, and user-roles mapping. But there is a problem:
While roles related methods like:
public List<String> getAllRoles() or public List<String> getRolesForUser(ITenant tenant, String username)

are correctly invoked (returning al the Klopotek roles), this doesn’t seem to happen with all methods relating to users, such us:
public List<String> getAllUsers()

which are never invoked instead.

I thought these methods were designated to the users list loading together with all the informations displayed inside the Users&Roles Pentaho Console page.
But what i get, as final result is that there are listed just the default existent users: Admin,Pat,Suzy,Tiffany.


Where am i wrong? Is there some configuration setting missed inside the spring security context?
Really thanks.
Regards.

No driver class name is available

$
0
0
I'm trying to connect the Saiku plugin to my ODBC datasource. When go to new>saiku analytics, this exception appears in the tomcat logs: "Unable to create data source for connection [connectionName]. No driver class name is available. What do I need to do?

Change the default folder directory to save a report

$
0
0
Hello,

How can the default folder directory (/home/"user_name") to save a report in BA-repository be changed?
usually each user has a folder, named after the user, in the Home directory /home/"user_name".
I want to save all the reports in one folder (ex. the public folder) and want to set this folder as the default folder when a user click on save/save as

anyone can help
Thank you

PRD 3.9.1 : Excel preview, problem with column size

$
0
0
Hi !

I use PRD 3.9.1 (I know that it's an old version but we have a 3.7 user console in prodution). When I want to run my report with Excel, I have little column with size not corresponding to size I set in my report.
When I used PRD 3.7 the preview was correct, Excel kept size that I have set in my report.
Since I have migrated in PRD 3.9.1, I have this problem.

Thanks for your reply.

Change column header

$
0
0
Hi, I have a table component obtained from a MDX Query:

WITH
SET [~ROWS] AS
{[Regime Ricovero].[Degenza ordinaria]}
SELECT
NON EMPTY {[Measures].[Numero di Ricoveri], [Measures].[Dimessi 0-1 Giorno], [Measures].[Dimessi > di 1 Giorno], [Measures].[Numero di Uscite verso l'Esterno], [Measures].[Peso Medio DRG], [Measures].[Giornate di Degenza], [Measures].[Degenza Media], [Measures].[Occupazione Media (Percentuale)], [Measures].[Indice di Rotazione], [Measures].[Presenti Medi Giornaliari]} ON COLUMNS,
NON EMPTY [~ROWS] ON ROWS
FROM [Cubo Virtuale Report Dati Attività]

rsz_1screenshot_from_2016-02-02_115606.jpg

I would change the header of the first column,

I tried "column headers" in the advanced properties, but it returns an error.
Could someone help me?

API returns wrong roles

$
0
0
I've set up a custom LDAP/JDBC setup following this guide (https://help.pentaho.com/Documentati...P0/150/010/050) and the login works OK. The admin panel shows the proper roles and users, BUT the API returns the default roles, not the custom ones.

As I've said, I followed the official guide and it should be properly configured as the login works with the LDAP users. Where could I look at?

Error code 28 while generating report

$
0
0
Hi all!

All of a sudden a report doesnt work in my environment anymore.
When i generate it in de BIServer it says failed at query.
When i generate the report in PRD it gives me this log:

Code:

org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Failed at query:











CALL sp_sel_SA_RptMarginOverview(${YEAR}, ${FromWeek}, ${ToWEEK}, ${BUSINESSUNIT})                                         
        at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.queryData(SimpleSQLReportDataFactory.java:214)
        at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SQLReportDataFactory.queryData(SQLReportDataFactory.java:162)
        at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStaticInternal(CompoundDataFactory.java:205)
        at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:182)
        at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryInternal(CachingDataFactory.java:505)
        at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryStatic(CachingDataFactory.java:181)
        at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStaticInternal(CompoundDataFactory.java:200)
        at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:182)
        at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryData(CompoundDataFactory.java:69)
        at org.pentaho.reporting.engine.classic.core.states.datarow.DefaultFlowController.performQueryData(DefaultFlowController.java:296)
        at org.pentaho.reporting.engine.classic.core.states.datarow.DefaultFlowController.performSubReportQuery(DefaultFlowController.java:374)
        at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.initializeForSubreport(ProcessState.java:599)
        at org.pentaho.reporting.engine.classic.core.states.process.EndSubReportHandler.commit(EndSubReportHandler.java:59)
        at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.commit(ProcessState.java:1063)
        at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processPrepareLevels(AbstractReportProcessor.java:437)
        at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.performStructuralPreprocessing(AbstractReportProcessor.java:621)
        at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.prepareReportProcessing(AbstractReportProcessor.java:509)
        at org.pentaho.reporting.engine.classic.core.modules.output.pageable.graphics.PrintReportProcessor.getNumberOfPages(PrintReportProcessor.java:78)
        at org.pentaho.reporting.engine.classic.core.modules.gui.base.PreviewPane$RepaginationRunnable.run(PreviewPane.java:271)
        at org.pentaho.reporting.engine.classic.core.util.Worker.run(Worker.java:174)
Caused by: java.sql.SQLException: Error writing file 'C:\WINDOWS\TEMP\MY283.tmp' (Errcode: 28)
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
        at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
        at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:2281)
        at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.parametrizeAndQuery(SimpleSQLReportDataFactory.java:381)
        at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.queryData(SimpleSQLReportDataFactory.java:209)
        ... 19 more

When i execute the stored procedure in MySQL Workbench it works fine
Can anyone help me?

Greetz

How to open a report from PRD 6.0 on PRD 5.4? Or posublish from PRD 6.0 to BI server?

$
0
0
Hi!

I have some reports made on PRD 6.0 and I want to upload them on biserver 5.4.

I must upload them from PRD 5.4? Uploading them from PRD 6.0 will work? How can I edit them on PRD 5.4?

Thanks in advance!

Insert/update too slow for dimension table load...

$
0
0
Hi everyone,


Insert/update step is running too slow. It took 1 hr and 7 mins to load 8331 records into existing target table of 142,000 records.
My commit size is set to 10000. The database is redshift. I can't create index on the look up column and
my update is based on primary key column of the table and it has sort key.


All the steps before this step completed (active='finished') in 13.2 seconds. I know...it's not slow due to previous steps.


Is there anything I am missing? thanks for the help.


Thanks,
Raji.

reading from zipped csv files

$
0
0
I have a project where I need to load zipped csv files into a db. I was delighted to see the text input step can read from a zipped file. Although it cannot handle new lines in fields. So now I am using the csv input and the csv input can handle the breaks but cannot read from zipped files. So I am reworking the whole thing to used the unzip step and then processing the file. Which makes the process much more convoluted than it needs to be. The files are not gzipped so I cannot use the gzip csv input.

Is there something I missing here? Why can I not read directly from a zipped csv file? I am using 5.3, do I need to upgrade?

Thanks,
Jason

Job Fails When Logging Level set to "Row Level (very detailed)"

$
0
0
Hi All,

When I run a job and choose the logging level to be "Basic", the job runs flawlessly.

If I run the job again, then I choose the logging level to be "Row Level (very detailed)" the job fails.

It doesn't matter what the failure is when it fails. I say this because if I want the job to succeed in every case, I can just set it to Basic logging. I know the job itself is fine.

What I want to know is: If the job succeeds when logging is set to "Basic", why would changing the level of logging cause the job itself to fail?

Is this a bug or by design?

Anyone see this before or can anyone explain if this is FAD?

Thanks!


Remove line chart padding

BI Server - performance, uptime and such...

$
0
0
So, I noticed that things are starting to take quite a while to execute on the BI Server...
decided to check top, and while I reloaded a dashboard, this came up:

PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
31244 x 20 0 2916m 2,4g 10m S 102,1 60,9 638:53.83 java

Which made me wonder about two things:
1- (serious) could that server uptime be a performance problem?
2- (not so much) how the hell did it got 102.1% CPU?! O_o

output log erroneus in procces action pentaho data integration job (xaction)

$
0
0
Hi friends.

I have a problem with the execution of a action sequence (xaction). This xaction simply called a job, which depending on the user logged reads a file and generates reports and if an error occurs displays it on the user console. The process works perfectly but when 2 users running at time and one has an error, the error is reflected in the 2 user consoles.

What I can do to make it not occur?

Thanks.
Attached Files
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>