Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Prepare MySQL BA Repository Database

$
0
0
Hi everyone!


I am just starting to use Pentaho BI Server 5.0.1.
I have done the mysql configuration for Pentaho (read from this instruction http://anonymousbi.wordpress.com/201...llation-guide/ and
http://infocenter.pentaho.com/help/i...epository.html).


Login to application works fine, but when I try to create new User, I am getting the "Principal does not exists" exception. It seems, that the application does not create principal for new user.


I tried to debug the codes. The strange thing is that the it normally passes createUser function of AbstractJcrBackedUserRoleDao.java, but fails on verification in checkValidEntry method(ACLTemplate.java).


Also I can see that user is not being creatred in MySQL users table.




Caused by: org.springframework.extensions.jcr.JcrSystemException: Repository access exception; nested exception is javax.jcr.security.AccessControlException: Pr
incipal mysqlDBTestUser-/pentaho/tenant0 does not exist.
at org.springframework.extensions.jcr.SessionFactoryUtils.translateException(SessionFactoryUtils.java:223)
at org.springframework.extensions.jcr.JcrAccessor.convertJcrAccessException(JcrAccessor.java:58)
at org.springframework.extensions.jcr.JcrTemplate.execute(JcrTemplate.java:94)
at org.springframework.extensions.jcr.JcrTemplate.execute(JcrTemplate.java:115)
at org.pentaho.platform.security.userroledao.jackrabbit.JcrUserRoleDao.createUser(JcrUserRoleDao.java:136)
... 90 more
Caused by: javax.jcr.security.AccessControlException: Principal mysqlDBTestUser-/pentaho/tenant0 does not exist.
at org.apache.jackrabbit.core.security.authorization.acl.ACLTemplate.checkValidEntry(ACLTemplate.java:323)
at org.apache.jackrabbit.core.security.authorization.acl.ACLTemplate.addEntry(ACLTemplate.java:385)
at org.apache.jackrabbit.core.security.authorization.AbstractACLTemplate.addAccessControlEntry(AbstractACLTemplate.java:152)
at org.pentaho.platform.repository2.unified.jcr.JcrRepositoryFileAclUtils.internalUpdateAcl(JcrRepositoryFileAclUtils.java:241)
at org.pentaho.platform.repository2.unified.jcr.JcrRepositoryFileAclUtils.createAcl(JcrRepositoryFileAclUtils.java:171)
at org.pentaho.platform.security.userroledao.jackrabbit.AbstractJcrBackedUserRoleDao.internalCreateFolder(AbstractJcrBackedUserRoleDao.java:752)
at org.pentaho.platform.security.userroledao.jackrabbit.AbstractJcrBackedUserRoleDao.createUserHomeFolder(AbstractJcrBackedUserRoleDao.java:736)
at org.pentaho.platform.security.userroledao.jackrabbit.AbstractJcrBackedUserRoleDao.createUser(AbstractJcrBackedUserRoleDao.java:342)
at org.pentaho.platform.security.userroledao.jackrabbit.JcrUserRoleDao$4.doInJcr(JcrUserRoleDao.java:139)
at org.springframework.extensions.jcr.JcrTemplate.execute(JcrTemplate.java:89)
... 92 more


If somebody has experiance, it would be very helpfull.
Thanks in advice

Problems sharing PRD reports in user console.

$
0
0
We are creating reports in PRD and publishing to BI server 4.8. These reports are viewable in User Console (by the user that published the report), but not "sharable". In user console, you can right-click on a report name, and you'll see the "share" function in the context menu. But when we select this function, we get an error: "could not get properties". Thus we cannot choose users to share the report with.

Any suggestions for fixing this?

Thx,
RR

Saiku Analytics Drill Through on Cell Return Empty Result

$
0
0
I created my own mondrian schema using Schema Workbench and published to Pentaho CE 4.8. The schema worked well when I created new analysis using Saiku. But when I used drill through on cell, it always returned empty result without any error message. Any clue what may go wrong and how to debug it?

I tried on the SteelwheelsSales schema, the drill through function worked.

By the way, I used Pentaho 4.8 CE on Ubuntu and Saiku plugin 2.7.

Here is the schema:

<Schema name="My Schema">
<Dimension type="TimeDimension" visible="true" highCardinality="false" name="Date">
<Hierarchy name="Months" visible="true" hasAll="true" primaryKey="date_key">
<Table name="dim_date_en_us">
</Table>
<Level name="Year" visible="true" column="year4" type="Numeric" uniqueMembers="true" levelType="TimeYears" hideMemberIf="Never">
</Level>
<Level name="Quarter" visible="true" column="quarter_number" ordinalColumn="quarter_number" type="Numeric" uniqueMembers="false" levelType="TimeQuarters" hideMemberIf="Never" captionColumn="quarter_name">
</Level>
<Level name="Month" visible="true" column="month_number" ordinalColumn="month_number" type="Numeric" uniqueMembers="false" levelType="TimeMonths" hideMemberIf="Never" captionColumn="month_name">
</Level>
</Hierarchy>
<Hierarchy name="Weeks" visible="true" hasAll="true" primaryKey="date_key">
<Table name="dim_date_en_us">
</Table>
<Level name="Year" visible="true" column="year4" type="String" uniqueMembers="true" levelType="TimeYears" hideMemberIf="Never">
</Level>
<Level name="Week" visible="true" column="week_in_year" type="Numeric" uniqueMembers="false" levelType="TimeWeeks" hideMemberIf="Never">
</Level>
</Hierarchy>
</Dimension>
<Dimension type="StandardDimension" visible="true" highCardinality="false" name="Channel">
<Hierarchy name="Channel" visible="true" hasAll="true" primaryKey="channel_id">
<Table name="dim_channel">
</Table>
<Level name="Parent Channel" visible="true" column="parent_channel_id" type="Numeric" uniqueMembers="true" levelType="Regular" hideMemberIf="Never" captionColumn="parent_channel_name">
</Level>
<Level name="Channel" visible="true" column="channel_id" type="Numeric" uniqueMembers="false" levelType="Regular" hideMemberIf="Never" captionColumn="name">
</Level>
</Hierarchy>
</Dimension>
<Dimension type="StandardDimension" visible="true" highCardinality="false" name="CLC_change">
<Hierarchy visible="true" hasAll="true" primaryKey="CLC_change_key">
<Table name="dim_CLC_change">
</Table>
<Level name="CLC_change" visible="true" column="CLC_change_key" type="Numeric" uniqueMembers="true" levelType="Regular" hideMemberIf="Never" captionColumn="name">
</Level>
</Hierarchy>
</Dimension>
<Dimension type="StandardDimension" visible="true" highCardinality="false" name="Cancel_reason">
<Hierarchy visible="true" hasAll="true" primaryKey="cancel_reason_key">
<Table name="dim_cancel_reason">
</Table>
<Level name="Cancel_reason" visible="true" column="cancel_reason_key" type="Numeric" uniqueMembers="true" levelType="Regular" hideMemberIf="Never" captionColumn="cancel_reason">
</Level>
</Hierarchy>
</Dimension>
<Dimension type="StandardDimension" visible="true" highCardinality="false" name="Location">
<Hierarchy visible="true" hasAll="true" primaryKey="account_id">
<Table name="dim_customer">
</Table>
<Level name="Country" visible="true" column="country" type="String" uniqueMembers="true" levelType="Regular" hideMemberIf="Never" captionColumn="country">
</Level>
<Level name="State" visible="true" column="state" type="String" uniqueMembers="false" levelType="Regular" hideMemberIf="Never" captionColumn="state">
</Level>
<Level name="City" visible="true" column="city" type="String" uniqueMembers="false" levelType="Regular" hideMemberIf="Never" captionColumn="city">
</Level>
</Hierarchy>
</Dimension>
<Dimension type="StandardDimension" visible="true" highCardinality="false" name="Signup_source">
<Hierarchy visible="true" hasAll="true" primaryKey="signup_source_key">
<Table name="dim_signup_source">
</Table>
<Level name="Signup_source" visible="true" column="signup_source_key" type="Numeric" uniqueMembers="true" levelType="Regular" hideMemberIf="Never" captionColumn="signup_source">
</Level>
</Hierarchy>
</Dimension>
<Cube name="My Cube" caption="My Cube" visible="true" cache="true" enabled="true">
<Table name="warehouse">
</Table>
<DimensionUsage source="Date" name="Date" caption="Date" visible="true" foreignKey="date_key" highCardinality="false">
</DimensionUsage>
<DimensionUsage source="Channel" name="Channel" visible="true" foreignKey="channel_id" highCardinality="false">
</DimensionUsage>
<Dimension type="StandardDimension" visible="true" highCardinality="false" name="CLC">
<Hierarchy visible="true" hasAll="true" allMemberName="All">
<Level name="CLC" visible="true" column="CLC" type="String" uniqueMembers="true" levelType="Regular" hideMemberIf="Never">
</Level>
</Hierarchy>
</Dimension>
<DimensionUsage source="Date" name="Start_date" caption="StartDate" visible="true" foreignKey="start_date_key" highCardinality="false">
</DimensionUsage>
<DimensionUsage source="CLC_change" name="CLC_change" visible="true" foreignKey="CLC_change_key" highCardinality="false">
</DimensionUsage>
<DimensionUsage source="Date" name="Cancel_date" caption="CancelDate" visible="true" foreignKey="cancel_date_key" highCardinality="false">
</DimensionUsage>
<DimensionUsage source="Cancel_reason" name="Cancel_reason" visible="true" foreignKey="cancel_reason_key" highCardinality="false">
</DimensionUsage>
<DimensionUsage source="Location" name="Location" visible="true" foreignKey="account_id" highCardinality="false">
</DimensionUsage>
<DimensionUsage source="Signup_source" name="Signup_source" visible="true" foreignKey="signup_source_key" highCardinality="false">
</DimensionUsage>
<Measure name="MOU" column="mou" datatype="Integer" aggregator="sum" visible="true">
</Measure>
<Measure name="amountDueCents" column="amount_due" datatype="Integer" aggregator="sum" visible="false">
</Measure>
<Measure name="mrc_amount" column="mrc_amount" datatype="Integer" aggregator="sum" visible="false">
</Measure>
<Measure name="usage_amount" column="usage_amount" datatype="Integer" aggregator="sum" visible="false">
</Measure>
<Measure name="other_amount" column="other_amount" datatype="Integer" aggregator="sum" visible="false">
</Measure>
<Measure name="CNT" column="cnt" datatype="Integer" aggregator="sum" visible="true">
</Measure>
<Measure name="payment_amount" column="payment_amount" datatype="Integer" aggregator="sum" visible="false">
</Measure>
<CalculatedMember name="billed" formatString="#,###.00" formula="([Measures].[mrc_amount]+[Measures].[usage_amount]+[Measures].[other_amount])/100.0" dimension="Measures" visible="true">
</CalculatedMember>
<CalculatedMember name="AmountDue" formatString="#,###.00" formula="([Measures].[amountDueCents])/100.0" dimension="Measures" visible="true">
</CalculatedMember>
<CalculatedMember name="receipts" formatString="#,###.00" formula="([Measures].[payment_amount])/100.0" dimension="Measures" visible="true">
</CalculatedMember>
<CalculatedMember name="MRC" formatString="#,###.00" formula="([Measures].[mrc_amount])/100.0" dimension="Measures" visible="true">
</CalculatedMember>
</Cube>
</Schema>

Where does the export file go from dashboard?

$
0
0
Hi,
I am new to Pentaho.
I added an export button to a dashboard, did the setting as
http://forums.pentaho.com/showthread...on-work-in-CDE
When I click on the button, nothing happens.
In the log, I can see the following entry.
Please help
Thank
unary

2014/01/23 20:12:56 - DataAccess.0 - Finished reading query, closing connection.


2014/01/23 20:12:56 - DataAccess.0 - Finished processing (I=3, O=0, R=0, W=3, U=
0, E=0)
2014/01/23 20:12:56 - Export.0 - Finished processing (I=0, O=4, R=3, W=3, U=0, E
=0)

Hadoop Big Data Online Training & Certification

$
0
0
Hadoop Big Data Online Training by SunItLabs We are providing excellent Hadoop Big Data Training by real-time

IT industry experts Our training methodology is very unique Our Course Content covers all the in-

depth critical scenarios. We have completed more than 200 Hadoop Big Data batches through Online Hadoop Big Data

Training program, Our Hadoop Big Data Classes covers all the real time scenarios, and its completely on Hands-

on for each and every session.

Please call us for the Demo Classes we have regular batches and weekend batches.

Contact Number : India :+91 903-092-8000,

Email : info@sunitlabs.com ,

Web: http://sunitlabs.com/hadoop-online-training/

Dynamic Schema Processing

$
0
0
  • Dynamic Schema Processing - For passing session variables and database values based on logged user to the Cube definition

    How we can achieve this in pentaho 5.0.2 version?


    Thanks
    Kavitha S

Rows duplicating on "change number of copies to start" in table output step

$
0
0
I have the below transformation.

sshot.jpg

There are 60 million rows to be loaded. The "change number of copies to start" property of the table output step is set to 2 in an attempt to increase throughput.
The 60 million rows need to go into both the tables, the columns differ in both of them.
The transformation was test ran with 100 rows. The tables were loaded with 100 rows each but there were duplicates in them.
How can it be ensured that only unique rows get inserted in the tables.
Attached Images

Chrome error while generating the CDE dashboard!

$
0
0
Hi Pmalves,

We are using CDE latest version. Everything seemed perfectly fine, when we started getting the attached error in couple of our dashboards having verical and horizontal bars to view the entire data. Though it seems the chrome browser issue, but if anything could be done from dashboard itself in settings to avoid it. Please help.IMG_22012014_184702.jpg


Any help would be highly appreciated.

Best Regards,
Ramya
Attached Images

Error in cube

$
0
0
Hi all,
I was able o create analysis report but now am getting error "An error occurred while rendering Pivot.jsp. Please see the log for details."
Please see the log file attached.

Thank you in advance.

PentahoLog.txt
Attached Files

How to increase database throughput

$
0
0
I have a transformation which loads data in the range of 60 - 80 million rows.
When run individually it gives a throughput of more than 7000 rows/sec. The transformation has to be run for about 70 source tables each holding data in the range of 60 - 80 million rows. When run in parallel the throughput falls to less than 1000 rows/sec ( 6 transformations were run in parallel ).

Below are the settings that are currently used:

- In database connection defaultRowPrefetch = 200
- In transformation properties 'Nr of rows in rowset' is 50000
- In pan.sh JAVAMAXMEM="2048"
- In server/data-integration-server/start-pentaho.sh, CATALINA_OPTS="-Xmx2048m
- The JDBC driver used is ojdbc6.jar

There are no indexes on the target table. The database is Oracle. The server has 4 cores.
Please advise on what else can be done to increase performance.

CDA datasource with more parameters

$
0
0
Hello,
I created report that looks like dashboard in order to be able to send it via mail. I use graph and tables in PRD 3.9.

In first preview I get all values, then if I change value of 1 parameter (of 4), all values are gone except for table where only 1 parameter is used. Then I tried - it doesn'n matter how many subreports I have - all with only 1 parameter are working, subreport with 2 and more are without result. If I swith to the same value of parameter as in the beginning (as it worked), report is also without values except for tables wih 1 parameter.

I'm using inherited data source for reports and not adding new CDA datasource for each subreport.
My parameters have diferent default - 1 without, 1 fixed as values, 2 as date with first date of month and yesterday. -- but setting this as fixed value doesn't changed behaviour ...

What should I take care in PRD using more parameters with CDA datasource?

Thanks a lot for your help!
timfu83

Time series forecasting OutOfMemoryError when used programmatically

$
0
0
Hi, I don't really know if I am missing something here, but when I use Weka for time series forecasting through its graphical frontend, it works nicely and fast, but when I try to do the same programmatically, I get the following error:

Code:

Transforming input data...
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at weka.classifiers.functions.GaussianProcesses.buildClassifier(GaussianProcesses.java:384)
    at weka.classifiers.timeseries.WekaForecaster$SingleTargetForecaster.buildForecaster(WekaForecaster.java:956)
    at weka.classifiers.timeseries.WekaForecaster.buildForecaster(WekaForecaster.java:1053)
    at TimeSeriesExample.main(TimeSeriesExample.java:48)

I am following this simple example: http://martinbmadsen.dk/weka-time-se...d-values-date/ , and yes, it works perfectly with the wine file, but it seems my file is too big, which is weird since, as I already said, it works great with the Weka frontend. My file has approximately 15000 records and only two data columns. I have also modified the memory parameters of the JVM.

I have attached my arff data file.

Thank you very much for your help!
Attached Files

passing the Kettle managed DB connection object to the Java API

$
0
0
We have come across a requirement where we need to call Java API from Kettle and pass the Kettle managed DB connection to the Java API so that the transaction management is done at single location in Kettle. We are able to call the java api but not getting any handle on how to share same transaction.

Is there any way to achieve this?

Pros and cons of using enterprise repository to store jobs and transfrmations PDI5.0

$
0
0
Hi,

I am planning to use enterprise repository to store jobs and transformations as well as to schedule jobs and transformations using schedule option in spoon in PDI 5.0 .Could u pls explain me the pros and cons of using schedule option in spoon as well as storing jobs in enterprise repository?


Thanks and Regards,
Fabeena Mary

Map not plotting places in Pentaho CDE

$
0
0
Hi

I am using the cde map component to plot the places based on the user selection using an sql query, and the places were plotted accordingly, and i was using the input format (city,value) eg:("Delhi, Connaught Place"). and given the center latitude and center longitude of India and everything was working fine , but suddenly from 15-jan 2014 everything stopped working, it is not plotting places on the map and i checked the sample dashboards under CDE references also except the "Markers based on lat/Lon" dashboard the other dashboards are not plotting places on map.:confused:

Can any one tell me how to give the format(input format) for the map component using the sql query to plot the places.

command line "java weka.Run" don't work

$
0
0
Hello all,

I need to use weka in a command line. Installing packages from command line work fine following this documentation :
http://weka.wikispaces.com/How+do+I+...age+manager%3F

But when i am trying use the installed packge, using :
java weka.Run weka.classifiers.functions.LibLINEAR (for example)

i got this error : Error: Could not find or load main class weka.Run

I am using the Linux version weka-3-7-10/

can you help please ?

Thank you for your time

Error running job with ssh from PUC

$
0
0
Hi everybody !

I have a job which have to connect to host to kill some process. When I run it on my computer with kettle, I have no problem, job running correctly. Problem comes when I want to execute this job from the Pentaho User Interface. Job is in a database repository.
So I create an .xaction, like I do for my other job. But when I want to execute it, double-clicking, I have this error in catalina.out :
Quote:

PDI a rencontré une erreur lors du chargement de l'entrée [SSH_supp_sqlsrv2_process.0]

Erreur inattendue lors de la tentative de chargement des informations de la transformation

PDI a rencontré une erreur durant la lecture d'une transformation depuis le référentiel

Erreur lors du chargement du plugiciel:
com/trilead/ssh2/ProxyData
It seems to have a problem with Proxy ? But I don't use Proxy parameter in my job.

I have created a system of private/public ssh key between the host where the PUC is installed and the host ont which I have to kill process to not have to enter a password.

Here my job in Kettle (sorry, I'm french so it's writting in ...french...) :


Thanks in advance ! ;)
Attached Images

CDA Schedular Error

$
0
0
Dear all,

When I starup the BI server, the following error log outputted in my pentaho.log and catalina.out.

3:42:17,492 ERROR [SolutionEngine] db90aee9-84c6-11e3-b0ab-6533129347f4:SOLUTION-ENGINE:scheduler.xaction: Action Sequence execution failed, see details below
| Error Time: Friday, January 24, 2014 1:42:17 PM MMT
| Session ID: scheduler.xaction
| Instance Id: db90aee9-84c6-11e3-b0ab-6533129347f4
| Action Sequence: scheduler.xaction
| Execution Stack:
EXECUTING ACTION: Scheduler (org.pentaho.platform.engine.services.solution.PojoComponent)
| Action Class: org.pentaho.platform.engine.services.solution.PojoComponent
| Action Desc: Scheduler
| Loop Index (1-based): 0
Stack Trace:org.pentaho.platform.api.engine.ActionExecutionException: RuntimeContext.ERROR_0017 - Action failed to execute
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeComponent(RuntimeContext.java:1325)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeAction(RuntimeContext.java:1262)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.performActions(RuntimeContext.java:1161)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeLoop(RuntimeContext.java:1105)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:987)
at org.pentaho.platform.engine.services.runtime.RuntimeContext.executeSequence(RuntimeContext.java:897)
at org.pentaho.platform.engine.services.solution.SolutionEngine.executeInternal(SolutionEngine.java:399)
at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:317)
at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:193)
at org.pentaho.platform.engine.services.BaseRequestHandler.handleActionRequest(BaseRequestHandler.java:159)
at org.pentaho.platform.scheduler.QuartzExecute.execute(QuartzExecute.java:198)
at org.quartz.core.JobRunShell.run(JobRunShell.java:203)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:520)

Although pamval said it is not effect on system behavior in
http://forums.pentaho.com/showthread...xection-failed
http://forums.pentaho.com/showthread...2337-CDA-error

Could you please tell me how to solve this error ??
(or)
Is this bug of Pentaho???

Please reply me at least one reply...
I will be very appreciate for your reply.
Thank you.

to define query execution time

$
0
0
Hi,

In the table input step, while executing a query, if it takes a long time to retrieve the data from table, is there any way to define "query execution time limit" so that the flow continues without any interruption??

Thanks in advance.

Regards,
Poulomi

Authentication in User Console in PENTAHO 5.0.1 CE

$
0
0
Hi!
I'm using Microsoft Active Directory as security provider and I need to configure my pentaho server to authenticate the users through my DA.
I have been reading documentation for the Pentaho 5.0 CE and...From User Console Home menu, click Administration, then select Authentication from the left. The Authentication interface appears...here is the problem!! In my console, inside Administration menu i hvce only three options:
  • User and roles
  • Mail Server
  • settings

Is it possible that Authentication option is active for a Enterprise Edition exclusively??
Why i don't see this option?? :confused:
thaks thanks a lot for your time!

Susana
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>