Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

What options do I have to set session variables?

$
0
0
Hello,

I want to set session variables in order to filter the queries when a user access to a report (PRD) or a dashboard (CDE) throw a URL from an external page.

I have seen Startup Rule Engine. If I access with the credentials attached to the URL, variables values would be set and the dashboard would use them in the queries properly? Would it make the access to be very slow? Is there a best way to do this?

Thank you.

Mondrian/Saiku larger data perfomance

$
0
0
Hi, I have a problem to display a large body of data in a cube in Pentaho BI Server 5.3 using the Saiku.
I used the aggregation designer to optimize the cube and use aggregates tables, but did not improve in just about anything.
I can list using a simple crossing of 1:1 quietly (~ 500,000 records in 15 seconds), but when I add most one line crossing, Saiku can not return the result.
My server has 16 processors and 32GB of memory, pentaho is configured to use 25GB memory.
Can anyone help me with this problem?
Thanks.

mondrian.properties

Quote:

mondrian.rolap.aggregates.Use=true
mondrian.rolap.aggregates.Read=true
mondrian.result.limit=20000000
mondrian.rolap.queryTimeout=3600
mondrian.trace.level=0
mondrian.query.limit=40
mondrian.rolap.LargeDimensionThreshold=100
mondrian.rolap.star.disableCaching=false
mondrian.rolap.generate.formatted.sql=false
mondrian.olap.case.sensitive=true
mondrian.expCache.enable=true
mondrian.native.crossjoin.enable=true
mondrian.native.topcount.enable=true
mondrian.native.filter.enable=true
mondrian.native.nonempty.enable=true
mondrian.rolap.groupingsets.enable=true
mondrian.rolap.maxConstraints=1000
mondrian.rolap.evaluate.MaxEvalDepth=10
mondrian.rolap.ignoreInvalidMembers=true
mondrian.rolap.ignoreInvalidMembersDuringQuery=true
mondrian.rolap.iterationLimit=500000
mondrian.native.unsupported.alert=WARN
mondrian.rolap.compareSiblingsByOrderKey=true
mondrian.olap.NullDenominatorProducesNull=true
mondrian.native.ExpandNonNative=true
mondrian.olap.elements.NeedDimensionPrefix=true
mondrian.spi.dataSourceResolverClass=org.pentaho.platform.web.servlet.PentahoDataSourceResolver
mondrian.rolap.maxQueryThreads=20

Use variables from previous results as input on query with step cassandra input

$
0
0
I would like to use Kettle to make some transformations in our cassandra databases. Kettle include a plug in to read and write from cassandra using cql.
I am trying to read records from a column family using a cql with a filters which depends on previous records. Using a query like this:

SELECT * FROM snpsearch WHERE idLine1 =? AND idLine2 = ? AND partid = 0 LIMIT 10;

The previous step generate as output a result like this
ID1 ID2
#1 10 11
#2 11 12

But when I executed the transformation I get the next error:

Code:

2015/07/16 16:24:57 - Get Individuals Pairs.0 - SQL query : SELECT IND1.ID AS ID1,IND2.ID AS ID2 FROM INDIVIDUALS IND1, INDIVIDUALS IND2
2015/07/16 16:24:57 - Get Individuals Pairs.0 - WHERE IND1.ID<IND2.ID AND
2015/07/16 16:24:57 - Get Individuals Pairs.0 - IND1.EXPERIMENT_ID IN ( SELECT ID FROM EXPERIMENTS    WHERE EXPERIMENTS.PROJECT_ID= 135
2015/07/16 16:24:57 - Get Individuals Pairs.0 -                            )
2015/07/16 16:24:57 - Get Individuals Pairs.0 - AND IND2.EXPERIMENT_ID IN (
2015/07/16 16:24:57 - Get Individuals Pairs.0 -                              SELECT ID FROM EXPERIMENTS    WHERE EXPERIMENTS.PROJECT_ID= 135
2015/07/16 16:24:57 - Get Individuals Pairs.0 -                            );
2015/07/16 16:24:57 - Get Individuals Pairs.0 - Signaling 'output done' to 1 output rowsets.
2015/07/16 16:24:57 - Get Individuals Pairs.0 - Finished reading query, closing connection.
2015/07/16 16:24:57 - Cassandra Input.0 - Connecting to Cassandra node at 172.31.7.241 : 9160 using keyspace snpaware ...
2015/07/16 16:24:57 - Cassandra Input.0 - Using connection options: cqlVersion=3.0.1
2015/07/16 16:24:57 - SNPAWARE@GTDEV - Connection to database closed!
2015/07/16 16:24:57 - Get Individuals Pairs.0 - Finished processing (I=3, O=0, R=0, W=3, U=0, E=0)
2015/07/16 16:25:31 - Cassandra Input.0 - Getting meta data for column family snpsearch
2015/07/16 16:25:32 - Cassandra Input.0 - Executing query SELECT * FROM snpsearch WHERE idLine1 =? AND idLine2 = ? AND partid = 0 LIMIT 10;  ...
2015/07/16 16:26:03 - Cassandra Input.0 - Closing connection ...
2015/07/16 16:26:03 - Cassandra Input.0 - ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) : Unexpected error
2015/07/16 16:26:03 - Cassandra Input.0 - ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/07/16 16:26:03 - Cassandra Input.0 - null
2015/07/16 16:26:03 - Cassandra Input.0 -  at java.lang.Thread.run (Thread.java:745)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.pentaho.di.trans.step.RunThread.run (RunThread.java:62)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.pentaho.di.trans.steps.cassandrainput.CassandraInput.processRow (CassandraInput.java:223)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.pentaho.di.trans.steps.cassandrainput.CassandraInput.initQuery (CassandraInput.java:322)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.pentaho.cassandra.legacy.LegacyCQLRowHandler.newRowQuery (LegacyCQLRowHandler.java:285)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.apache.cassandra.thrift.Cassandra$Client.execute_cql3_query (Cassandra.java:1678)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.apache.cassandra.thrift.Cassandra$Client.recv_execute_cql3_query (Cassandra.java:1693)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.apache.thrift.TServiceClient.receiveBase (TServiceClient.java:78)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.apache.cassandra.thrift.Cassandra$execute_cql3_query_result.read (Cassandra.java:48924)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.apache.cassandra.thrift.Cassandra$execute_cql3_query_result$execute_cql3_query_resultStandardScheme.read (Cassandra.java:49009)
2015/07/16 16:26:03 - Cassandra Input.0 -  at org.apache.cassandra.thrift.Cassandra$execute_cql3_query_result$execute_cql3_query_resultStandardScheme.read (Cassandra.java:49032)
2015/07/16 16:26:03 - Cassandra Input.0 -
2015/07/16 16:26:03 - Cassandra Input.0 -    at org.pentaho.di.trans.steps.cassandrainput.CassandraInput.initQuery(CassandraInput.java:344)
2015/07/16 16:26:03 - Cassandra Input.0 -    at org.pentaho.di.trans.steps.cassandrainput.CassandraInput.processRow(CassandraInput.java:223)
2015/07/16 16:26:03 - Cassandra Input.0 -    at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/07/16 16:26:03 - Cassandra Input.0 -    at java.lang.Thread.run(Thread.java:745)
2015/07/16 16:26:03 - Cassandra Input.0 - Caused by: InvalidRequestException(why:Invalid amount of bind variables)

Looks like it is not getting the parameters from the previous result. It is possible to do this, or in the cassandra plugin you can only use variables and not parameters that comes from previous steps?

More information with screens (I could not upload the screen on this ticket) are on: http://stackoverflow.com/questions/3...ry-with-kettle

Someone put an emoji in a text field, break things. How can I fix this?

$
0
0
Hi there--

Well, after about a million emails (literally close to 1 million) I guess someone put an emoji in a free-form text field, and it has broken my Pentaho ETL process.

So simple, yet so maddening.


Essentially, our ticket form has several free form fields ... the body text/ message itself, for instance.

Anyway somebody put in the emoji/ symbol ������ .... EDIT: Irony of ironies, the little man emoji renders in the free form text editor here but not the post --- it's this character here:

http://apps.timwhitlock.info/unicode/inspect?s=%F0%9F%91%A4



And I got this error message in Pentaho when doing a SQL update with that string ...
Quote:

Error inserting/ updating row --- Incorrect string value: '\xF0\x9F\x91\xA4 ...' for column 'custom_fields' at row 1


I believe this is the UTF-8 of the emoji/ symbol.



Anyway ... what gives here? Surely, this cannot be the first time someone put in, or attempted to put in, an emoji somewhere after 1 million tickets. The platform in question here is Zendesk, and Zendesk renders the emoji in its JSON file as the person.

Is this a problem with Zendesk not 'cleansing' this symbol, or whatever the technical term is? How would I test this with other symbols? Should I reach out to them? (their help desk can be a bit slow and non-technical at times).

Secondly ... how can I 'cleanse' this symbol in the data in Pentaho so the process can continue? Frankly, I don't care whether the symbol is deleted, or rendered as UTF-8, or whatever.

I am a bit confused on what actual data is being transmitted/ read/ converted in Pentaho.

Again, it shows up as
(the tiny man) in the original data, Spoon is referencing the UTF-8 code in its error (and why would it not be able to insert the UTF-8 code as a string in mySQL?) ... and when I simply use Spoon to create a JSON file itself from the data, it seems to render the symbol as a question mark, (?) ... which is perfectly fine. I suppose I can add in an intermediary step (take this online JSON file, produce a JSON file - this would convert all garbage to ? -- then read this new JSON file) --- it would be a tad annoying to do this, but I guess it's one possible solution.

Thoughts?




enable / disable features depending on the user

$
0
0
Is it possible to enable / disable features in a dashboard depending on the logged user?
In particular I have a CDE dashboard with several links that I only want to enable for power users.
thanks

cpk.session.username = UNDEFINED in Startup Rule Engine sample

$
0
0
Edit: Sorry, it's cpk.sessions.roles not username. Setting log.kjb at login, cpk.session.roles is undefined in the log.

Hello,

I have just installed Startup Rule Engine in pentaho biserver 5.3. With the samples I get cpk.session.roles = UNDEFINED. But cpk.session.username, cpk.solution.system.dir and cpk.webapp.dir are set.


Sparkl is up to date.


Pentaho Update widget

$
0
0
Guys,

I receive a file with about 120K records and use the Update widget to look up values in the DB and then perform the Updates. Unfortunately I am finding out that some records were skipped and updates were not performed. Any ideas why? The database is mysql.

PDI 5.4 and Oracke Java 8 Compatibility

$
0
0
Hello we recently upgraded from PDI Community 5.2 to PDI Community 5.4. The documentation says PDI is certified with Oracle Java 7. Does anyone know if Oracle 8 can be used with PDI 5.4? Before we upgrade our servers I thought I would ask this question to the forum. Currently we are running on PDI 5.4 on Linux 64 bit with Oracle Java SE 1.7.0_45 64 bit edition. When running large data loads we are seeing these errors "Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x00000003d0480000, 1170735104, 0) failed; error='Cannot allocate memory' (errno=12)". It looks like this is a bug in Java 1.7.0_45 per http://stackoverflow.com/questions/2...va-application. So my initial thoughts were to upgrade to latest Java 1.7 build but now contemplating if we can use Java 8 version with PDI 5.4. Please share your experience or thoughts on pairing PDI 5.4 with Oracle Java 8. Thanks.

Any plans to leave Sourceforge

$
0
0
Today I tried to use Marketplace to install SaikuChartsPlus and it failed. Brief investigation shown that SourceForge download is not available and it seems like most of the SF.net site was down.
I did some quick Google search on the subject and discovered that good and big projects like VideoLAN and Gimp are leaving SF.net and implementing their own distribution networks because Sourceforge constantly couples downloads with malware and adware to generate revenue.
So I am wondering if Pentaho project is aware of these issues and has any plans to address them?

Rita

Fresh install. Dozens of exceptions on startup

$
0
0
biserver-ce-5.4.0.1-130
jdk-8u51-windows-x64.exe

Upon startup, it says

C:\Tools\biserver-ce> start-pentaho.bat
DEBUG: Using JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=c:\Program Files\Java\jre1.8.0_51
DEBUG: _PENTAHO_JAVA=c:\Program Files\Java\jre1.8.0_51\bin\java.exe
Using CATALINA_BASE: "C:\Tools\biserver-ce\tomcat"
Using CATALINA_HOME: "C:\Tools\biserver-ce\tomcat"
Using CATALINA_TMPDIR: "C:\Tools\biserver-ce\tomcat\temp"
Using JRE_HOME: "c:\Program Files\Java\jre1.8.0_51"
Using CLASSPATH: "C:\Tools\biserver-ce\tomcat\bin\bootstrap.jar"


In tomcat

[Server@30ee2816]: To close normally, connect and execute SHUTDOWN SQL
[Server@30ee2816]: From command line, use [Ctrl]+[C] to abort abruptly
22:24:38,863 ERROR [ProxyWeavingHook] There was a serious error trying to weave the class org.apache.aries.blueprint.con
tainer.BlueprintContainerImpl. See the associated exception for more information.
java.lang.IllegalArgumentException

2:24:38,872 ERROR [ProxyWeavingHook] There was a serious error trying to weave the class org.apache.felix.cm.impl.Conf
urationManager. See the associated exception for more information.
ava.lang.IllegalArgumentException

ERROR: Bundle org.apache.felix.configadmin [12] Error starting file:/C:/Tools/biserver-ce/pentaho-solutions/system/osgi/
core_bundles/org.apache.felix.configadmin-1.8.0.jar (org.osgi.framework.BundleException: Activator start error in bundle
org.apache.felix.configadmin [12].)
java.lang.ClassFormatError: Weaving hook failed.

22:24:39,068 ERROR [Logger] misc-class org.pentaho.platform.plugin.services.pluginmgr.DefaultPluginManager: PluginManage
r.ERROR_0011 - Failed to register plugin cda
org.springframework.beans.factory.BeanDefinitionStoreException: Unexpected exception parsing XML document from file [C:\
Tools\biserver-ce\pentaho-solutions\system\cda\plugin.spring.xml]; nested exception is java.lang.IllegalStateException:
Context namespace element 'annotation-config' and its parser class [org.springframework.context.annotation.AnnotationCon
figBeanDefinitionParser] are only available on JDK 1.5 and higher

I have the JDK installed.

22:24:48,982 ERROR [CoreBeanFactory] Spring definition file does not exist. There should be a <plugin_name>.spring.xml f
ile on the classpath
22:24:48,995 ERROR [CdeEngine] Error initializing CdeEngine: [ java.lang.NullPointerException ] - null

Seems completely borked. Any ideas?

Rest Client not working with api

$
0
0
i am using rest client to create a bulk login to pentaho, i am using a API for this "http://localhost:8080/pentaho/api/userroledao/createUser"

i am taking input from excel and in js modifying input in json format and in next step i am using REST Client and in this step its not working, can anyone help me

Thanks in Advance

PDI 5.3.0 CE - Can't save new/existing mappings to repo after error.

$
0
0
Hi,

We have a MySQL repo that had been working OK until it threw a 'java.lang.ArrayIndexOutOfBoundsException' error. Now I can't save any new jobs/transformations. Also, I cannot save any existing job/transformation.

This is preventing me completing the upgrade from 5.1.

Please Help!


The detailed message :-

java.lang.reflect.InvocationTargetException
at org.eclipse.jface.operation.ModalContext.run(ModalContext.java:350)
at org.eclipse.jface.dialogs.ProgressMonitorDialog.run(ProgressMonitorDialog.java:495)
at org.pentaho.di.ui.spoon.dialog.SaveProgressDialog.open(SaveProgressDialog.java:81)
at org.pentaho.di.ui.spoon.Spoon.saveToRepository(Spoon.java:5177)
at org.pentaho.di.ui.spoon.Spoon.saveFileAs(Spoon.java:5266)
at org.pentaho.di.ui.spoon.Spoon.saveFileAs(Spoon.java:5234)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem.access$100(JfaceMenuitem.java:43)
at org.pentaho.ui.xul.jface.tags.JfaceMenuitem$1.run(JfaceMenuitem.java:106)
at org.eclipse.jface.action.Action.runWithEvent(Action.java:498)
at org.eclipse.jface.action.ActionContributionItem.handleWidgetSelection(ActionContributionItem.java:545)
at org.eclipse.jface.action.ActionContributionItem.access$2(ActionContributionItem.java:490)
at org.eclipse.jface.action.ActionContributionItem$5.handleEvent(ActionContributionItem.java:402)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1316)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7979)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9310)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:654)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: java.lang.ArrayIndexOutOfBoundsException
at java.lang.System.arraycopy(Native Method)
at org.pentaho.di.repository.kdr.KettleDatabaseRepositoryCreationHelper.getAndCopyStepTypeIds(KettleDatabaseRepositoryCreationHelper.java:3037)
at org.pentaho.di.repository.kdr.KettleDatabaseRepositoryCreationHelper.loadPluginsIds(KettleDatabaseRepositoryCreationHelper.java:3053)
at org.pentaho.di.repository.kdr.KettleDatabaseRepositoryCreationHelper.updateStepTypes(KettleDatabaseRepositoryCreationHelper.java:2990)
at org.pentaho.di.repository.kdr.KettleDatabaseRepository.updateStepTypes(KettleDatabaseRepository.java:1476)
at org.pentaho.di.repository.kdr.delegates.KettleDatabaseRepositoryTransDelegate.saveTransformation(KettleDatabaseRepositoryTransDelegate.java:264)
at org.pentaho.di.repository.kdr.KettleDatabaseRepository.save(KettleDatabaseRepository.java:407)
at org.pentaho.di.repository.kdr.KettleDatabaseRepository.save(KettleDatabaseRepository.java:387)
at org.pentaho.di.repository.AbstractRepository.save(AbstractRepository.java:126)
at org.pentaho.di.ui.spoon.dialog.SaveProgressDialog$1.run(SaveProgressDialog.java:70)
at org.eclipse.jface.operation.ModalContext$ModalContextThread.run(ModalContext.java:113)

How to map values from one CSV to another?

$
0
0
I have one CSV file containing article data which is structured as
follows (simplified):


Artnr;EAN;Artkern;
"111";"1234";"Some String"


Now I would like to map this data into a structure like below
(again simplified) and output it to a new CSV file:


sku,_something_a,description,something_b,...
"1234",,"Some String",,


I've created a template csv file holding only the header of the
target structure. Then I've added an input step reading this file.
Next I have added a second input step which reads the csv file
holding the actual data. I have tried joins, selects, etc. but here
I am lost (i am pretty new to kettle).


Which steps are necessary to merge this structures together into a new csv file?

pentaho configuration

$
0
0
Hi friends,
I ve been trying for 2 days to open a saiku view in a new tab when clicking on a button that i made in index.jsp (pentaho 5.0.1).
I can open reports and Dashboards but not saiku views
I have been trying a lot of things and I wonder why this doesn t work:
onclick=top.mantle_openTab('Saiku','Saiku','content/saiku-ui/index.html?biplugin5=true&dimension_prefetch=true#query/open//public/test.saiku')

these also don't work:
onclick=top.mantle_openTab('Saiku','Saiku','content/saiku-ui/index.html?biplugin5=true&dimension_prefetch=true#query/open/%2Fpublic%2Ftest.saiku')
onclick=top.mantle_openTab('Saiku','Saiku','content/saiku-ui/index.html?biplugin5=true&dimension_prefetch=true#query/open/%3Apublic%3Atest.saiku')

this works but does not open in a new tab and i really need to open in a new tab:
onclick="window.location.href=('../../content/saiku-ui/index.html?biplugin5=true&dimension_prefetch=true&mode=edit#query/open//public/test.saiku')"

Thanks for you help, any suggestion will be rewarded with my eternal gratitude :D

performance question

$
0
0
Hello, a couple of silly questions.

A Mondrian schema has been created which obviously allows MDX queries to be written against it. As part of the fetching the MDX command is interpreted and SQL gets the data from the underlying tables. Therefore is it just as efficient, if not more efficient, to write a query in SQL and get the underlying data direct?

Is there an IDE to generate the schema or must one hand cut the xml?

Thanks

Pie Chart in pentaho report designer

$
0
0
I am trying to create a pie chart to show the number of employees active and terminated per company.

Company a employee 1 active
company a employee 2 terminated
company b employee 1 active
company b employee 2 active

I keep getting no data available.

I am unsure how to edit the chart to display the information i need.


Thanks!!
Screen Shot 2015-07-17 at 4.10.54 PM.jpg
Attached Images

Copy Weka strings into R package RWeka

$
0
0
Is it possible to take the strings which Weka generates when you modify the settings for Weka's classifiers and paste it into R using the package RWeka. Currently, Weka only generates the source code for Java.
Any solution? Thanks.

The server sent HTTP status code 401: Unauthorized

$
0
0
Hi,
I wonder what is the solution for The server sent HTTP status code 401: Unauthorized in Pentaho..OS : RHEL 6.6

Thanks

Ujjwal Rana

Create a user defined log file

$
0
0
Hi,

How can we create a user defined log file with user required information in log file? I have attached one log file here for reference.

Ex:
Log file should contain following information:
1. Howmany source records got processed?
2. Howmany source records written to target file?
3. Log file name should be <Transformation Name>.log


Thanks,
Madhu
Attached Files

Time frame analysis

$
0
0
Hi,

I would like to analyse time frame / activity, e.g.
start / end / activity
01jan2015 / 02jan2015 / activity1
02jan2015 / 03jan2015 / activity2
01jan2015 / 02jan2015 / activity2
03jan2015 / 06jan2015 / activity1

I would like to know e.g. ho many activity1 have been active any given time, at which time/date are what activities are active etc. (as a heatmap of graph).How can such analysis be performed?

Thank you!
Mike
Viewing all 16689 articles
Browse latest View live