Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Problem with export to sql of a Date field

$
0
0
Hi to all.
I have a problem with an export to table of one csv file.
The problem is a field wilth format 20140326T194828
I made a calculator step for export to table in format yyyy/MM/dd HH:mm:ss
But I have an error with export step.
Can you help me?
I attach the transformation.
Thanks a lot

GiorgioProva.ktr
Attached Files

Can't access to remote pentaho bi-server

$
0
0
I uploaded mi "pentaho biserver-ce-6.0.1.0-386" to my hosting and set the variables needed but i keep getting error like this:
04-Mar-2016 11:14:27.427 SEVERE [localhost-startStop-1] org.apache.tomcat.util.descriptor.web.SecurityConstraint.findUncoveredHttpMethods For security constraints with URL pattern [/jsp/*] only the HTTP methods [POST GET] are covered. All other methods are uncovered.
I didn't find a guide-tutoial for biserver on a remote hosting in my pc in local is working perfectly
I modified the web.xml to my domain name, I have mi pentahoo bi-server under the directory "demo-odoo.ecommlean.com/pentaho"
I attach my logs and web.xml: https://www.dropbox.com/sh/75wexsqxf...bU9uuY2sa?dl=0


eco.web08 ~ # cat /var/www/demo-odoo.ecommlean.com/htdocs/pentaho/biserver-ce/tomcat/logs/pentaho.log
2016-03-04 11:13:21,975 INFO [org.pentaho.platform.osgi.KarafInstance] ******************************************************************************* *** Karaf
Instance Number: 1 at /var/www/demo-odoo.ecommlean.com/htdocs/pen *** *** taho/biserver-ce/pentaho-solutions/system/karaf//data1 *** *** Karaf Port:8801 *** ***
OSGI Service Port:9050 *** *** JMX RMI Registry Port:11098 *** *** RMI Server Port:44444 ***
******************************************************************************* 2016-03-04 11:13:29,897 INFO [org.pentaho.di] 2016/03/04 11:13:29 - cfgbuilder -
Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp 2016-03-04 11:13:40,657 ERROR
[org.pentaho.di.osgi.KarafLifecycleListener] The Kettle Karaf Lifycycle Listener failed to execute properly. Releasing lifecycle hold, but some services may be
unavailable. 2016-03-04 11:13:51,859 ERROR [org.apache.felix.configadmin.1.8.0] [[org.osgi.service.cm.ConfigurationAdmin]]Cannot use configuration
org.pentaho.requirejs for [org.osgi.service.cm.ManagedService, id=562, bundle=187/mvn:pentaho/pentaho-requirejs-osgi-manager/6.0.1.0-386]: No visibility to
configuration bound to mvn:pentaho/pentaho-server-bundle/6.0.1.0-386



netstat -an | grep -w -e 8801 -e 9050 -e 11098 -e 44444
tcp6 0 0 :::11098 :::* LISTEN
tcp6 0 0 :::44444 :::* LISTEN
tcp6 0 0 127.0.1.1:44444 127.0.0.1:47797 ESTABLISHED
tcp6 0 0 127.0.0.1:47797 127.0.1.1:44444 ESTABLISHED
Attached Files

Two java folders up

$
0
0
Hi everyone,
I just installed Pentaho BA 6.0.1 (latest version) on UBUNTU.When it was installed I was able to see the Pentaho User console until the system down.After once the system was restarted the tomcat service is not started saying error "THE REQUESTED SOURCE IS NOT AVAILABLE" and later I observed that in DEBUG it is telling "FOUND JAVA TWO FOLDERS" and the console was not coming.After that I removed all the java versions in the system and uninstalled the Pentaho also and tried to install again, this time also when it was installed the console was coming after the restarting of system again saying the same error, this time added another DEBUG saying "FOUND PENTAHO LICENSE TWO FOLDERS UP". I tried many trails and errors by searching in google but I couldn't get.

Can Anybody help me how to fix the solution for this


Thanks and Regards
Mallikarjun

ReservedCharacters returned a response status of 404 not found

$
0
0
Hello everybody,

I am having a problem when I go to publish a schema happen error:

DEBUG: Using JAVA_HOME
DEBUG: _PENTAHO_JAVA_HOME=C:\Program Files\Java\jdk1.7.0_79
DEBUG: _PENTAHO_JAVA=C:\Program Files\Java\jdk1.7.0_79\bin\java
09:14:46,094 INFO [MondrianProperties] Mondrian: properties loaded from 'file:/C:/Users/gabrielgomes/.schemaWorkbench/mondrian.properties'
09:14:46,095 INFO [MondrianProperties] Mondrian: loaded 0 system properties
09:14:46,202 INFO [StandardFileSystemManager] Using "C:\Users\GABRIE~1\AppData\Local\Temp\vfs_cache" as temporary files store.
09:14:48,823 INFO [RolapUtil] Mondrian: JDBC driver oracle.jdbc.driver.OracleDriver loaded successfully
09:14:48,824 INFO [RolapUtil] Mondrian: JDBC driver com.mysql.jdbc.Driver loaded successfully
09:14:48,845 INFO [StandardFileSystemManager] Using "C:\Users\GABRIE~1\AppData\Local\Temp\vfs_cache" as temporary files store.
Mar 04, 2016 9:40:53 AM org.pentaho.mondrian.publish.workbench.PublishUtil fetchReservedChars
ADVERT╩NCIA: Reserved character call failed: GET http://localhost:8080/pentaho/api/re...rvedCharacters returned a response status of 404 Not Found

I am trying find this REPO but I not find.

Can Somebody help me please?

Kettle Repository Issues

$
0
0
Hello,

I'm running Kettle CE 6.01 on an AWS Windows Server 2012. I setup a repository in an Amazon Posgre managed instance (AWS RDS).

Quite often I run into some of these problems:

- Click on "open referenced object" top open a transformation from a job and either nothing happens or another opened job gets selected
- When copying transformation from one repository to the another the connection disappears or just stops working.

The issue seems to arise especially when "cloning" jobs or transformations (Save As..) and is resolved by exporting the repository to file, delete/recreate the database, and reimporting the repository from the file.

Also, can you please advise on the best strategies to clone transformation and/or copying them between repository (we have a development and a production repository)

Thanks in advance!

Alex

How to make a job use transformations from inside a repository?

$
0
0
Is there a way that I can make a job use transformations from inside a repository? Right now, I can only choose files that are not in the repository.

My ultimate goal is to schedule a job to run transformations on a certain day/time, so I have start with one transformation outputing to another transformation, one at a time.

Scheduling Jobs

$
0
0
1. This is the document that I found: http://wiki.pentaho.com/display/EAI/Start
2. What I am trying to do is schedule a job to run once a day (or week, does not matter) without anyone actually being logged into Pentaho. Is it possible for the job to run without anyone being logged in?

Capability Manager: What does it provide?

$
0
0
Hi,

It looks like the capability manager (Tools -> Capability Manager) should expose additional functionality that is not otherwise available. I am not sure what it is supposed to provide, as I get an error every time I check a box next to a capability. Does anyone have insight regarding how this is intended to be used?

Thanks,
Gabe

Shell entry does not execute script

$
0
0
Hi,

I have a Shell entry with this script

Code:

@echo on
cd P:
Set BIRT_HOME=P:\server
%BIRT_HOME%\ReportEngine\genReport.bat -f pdf -o "P:\pentaho\output\reports\client\copacker\cpacker.pdf" -F "P:\server\tomcat\webapps\birt\copacker_asn.rptdesign"

When I run this script from a .bat file in the command line, everything works fine. If I run the same script from PDI and with the Shell entry, it simply goes over the step, executes it and continues. But it does not actually run the script.

Anybody got an idea what that might be?
Thanks

Bobse

PDI .kettle properties file location in ubuntu and start spoon problems

$
0
0
Hi Forum,
I used tow execute transformations and jobs on windows platform.

Recently, I have been involved in the environment setup in linux.

I have below environment setup

OS :
Distributor ID: Ubuntu
Description: Ubuntu 14.04.3 LTS
Release: 14.04
Codename: trusty

JAVA Version :
java version "1.8.0_72"
Java(TM) SE Runtime Environment (build 1.8.0_72-b15)
Java HotSpot(TM) 64-Bit Server VM (build 25.72-b15, mixed mode)


JAVA location :
/usr/bin/java


JAVA_HOME, JRE_HOME, PATH are as below
/usr/lib/jvm/java-8-oracle
/usr/lib/jvm/java-8-oracle/jre
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/lib/jvm/java-8-oracle/bin:/usr/lib/jvm/java-8-oracle/db/bin:/usr/lib/jvm/java-8-oracle/jre/bin

Start pentaho spoon
/opt/pdi-ce#./spoon.sh or (sh spoon.sh) and I am struck with below warning message.
Does it means my PDI started or how to check whether my PDI instance running or not ?


java.lang.UnsatisfiedLinkError: Could not load SWT library. Reasons:
no swt-pi-gtk-4332 in java.library.path
no swt-pi-gtk in java.library.path
/root/.swt/lib/linux/x86_64/libswt-pi-gtk-4332.so: libgtk-x11-2.0.so.0: cannot open shared object file: No such file or directory
Can't load library: /root/.swt/lib/linux/x86_64/libswt-pi-gtk.so
at org.eclipse.swt.internal.Library.loadLibrary(Unknown Source)
at org.eclipse.swt.internal.Library.loadLibrary(Unknown Source)
at org.eclipse.swt.internal.gtk.OS.<clinit>(Unknown Source)
at org.eclipse.swt.internal.Converter.wcsToMbcs(Unknown Source)
at org.eclipse.swt.internal.Converter.wcsToMbcs(Unknown Source)
at org.eclipse.swt.widgets.Display.<clinit>(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:610)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
root@bi:/opt/pdi-ce#


Also, where does .kettle properties file available in Ubuntu ?


I have created a repository with mysql to store trans and jobs on the same Ubuntu


I assumed that the kettle is running though I see above warning message and tried executing a sample ktr from command line and as well repository trans execution from command.
When I run file based ktr from command line I got below message :
root@nutrisafrabi:/opt/pdi-ce# sh pan.sh -file="/opt/test.ktr"
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed i n 8.0
11:02:46,724 INFO [KarafInstance]
*******************************************************************************
*** Karaf Instance Number: 1 at /opt/pdi-ce/./system/karaf//data1 ***
*** Karaf Port:8801 ***
*** OSGI Service Port:9050 ***
*******************************************************************************
Mar 05, 2016 11:02:47 AM org.apache.karaf.main.Main$KarafLockCallback lockAquired
INFO: Lock acquired. Setting startlevel to 100
2016/03/05 11:02:49 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2016-03-05 11:02:51.786:INFO:oejs.Server:jetty-8.1.15.v20140411
2016-03-05 11:02:52.076:INFO:oejs.AbstractConnector:Started NIOSocketConnectorWrapper@0.0.0.0:9050
log4j:ERROR Could not parse url [file:/opt/pdi-ce/./system/osgi/log4j.xml].
java.io.FileNotFoundException: /opt/pdi-ce/./system/osgi/log4j.xml (No such file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at java.io.FileInputStream.<init>(FileInputStream.java:93)
............................
............................(b'z of length of post I am not pasting the whole log)

-------------------
----------------
Mar 05, 2016 11:02:53 AM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register
INFO: Registered blueprint namespace handler for http://cxf.apache.org/clustering
Mar 05, 2016 11:02:53 AM org.pentaho.caching.impl.PentahoCacheManagerFactory$RegistrationHandler$1 onSuccess
INFO: New Caching Service registered
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/pdi-ce/launcher/../lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/i mpl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/pdi-ce/plugins/pentaho-big-data-plugin/lib/slf4j-log4j12-1. 7.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Mar 05, 2016 11:02:54 AM org.apache.cxf.endpoint.ServerImpl initDestination
INFO: Setting the server's publish address to be /lineage
2016/03/05 11:02:55 - Pan - Start of run.
2016/03/05 11:02:55 - test - Dispatching started for transformation [test]
2016/03/05 11:02:55 - Table output.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.2 5 by buildguy) : Error initializing step [Table output]
2016/03/05 11:02:55 - Table output.0 - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.2 5 by buildguy) : java.lang.NullPointerException
2016/03/05 11:02:55 - Table output.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.init (TableOutput.java:474)
2016/03/05 11:02:55 - Table output.0 - at org.pentaho.di.trans.step.StepInitThread.run(StepInitTh read.java:69)
2016/03/05 11:02:55 - Table output.0 - at java.lang.Thread.run(Thread.java:745)
2016/03/05 11:02:55 - test - ERROR (version 6.0.1.0-386, build 1 from 2015-12-03 11.37.25 by build guy) : Step [Table output.0] failed to initialize!
Unable to prepare and initialize this transformation


Can anybody help me here ?

Thank you in advance.
Kind Regards,
Sadakar

Widows and orphans misworking

$
0
0
Hello,

I have detected some problems with widows and orphans:

- If I set widows (2) and orphans (2) in the Details body, widows work fine (the footer never is alone in a new page), but with orphans in some conditions (some details rows in the first page and some in the second one) I get some blank space in the second page between the details header (repeat set to true) and the details rows.

- I have a report with details header, details and a subreport in group footer. In this situation I set widows in group and it doesn't work, (the subreport can be alone in the second page).

Any ideas?

Using JNDI connections for the solution repository

$
0
0
Hello,

I am trying to use JNDI connection for the solution repository
I found a good document here:
https://support.pentaho.com/hc/en-us...e_the_solution

And modified the hibernate file (postgresql.hibernate.cfg.xml) and the jackrabbit file (repository.xml) as described.
I removed the credentials, URL, and driver specified and replace it with the JNDI names:
- <property name="hibernate.connection.datasource">java:comp/env/jdbc/hibernate</property> and,
- <param name="url" value="java:comp/env/jdbc/jackrabbit"/>).

The JNDI name has been defined in the Tomcat context.xml file.

but I am getting an error!! (in the attached files are the error and the log file)

Can anyone help please? I am using Pentaho 6.0.1


Thanks

catalina.2016-03-06.log

error-JNDI.jpg
Attached Images
Attached Files

How to Secure Passwords

$
0
0
Hello,

In the context.xml and repository.xml files the usernames and passwords are written as a clear text.
And in some other SQL script files for creating Hibernate, Quartz, Jackrabbit databases (create_jcr_postgresql, create_quartz_postgresql, create_repository_postgresql)


Is it possible to encrypt the passwords? or to avoid writing the passwords in a clear text?

Do anyone know?
Thanks in advanced

MongoDB Input unwind

$
0
0
Mate - I am processing a json which is hierarchical in nature. It is a survey information. Once survey can have many questions, one questions can have many answers and one answers can have many plugins.

Source data:-
Code:

{
    "_id": {"$oid": "5673417f677aff45e70001c5"},
    "title": "Survey 01",
    "questions": [
    {
        "_id": {"$oid": "56734183677aff45e70001c6"},
        "body": "Which browser do you use? Any plugin to mention?",
        "answers": [
            {"_id": {"$oid": "56734183677aff45e70001c7"},
            "body": "Google Chrome",
            "plugins": [
                {
                "_id": {"$oid": "56ce0767988ceb49c5000002"},
                "name": "Ad blocker",               
                "browser_plugins_id": {"$oid": "566886c040041260650003cc"}
                },
                {
                "_id": {"$oid": "56ce076c988ceb49c5000003"},
                "name": "Chrome password manager",
                "browser_alert_id": {"$oid": "5644fa0a5241491f9abe0200"}
                }]
            },
            {
            "_id": {"$oid": "56ce07b2988ceb49c5000006"},
            "body": "Internet Explorer 11"
            },
            {
            "_id": {"$oid": "56ce07b4988ceb49c5000007"},
            "body": "Mozilla Firefox"
            }]
    },
    {
        "_id": {"$oid": "56ce079b988ceb49c5000004"},
        "body": "Which website you surf most?",
        "answers": [
            {
            "_id": {"$oid": "56ce079b988ceb49c5000005"},
            "title": "Facebook"
            }]
    }],
    "status": "active"
}

I need to produce normalize data and break it at plugins level. So I need total 5 records (4 answers and 2 plugins for one answer). I am attaching output (survey.xls) for reference.

I am having a working solution but it is not very much clean. I am reading everything from MongoDb in a single row with the help of array. So I am having 5 questions ($.questions[0]._id) then reading 5 answers for each quesitons ($.questions[0].answers[0]._id) and so on. I then normalize this data and filter out null questions, answers, etc.

Instead of that, I want to use something like unwind so that I can get this format directly out of MongoDB Input step. And which should take care of any number of questions or answers.

Any help will be much appreciated.

env:- PDI 5.4.0.1-131 CE, Windows 10, Java build 1.8.0_25-b18, MongoDB 3.2.3

Regards,
Ritesh
Attached Files

MDX Using Member_Key

$
0
0
These are the steps I've preformed leading up to this question.

  • Create star schema in MS SQL
  • Import schema and define joins in the pentaho site's data source wizard importer thingie
  • Export this connection and load the .xml in schema workbench


I'm trying to create a query which uses ordering but can't seem to use any of the intrinsic member properties defined here: https://msdn.microsoft.com/en-us/lib...or=-2147217396

Simple Query no order
rdENM.jpg


After adding order to the query
RSyyD.jpg


Any thoughts on what I might be doing wrong?
Attached Images

org.pentaho.di.core.exception.KettleException: com.sun.xml.ws.client.ClientTransportE

$
0
0
Hi!
I am trying to connect to my pentaho repository using pentaho BI enterprise edition. My username has been added to the repository and given the admin privilege. However everytime i click connect, i receive the following error. My username and password are correct and a coworker is able to login using my system.

org.pentaho.di.core.exception.KettleException:com.sun.xml.ws.client.ClientTransportException: The server sent HTTP status code 401: Unauthorized

Please help.

Multi Instance of BI-Server 6.0

$
0
0
Hey Guys!
i want to run multiple instances of Pentaho Server on a single machine, for the sake of Testing My data. I have followed :
https://herwinrayen.wordpress.com/tag/bi-configuration/


but i think its for an older version of pentaho it does not work, and i get the exception, please check the attached screen shot.

Please if anyone can help me?
Attached Images

Is there ane extra option in a table component equivalent to bAutoWidt

$
0
0
Hello,

I have a table whose columns could change in the future. I want that all of them have the same width. I can't use columns width property because their number could change, so I use a js function. The problem is that if I resize the window, their width is autoset.

i know that you can set autoWidth to false in datatables definition, but I don't know how to do this in cde. Is there any extra option for this?

Thank you.

Is the sum of the latest PDI wrong? (MD5 and SHA1 of biserver-ce-6.0.1.0-386.zip)

$
0
0
Hello,
I had downloaded biserver-ce-6.0.1.0-386.zip from sourceforge.
However its sum is different from the above site.
Is my downloaded file broken?
Anyone can help me to understand it?


It is described below in detail.


The sum of sourceforge
SHA1: 8250b87df550c99d625da4a73906f5a740574afb
MD5: 02f102d3b16d5eb0caa06ce7b4c263da


$ openssl sha1 biserver-ce-6.0.1.0-386.zip
SHA1(biserver-ce-6.0.1.0-386.zip)= 1ebe876b4949eb2e161ac19c30fafb4037eeb98b


$ md5sum -b biserver-ce-6.0.1.0-386.zip
40dc3c562b8b382c7e0c0c8eae550245 *biserver-ce-6.0.1.0-386.zip


# unzip -t biserver-ce-6.0.1.0-386.zip
:
:
No errors detected in compressed data of biserver-ce-6.0.1.0-386.zip.


# ls -l
-rw-rw-r-- 1 root root 973793445 Mar 1 18:31 biserver-ce-6.0.1.0-386.zip


Thank you.

Can't make BI Server CE work on Mac OS X 10.11

$
0
0
I've managed to use Data Integration and Schema Workbench, but I need to Publish using Schema Workbench in order to finish an university work; in order to do that, I need BI Server working, but I couldn't make it.

I'm trying to connect to MySQL, so I tried to run the .sql files inside the data/mysql5 folder and ran start-pentaho.sh, but despite it saying that Tomcat is running I can't publish in Schema Workbench and can't access the page localhost:8080/pentaho in my browser. Tried to search on Google but didn't found anything useful, and the most part of the tutorials for using it on Mac OS X were really old and probably outdated.

Any help is appreciated. I'm a completely newbie in Pentaho.
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>