Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

[BI Server 5.2.0] libgconf-2.so.4: cannot open shared object file: No such file or di

$
0
0
Hi all,

we got a Pentaho BI server running for a while now (4 months) on an acceptance Redhat Server and last week we shut it down to copy the files to a similar server.
After we copied the files, we tried to start the server, but unfortunately we got this error:
Code:

Exception in thread "main" java.lang.UnsatisfiedLinkError: /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.85.x86_64/jre/lib/amd64/libnio.so: libgconf-2.so.4: cannot open shared object file: No such file or directory
We can't seem to get the server up and running again. As far as we know, there hasn't been any installations made. Anyone familiar with this issue?

Java version:
Code:

java version "1.7.0_85"
OpenJDK Runtime Environment (rhel-2.6.1.3.0.1.el6_7-x86_64 u85-b01)
OpenJDK 64-Bit Server VM (build 24.85-b03, mixed mode)


Issues with POI while reading Excel files

$
0
0
Hi guys,

I have different issues while reading excel files using POI 3.9 and Spoon 5.3.0
I saw that quite a few people fixed this kind of issues upgrading the POI libraries to 3.12, so I downloaded Spoon 6 and everything works fine.
I know it was a bit risky but I tried to copy the POI 3.12 jar files from Spoon 6 to my Spoon 5.3 (I can't upgrade Spoon at the moment) but unfortuately it didn't work. The transformartion is running fine but I still have issues with POI

Could anyone help me please? I read some people who did something similar but maybe I missed something.

Thanks
Alex

HTTP step with PAC

$
0
0
Hi,

we are using the HTTP step in a job to get a csv file from a web server. However, from one of our environments it is necessary to use a proxy to be able to connect.

Does anyone know if it would be possible to use a PAC file (proxy auto-config) in this step? Can we set an URL to this pac file somewhere?

Thank you and best regards,
Martin

Call db procedure / input parameter

$
0
0
Hello guys,
Problably it is goin to be really easy question but I don't know how to put input parameters into my Procedure defined in database.
For example I have procedure with only 1 parameter (eg. number) and in pentaho I would like to run it with "hardcoding" parameter.
I looked at Call DB Step but it does not work correctly.
I would like to run procedure with parameter = 1.
analogously I will create 5 other transformation with parameter 2, 3, 4, 5 and 6.
Then I will run it as JOB.
That's all.
Is it possible?:)

REGEX Check.

$
0
0
Hi,

I want to put a check for spaces and tab. If there is a space that record is treated as valid and if there is a tab so that record will be error out. All this needs to be done using regex expression function.


Please Help!!!!!!

Thanks,
Nishank

CDE OLAP Wizard with modnrian Dynamic Schema Processor not working

$
0
0
hi,
I'm trying to use CDE OLAP wizard on pentaho CE 6.0 with a mondrian xml schema that use a dynamic schema processor class.
The schema works fine with saiku 3.7, but with the cde olap wizard I get a mondrian error.

I try to debug my DSP class, but isn't called from CDE.

Is there any configuration parameter to set CDE working with a custom DSP class?
may the custom class jar are to be deployed in a folder different that "/biserver-ce/tomcat/webapps/pentaho/WEB-INF/lib" ?

thank for every reply.
bye

pass db fatal connection error to syslog message

$
0
0
Hello.

I'm running a "Check Db connections" step and notice that the output shows the details, for example:

"FATAL: no pg_hba.conf entry for host..."

I would like to pass this output (the reason for the failure) into a syslog event.

How can this be done?

Many thanks!

JOBS: change a parameter in the call by terminal

$
0
0
Hi everyone!
I'm currently using the kettle of pentaho to load data from hive to mysql.
The problem is that I do it with the spoon's GUI to make the changes. But now, I need to do by terminal.
I was locking about kitchen and its jobs load, but I need to change some parameters by terminal befor load the job (not using an interface as usual) and I dont know/see nothing about it in kitchen.
So, someone knows something or have some guide/tutorial for change a paremeter by terminal from a JOB ?
Thank you so much.

What is the lastest stable version of biserver-ce?

User Console hang on creating reports

$
0
0
Hi everyone,

Upon hitting 'Finish' on creating a report though the User Console, the User Console displays a perpetual loading dialog "Just a few moments please. (Creating/Updating data source...)". I'm running the User Console in Google Chrome under Mac OS X version 10.10.3. Here's the contents of pentaho.log (although I'm not sure this will help):

Code:

2015-11-12 10:24:34,937 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2015-11-12 10:24:34,939 ERROR [org.pentaho.platform.util.logging.Logger] misc-org.pentaho.platform.engine.core.system.PentahoSystem: PentahoSystem.ERROR_0015 - Error while trying to execute shutdown sequence for org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter
java.lang.IllegalStateException: Service already unregistered.
        at org.apache.felix.framework.ServiceRegistrationImpl.unregister(ServiceRegistrationImpl.java:124)
        at org.pentaho.platform.engine.core.system.objfac.OSGIRuntimeObjectFactory$OSGIPentahoObjectRegistration.remove(OSGIRuntimeObjectFactory.java:181)
        at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.unloadPlugins(PentahoSystemPluginManager.java:225)
        at org.pentaho.platform.plugin.services.pluginmgr.PentahoSystemPluginManager.unloadAllPlugins(PentahoSystemPluginManager.java:917)
        at org.pentaho.platform.plugin.services.pluginmgr.PluginAdapter.shutdown(PluginAdapter.java:47)
        at org.pentaho.platform.engine.core.system.PentahoSystem.shutdown(PentahoSystem.java:1063)
        at org.pentaho.platform.web.http.context.SolutionContextListener.contextDestroyed(SolutionContextListener.java:262)
        at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4776)
        at org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5390)
        at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:232)
        at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1424)
        at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1413)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
2015-11-12 10:24:34,939 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2015-11-12 10:24:34,941 WARN  [org.pentaho.platform.web.http.context.PentahoSolutionSpringApplicationContext] Exception thrown from ApplicationListener handling ContextClosedEvent
java.lang.IllegalStateException: Service already unregistered.
        at org.apache.felix.framework.ServiceRegistrationImpl.unregister(ServiceRegistrationImpl.java:124)
        at org.pentaho.platform.engine.core.system.objfac.OSGIRuntimeObjectFactory$OSGIPentahoObjectRegistration.remove(OSGIRuntimeObjectFactory.java:181)
        at org.pentaho.platform.engine.core.system.objfac.spring.PublishedBeanRegistry$1.onApplicationEvent(PublishedBeanRegistry.java:125)
        at org.pentaho.platform.engine.security.event.OrderedApplicationEventMulticaster$2.run(OrderedApplicationEventMulticaster.java:87)
        at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:49)
        at org.pentaho.platform.engine.security.event.OrderedApplicationEventMulticaster.multicastEvent(OrderedApplicationEventMulticaster.java:85)
        at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:334)
        at org.springframework.context.support.AbstractApplicationContext.doClose(AbstractApplicationContext.java:1051)
        at org.springframework.context.support.AbstractApplicationContext.close(AbstractApplicationContext.java:1012)
        at org.springframework.web.context.ContextLoader.closeWebApplicationContext(ContextLoader.java:586)
        at org.springframework.web.context.ContextLoaderListener.contextDestroyed(ContextLoaderListener.java:143)
        at org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4776)
        at org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5390)
        at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:232)
        at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1424)
        at org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1413)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

If anyone knows what's going on here please let me know. I can provide more information about the enviroment if necessary.

Using Metadata in a transformation

$
0
0
I know I can get column/stream names using the Utility > Metadata structure of a stream step.

Where I'm having a conceptual failure is how I can use that information.

I want to be able to define certain columns as 'single response' or 'text' in one file.

Then use that information on my main input file and process the columns I have identified as 'single response' in one sub-transformation and those I've identified as 'text' in a separate transformation.

I have multiple files in the same format but each has their own processing needs based on the column/field name.

For example all the files have FIELDA, FIELDB, FIELDC - but in some of the files FIELDA is 'single response' and in others it's 'text'.

I want to be able to process each file with the same transformation in conjunction with a controlling file that will identify FIELDA as 'single response' or 'text' for that particular file.
I'm having trouble visualizing how I can tell tell Kettle how to process based on the field name.

Any help would be appreciated.

processing the same records over and over

$
0
0
I have a job using arinpu to aroutput but it keeps processing the same records over and over. How do I stop it which is thee last record on the input form.

how to run same transformation parallel using different jobs in PDI

$
0
0
I have a common transformation which logs the run log details which is called at job level. The same transformation is called in two different jobs but when the jobs is executed in parallel one of them will be successful. So how to run same transformation parallel in different jobs.

wait for parallel job to complete

$
0
0
Hi, In a job I am calling 3 jobs. two are running in parallel and third should wait until first and second jobs completes. but currently any of the jobs completes it will run the third job. I tried using wait for, dummy and success step but no luck. Please assist. Thanks

Unable to process bunch of excel workbench format files.

$
0
0
Hi Folks,
I'm using spoon 5.2 when I use the component "Text file Input" for attaching multiple files(excel worksheet) from a directory ,in the file tab of this component when I click on "show file content" button it gives me with some junk value.
please help me on this.
thanks :-)

SQL Server Connection in Schema Workbench

$
0
0
Hi, I'm a newbie on Pentaho. I want to create a cube with SQL Server connection in Schema Workbench, I connect to SQL Server with ODBC but schema workbench didn't detect the tables. I have made cube before but with MySQL connection, not SQL Server. Can someone tell me how to create connection to SQL Server ? I try connect using jdbc but didn't work.

Thank you.

[BI Server 5.4.0] How to send report by automatic mail?

$
0
0
Hi everybody,

I'm new and not used to BI server functionnalities. I apologize in advance for my poor level on this problem and in Javascript.

I've created a report with CDE, simple report with no specific problem. And now, i want to send this report every hour by mail with an automatic task.

I've seen there is some kind of task planner using the calendars part and specifying the mail server parameters. But i can't find a way to make it work.

No problem with my mail parameters, using the test button, i've received the confirmation without problem but no report received with the automatic task.

Is there another way to send the report with a common format like pdf at specific hours?

I've tried this tutorial : http://joshid.github.io/blog/2014/10...ho-pdf-export/ but no way to make it work...

Pentaho BigData plugin CDH5, Namenode HA aware or not ?

$
0
0
Hi,

I am connecting without problems to Hadoop for file exchanges as long as we provide static FQDN's to the node which is the current ACTIVE node in a Hadoop setup with 1 ACTIVE NN and the other one on STANDBY.
Problems begin when the ACTIVE NN role switches to another node.
CDH 5.3 provides for automatic ACTIVE/STANDBY namenode lookup by putting extra config "dfs.nameservice" details in hdfs-site.xml like:

<property>
<name>dfs.nameservices</name>
<value>nameservice1</value>
</property>
<property>
<name>dfs.ha.namenodes.nameservice1</name>
<value>namenode75,namenode56</value>
</property>

This way you can conveniently address the nameservice as the generic NN address while the final NN resolution to the ACTIVE NN is done for you in the background.

But when I provide these extra config (hdfs-site.xml) details to Spoon and restart, nothing changes. Only the ACTIVE NN can be addressed by giving the FQDN of the specific server.

So is the Pentaho plugin CDH5.3 namenode HA aware or not? Or am I missing something.

(It certainly does not look like it, since here http://wiki.pentaho.com/display/BAD/...Configurations , it says that the hdfs-site.xml is not in scope at all for hadoop connections)

Pentaho version: PDI Enterprise 5.3.3
Hadoop versoin: CDH 5.3

How to load multiple xmls in oracle database using pentaho.

$
0
0
We are using xml input stream as source & table output as destination to load single xml to oracle database. But we wanted to load multiple xmls at a time. Can some one help ? How to loop the transformation to load multiple xmls?
Or is there any alternate way any job etc to do the bulk loading of xmls in database?

How to compare two files using File Compare job?

$
0
0
Hi we are trying to compare two xmls using file compare job in pentaho. But when we configured the job i.e. provide the file path etc & try to execute it it was failing.
We to see the out put result of the file compare job?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>