Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

LeastMedSq Query

$
0
0
Hello,

A while ago, I used the LeastMedSq method to do some basic linear regression on a data set.
Since then, a colleague has attempted to verify my work using a spreadsheet and the basic knowledge of what Least Median Squared Error is.

The verification did not get the same result as WEKA. I cannot work out how to see what WEKA is doing exactly when it uses LeastMedSq for classification, but it doesn't seem to be what we understood by least median squared error. This is not something particular to our data set - even on the most basic test data sets, we do not get a match when we try to manually minimise the median squared error of the data set.

Could you please advise? Is there an easy way to see what WEKA understands by least median squared error?

Thank you.

Menu Option for Sparkl and CRS

$
0
0
I installed Sparkl and Community Repository Syncronizer from Marketplace in biserver-ce 5.2 and I can't see the menu option to open these plugins, but I can see saiku and Ivy plugins.

Embedding Cost Function in the algorythm

$
0
0
In a direct mailing project the info is:
cost of delivery: 6.5
Value of product: 41.9
If there was no response then - 0
If there was positive response then - 1

I am trying to create a cost function matrix and am a bit confused how it should look.
To my understanding the confusion matrix looks like that (I don't insert the profit, only the costs):
TP=6.5 FP=6.5
FN=41.9 TN=0

However, Weka seems to produce a matrix like that:Snap 2015-02-03 at 14.06.00.png
TP FP
FN TN

What am I missing and where do I go wrong?

Thank you ,
Tamir
Attached Images

Starting Pentaho bi-ce 5.2 as service using systemd

$
0
0
Dear All,

I'm trying to create a service that automatically starts the pentaho-bi-ce 5.2 on boot.

I'm running on Fedora 20

Has anybody done this because I can't find any threads about it?

I have created a .service file that looks like this:

Code:

[Unit]
Description=Pentho 5.2 Bi-server-CE
After=syslog.target
After=network.target
After=mysqld.service

[Service]
Type=simple
User=root
Group=root
Environment=JAVA_HOME=/usr/java/latest
ExecStart=/opt/pentaho52/biserver-ce/start-pentaho.sh
ExecStartPost=/bin/echo pentaho...end of unitfile

TimeoutSec=300

[Install]
WantedBy=graphical.target

The service itself works both when manually started and on boot and the output looks the same as when starting the script directly but the server doesnt start anyways. The output of the service looks like this:

Code:

[root@localhost system]# systemctl status pentaho52bice.service
pentaho52bice.service - Pentho 5.2 Bi-server-CE
  Loaded: loaded (/lib/systemd/system/pentaho52bice.service; enabled)
  Active: inactive (dead) since tis 2015-02-03 17:11:07 CET; 40min ago
  Process: 4474 ExecStartPost=/bin/echo pentaho...end of unitfile (code=exited, status=0/SUCCESS)
  Process: 4473 ExecStart=/opt/pentaho52/biserver-ce/start-pentaho.sh (code=exited, status=0/SUCCESS)
 Main PID: 4473 (code=exited, status=0/SUCCESS)
  CGroup: /system.slice/pentaho52bice.service

feb 03 17:11:07 localhost.localdomain echo[4474]: pentaho...end of unitfile
feb 03 17:11:07 localhost.localdomain systemd[1]: Started Pentho 5.2 Bi-server-CE.
feb 03 17:11:07 localhost.localdomain start-pentaho.sh[4473]: DEBUG: Using JAVA_HOME
feb 03 17:11:07 localhost.localdomain start-pentaho.sh[4473]: DEBUG: _PENTAHO_JAVA_HOME=/usr/java/latest
feb 03 17:11:07 localhost.localdomain start-pentaho.sh[4473]: DEBUG: _PENTAHO_JAVA=/usr/java/latest/bin/java
feb 03 17:11:07 localhost.localdomain start-pentaho.sh[4473]: Using CATALINA_BASE:  /opt/pentaho52/biserver-ce/tomcat
feb 03 17:11:07 localhost.localdomain start-pentaho.sh[4473]: Using CATALINA_HOME:  /opt/pentaho52/biserver-ce/tomcat
feb 03 17:11:07 localhost.localdomain start-pentaho.sh[4473]: Using CATALINA_TMPDIR: /opt/pentaho52/biserver-ce/tomcat/temp
feb 03 17:11:07 localhost.localdomain start-pentaho.sh[4473]: Using JRE_HOME:        /usr/java/latest
feb 03 17:11:07 localhost.localdomain start-pentaho.sh[4473]: Using CLASSPATH:      /opt/pentaho52/biserver-ce/tomcat/bin/bootstrap.jar

Does anybody have an idea what the problem could be?

Thanks


Bernhard

Migrate Data Sources

$
0
0
Hello,

I want to migrate a Pentaho server to a new OS. I installed the software, but when I move the PRPT files it doesn't work.

I think I need to move the Data Sources, how could I do it?

Regards.

Any Embedded BI Platform for Online Users?

$
0
0
Assume I have an online web site with lots of users, and I'd like to create some dashboards/charts for users in the web site. For example when a user log into his account, he can see a few dashboards/charts/tables about his account activity, status, healthy, etc. I wish some flexibility to the dashboards/charts/tables as we can achieve in regular BI suite.

Any good suggestions for this? Or I have to develop it use Perl or PHP directly?

Thanks.

Cut strings based on values in an Excel file. Metadata Injection?

$
0
0
I am trying to figure out how to cut strings based on column definitions that are stored in an Excel file.

We get dozens of text input files from vendors. Each file has a different file layout (the files are fixed length; not delimited; and the definition of the content is in an Excel file). For example, an Excel file will say that for input file X, the AccountNumber field starts in column 10 and ends in column 19. For input file Y, the AccountNumber field is in a different position.

Since each input file has a couple of hundred columns, it is extremely painful to type in the starting and ending positions for each field, into a Cut Strings transformation. Especially since there are dozens of files.

(It's actually worse than that. Each file has five to ten record sub-types, which are marked in the first three or four characters of the record. Each record sub-type contains a different set of fields. It's not hard to split these out into their own streams... but I don't want to hand-enter 6000 different column definitions.)


I have looked at metadata injection. The video at http://wiki.pentaho.com/display/EAI/...data+Injection (one is by Matt) and the documentation don't make it quite clear WHAT things are able to be "injected". Can I inject the values of InStreamField, OutStreamField, CutFrom, and CutTo that I see in the String Cut step?


Can I inject a series of Name, Type, Format, Position, and Length values that are on the Fields tab of the Text File Input step?


Please let me know if there is more documentation on this somewhere.

Thanks.

Issue with Row denormaliser

$
0
0
I'm starting with Pentaho Data-Integration. I'm trying to update my db with data from XML files. However, the Get data from XML output is not in the correct format. Then, I use Row denormaliserstep to do this correction. But, the result is like this:MY transformation is that simple:There's some steps that I can use to Merge this rows? I tried with Merge Rows Diff, Join Rows and Merge Join. None of these steps worked.
Thanks a lot! Sorry for my English :)

CDE CDF.js performance

$
0
0
Hello all,

I'm facing a problem with CDE CDF dashboard, the dashboard always load CDF.js?v=ff573f52f47d2c8e8672b1494748e8f8 cdf-blueprint-script-includes.js?v=ac6b3ee58285bbd92b6274d565b7f320 with differantes versions each load with takes more than 30s to load.
how can I cache those files and where can I find the function CDF.js?v= that make th version dinamicaly !!!???

Thank you in advance.

Where can I find v5.2 REST client libraries or WADL?

running jobs from eclipse

$
0
0
Hello I am trying a run a job from eclipse, using it as a test case. When i run the job, i get the following error message:

org.pentaho.di.core.exception.KettleXMLException:
The specified file '"filename".kjb' does not contain transformation XML.
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2671)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2628)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2605)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2585)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2550)
at org.pentaho.di.trans.TransMeta.<init>(TransMeta.java:2513)
at test.poc_fx_job.executeTransformation(filename.java:31)
at test.poc_fx_job.main(filename.java:23)



I have ran and tested my transfrmations before using same method and they all ran.
Peease advise.

Bullet Chart Labels Overlap

$
0
0
Is there an extension point to handle the possibility of Bullet Chart Rule Labels overlapping? I know I can do something like "bulletRuleLabel_textAngle: -0.5" so that they are angled, but is there anything like "bulletRuleLabel_textOverlappedLabelsMode: 'hide'"? I've tried this amongst other nuances but no luck so far.

Thanks,
Matt

Extract Date From File Name - Example Given in Kettle

$
0
0
Hi Forum,

I'm exploring java script based transformations using Kettle provided examples. ( 5.2.0 KETTLE SAMPLES)

I've taken "Javascript -extract date from file name.ktr" from samples provided. When I run the provided one there is no issue at all but when I re-develop the same it is showing me below error

Code in java script step is :

CASE-1 :
var dat = DIR.rightstr(16);

Error :

2015/02/04 11:26:36 - Modified Java Script Value.0 - TypeError: Cannot find function rightstr in object C:\temp\LIM_kettle\ST18.PROTOKL.TENOPROD_2006_07_21_00_09. (script#6)


CASE-2 :

// C:\temp\LIM_kettle\ST18.PROTOKL.TENOPROD_2006_07_21_00_09

var dat = DIR.Clone().rightstr(16).str2dat("yyyy_MM_dd_HH_mm");


In this case I'm not able to Get variables in java script .. and the error message as follows

TypeError: Cannot find function Clone in object test value test value test value test value test value test value test value test value test value test value. (script#4)


Observations :
I have taken "Get Rows" given filed name as DIR and given a default value and connected it java script step... I could find DIR.getString() in example provided by kettle but when I do the same in my re-develop it is not showing..

Dear experts, could you suggest me where I'm doing mistake ?

Below is the screen shot of sample taken from kettle

string.jpg


Thank you in Advance :-)

Kettle beginner ...
Attached Images

UserDefinedJava Class

$
0
0
Hello

I am trying to use a UserDefinedJava Class in my kettle ktr file which has some references to a internal Jar that exists in the lib folder of pentaho installation. The referenced jar is a logger class which creates rolling log files in the project directory and this case it is in the kettle default installation directory. All works fine when I run the kettle transformation from spoon. But, when I execute the same transformation through JAVA API, the transformation completes fine but I don't see any logs generated. Ideally, the java code isn't aware of the kettle installation since we refer directly the kettle packages through maven in java(kettle core,kettle engine and kettle db) . The transformation job executes without any errors but the UserDefinedJava class which is supposed to create the log files isn't getting created.

Is there a way to specify through java code when we trigger kettle jobs, the default project path to execute kettle jobs and tell the kettle engine to internally create the logs files in any specific directory...

Please help since I am stuck with this issue for last two days...

Thanks

transformation works on Windows but not on Linux

$
0
0
Hi all!

I have a transformation that reads from a flat file and shall transform this into XML. I got a new requirement the other day that the creation timestamp found in the filename shall end up in the XML as well. So, I extract the String from the filename (step: "Get CreatedTS", see attached transformation), cut away the constant String "tradein_" from that previous String (step: "Format CreatedTS") and type the field to a timestamp, using format mask "yyyyMMddHHmm" (step: "String to Timestamp"). Finally, I put it in the XML (step: "XML TradeIn").

All works fine on my computer running Windows 8 but when I run it with the very same input file on a Linux server (tbh. I don't know the exact version or even distribution but if it matters I will find out) I get the following error:
Code:

org.pentaho.di.core.exception.KettleValueException: CreatedTS Timestamp : couldn't convert string [201409021302] to a timestamp, expecting format [yyyy-mm-dd hh:mm:ss.ffffff]
Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]


        at org.pentaho.di.core.row.value.ValueMetaTimestamp.convertStringToTimestamp(ValueMetaTimestamp.java:205)
        at org.pentaho.di.core.row.value.ValueMetaTimestamp.convertData(ValueMetaTimestamp.java:353)
        at org.pentaho.di.trans.steps.selectvalues.SelectValues.metadataValues(SelectValues.java:342)
        at org.pentaho.di.trans.steps.selectvalues.SelectValues.processRow(SelectValues.java:386)
        at org.pentaho.di.trans.step.RunThread.run(RunThread.java:60)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: Timestamp format must be yyyy-mm-dd hh:mm:ss[.fffffffff]
        at java.sql.Timestamp.valueOf(Timestamp.java:202)
        at org.pentaho.di.core.row.value.ValueMetaTimestamp.convertStringToTimestamp(ValueMetaTimestamp.java:203)

So, it seems the format mask is not used in Linux for some reason...???

Can anyone help here? I'd really like to understand this problem, but don't know where to start. If you know a nice workaround, fine as well.

Pentaho BI server 5.2 with mysql EmbeddedQuartzSystemListener.ERROR_0007_SQLERROR

$
0
0
Hello All,

I have configured pentaho 5.2 CE with mysql by following below steps :

https://anonymousbi.wordpress.com/20...llation-guide/
and
https://interestingittips.wordpress....on-5-1-to-5-2/

I have also removed biserver-ce/tomcat/webapps/pentaho/WEB-INF/lib/pentaho-hadoop-hive-jdbc-shim-5.2.0.0-209.jar file and changed my sql drivers but still it is showing me below error :

16:50:01,670 ERROR [EmbeddedQuartzSystemListener] EmbeddedQuartzSystemListener.ERROR_0007_SQLERROR
org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (socket creation error)

Below is attached logs :

Feb 04, 2015 4:49:37 PM org.apache.catalina.core.AprLifecycleListener init
INFO: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on
the java.library.path: C:\Program Files\Java\jdk1.7.0_45\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\Progr
am Files\Java\jre7\bin;D:\oracle\product\10.2.0\client_1\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windo
ws\System32\WindowsPowerShell\v1.0\;C:\Program Files\TortoiseSVN\bin;C:\instantclient-basic-windows.x64-11.2.0.3.0\instantcli
ent_11_2;c:\Program Files\7-Zip;C:\Program Files\WinRAR;D:\DATA_DIR\SCRIPT;D:\SELF\PENTAHO\pdi-ce-4.3.0-stable-Kettle\data-in
tegration;C:\Program Files (x86)\Microsoft SQL Server\110\Tools\Binn\ManagementStudio\;C:\Program Files (x86)\Microsoft SQL S
erver\110\Tools\Binn\;C:\Program Files\Microsoft SQL Server\110\Tools\Binn\;C:\Program Files (x86)\Microsoft SQL Server\110\D
TS\Binn\;D:\SELF\My-SQL\mysql-5.6.10-win32\mysql-5.6.10-win32;C:\Program Files\Microsoft SQL Server\110\DTS\Binn\;C:\Program
Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\;C:\Program Files\7-Zip;C:\Program Files (x86)\Microso
ft SQL Server\100\Tools\Binn\;C:\Program Files\Microsoft SQL Server\100\Tools\Binn\;C:\Program Files\Microsoft SQL Server\100
\DTS\Binn\;C:\Program Files\TortoiseGit\bin;C:\Program Files\Java\jdk1.7.0_45\bin;d:\Program Installed\vertica\bin;C:\Program
Files\TortoiseHg\;D:\Program Installed\Vagrant\bin;D:\UBUNTU\Nitesh\apache-ant-1.9.4\bin;C:\Program Files\FERRO Software\Ftp
Use;C:\Program Files (x86)\Git\cmd;.
Feb 04, 2015 4:49:37 PM org.apache.coyote.http11.Http11Protocol init
INFO: Initializing Coyote HTTP/1.1 on http-8080
Feb 04, 2015 4:49:37 PM org.apache.catalina.startup.Catalina load
INFO: Initialization processed in 581 ms
Feb 04, 2015 4:49:37 PM org.apache.catalina.core.StandardService start
INFO: Starting service Catalina
Feb 04, 2015 4:49:37 PM org.apache.catalina.core.StandardEngine start
INFO: Starting Servlet Engine: Apache Tomcat/6.0.41
Feb 04, 2015 4:49:37 PM org.apache.catalina.startup.HostConfig deployDescriptor
INFO: Deploying configuration descriptor pentaho.xml
16:50:01,670 ERROR [EmbeddedQuartzSystemListener] EmbeddedQuartzSystemListener.ERROR_0007_SQLERROR
org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (socket creation error)
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549)
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388)
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
at org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener.verifyQuartzIsConfigured(EmbeddedQuartzSystemL
istener.java:158)
at org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener.startup(EmbeddedQuartzSystemListener.java:100)

at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:398)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:389)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:368)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:389)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:326)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:323)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:368)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:323)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:294)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:207)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)

at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.sql.SQLException: socket creation error
at org.hsqldb.jdbc.Util.sqlException(Unknown Source)
at org.hsqldb.jdbc.jdbcConnection.<init>(Unknown Source)
at org.hsqldb.jdbcDriver.getConnection(Unknown Source)
at org.hsqldb.jdbcDriver.connect(Unknown Source)
at org.apache.commons.dbcp.DriverConnectionFactory.createConnection(DriverConnectionFactory.java:38)
at org.apache.commons.dbcp.PoolableConnectionFactory.makeObject(PoolableConnectionFactory.java:582)
at org.apache.commons.dbcp.BasicDataSource.validateConnectionFactory(BasicDataSource.java:1556)
at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1545)
... 40 more
16:50:01,686 ERROR [Logger] Error: Pentaho
16:50:01,687 ERROR [Logger] misc-org.pentaho.platform.engine.core.system.PentahoSystem: org.pentaho.platform.api.engine.Penta
hoSystemException: PentahoSystem.ERROR_0014 - Error while trying to execute startup sequence for org.pentaho.platform.schedul
er2.quartz.EmbeddedQuartzSystemListener
org.pentaho.platform.api.engine.PentahoSystemException: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem
.ERROR_0014 - Error while trying to execute startup sequence for org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemL
istener
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:331)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:294)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:207)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:135)

at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4210)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4709)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:583)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:822)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - Error while trying to execute s
tartup sequence for org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:407)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:389)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:368)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:389)
at org.pentaho.platform.engine.core.system.PentahoSystem.access$000(PentahoSystem.java:77)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:326)
at org.pentaho.platform.engine.core.system.PentahoSystem$1.call(PentahoSystem.java:323)
at org.pentaho.platform.engine.core.system.PentahoSystem.runAsSystem(PentahoSystem.java:368)
at org.pentaho.platform.engine.core.system.PentahoSystem.notifySystemListenersOfStartup(PentahoSystem.java:323)
... 27 more
Caused by: org.pentaho.platform.api.engine.PentahoSystemException: PentahoSystem.ERROR_0014 - Error while trying to execute s
tartup sequence for org.pentaho.platform.scheduler2.quartz.EmbeddedQuartzSystemListener
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:399)
... 35 more
16:50:01,694 ERROR [Logger] Error end:
Pentaho BI Platform server failed to properly initialize. The system will not be available for requests. (Pentaho Open Source
BA Server 5.2.0.0-209) Fully Qualified Server Url = http://localhost:8080/pentaho/, Solution Path = D:\PENTAHO-CE\pentaho5.2
\biserver-ce-5.2.0.0-209\biserver-ce\pentaho-solutions
Feb 04, 2015 4:50:01 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory pentaho-style
Feb 04, 2015 4:50:02 PM org.apache.catalina.core.NamingContextListener addResource
WARNING: Failed to register in JMX: javax.naming.NamingException: Could not load resource factory class [Root exception is ja
va.lang.ClassNotFoundException: org.apache.commons.dbcp.BasicDataSourceFactory]
Feb 04, 2015 4:50:02 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory ROOT
Feb 04, 2015 4:50:02 PM org.apache.catalina.core.NamingContextListener addResource
WARNING: Failed to register in JMX: javax.naming.NamingException: Could not load resource factory class [Root exception is ja
va.lang.ClassNotFoundException: org.apache.commons.dbcp.BasicDataSourceFactory]
Feb 04, 2015 4:50:02 PM org.apache.catalina.startup.HostConfig deployDirectory
INFO: Deploying web application directory sw-style
Feb 04, 2015 4:50:02 PM org.apache.catalina.core.NamingContextListener addResource
WARNING: Failed to register in JMX: javax.naming.NamingException: Could not load resource factory class [Root exception is ja
va.lang.ClassNotFoundException: org.apache.commons.dbcp.BasicDataSourceFactory]
Feb 04, 2015 4:50:02 PM org.apache.coyote.http11.Http11Protocol start
INFO: Starting Coyote HTTP/1.1 on http-8080
Feb 04, 2015 4:50:02 PM org.apache.jk.common.ChannelSocket init
INFO: JK: ajp13 listening on /0.0.0.0:8009
Feb 04, 2015 4:50:02 PM org.apache.jk.server.JkMain start
INFO: Jk running ID=0 time=0/22 config=null
Feb 04, 2015 4:50:02 PM org.apache.catalina.startup.Catalina start
INFO: Server startup in 24410 ms

Assist me to resolve issue.

Thanks in Advance !!

JDBC connection - Crate.io

$
0
0
Hi Everyone,

I'm trying to connect to a Crate.io database. They say the database uses a standard jdbc 4 driver.
More information about the driver can be found at:
https://crate.io/docs/projects/crate-jdbc/stable/

Currently it is just hanging when "Test" my Database Connection.
Please help

Kind Regards,

Extreme Newbie - Did I configure installation wrong? Unable to run sakila demo

$
0
0
Hi, I am just beginning to learn Pentaho via the Wiley publication "Pentaho Kettle Solutions". I am a statistician by training and I know R, Python, Fortran, a little Pascal, a little SQL and C...so I am not afraid of programming languages. But, I am not BI solutions experienced, by any means. That being said, I imagine this request for help will not include all the information you need to possibly help me. I will respond quickly with any new information you request.


OK, so I installed MySQL on my desktop from installer 1.4 (MySQL 5.6, I believe). Then I *thought* I installed Pentaho 5.2.0.0 as instructed in the book. Following in Chapter 4 I loaded (connected?) the sakila_dwh data with Spoon and then the load_dim_date.ktr transformation. When I ran the transformation, it failed with the errors below.

I might be misunderstanding but it looks like it's having trouble speaking to the output table via MySQL, is that right? Specifically with some batch function? Other errors may cascade from there, but I have no idea. Can anyone help me get started with this? Did I configure something wrong?

Code:

2015/02/04 10:07:37 - Version checker - OK
2015/02/04 10:08:02 - Spoon - Transformation opened.
2015/02/04 10:08:02 - Spoon - Launching transformation [load_dim_date]...
2015/02/04 10:08:02 - Spoon - Started the transformation execution.
2015/02/04 10:08:03 - load_dim_date - Dispatching started for transformation [load_dim_date]
2015/02/04 10:08:03 - Load dim_date.0 - Connected to database [sakila_dwh] (commit=10000)
2015/02/04 10:08:03 - Generate 10 years.0 - Finished processing (I=0, O=0, R=0, W=3660, U=0, E=0)
2015/02/04 10:08:03 - Calculate Dimension Attributes.0 - Optimization level not specified.  Using default of 9.
2015/02/04 10:08:03 - Calculate Dimension Attributes.1 - Optimization level not specified.  Using default of 9.
2015/02/04 10:08:03 - Calculate Dimension Attributes.2 - Optimization level not specified.  Using default of 9.
2015/02/04 10:08:03 - Calculate Dimension Attributes.3 - Optimization level not specified.  Using default of 9.
2015/02/04 10:08:03 - Day Sequence.0 - Finished processing (I=0, O=0, R=3660, W=3660, U=0, E=0)
2015/02/04 10:08:06 - Calculate Dimension Attributes.0 - Finished processing (I=0, O=0, R=915, W=915, U=0, E=0)
2015/02/04 10:08:06 - Calculate Dimension Attributes.3 - Finished processing (I=0, O=0, R=915, W=915, U=0, E=0)
2015/02/04 10:08:06 - Calculate Dimension Attributes.2 - Finished processing (I=0, O=0, R=915, W=915, U=0, E=0)
2015/02/04 10:08:06 - Calculate Dimension Attributes.1 - Finished processing (I=0, O=0, R=915, W=915, U=0, E=0)
2015/02/04 10:08:08 - Load dim_date.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Unexpected batch update error committing the database connection.
2015/02/04 10:08:08 - Load dim_date.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : org.pentaho.di.core.exception.KettleDatabaseBatchException:
2015/02/04 10:08:08 - Load dim_date.0 - Error updating batch
2015/02/04 10:08:08 - Load dim_date.0 - Unknown column 'date_value' in 'field list'
2015/02/04 10:08:08 - Load dim_date.0 -
2015/02/04 10:08:08 - Load dim_date.0 -              at org.pentaho.di.core.database.Database.createKettleDatabaseBatchException(Database.java:1377)
2015/02/04 10:08:08 - Load dim_date.0 -              at org.pentaho.di.core.database.Database.emptyAndCommit(Database.java:1366)
2015/02/04 10:08:08 - Load dim_date.0 -              at org.pentaho.di.trans.steps.tableoutput.TableOutput.dispose(TableOutput.java:571)
2015/02/04 10:08:08 - Load dim_date.0 -              at org.pentaho.di.trans.step.RunThread.run(RunThread.java:96)
2015/02/04 10:08:08 - Load dim_date.0 -              at java.lang.Thread.run(Unknown Source)
2015/02/04 10:08:08 - Load dim_date.0 - Caused by: java.sql.BatchUpdateException: Unknown column 'date_value' in 'field list'
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1815)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.PreparedStatement.executeBatch(PreparedStatement.java:1277)
2015/02/04 10:08:08 - Load dim_date.0 -              at org.pentaho.di.core.database.Database.emptyAndCommit(Database.java:1353)
2015/02/04 10:08:08 - Load dim_date.0 -              ... 3 more
2015/02/04 10:08:08 - Load dim_date.0 - Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'date_value' in 'field list'
2015/02/04 10:08:08 - Load dim_date.0 -              at sun.reflect.GeneratedConstructorAccessor31.newInstance(Unknown Source)
2015/02/04 10:08:08 - Load dim_date.0 -              at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
2015/02/04 10:08:08 - Load dim_date.0 -              at java.lang.reflect.Constructor.newInstance(Unknown Source)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.Util.handleNewInstance(Util.java:377)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.Util.getInstance(Util.java:360)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:978)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3887)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3823)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2435)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2582)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2530)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1907)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2141)
2015/02/04 10:08:08 - Load dim_date.0 -              at com.mysql.jdbc.PreparedStatement.executeBatchSerially(PreparedStatement.java:1773)
2015/02/04 10:08:08 - Load dim_date.0 -              ... 5 more

2015/02/04 10:08:08 - Load dim_date.0 - Finished processing (I=0, O=0, R=3660, W=0, U=0, E=1)
2015/02/04 10:08:08 - load_dim_date - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Errors detected!
2015/02/04 10:08:08 - Spoon - The transformation has finished!!
2015/02/04 10:08:08 - load_dim_date - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Errors detected!
2015/02/04 10:08:08 - load_dim_date - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Errors detected!

2015/02/04 10:08:08 - load_dim_date - Transformation detected one or more steps with errors.
2015/02/04 10:08:08 - load_dim_date - Transformation is killing the other steps!

Dynamically generate SOAP request

$
0
0
Hi,

I am facing a challenge to generate SOAP calls based on the records of an MS Excel file.

The file has a pre-defined set of attributes, but depending how it is populated it may determine how you need to build the SOAP envelope. In other words I may need to read two or three records to build one SOAP call.

The file looks like this:

ID EmployeeNum EmailType Address
1 001 Personal aa at aa dot com
1 Work bb at bb dot com


In this case it means that I need to read two records to build one SOAP request.
With one record per line it is really easy to map columns to xml elements usine the Modified JavaScript Value, but in this case I really need some ideas on how to do it.

So the XML should look like this:


<?xml version="1.0" encoding="UTF-8"?>
<employee employenum="001>
<email>
<personal_email>aa at aa dot com</ersonal_email>
<work_email>bb at bb dot com</ersonal_email>
</email>
</employee>

Thanks in advance.

Martin

Dynamically generate SOAP request

$
0
0
Hi,

I am facing a challenge to generate SOAP calls based on the records of an MS Excel file.

The file has a pre-defined set of attributes, but depending how it is populated it may determine how you need to build the SOAP envelope. In other words I may need to read two or three records to build one SOAP call.

The file looks like this:

ID Employee EmailType Email Address
1 001 Personal aa at aa dot com
1 Work bb at bb dot com


In this case it means that I need to read two records to build one SOAP request.
With one record per line it is really easy to map columns to xml elements usine the Modified JavaScript Value, but in this case I really need some ideas on how to do it.

So the XML should look like this:


<?xml version="1.0" encoding="UTF-8"?>
<employee employenum="001>
<email>
<personal_email>aa at aa dot com</ersonal_email>
<work_email>bb at bb dot com</ersonal_email>
</email>
</employee>

Thanks in advance.

Martin
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>