Please, how do I enable remote connection PDI? Thanks
↧
PDI and Remote DB Connections
↧
Get File Names - Empty Rows
Use "Get File Names" component using wild card and "Detect empty stream" component. If there are no files found, Detect empty stream give one row extra which you can use to identity whether file has arrived or not. try this once . Its working fine
↧
↧
How do I connect PDI to read SSAS cubes? From Analysis Services.
Hey maybe it's bizarre, but in multiple BI systems where you want to prevent redundant calculations ---
Anyway let's say I want to pull info/ MDX/ Mondrian query some existing Microsoft SSAS cubes and pull it into Pentaho PDI.
Uhm .... how do I configure that connection? Is it even possible out-of-the-box without heavy customization?
I have all the proper connection credentials/ know the connection string. Just not sure if it quite works in Pentaho PDI. SSIS of course has a method for it.
Anyway let's say I want to pull info/ MDX/ Mondrian query some existing Microsoft SSAS cubes and pull it into Pentaho PDI.
Uhm .... how do I configure that connection? Is it even possible out-of-the-box without heavy customization?
I have all the proper connection credentials/ know the connection string. Just not sure if it quite works in Pentaho PDI. SSIS of course has a method for it.
↧
PLUG - Pentaho London Usergroup - Machine Intelligence and Apache Beam
There's only 3 weeks to go until the next Pentaho London Usergroup (PLUG)
The agenda this month blows everything out of the park, even if i say so myself!
Not only do we have Ken Wood travelling from America to present the plugin machine intelligence stuff from Pentaho labs, but we ALSO have Matt Casters presenting the kettle beam stuff. If you've not seen this yet you'll be amazed!
Please register here, and don't forget you must sign up on the skillsmatter website too.
https://www.meetup.com/Pentaho-Londo...nts/256773962/
Finally - One request. Could I ask everyone to share and help promote this event - the more networks and places where we engage the better!
See you there!
The agenda this month blows everything out of the park, even if i say so myself!
Not only do we have Ken Wood travelling from America to present the plugin machine intelligence stuff from Pentaho labs, but we ALSO have Matt Casters presenting the kettle beam stuff. If you've not seen this yet you'll be amazed!
Please register here, and don't forget you must sign up on the skillsmatter website too.
https://www.meetup.com/Pentaho-Londo...nts/256773962/
Finally - One request. Could I ask everyone to share and help promote this event - the more networks and places where we engage the better!
See you there!
↧
Database Views vs. Metadata Editor
Hi fellow developers!
should we still use database views for data connections?
doesn't the metadata editor essentially give a user interface into creating views?
I don't have a problem creating views in pgAdmin but i am feeling like i am duplicating the effort in the Pentaho Metadata editor.
What is best practice?
Thanks in advance!
should we still use database views for data connections?
doesn't the metadata editor essentially give a user interface into creating views?
I don't have a problem creating views in pgAdmin but i am feeling like i am duplicating the effort in the Pentaho Metadata editor.
What is best practice?
Thanks in advance!
↧
↧
How to join constant columns from another step?
Is it possible to do something like a "Stream Value Lookup" which adds a handful of fields to a stream, but without looking up keys?
Basically I can achieve this behavior by adding a dummy "key" to the main stream, then use this key to lookup the fields to be joined into every row. But it would be prettier if there was no need to add this dummy key, and just join the fields into each row without any key/lookup logic.
Example:
Let's say a main stream has fields A, B C and we get 2 rows coming in (1,2,3) and (4,5,6).
A B C
1 2 3
4 5 6
A different step looks up additional fields E, F from some separate source (db or whatever) and joins this row (x, y). The result is shown below.
A B C D E
1 2 3 x y
4 5 6 x y
This is basically a cartesian product? But I'm not sure if "Jon Rows (cartesian product)" is suitable for large amounts of data and if it needs to receive all rows before doing the join?
Basically I can achieve this behavior by adding a dummy "key" to the main stream, then use this key to lookup the fields to be joined into every row. But it would be prettier if there was no need to add this dummy key, and just join the fields into each row without any key/lookup logic.
Example:
Let's say a main stream has fields A, B C and we get 2 rows coming in (1,2,3) and (4,5,6).
A B C
1 2 3
4 5 6
A different step looks up additional fields E, F from some separate source (db or whatever) and joins this row (x, y). The result is shown below.
A B C D E
1 2 3 x y
4 5 6 x y
This is basically a cartesian product? But I'm not sure if "Jon Rows (cartesian product)" is suitable for large amounts of data and if it needs to receive all rows before doing the join?
↧
Add variable or parameter to subjetc in a mail step
Hi,
I need to write automatically the date of the last week in a job because I'm sending a file with content of the last week.
So every monday I send an email with a file and in the subject I want to write "Inform for the week 14-01-2019" (in this case, the email will be sent the 21-01-2019).
Could you help me?
If this is not possible, would it be able to write the automatically date in the name of the excel file with a variable or parameter??
Thanks in advance.
I need to write automatically the date of the last week in a job because I'm sending a file with content of the last week.
So every monday I send an email with a file and in the subject I want to write "Inform for the week 14-01-2019" (in this case, the email will be sent the 21-01-2019).
Could you help me?
If this is not possible, would it be able to write the automatically date in the name of the excel file with a variable or parameter??
Thanks in advance.
↧
Trigger a Table Input without the use of dummy parameters?
In this example a transformation starts with a Table Input step. However, I only want to execute this select query in certain situations. So I create a step before that, doing some logic and sending a dummy value to the table input streams IF the query should be executed. The Table Input step is then configured using "Insert Data from step" and "Execute for each row". Great! However, the transformations crashes because dummy value is not used as a parameter in the sql query! Why is this usage strictly enforced?
The solution was to make dummy logic for the dummy parameter (value always set to static '1') inside the sql, for example:
...
WHERE 1 = ?
Is there some way to get the same behavior without polluting sql code with this dummy garbage?
The solution was to make dummy logic for the dummy parameter (value always set to static '1') inside the sql, for example:
...
WHERE 1 = ?
Is there some way to get the same behavior without polluting sql code with this dummy garbage?
↧
SPARK AEL execution through Pan
Hi,
I am able to run a PDI Transformation using AEL Spark Engine through Spoon, but I need to run this transformation using Pan and I dont see any option to specify Pan to use AEL Spark Engine to run.
Can anyone help me here please
Suresh
I am able to run a PDI Transformation using AEL Spark Engine through Spoon, but I need to run this transformation using Pan and I dont see any option to specify Pan to use AEL Spark Engine to run.
Can anyone help me here please
Suresh
↧
↧
REST client keeps sending "Transfer-Encoding: chunked"
How to remove header "Transfer-encoding: chunked" from REST client transmision?
Normally this header should be added when 'content-length' is unknown. But even if I add 'content-length', Pentaho keeps sending 'chunked'.
Is there any trick to manipulate headers Kettle sends?
Normally this header should be added when 'content-length' is unknown. But even if I add 'content-length', Pentaho keeps sending 'chunked'.
Is there any trick to manipulate headers Kettle sends?
↧
[First init PDI8.2] [Linux] Creating configuration won't terminate
Hello,
I'm using latest version of CentOS, I just downloaded PDI 8.2 and tried to launch kitchen with command line (file repository).
It seems that it wanted to configure some things but it never finishes and block at the same step every time I launch it.
Then I'm blocked and must kill processes (other shell session) to get hands back on the shell.
any clue anyone?
Thanks
I'm using latest version of CentOS, I just downloaded PDI 8.2 and tried to launch kitchen with command line (file repository).
It seems that it wanted to configure some things but it never finishes and block at the same step every time I launch it.
Then I'm blocked and must kill processes (other shell session) to get hands back on the shell.
Code:
$ ./kitchen.sh -rep=repository -dir=/ -trans=Catalogue_Offres_Activables.kjb
OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
16:12:52,459 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled
16:12:52,594 INFO [KarafInstance]
*******************************************************************************
*** Karaf Instance Number: 2 at /home/dd/pdi82/./system/karaf/caches/k ***
*** itchen/data-2 ***
*** FastBin Provider Port:52902 ***
*** Karaf Port:8803 ***
*** OSGI Service Port:9052 ***
*******************************************************************************
Jan 24, 2019 4:12:53 PM org.apache.karaf.main.Main launch
INFO: Installing and starting initial bundles
Jan 24, 2019 4:12:54 PM org.apache.karaf.main.Main launch
INFO: All initial bundles installed and set to start
Jan 24, 2019 4:12:54 PM org.apache.karaf.main.Main$KarafLockCallback lockAquired
INFO: Lock acquired. Setting startlevel to 100
2019/01/24 16:12:55 - Kitchen - Start of run.
2019/01/24 16:12:55 - RepositoriesMeta - Reading repositories XML file: /home/dd/.kettle/repositories.xml
ERROR: Kitchen can't continue because the job couldn't be loaded.
Creating configuration from org.apache.karaf.command.acl.bundle.cfg
Creating configuration from org.pentaho.caching-default.cfg
Creating configuration from org.apache.felix.fileinstall-deploy.cfg
Creating configuration from org.apache.karaf.command.acl.config.cfg
Creating configuration from jmx.acl.cfg
Creating configuration from org.apache.karaf.command.acl.scope_bundle.cfg
Creating configuration from org.apache.karaf.webconsole.cfg
Creating configuration from mondrian.cfg
Creating configuration from org.apache.karaf.features.obr.cfg
Creating configuration from jmx.acl.osgi.compendium.cm.cfg
Creating configuration from org.ops4j.pax.url.mvn.cfg
Creating configuration from org.apache.karaf.jaas.cfg
Creating configuration from org.apache.karaf.features.cfg
Creating configuration from org.ops4j.pax.logging.cfg
Creating configuration from org.apache.aries.rsa.provider.fastbin.cfg
Creating configuration from org.apache.karaf.command.acl.feature.cfg
Creating configuration from org.apache.karaf.kar.cfg
Creating configuration from org.apache.karaf.shell.cfg
Creating configuration from org.apache.activemq.server-default.cfg
Creating configuration from org.apache.karaf.log.cfg
Creating configuration from jmx.acl.org.apache.karaf.config.cfg
Creating configuration from org.pentaho.pdi.engine.spark.cfg
Creating configuration from org.apache.karaf.management.cfg
Creating configuration from org.apache.karaf.command.acl.jaas.cfg
Creating configuration from org.ops4j.pax.web.cfg
Creating configuration from org.apache.karaf.command.acl.kar.cfg
Creating configuration from org.apache.karaf.command.acl.system.cfg
Creating configuration from jmx.acl.org.apache.karaf.bundle.cfg
Creating configuration from org.apache.karaf.features.repos.cfg
Creating configuration from jmx.acl.java.lang.Memory.cfg
Creating configuration from org.apache.activemq.webconsole.cfg
Creating configuration from org.apache.karaf.command.acl.shell.cfg
Creating configuration from jmx.acl.org.apache.karaf.security.jmx.cfg
Creating configuration from org.pentaho.features.cfg
Thanks
↧
Browser Files page never ends loading files
Hi everyone.
Someone could help me. I got a error in Bi Server when try to browse files because spinning wheel never ends but just happend when the folder have files in there!
I try to inspect webpage but only found this error:
Another solution I found was modified characteres in "browser.js" file but was already changed. Changing "|" to "%7C".
Someone could help me. I got a error in Bi Server when try to browse files because spinning wheel never ends but just happend when the folder have files in there!
I try to inspect webpage but only found this error:
Code:
Uncaught TypeError: Cannot read property 'prop' of undefined at child.reformatResponse (mantle/browser/js/browser.js:617)
at mantle/browser/js/browser.js:636
at Object.success (mantle/browser/js/browser.js:655)
at j (eval at <anonymous> (content/common-ui/resources/web/jquery/jquery.min.js:2), <anonymous>:2:27244)
at Object.fireWith [as resolveWith] (eval at <anonymous> (content/common-ui/resources/web/jquery/jquery.min.js:2), <anonymous>:2:28057)
at x (eval at <anonymous> (content/common-ui/resources/web/jquery/jquery.min.js:2), <anonymous>:4:21843)
at XMLHttpRequest.b (eval at <anonymous> (content/common-ui/resources/web/jquery/jquery.min.js:2), <anonymous>:4:25897)
↧
DB Repository to File Repo
I am migrating from a DB Repo to a file based Repo.
While all the transformations and Jobs are progressing properly I have not yet figured out how to do the following
Currently (DB Repo)
When making a Transformation you have a option of all of your Database connections on the left hand side. This allows the ability to quickly configure what you need
File Based Repo
Only the connections in use appear to be listed. (for that job or transformation) when making new Transformations or jobs this is a PITA
Question:
How do I export all of the Database Connections into unique kdb files? (I have 50-60 unique DB connections)
Thank you
While all the transformations and Jobs are progressing properly I have not yet figured out how to do the following
Currently (DB Repo)
When making a Transformation you have a option of all of your Database connections on the left hand side. This allows the ability to quickly configure what you need
File Based Repo
Only the connections in use appear to be listed. (for that job or transformation) when making new Transformations or jobs this is a PITA
Question:
How do I export all of the Database Connections into unique kdb files? (I have 50-60 unique DB connections)
Thank you
↧
↧
differences between kettle 8.0.0.0-28 and 8.2.0.0-342 ?
after looking at https://help.pentaho.com/Documentation/8.2/Whats_New to see what is new in 8.2 I still have some issues.
i have a number of dbunit tests that execute pentaho jobs and transformations via java code and then compare job/transformation outputs vs. known inputs.
these 121 odd tests ran clean against 8.0.0.0-28 but after upgrading to 8.2.0.0-342 I have a number of failures.
was wondering if somebody else has seen similar behavior in the 8.2 release.
any pointers would be appreciated.
thanks.
i have a number of dbunit tests that execute pentaho jobs and transformations via java code and then compare job/transformation outputs vs. known inputs.
these 121 odd tests ran clean against 8.0.0.0-28 but after upgrading to 8.2.0.0-342 I have a number of failures.
was wondering if somebody else has seen similar behavior in the 8.2 release.
any pointers would be appreciated.
thanks.
↧
Unknown type 245 in column 0 of 1 in binary-encoded
Hi,
PDI 6.1, MySQL database.
We are getting this error Unknown type 245 in column 0 of 1 in binary-encoded. could you please help on this.
When we are trying to load the json data using table input step, we are getting it.
Thanks
PDI 6.1, MySQL database.
We are getting this error Unknown type 245 in column 0 of 1 in binary-encoded. could you please help on this.
When we are trying to load the json data using table input step, we are getting it.
Thanks
↧
PDI (8.2) Status SVG file not found
I have problem with status page on my Carte servers and on Pentaho Data Integration server.
When i try to access on status page, all is broken because static web resources cannot be loaded (css, svg).
I have the folowing use cases.
Case 1) Through Carte status page, seems it's trying to get access to inaccessible content. I tried change to "kettle/content..." and "kettle/pentaho/content..." but it still failed.
![]()
![]()
Case 2) Through PDI status page, i have custom url, instead of "/pentaho" i need to use "/pentaho-di", resources keeps pointing to bad location. If i do manually url adjustment, resource is there.
![]()
![]()
Where i can change those resource locations?
When i try to access on status page, all is broken because static web resources cannot be loaded (css, svg).
I have the folowing use cases.
Case 1) Through Carte status page, seems it's trying to get access to inaccessible content. I tried change to "kettle/content..." and "kettle/pentaho/content..." but it still failed.


Case 2) Through PDI status page, i have custom url, instead of "/pentaho" i need to use "/pentaho-di", resources keeps pointing to bad location. If i do manually url adjustment, resource is there.


Where i can change those resource locations?
↧
Launch Spoon on ubuntu 18.04
Hi All,
I have installed pentaho on ubuntu server which is running on GCP. I have followed all the steps to install and the installation was fine. I ran the job from terminal using kitchen and it works fine. But when i am trying to open spoon, I'm facing the below error.
Please share your comments if anyone has the same issue.
OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0log4j:WARN Continuable parsing error 45 and column 76log4j:WARN Element type "rollingPolicy" must be declared.log4j:WARN Continuable parsing error 52 and column 14log4j:WARN The content of element type "appender" must match "(errorHandler?,param*,layout?,filter*,appender-ref*)".log4j:WARN Please set a rolling policy for the RollingFileAppender named 'pdi-execution-appender'
org.eclipse.swt.SWTError: No more handles [gtk_init_check() failed]at org.eclipse.swt.SWT.error(Unknown Source)at
org.eclipse.swt.widgets.Display.createDisplay(Unknown Source)at org.eclipse.swt.widgets.Display.create(Unknown Source)at
org.eclipse.swt.graphics.Device.<init>(Unknown Source)at org.eclipse.swt.widgets.Display.<init>(Unknown Source)at org.eclipse.swt.widgets.Display.<init>(Unknown Source)at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:664)at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.base/java.lang.reflect.Method.invoke(Method.java:564)at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
I have installed pentaho on ubuntu server which is running on GCP. I have followed all the steps to install and the installation was fine. I ran the job from terminal using kitchen and it works fine. But when i am trying to open spoon, I'm facing the below error.
Please share your comments if anyone has the same issue.
OpenJDK 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0log4j:WARN Continuable parsing error 45 and column 76log4j:WARN Element type "rollingPolicy" must be declared.log4j:WARN Continuable parsing error 52 and column 14log4j:WARN The content of element type "appender" must match "(errorHandler?,param*,layout?,filter*,appender-ref*)".log4j:WARN Please set a rolling policy for the RollingFileAppender named 'pdi-execution-appender'
org.eclipse.swt.SWTError: No more handles [gtk_init_check() failed]at org.eclipse.swt.SWT.error(Unknown Source)at
org.eclipse.swt.widgets.Display.createDisplay(Unknown Source)at org.eclipse.swt.widgets.Display.create(Unknown Source)at
org.eclipse.swt.graphics.Device.<init>(Unknown Source)at org.eclipse.swt.widgets.Display.<init>(Unknown Source)at org.eclipse.swt.widgets.Display.<init>(Unknown Source)at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:664)at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)at java.base/java.lang.reflect.Method.invoke(Method.java:564)at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
↧
↧
Access of external jar using udjc step
Hi all,
Have anyone accessed external jars using udjc step?
I am currently using pentaho 7 version.
I have placed the jar file in pentaho lib folder and I also checked in launcher.properties file if classpath is mapped to lib folder.
After all these checks there is still an issue.
Can anyone pls help me out.
Have anyone accessed external jars using udjc step?
I am currently using pentaho 7 version.
I have placed the jar file in pentaho lib folder and I also checked in launcher.properties file if classpath is mapped to lib folder.
After all these checks there is still an issue.
Can anyone pls help me out.
↧
UDJC step throwing a compilation error
Hi All,
I have written a java code for encryption of data. currently it is throwing the error as "no applicable constructor/method found for actual parameters [byte], [byte]. Candidates are byte org.springframework.security.crypto.util.Encodingutils.concatenate (byte[]). "
Can anyone please help.
I have already imported the below ones in the code.
import org.springframework.security.crypto.encrypt.BytesEncryptor;
import org.springframework.security.crypto.codec.Hex;
import org.springframework.security.crypto.encrypt.Encryptors;
import org.springframework.security.crypto.keygen.BytesKeyGenerator;
import org.springframework.security.crypto.keygen.KeyGenerators;
import org.springframework.security.crypto.util.EncodingUtils;
import java.io.UnsupportedEncodingException;
import org.owasp.esapi.ESAPI;
I have written a java code for encryption of data. currently it is throwing the error as "no applicable constructor/method found for actual parameters [byte], [byte]. Candidates are byte org.springframework.security.crypto.util.Encodingutils.concatenate (byte[]). "
Can anyone please help.
I have already imported the below ones in the code.
import org.springframework.security.crypto.encrypt.BytesEncryptor;
import org.springframework.security.crypto.codec.Hex;
import org.springframework.security.crypto.encrypt.Encryptors;
import org.springframework.security.crypto.keygen.BytesKeyGenerator;
import org.springframework.security.crypto.keygen.KeyGenerators;
import org.springframework.security.crypto.util.EncodingUtils;
import java.io.UnsupportedEncodingException;
import org.owasp.esapi.ESAPI;
↧
PDI 8.2.0.0-342 Microsoft Excel input step - Browse doesn't "see" xlsx files
I've been using PDI for quite some time but this is the first time I've run across this issue. I'm trying to load an Excel (xlsx) file into the Microsoft Excel input step. Yet when I browse to the folder containing my xlsx files nothing shows up in the step's browse window. It will see xls files but not xlsx files. If I change the browse filter to "all files" to where I can see the xlsx files, add one, and then try to get the sheet names I get the following error:
org.pentaho.di.core.exception.KettleException:
jxl.read.biff.BiffException: Unable to recognize OLE stream
Unable to recognize OLE stream
at org.pentaho.di.trans.steps.excelinput.jxl.XLSWorkbook.<init>(XLSWorkbook.java:54)
at org.pentaho.di.trans.steps.excelinput.WorkbookFactory.getWorkbook(WorkbookFactory.java:39)
at org.pentaho.di.ui.trans.steps.excelinput.ExcelInputDialog.getSheets(ExcelInputDialog.java:1922)
at org.pentaho.di.ui.trans.steps.excelinput.ExcelInputDialog$17.widgetSelected(ExcelInputDialog.java:1184)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.trans.steps.excelinput.ExcelInputDialog.open(ExcelInputDialog.java:1214)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:120)
at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8662)
at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3293)
at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:785)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1381)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7817)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9179)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:707)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: jxl.read.biff.BiffException: Unable to recognize OLE stream
at jxl.read.biff.CompoundFile.<init>(CompoundFile.java:116)
at jxl.read.biff.File.<init>(File.java:127)
at jxl.Workbook.getWorkbook(Workbook.java:268)
at org.pentaho.di.trans.steps.excelinput.jxl.XLSWorkbook.<init>(XLSWorkbook.java:52)
... 29 more
I have Office 2016 installed on my computer.
Thanks.
org.pentaho.di.core.exception.KettleException:
jxl.read.biff.BiffException: Unable to recognize OLE stream
Unable to recognize OLE stream
at org.pentaho.di.trans.steps.excelinput.jxl.XLSWorkbook.<init>(XLSWorkbook.java:54)
at org.pentaho.di.trans.steps.excelinput.WorkbookFactory.getWorkbook(WorkbookFactory.java:39)
at org.pentaho.di.ui.trans.steps.excelinput.ExcelInputDialog.getSheets(ExcelInputDialog.java:1922)
at org.pentaho.di.ui.trans.steps.excelinput.ExcelInputDialog$17.widgetSelected(ExcelInputDialog.java:1184)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.trans.steps.excelinput.ExcelInputDialog.open(ExcelInputDialog.java:1214)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:120)
at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8662)
at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3293)
at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:785)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1381)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7817)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9179)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:707)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: jxl.read.biff.BiffException: Unable to recognize OLE stream
at jxl.read.biff.CompoundFile.<init>(CompoundFile.java:116)
at jxl.read.biff.File.<init>(File.java:127)
at jxl.Workbook.getWorkbook(Workbook.java:268)
at org.pentaho.di.trans.steps.excelinput.jxl.XLSWorkbook.<init>(XLSWorkbook.java:52)
... 29 more
I have Office 2016 installed on my computer.
Thanks.
↧