Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

ERROR KarafLifecycleListener Error in Blueprint Watcher

$
0
0
Hi,

I am getting the following error ERROR [KarafLifecycleListener] Error in Blueprint Watcher occasionally when running pdi 6101. I have tried some of the suggestions like clearing the cache and editing the karaf cfg file. However this error still randomly appears.

Has anyone else saw this before in this version of PDI and got around it? I have attached the log.

Kind regards,

Gary
Attached Files

Error message: Couldn't find field X in row

$
0
0
When there are missing fields in the Select Values step for example, it always sends an error message of the first field that is missing. Why doesn't it show ALL the fields that are missing? This forces me to run a transaction 10 times for 10 missing fields to find out which ones are missing. Since I am using PDI every day this pisses me off quite a lot.

Error while connecting to MSSQL DB

$
0
0
Dear All,

Now that I am able to launch the Spoon.bat after installing java 8 ( thanks to JohanHammink for your help in another post ) ,, I am stuck currently in establishing a DB connection with MSSQL.

I also had restarted my machine reading the same in one of the posts, but that did not help. Please let me know of anything that can resolve this. Thanks in advance.

Tanisha

The entire error looks like below :

Error connecting to database [DBConnection] :org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database


Driver class 'net.sourceforge.jtds.jdbc.Driver' could not be found, make sure the 'MS SQL Server' driver (jar file) is installed.
net.sourceforge.jtds.jdbc.Driver




org.pentaho.di.core.exception.KettleDatabaseException:
Error occurred while trying to connect to the database


Driver class 'net.sourceforge.jtds.jdbc.Driver' could not be found, make sure the 'MS SQL Server' driver (jar file) is installed.
net.sourceforge.jtds.jdbc.Driver




at org.pentaho.di.core.database.Database.normalConnect(Database.java:472)
at org.pentaho.di.core.database.Database.connect(Database.java:370)
at org.pentaho.di.core.database.Database.connect(Database.java:341)
at org.pentaho.di.core.database.Database.connect(Database.java:331)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:80)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2795)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:598)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:313)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:157)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:141)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:43)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:137)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.di.ui.xul.KettleDialog.show(KettleDialog.java:80)
at org.pentaho.di.ui.xul.KettleDialog.show(KettleDialog.java:47)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:116)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:60)
at org.pentaho.di.ui.trans.step.BaseStepDialog.showDbDialogUnlessCancelledOrValid(BaseStepDialog.java:779)
at org.pentaho.di.ui.trans.step.BaseStepDialog$AddConnectionListener.widgetSelected(BaseStepDialog.java:1374)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.trans.steps.tableinput.TableInputDialog.open(TableInputDialog.java:436)
at org.pentaho.di.ui.spoon.delegates.SpoonStepsDelegate.editStep(SpoonStepsDelegate.java:127)
at org.pentaho.di.ui.spoon.Spoon.editStep(Spoon.java:8789)
at org.pentaho.di.ui.spoon.trans.TransGraph.editStep(TransGraph.java:3179)
at org.pentaho.di.ui.spoon.trans.TransGraph.mouseDoubleClick(TransGraph.java:775)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1359)
at org.pentaho.di.ui.spoon.Spoon.waitForDispose(Spoon.java:7990)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:9290)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:685)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Driver class 'net.sourceforge.jtds.jdbc.Driver' could not be found, make sure the 'MS SQL Server' driver (jar file) is installed.
net.sourceforge.jtds.jdbc.Driver


at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:515)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:456)
... 55 more
Caused by: java.lang.ClassNotFoundException: net.sourceforge.jtds.jdbc.Driver
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:490)
... 56 more


Hostname :GNM8102
Port :1433
Database name :Comcast

Metadata Injection MongoDB document load...

$
0
0
Hi...

I've been reading up & playing with Metadata Injection on PDI 7.0 community edition. In the examples I've seen (e.g. on https://help.pentaho.com/Documentati...data_Injection & the example at %PDI%\data-integration\samples\transformations\meta-inject ), they both hardcode the possible fields somewhere (e.g. the "Fields" step in use_metainject_step.ktr from the 2nd example above).

I'm trying to do something a little simpler. Can I use Metadata Injection in PDI (min 7.0?) to dynamically load into MongoDB any type of flat file that meets certain criteria--for example: pipe-delimited with header row? So, I'd like to use the same PDI import to load a flat file with 10 columns vs. another flat file with 500 columns--assuming both are pipe-delimited with a header row. My criteria is simple:
  1. Parse out the variable number of field names from row 1 & set the datatypes as text/string.
  2. Load rows 2+ as text/string into the columns created above. (I don't care about numbers, dates, etc.)

Here's the problem I'm trying to solve... For years, people have been asking the makers of MongoDB to add a command-line parameter to mongoimport that would allow the use of any delimiter, as opposed to only supporting JSON, CSV, tab-delimited. That request has fallen on deaf ears. I'm hoping that PDI 7.0+ with Metadata Injection could solve this problem, as converting the input flat files into a different format (e.g. tab-delimited) is not an option for us (we don't change incoming data & we're talking about 100k+ files).

Not asking for anyone to do this work for me--I am just looking to see if this is possible and if anyone has done it. Sounds simple enough but I haven't found such an example online, especially since PDI 7.0 & support for Metadata Injection throughout PDI is so new. Thanks!

Postgres field transformation changed to lower case

$
0
0
I created a PostgreSQL below

select b.idm_id as goal_IDMID, a.goal_year::text,a.goal_number as goal_no, a.goal_type, a.goal as goal_content, a.progress as goal_progress, a.created_at, a.updated_at from kmdata.ext_goal_and_progress a
join kmdata.user_identifiers b
on a.user_id = b.user_id
order by b.idm_id, a.goal_year, a.goal_type, a.goal_number

And the first field is goal_IDMID but when run preview it changed to goal_idmid, no clue why it changed to case to lower, any idea how to get around this issue? I need the final to goal_IDMID, not goal_idmid.

Thanks
Greg

Python option in script engine in Super Script not found

$
0
0
Hello friends,

Need help...I have super script and have Python jar files , But not seeing the option as Python in the dropdown of super script - Script Engines. Am i missing to set up/initiating some thing here...pls help ..kind of urgent...

Compatibility Report designer from 3.8 to 6.1

$
0
0
Hello there. I updated the bi-server from 4.3 ( i think ) to 6.1 and PRD 3.8 to 6.1 as well.

I opened all the report, designed with the 3.8, with the 6.1 version and published into the 6.1 server. Most of them works fine but for some of them there is a problem: The page remain blank and it keep loading.

I read something about layout, so redisigned a report from 0, published and same error ( this report include sub-report, don't know if it matters ).

What could be the error? or the bug? Into the report designer they works fine... once on the server, can't see them.
Cattura.PNG
Attached Images

[6.1 to 7.0] Upgrade - CE

$
0
0
Hello,

1) I would like to upgrade my bi-server 6.1 CE to bi-server 7.0.25 CE, how i can do that ?
Is there a script to update or do I need to install from scratch ?

2) I only find 7.0.25 CE version, I can't find 7.0.1 CE version ?

Thank you

CDA wrong MDX results

$
0
0
Hi,


I'm having an strange issue with CDA. When executing an MDX query with CDA, for some rows I'm getting different results than from Saiku that brings the correct values. I'm clearing the cache first and using exactly the same parameters.


As this is a prod server I don't have enabled mdx or sql logs and I haven't restarted it either.


Any clues?


Regards

Remote R Script Execution on MS SQL 2016

$
0
0
hey to everyone,

im new to the Pentaho Platform and im currently working on my master thesis. I would like to use Pentaho as my BI Tool for my thesis.
Therefore i would like to know, if it is possible to execute remote R Scripts on MS SQL 2016 with Pentaho and get the results back to use them in PDI. The basic idea here is to transform the executing R code to the database and only to transfer the results of the R Script to Pentaho PDI. Within SQL Server 2016 it is possible to generate T-SQL Stored Procedure with R Code. I saw, that is it possible to execute DB Stored Procedure with PDI, but im not sure, if it works with T-SQL Stored Procedure, which are returning R-Results.

I appreciate every help.

nested if statement in the formula step

$
0
0
Hi,

I'm trying to implement the "if and" statement in the function step in Pentaho Data Integration.
Basically, if Region is "Americas - North America" and IHC Product is not null then I want to update a new field with "Other US", else leave it blank.
My formula is as follows:
if(and[REGION]="Americas -North America";and([IHC Product]=""));"Other US;"")
It looks like Pentaho does not recognize null as "". I also tried with IHC Product = null but it does not recognize the null.

Any ideas how to get this working?

BI Server Start Error - Unable to render Home Directory

$
0
0
HI,

I am using Pentaho 5.0.1 CE and just restart my UAT server and suddenly got the below mentioned error.Can you please tell me how i can quickly fixed this issue.

2017-01-05 23:15:22,945 ERROR [org.springframework.extensions.jcr.JcrSessionFactory] Error registering nodetypes
2017-01-05 23:15:37,205 ERROR [org.pentaho.platform.util.logging.Logger] Error: Pentaho
2017-01-05 23:15:37,205 ERROR [org.pentaho.platform.util.logging.Logger] misc-org.pentaho.platform.engine.services.audit.AuditSQLEntry: Table 'hibernate.pro_audit' doesn't exist
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'hibernate.pro_audit' doesn't exist
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.Util.getInstance(Util.java:386)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1052)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3597)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3529)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1990)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2151)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2625)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2119)
at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2415)
at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2333)
at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2318)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
at org.pentaho.platform.engine.services.audit.AuditSQLEntry.auditAll(AuditSQLEntry.java:138)
at org.pentaho.platform.engine.core.audit.AuditEntry.auditAll(AuditEntry.java:60)
at org.pentaho.platform.engine.core.audit.AuditEntry.auditJobDuration(AuditEntry.java:49)
at org.pentaho.platform.engine.core.audit.AuditHelper.audit(AuditHelper.java:91)
at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:266)
at org.pentaho.platform.engine.services.solution.SolutionEngine.execute(SolutionEngine.java:184)
at org.pentaho.platform.engine.core.system.PentahoSystem.globalStartup(PentahoSystem.java:868)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:825)
at org.pentaho.platform.engine.core.system.PentahoSystem$2.call(PentahoSystem.java:822)
at org.pentaho.platform.engine.security.SecurityHelper.runAsSystem(SecurityHelper.java:333)
at org.pentaho.platform.engine.core.system.PentahoSystem.globalStartup(PentahoSystem.java:822)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:282)
at org.pentaho.platform.engine.core.system.PentahoSystem.init(PentahoSystem.java:182)
at org.pentaho.platform.web.http.context.SolutionContextListener.contextInitialized(SolutionContextListener.java:136)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4206)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4705)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:675)
at org.apache.catalina.startup.HostConfig.deployDescriptors(HostConfig.java:601)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:502)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
2017-01-05 23:15:37,205 ERROR [org.pentaho.platform.util.logging.Logger] Error end:
2017-01-05 23:15:37,205 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] Error Start: Pentaho Pentaho Platform Core 5.0.1-stable.-1
2017-01-05 23:15:37,205 ERROR [org.pentaho.platform.engine.services.solution.SolutionEngine] c41c2183-d36e-11e6-8ce1-7c5cf86a154e:SOLUTION-ENGINE:/public/bi-developers/Secure/global-department-list.xaction: com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Can't call rollback when autocommit=true
org.pentaho.platform.api.engine.AuditException: com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Can't call rollback when autocommit=true

array of dates on x axis

$
0
0
Hi everyone!

I'm using Pentaho Report Designer 5.0.1
I'm try to create a line chart with a range of days From the current date To the current date + 30 days on x axis.

Any idea of how this is possible?

I'll be grateful for any participation.

Thanks!
Attached Images

kitchen hang; connection pool deadlock

$
0
0
On 6.0.1, ran into http://jira.pentaho.com/browse/PDI-14882
On 6.1.0.1, kettle just hangs when ececuting via Kitchen. I used jstack to extract stack traces (attached)

The stuck threads all look like this

Code:

2017-01-05 13:39:17
Full thread dump Java HotSpot(TM) 64-Bit Server VM (25.112-b15 mixed mode):


"init of testresult.0 (Thread-58)" #87 prio=5 os_prio=0 tid=0x0000000021ce3000 nid=0x3600 in Object.wait() [0x0000000029b4f000]
  java.lang.Thread.State: WAITING (on object monitor)
        at java.lang.Object.wait(Native Method)
        at java.lang.Object.wait(Unknown Source)
        at org.apache.commons.pool.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:1115)
        - locked <0x00000000e94c6a98> (a org.apache.commons.pool.impl.GenericObjectPool$Latch)
        at org.apache.commons.dbcp.PoolingDataSource.getConnection(PoolingDataSource.java:106)
        at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
        at org.pentaho.di.core.database.ConnectionPoolUtil.getConnection(ConnectionPoolUtil.java:101)
        at org.pentaho.di.core.database.ConnectionPoolUtil.getConnection(ConnectionPoolUtil.java:87)
        at org.pentaho.di.core.database.Database.normalConnect(Database.java:439)
        at org.pentaho.di.core.database.Database.connect(Database.java:364)
        - locked <0x00000000e94c45a0> (a org.pentaho.di.core.database.Database)
        at org.pentaho.di.core.database.Database.connect(Database.java:335)
        at org.pentaho.di.trans.steps.tableinput.TableInput.init(TableInput.java:330)
        at org.pentaho.di.trans.step.StepInitThread.run(StepInitThread.java:69)
        at java.lang.Thread.run(Unknown Source)

This is with the MSFT SQL JDBC server driver. Happens every time it's run. Any ideas?

Attached Files

access the user concole very slow

$
0
0
like the title , i access the user concole very slow.
if restart the pentaho, it would get right.
but after a period of time will again be a problem

use batch update for inserts issue with postgres 9.6

$
0
0
Hi All,

I am facing a weird issue with the option "use batch update for inserts" of table output with the commit size 5000(tried changing the commit size but no use).
I am using PostgreSQL as database server and the version is 9.6.1.
The transformation is just simply read the data from postgres database and loads the data into postgresql db.
The same transformation is working fine on 9.4.5 version and facing the issue with 9.6.
When i disabled the batch insert option the transformation is working fine on 9.6.
By disabling the batch option performance dipped a bit low.
Please share your ideas/workaround if anyone has similar kind of issue.

Thanks in advance.

Regards,
Bhanu.

CST Plugin Problem to Rendering the Particular Dashboard based on user

$
0
0
Hi ,

I have configured the CST plugin in my dashboard to render the particular dashboard to particular users. It was working fine, but when a user login first he will render to Pentaho user console, then it will redirect to particular dashboard or some time it will take more than 3-4 minutes.

Is there any other alternative method or configuration where do I need to change to redirect directly to the dashboard instead of first user console than dashboard? Also i have mentioned below the configuration which i have did in cst/resources/config.xml:

<rule match="USER" pattern="false" value="user1">
<tab title="sample Dashboard" fullScreen="true" tooltip="A CDE sample"><![CDATA[api/repos%3Apublic%3ASS%3ADashboard%3Awelcome_dashboard.wcdf/generatedContent]]></tab>

</rule>

ElasticSearch Bulk step

$
0
0
how to insert data in elasticSearch Server in 'nested document' using pentaho etl ElasticSearch Bulk step.
elastic search version:1.5.2
Pentaho 6.0

Fixed Measure - descendant from a dimension

$
0
0
I have to produce a report with some measures / dimensions like the following one


+---------------+---------------+-----------------+------------+-----------------+
| date | province |active_clients| no. sales | total_stores |
+---------------+---------------+-----------------+------------+-----------------+
| | AAAA | 500 | 100 | 80 |
| 2017-01-04| BBBB | 275 | 200 | 88 |
| | CCCC | 999 | 203 | 89 |
+---------------+---------------+-----------------+------------+-----------------+


Both measures 'active_clients' and 'no.sales' are calculated from fact tables.
On the other hand, the last one is calculated from dimension store (and it is not related with dimension date) and does not depend on facts of that day (it is the total number of stores we have, no matter if they sold on that day or not).


I've searched all around online and I can't find a solution to solve it with Pentaho Schema Workbench.




Can anyone help me, please?


Thank you,

National characters not displayed when report is generated as PDF

$
0
0
Hello, good folks,

I have a problem I think was discussed many times in the past but so far I have not seen any solution to it (also the threads I found are mostly several years old). We are running a Pentaho CE 5.1 server with some reports. If we generate them as HTML or Excel, everything is fine, but when exported as PDF, the national characters (č, ž, é, ů and so on) are either completely left out with no blank space in their place, as if it was a typo (observed by a colleague with Windows 7) or some strange characters are printed instead (observed by me on Linux).

Somewhere on StackOverflow I found a suggestion that the fonts have to be installed on the server for the PDF to render correctly. So we tried this, but to no avail.

Everything in the report configuration is left at default values, and the classic-engine.properties file on the server is also unchanged. We also run a Pentaho 5.0.1 instance where this doesn't happen. Both servers run some version of Debian, but there may be some differences in their configuration.

Please, if you have an idea what could be wrong, can you point me in the right direction?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>