Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Zip format with size =0

$
0
0
Hi experts,

using PDI-CE-6.0,JAVA 1.7 MySQL database
currently i am able to zip all files including size=0 but i don't want.

I am facing issue with Zip format. I want to zip a folder and i want to ignore which consists size=0 and want to zip only size > 0. Could you please help me how can i achieve this issue.

zip.jpg

thanks,
santhi
Attached Images

PostgreSQL Bulk Loader

$
0
0
I want to use the PostgreSQL Bulk Loader. Can you help me?
What to put in "Path to the psql Client","DB Name Override"?

PRD query scripting

$
0
0
I'm working on my first report in PRD (v6.0), and can't seem to find much info on how to use query scripting. I can get a basic report to work (by entering the query in the "Static Query" tab, but our goal is to use parameters. The idea is to have several user options presented first, and based on the answers, design the specific query. But even without getting into the additional user options, I'm having trouble getting anything to display on the report.

So I started by creating a new JDBC data source, and in the "Query Scripting" tab I wrote code such as this...

function computeQuery (query, queryName, dataRow) {
query = "SELECT fullName, totLoadCount, totLoadGP....FROM tblX....WHERE blah blah blah";
return query;
}

When I hit the preview button at the bottom, it shows my data. As of now, I am not using parameters, but once I get things working, I'd like to adjust the query in code (by changing around the WHERE clauses). For example, if the user wants to exclude inactive employees, then in the code I would add something like this...

if (${prmExcludeActiveEmpsOption} = 'yes') {
query += " AND isActive = 1"
}

But my problem is without even using any parameters, I can't get any of the fields in my query to show up on the report. I dragged a text field to the report and in the attributes panel for "field", I type "fullName" (the first field in my select statement) yet nothing shows up when running the report. So why does the preview work fine but I'm not able to get these fields to display on the report? I have done a lot of searching on the internet for help with PRD, but it seems there is very little. If anyone knows where I can acquire additional info for help with PRD, that would be much appreciated also.

Thanks in advance for any help.
Wes

Using resource bundles with variable names

$
0
0
This is how I use the resource-fields at the moment. In the attribute 'field' I have the field name from the SQL query and in the attribute 'resource-identifier' I construct the resource-bundle file name as follows: '=CONCATENATE("/opt/pentaho/ktr/reports/translations/";[PRM_DBNAME])'. This works just fine.

Now I want to capitalize the first character of the translated text and I'm having trouble finding a solution for that. I tried using the ResourceBundleLookup function to do the translation and then use that later on to capitalize the first character. This works if I put in a fixed file name for the resource bundle, but I need to build that file name dynamically with the database name as last node. However this does not seem to work with the ResourceBundleLookup function.

Any suggestions how I could solve this?

Thanks!

Drilldown in Saiku widget CDE

$
0
0
Hello, I have a query made in Saiku and recorded in .saiku file. This query assigned to the Widget Saiku CDE and displayed correctly. The problem I find is that I can not do drilldown from here to other levels in the hierarchy when the original query from Saiku if I can. Any way to get to drilldown from Saiku widget in cde?
Thanks and regards.

HTTP Status 404

Mining the metadata repository tables

$
0
0
I want to understand the metadata repository tables , their purpose [ in database/xml]
is there any documentation available for the same.

The purpose is to create a dashboard for performance/error display.

Groovy script in Pentaho Report

$
0
0
Hi all, I am newbie with Pentaho in general. I am trying consume a rest web service from a report with Report Designer without use PDI through this groovy script:

Code:

@Grab('org.codehaus.groovy.modules.http-builder:http-builder:0.7')
import groovyx.net.http.RESTClient
import static groovyx.net.http.ContentType.*
import org.pentaho.reporting.engine.classic.core.util.TypedTableModel;


def ARIS = new RESTClient('https://LOCALHOST:8898/svc/')
def resp = ARIS.post(
    path: 'DBService.SchGetCommCodesFromUser/',
    requestContentType: ContentType.JSON,
    body: [6])


assert resp.status == 200
assert resp.headers.Status
assert (resp.data.result instanceof List)


String[] columnNames = new String[3];
columnNames[0] = "SKILL";
columnNames[1] = "CODE_ID";
columnNames[2] = "CODEDESCRIP";


Class[] columnTypes = new Class[3];
columnTypes[0] = Integer.class;
columnTypes[1] = Integer.class;
columnTypes[2] = String.class;


TypedTableModel model = new TypedTableModel(columnNames, columnTypes);
resp?.data.result[0].each { res ->
  model.addRow( [res.SKILL, res.CODE_ID, res.CODEDESCRIP] as Object[] )
}
return model;

and I am having this error:


[ 783093] ERROR - org.pentaho.reporting.designer.core.util.exceptions.UncaughtExceptionsModel - Unexpected Error encountered:
java.lang.NoClassDefFoundError: org/apache/ivy/core/report/ResolveReport

and this is the stack trace:

at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Unknown Source)
at java.lang.Class.privateGetPublicMethods(Unknown Source)
at java.lang.Class.getMethods(Unknown Source)
at java.beans.Introspector.getPublicDeclaredMethods(Unknown Source)
at java.beans.Introspector.getTargetMethodInfo(Unknown Source)
at java.beans.Introspector.getBeanInfo(Unknown Source)
at java.beans.Introspector.getBeanInfo(Unknown Source)
at groovy.lang.MetaClassImpl$15.run(MetaClassImpl.java:3289)
at java.security.AccessController.doPrivileged(Native Method)
at groovy.lang.MetaClassImpl.addProperties(MetaClassImpl.java:3287)
at groovy.lang.MetaClassImpl.initialize(MetaClassImpl.java:3263)
at org.codehaus.groovy.reflection.ClassInfo.getMetaClassUnderLock(ClassInfo.java:254)
at org.codehaus.groovy.reflection.ClassInfo.getMetaClass(ClassInfo.java:285)
at groovy.grape.GrapeIvy.$getStaticMetaClass(GrapeIvy.groovy)
at groovy.grape.GrapeIvy.<init>(GrapeIvy.groovy:81)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at java.lang.Class.newInstance(Unknown Source)
at groovy.grape.Grape.getInstance(Grape.java:121)
at groovy.grape.Grape.grab(Grape.java:159)
at groovy.grape.GrabAnnotationTransformation.visit(GrabAnnotationTransformation.java:378)
at org.codehaus.groovy.transform.ASTTransformationVisitor$3.call(ASTTransformationVisitor.java:321)
at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:931)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:593)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:569)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:546)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:268)
at groovy.lang.GroovyShell.parseClass(GroovyShell.java:688)
at groovy.lang.GroovyShell.parse(GroovyShell.java:700)
at groovy.lang.GroovyShell.evaluate(GroovyShell.java:584)
at groovy.lang.GroovyShell.evaluate(GroovyShell.java:623)
at groovy.lang.GroovyShell.evaluate(GroovyShell.java:604)
at org.codehaus.groovy.bsf.GroovyEngine.eval(GroovyEngine.java:95)
at org.apache.bsf.BSFManager$5.run(BSFManager.java:445)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.bsf.BSFManager.eval(BSFManager.java:442)
at org.pentaho.reporting.engine.classic.extensions.datasources.scriptable.ScriptableDataFactory.queryData(ScriptableDataFactory.java:156)
at org.pentaho.reporting.engine.classic.core.AbstractDataFactory.queryDesignTimeStructure(AbstractDataFactory.java:83)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryDesignTimeStructStaticInternal(CompoundDataFactory.java:143)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryDesignTimeStructureStatic(CompoundDataFactory.java:127)
at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryDesignTimeStructure(CompoundDataFactory.java:182)
at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryDesignTimeStructureInternal(CachingDataFactory.java:455)
at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryDesignTimeStructure(CachingDataFactory.java:326)
at org.pentaho.reporting.engine.classic.core.designtime.DesignTimeDataSchemaModel.queryReportData(DesignTimeDataSchemaModel.java:192)
at org.pentaho.reporting.engine.classic.core.designtime.DesignTimeDataSchemaModel.buildDataSchema(DesignTimeDataSchemaModel.java:138)
at org.pentaho.reporting.engine.classic.core.designtime.DesignTimeDataSchemaModel.ensureDataSchemaValid(DesignTimeDataSchemaModel.java:98)
at org.pentaho.reporting.engine.classic.core.designtime.DesignTimeDataSchemaModel.getDataSchema(DesignTimeDataSchemaModel.java:89)
at org.pentaho.reporting.designer.core.model.data.QueryMetaDataActorImpl.retrieve(QueryMetaDataActorImpl.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at akka.actor.TypedActor$MethodCall.apply(TypedActor.scala:147)
at akka.actor.TypedActor$TypedActor$$anonfun$receive$1$$anonfun$applyOrElse$2.apply(TypedActor.scala:311)
at akka.actor.TypedActor$TypedActor.withContext(TypedActor.scala:299)
at akka.actor.TypedActor$TypedActor$$anonfun$receive$1.applyOrElse(TypedActor.scala:306)
at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
at akka.actor.TypedActor$TypedActor.aroundReceive(TypedActor.scala:246)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.ClassNotFoundException: org.apache.ivy.core.report.ResolveReport
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 71 more


I am using Pentaho 6.0.1.386.

Any idea ?

Thanks in advance.

Esteban

How can Kettle/PDI run a report/PRPT on another server?

$
0
0
Hello all

Using the sample "Pentaho Reporting Output Example.ktr" as a guide, I've been trying different syntax to retrieve the PRPT directly from another server in my network; we've separated the PDI and BI Servers onto their own (virtual) servers for performance reasons. So from a KTR on MyPDIServer we need to reach a PRPT on MyReportServer.

In the Pentaho Reporting File Output step, setting the Report Destination File value to something like "http://MyReportServer:8080/pentaho/biserver-ce/pentaho-solutions/MyApplications/MyReport.prpt" appears to reach the report but returns an error about "Unable to parse the document: ResourceKey..." If I change the URL I get a "file not found" error which leads me to think that the current value is correct. Other than the fact that it doesn't work, of course.

Copying the PRPT file onto the PDI server does work, however it's not a good practice to host copies of the report that would need to be kept in sync.

I'm sure that I can open up a share to the containing folder but this is not optimal from a network security perspective.

Has anyone run reports across servers? Is there any more information I can provide?

Thanks in advance for any assistance
- Russell

Need a certain field's count as an addable field itself in Interactive Reports

$
0
0
The report is structured roughly as follows:
I have
(1) Organization Type, then I have
(2) the number of forms filed by all of the organizations of that type, then I have
(3) the tier of the form (there are two tiers total), then I have
(4) the number of requests made in the forms, then I have
(4) the dollar amount that is being doled out to the organizations of that organization type for all of those forms.

NOW, in Pentaho's Interactive Reports, I wish to do the following:
There is a field that is a unique identifier of each organization. Since there are many organizations in one organization type, I want a field that is a count of the number of organizations, corresponding to that specific tier and organization type. I wish to insert this field between fields (2) and (3). Please note that this is a distinct count of the number of organizations that correspond to that organization type of tier of form.

Mondrian large dimensions pre filter

$
0
0
Hi there,


I'm having a problem with the following situation, we have 3 dimensions in a cube with many members (around 10k).


When we use the 3 dims together in a saiku view (Pentaho CE 5.4), although we filter some members of one of the dims, mondrian is requesting to the DB all possible combinations of all dim members. Surely it's doing this for cache purposes. The thing is that the query is returning fast from DB but then we get a mondrian timeout as it keeps on working with all the combination.


Question is: is it possible to tell mondrian to work only with what it's being filtered?


mondrian.properties has not been changed.


Regards!

Export all transforms individually from repository

$
0
0
I am trying to export each individual transform and job from the repository to my local machine. Currently I've only been able to do a mass export which will give the repo back to me as one large xml file which would then need further parsing to break it into smaller chunks of xml.

Ideally I would like to have the transform/job take the repository connection in and then spit out the individual transforms still within their file structure on the other end. If this is one tool within pentaho or if it needs to be a full job that seperates the xml files and brings them back to my local machine.

anyone have any ideas on how to accomplish this? I've played with the export repository to xml file and was able to get it to break it down to just the public and home folders but inside those were still large xml files

pentoho crosstab

$
0
0
Hi All
I am still new in this please assist I need to create crosstab and my crosstab should display like below please help.
SERVICE TYPE 0-1KG 1-2KGS 2-5KGS 5-10KGS 10-20KGS 20-30KGS 30-40KGS 40-50KGS 50KGS OR MORE
Economy
201503 18 3 2 0 0 0 0 0 0
SUBTOTAL 18 3 3 0 0 0 0 0 0
ONX Central
201503 0 0 0 0 0 0 0 0 0
SUBTOTAL 9 4 10 0 0 1 1 0 0
ONX Regional
201503 0 0 0 0 0 0 0 0 0
SUBTOTAL 0 0 3 0 0 0 0 0 0
ONX Township
201503 0 0 0 0 0 0 0 0 0
SUBTOTAL 0 0 0 0 0 0 0 0 0
Road
201503 566 2815 7689 10 0 0 0 0 0
SUBTOTAL 17460 31598 97551 1310 17 8 15 2 4
Same Day
201503 1 1 0 0 0 0 0 0 0
SUBTOTAL 7 43 5 0 0 0 1 0 1
International
201503 0 0 0 0 0 0 0 0 0
SUBTOTAL 0 0 0 0 0 0 0 0 0
Unknown

Data integration Issue while running a report

$
0
0
I am facing this following issue in data integration while running a report. Can some one please help me with it. Please suggest me how can I fix it or debug it.

2016/03/31 16:50:21 - Generate Reports.0 - Caught Kettle Exception: Check your configuration
2016/03/31 16:50:21 - Generate Reports.0 -
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:260)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processRow(PentahoReportingOutput.java:114)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2016/03/31 16:50:21 - Generate Reports.0 - at java.lang.Thread.run(Unknown Source)
2016/03/31 16:50:21 - Generate Reports.0 - Caused by: org.pentaho.reporting.engine.classic.core.ReportDataFactoryException: Caught Kettle Exception: Check your configuration
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleDataFactory.queryData(KettleDataFactory.java:118)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStaticInternal(CompoundDataFactory.java:205)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:182)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryInternal(CachingDataFactory.java:505)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.cache.CachingDataFactory.queryStatic(CachingDataFactory.java:181)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStaticInternal(CompoundDataFactory.java:200)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryStatic(CompoundDataFactory.java:182)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.CompoundDataFactory.queryData(CompoundDataFactory.java:69)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.states.datarow.DefaultFlowController.performQueryData(DefaultFlowController.java:296)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.states.datarow.DefaultFlowController.performSubReportQuery(DefaultFlowController.java:374)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.initializeForSubreport(ProcessState.java:596)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.states.process.EndSubReportHandler.commit(EndSubReportHandler.java:59)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.states.process.ProcessState.commit(ProcessState.java:1059)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processPrepareLevels(AbstractReportProcessor.java:437)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.performStructuralPreprocessing(AbstractReportProcessor.java:621)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.prepareReportProcessing(AbstractReportProcessor.java:509)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.layout.output.AbstractReportProcessor.processReport(AbstractReportProcessor.java:1713)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.modules.output.table.csv.CSVReportUtil.createCSV(CSVReportUtil.java:74)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.modules.output.table.csv.CSVReportUtil.createCSV(CSVReportUtil.java:120)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.core.modules.output.table.csv.CSVReportUtil.createCSV(CSVReportUtil.java:90)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.di.trans.steps.pentahoreporting.PentahoReportingOutput.processReport(PentahoReportingOutput.java:229)
2016/03/31 16:50:21 - Generate Reports.0 - ... 3 more
2016/03/31 16:50:21 - Generate Reports.0 - Caused by: org.pentaho.di.core.exception.KettleException:
2016/03/31 16:50:21 - Generate Reports.0 - We failed to initialize at least one step. Execution can not begin!
2016/03/31 16:50:21 - Generate Reports.0 -
2016/03/31 16:50:21 - Generate Reports.0 -
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.di.trans.Trans.prepareExecution(Trans.java:1149)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.AbstractKettleTransformationProducer.prepareTransformation(AbstractKettleTransformationProducer.java:327)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.AbstractKettleTransformationProducer.performQueryOnTransformation(AbstractKettleTransformationProducer.java:294)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.AbstractKettleTransformationProducer.performQuery(AbstractKettleTransformationProducer.java:270)
2016/03/31 16:50:21 - Generate Reports.0 - at org.pentaho.reporting.engine.classic.extensions.datasources.kettle.KettleDataFactory.queryData(KettleDataFactory.java:110)
2016/03/31 16:50:21 - Generate Reports.0 - ... 23 more
2016/03/31 16:50:21 - Generate Reports.0 - Finished processing (I=0, O=0, R=1, W=0, U=0, E=1)
2016/03/31 16:50:21 - run_prpt_csv - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Errors detected!
2016/03/31 16:50:21 - run_prpt_csv - Transformation detected one or more steps with errors.
2016/03/31 16:50:21 - run_prpt_csv - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Errors detected!
2016/03/31 16:50:21 - run_prpt_csv - Transformation is killing the other steps!
2016/03/31 16:50:21 - GL APPL - Finished job entry [GL_APPL] (result=[false])
2016/03/31 16:50:21 - GL APPL - Finished job entry [Set Environment Parameters] (result=[false])
2016/03/31 16:50:21 - GL APPL - Job execution finished
2016/03/31 16:50:21 - Spoon - Job has ended.


Thanks,
Megha

Query and output printing in spoon job

$
0
0
Hi

Can the logging mechanism be changed in pentaho data integration spoon jobs. Can we have detailed logs enabled where the query and the output can be printed. Please tell me how you debug the reports.

Thanks,
Megha

Move multiple PDF's securely

$
0
0
Hi all,

I am currently developing a Job which has to copy multiple PDF files to another location. At first i thought this was easy, but then my boss came and said it wasnt good enough.
Here is my setup:

I have a Table Input which collects all the document paths i need. These paths are set into Result with: Set filenames in result.
Then i have a job which moves all these files to a local folder. Next step is ZIP all these files an put it to a FTP server.

At the other side a job picks up this ZIP file and unzip it. Then this client can use the PDF's i send him.

Pretty easy right?

My boss only thinks this isnt secure enough. He likes to see if i can lock the ZIP file with a password(Static) and then my client uses this Password to unlock the zip.
He also suggested if that isnt possible; to lock every individual PDF with a password(static) and then send the individual PDF to my Client using FTP.
I tried google, i tried command line programs but nothing seems to work.

Does anyone has a suggestion how i can accomplish this? Am i missing an option in Spoon?
I am using Pentaho 5.4 and my client also uses Pentaho 5.4

Thanks in advance!

${Internal.Job.Repository.Directory} is empty in 6.0 version

$
0
0
I configure a job in repository with internal variable ${Internal.Job.Repository.Directory} to know current directory

It works in old version (5.2) but not in 6.0

Even I use new internal variable ${Internal.Entry.Current.Directory}, it is empty, too.

Anyone know the issue or how to resolve it ?

Thanks!

CCC2 stacked bar chart's values didn't appear with complete text

$
0
0
Hello,
I am creating a dashboard with the stacked bar chart component.
I want to show the values(valuesMask:{category}({value.percent)}) within the bar but it didn't show the complete text even the valuesOverflow properties set to be 'Show'. I don't want the bar size to be more large.

I want to show the complete text within the bar with multiline texts.

Please,could you guys help me on this?????

Thanks in advance...

Mukesh

New transformation

Can't access JIRA

$
0
0
So, I tried accessing jira.pentaho.com, and first it showed up fine. I logged in... and now, for some weird reason, it's loading community.pentaho.com, even though the browser shows the address jira.pentaho.com O.o any ideas?
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>