Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Report per user role embedded on the home page of PUC

$
0
0
Hi Guys,

I want to display on the home page of the PUC a pre-scheduled Report (.html).

The difficulty is that each role should see its own report (so the index.jsp should be modified to point to /home/$userRole/report.html or a folder von the file system, i.e. /tmp/$userRole/report.html).

Question: Is there a way to read the user's role and so refer to right report?

Thank you in advanced.

Best regards,
Matthias

How to create performance cards similar to CDE Showcase example

$
0
0
Hello,
for our marketing data we need to create pretty much same Performance Comparison cards shown in the CDE Marketing Campaign Analysis example (see attachment or this URL) http://www.webdetails.pt/pentaho/plu...&password=demo

Is this something that can be achieved in CDE? I quite do not have any idea on what component can help me here. I was thinking to do dirty javascript coding in a freeform/query component, generate html and slap it on a selector.

any help is greatly appreciated.

5-21-2015 3-15-34 PM.jpg
Attached Images

How to display dynamic images in CDE dashboard?

$
0
0
Hi,

is there any way to display dynamic images (by changing the path of an image in the dataset, something like the 'content-field report element' of PRD) in P5.3 CE?

Log error lines in the logging table ONLY when the transformation got failed

$
0
0
Hi All,


I have logging tables enabled for transformation and Jobs.
I have enabled the the LOG_FIELD also.
i need the log field to be filled up ONLY when the transformation got failed.
Also i need only error lines to be logged in the log filed.
Do we have any way to do logging with only error lines in the logging table field LOG_FIELD.

Please help!

Edit Schedule in Pentaho User Console

$
0
0
When I try to edit an existing Schedule in Pentaho User Console I get the below error:
"not-null property references a null or transient value: org.pentaho.platform.repository.subscription.Subscription.title"
How to resolve this?

How to fetch only required XML child node value based on conditions

$
0
0
Hi ,

How to extract only particual block of XML only if the required conditions are satisfied.
Please give any suggesstions.

I have attched the sample XML input.Need to read the state block and get the value of name alone, Only if the state block contains <current/> tag

required Output in CSV;
Attached Images
Attached Files

Replace In String not working for me

$
0
0
Hi guys,

First time posting. I searched all over, but can't find a solution for this seemingly very simple issue.

I have some fields of text that sometimes contain an ampersand character (&). I want to do a find / replace on this character and replace it with the word "and" wherever it occurs.

I'm attempting to use the Replace In String function to do this, but it seems that it's just not doing anything when I run my transform.

I have the following set up in the config:

In Stream Field: Bank Name
Out Stream Field: Bank Name
Use RegEx: N
Search: &
Replace with: and
Set empty string?: N
Replace with field:
Whole Word: N
Case Sensitive: N

A sample value that I'm trying to perform this action on is: Branch Banking & Trust Company

For this sample value, I want the output to be: Branch Banking and Trust Company

Can anyone help?
Thank you

Check a checkbox by default in Pentaho report designer ?

$
0
0
we have set a check box parameter in Pentaho report designer . When we launch the report the check box is unchecked . what we want is , to set the default value of that Check box to be checked so when we launch the report we don't have to check it

Pentaho Mapreduce to Parse Mainframe Data

$
0
0
Is there any pentaho tools which transform mainframe data to HDFS. If yes , Please provide API document.

Modified Java Script Value scheduling problem

$
0
0
Hello all,

I'm working with kettle 4.4 and making a job to execute a transformation with a Modified Java Script Value step for some calculation, when I run the job manually it's working fine but when scheduling the job it isn't working at all I've tried a lot of changes with no result.

Is any one has faced the same problem when scheduling a Modified Java Script Value ??
do some one have an idea how to solve the problem ???

Thank you in advance.

Key fields don't show on data source wizard

$
0
0
Hello,

I really need help about this, I'm trying to create a new data source using data source wizard.
It shows no key fields in my tables, I don't know why and that's my first use of this tool.

Please let me know if you have any idea how to solve this.

Thank you

An alternative for Tableau Packaged Workbooks?

$
0
0
Hi,

We are looking for alternatives for Tableau Packaged Workbooks. Meaning: the ability to package a dashboard with its dataset and make it available for offline viewing and interactive analysis.
Moreover, we should be able to generate these 'workbooks' for 300+ end users, with every end user having the same dashboard but of course only his particular (filtered) data set.

Is this in any way possible within the Pentaho solution? Should I be looking at Ctools? A warm thanks for any tip in the right direction!

Sandra

Oracle Bulk Loading missing type information for floating point numbers

$
0
0
Hi,

we are currently building some jobs which uses the Oracle Bulk Loader to have maximum performance while inserting the rows.
One issue we approached is that for fields which have the Number format there is no type information appended into the control file of the loader.

I also checked the Java class on github which writes the control file and saw that this is really not happening in the source code.

https://github.com/pentaho/pentaho-k...ader.java#L277
Method: public String getControlFileContents(...)
Line: 277

This leads to the problem that we aren't able to import a field with the correct precision / scale.
In the table we have a column that is defined as "NUMBER (10,2)".

The handling through the bulk loader fails because of the missing type information in the control file.

I know we can also use the "Table Output Step" for this...

I'd like to know if any body else has this problem or if we are doing something wrong ?

Otherwise we could also create a JIRA Bug for this.

Thanks!

Best regards,
Ewatch

MDX cumulative measure

$
0
0
Hi all, maybe someone can help me with this MXD query. I need to make a general cumulative measure like the one in the picture (coloured in yellow). I tried with YTD() but it is not possible because it accumulate only within the year.
I'll appreciate your help.
Best regards.

tabla_acumulados_pentaho.png
Attached Images

Help - Exporting tables to Excel

$
0
0
Hi.

I am using a script to export a dashboard table to excel. I know there is a component called "export button" but the result of it is just the result of the query, and what I need is the data to be exported as they appear in the table (colors, formats, etc.).

So I am using the following javascript command:
Code:

var tableToExcel = (function() {  var uri = 'data:application/vnd.ms-excel;base64,'
    , template = '<html xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:x="urn:schemas-microsoft-com:office:excel" xmlns="http://www.w3.org/TR/REC-html40"><head><!--[if gte mso 9]><xml><x:ExcelWorkbook><x:ExcelWorksheets><x:ExcelWorksheet><x:Name>{worksheet}</x:Name><x:WorksheetOptions><x:DisplayGridlines/></x:WorksheetOptions></x:ExcelWorksheet></x:ExcelWorksheets></x:ExcelWorkbook></xml><![endif]--></head><body><table>{table}</table></body></html>'
    , base64 = function(s) { return window.btoa(unescape(encodeURIComponent(s))) }
    , format = function(s, c) { return s.replace(/{(\w+)}/g, function(m, p) { return c[p]; }) }
  return function(table, name) {
    if (!table.nodeType) table = document.getElementById(table)
    var ctx = {worksheet: name || 'Worksheet', table: table.innerHTML}
    window.location.href = uri + base64(format(template, ctx))
  }
})()

And also a button with the command:
Code:

function(){tableToExcel('pSalesTable', 'Table');
}

And it works!
The problem is that to excel sends only the data being displayed on the screen ("Displaying 1 to 10 of 100 records"), and the remainder is in the other pages of the table, and is not sent.

Anyone had this problem?

Parameter to query URL

$
0
0
Hello,

Is it possible to pass a parameter property for a Datasource like the url or the username, on CDF ? (in my case, it's a sql over jdbs datasource)

Thank you

Thibaud Grosroyat

Copying Recent files Uploaded in SFTP to another folder

$
0
0
I created a job which will uploads 4 files on the SFTP server daily. The files are results of a transformation.
Daily, 4 files are created and it keeps on accumulating in one folder. The requirement is to have another
folder which only contains the recent files on the same day.

Group By Multiple Levels Same Hierarchy

$
0
0
Hello!

I have a issue here, I think it's simple but I can't find an answer, so any idea would be a big help for me:


I have a 2 levels hierarchy (Type and Severity), in the OLAP Tool (Saiku in my case), if I put in the row both Levels, it works fine, as the example below:


Type 1 | Severity 1 | 10
| Severity 2 | 20
| Severity 3 | 30


Type 2 | Severity 2 | 10
| Severity 3 | 40


The problem is: If I remove the Type Level, It's not grouping by the Severity text, it keeps separating by the parents, as the example:


Severity 1 | 10
Severity 2 | 20
Severity 3 | 30
Severity 2 | 10
Severity 3 | 40



I believe the right result should be:


Severity 1 | 10
Severity 2 | 30
Severity 3 | 70



Following the dimension code in Schema Workbench, What it could be wrong?


Thanks!!


<Dimension type="StandardDimension" visible="true" highCardinality="false" name="Problem">
<Hierarchy name="Hierarchy" visible="true" hasAll="true" allMemberName="All Problem" allMemberCaption="All Problem" primaryKey="id">
<Table name="dim_problem" schema="problem">
</Table>
<Level name="Type" visible="true" column="type" type="String" uniqueMembers="true" levelType="Regular" hideMemberIf="Never">
</Level>
<Level name="Severity" visible="true" column="severity" type="String" internalType="String" uniqueMembers="false" levelType="Regular" hideMemberIf="Never">
</Level>
</Hierarchy>
</Dimension>

Persistent shared objet file problem

$
0
0
Hi!

I have an almost emty job that is not using a Shared object file, but even though, when I open it in Spoon, it shows a ton of shared DB connections. If I save the job, these connections become part of the KJB and is a mess.

Captura de pantalla 2015-05-23 a las 11.31.29.png

Code:

<?xml version="1.0" encoding="UTF-8"?><job>
  <name>test</name>
    <description/>
    <extended_description/>
    <job_version/>
    <job_status>0</job_status>
  <directory>&#x2f;</directory>
  <created_user>-</created_user>
  <created_date>2014&#x2f;10&#x2f;21 18&#x3a;24&#x3a;38.191</created_date>
  <modified_user>-</modified_user>
  <modified_date>2014&#x2f;10&#x2f;21 18&#x3a;24&#x3a;38.191</modified_date>
    <parameters>
    </parameters>
    <slaveservers>
    </slaveservers>
<job-log-table><connection/>
<schema/>
<table/>
<size_limit_lines/>
<interval/>
<timeout_days/>
<field><id>ID_JOB</id><enabled>Y</enabled><name>ID_JOB</name></field><field><id>CHANNEL_ID</id><enabled>Y</enabled><name>CHANNEL_ID</name></field><field><id>JOBNAME</id><enabled>Y</enabled><name>JOBNAME</name></field><field><id>STATUS</id><enabled>Y</enabled><name>STATUS</name></field><field><id>LINES_READ</id><enabled>Y</enabled><name>LINES_READ</name></field><field><id>LINES_WRITTEN</id><enabled>Y</enabled><name>LINES_WRITTEN</name></field><field><id>LINES_UPDATED</id><enabled>Y</enabled><name>LINES_UPDATED</name></field><field><id>LINES_INPUT</id><enabled>Y</enabled><name>LINES_INPUT</name></field><field><id>LINES_OUTPUT</id><enabled>Y</enabled><name>LINES_OUTPUT</name></field><field><id>LINES_REJECTED</id><enabled>Y</enabled><name>LINES_REJECTED</name></field><field><id>ERRORS</id><enabled>Y</enabled><name>ERRORS</name></field><field><id>STARTDATE</id><enabled>Y</enabled><name>STARTDATE</name></field><field><id>ENDDATE</id><enabled>Y</enabled><name>ENDDATE</name></field><field><id>LOGDATE</id><enabled>Y</enabled><name>LOGDATE</name></field><field><id>DEPDATE</id><enabled>Y</enabled><name>DEPDATE</name></field><field><id>REPLAYDATE</id><enabled>Y</enabled><name>REPLAYDATE</name></field><field><id>LOG_FIELD</id><enabled>Y</enabled><name>LOG_FIELD</name></field><field><id>EXECUTING_SERVER</id><enabled>N</enabled><name>EXECUTING_SERVER</name></field><field><id>EXECUTING_USER</id><enabled>N</enabled><name>EXECUTING_USER</name></field><field><id>START_JOB_ENTRY</id><enabled>N</enabled><name>START_JOB_ENTRY</name></field><field><id>CLIENT</id><enabled>N</enabled><name>CLIENT</name></field></job-log-table>
<jobentry-log-table><connection/>
<schema/>
<table/>
<timeout_days/>
<field><id>ID_BATCH</id><enabled>Y</enabled><name>ID_BATCH</name></field><field><id>CHANNEL_ID</id><enabled>Y</enabled><name>CHANNEL_ID</name></field><field><id>LOG_DATE</id><enabled>Y</enabled><name>LOG_DATE</name></field><field><id>JOBNAME</id><enabled>Y</enabled><name>TRANSNAME</name></field><field><id>JOBENTRYNAME</id><enabled>Y</enabled><name>STEPNAME</name></field><field><id>LINES_READ</id><enabled>Y</enabled><name>LINES_READ</name></field><field><id>LINES_WRITTEN</id><enabled>Y</enabled><name>LINES_WRITTEN</name></field><field><id>LINES_UPDATED</id><enabled>Y</enabled><name>LINES_UPDATED</name></field><field><id>LINES_INPUT</id><enabled>Y</enabled><name>LINES_INPUT</name></field><field><id>LINES_OUTPUT</id><enabled>Y</enabled><name>LINES_OUTPUT</name></field><field><id>LINES_REJECTED</id><enabled>Y</enabled><name>LINES_REJECTED</name></field><field><id>ERRORS</id><enabled>Y</enabled><name>ERRORS</name></field><field><id>RESULT</id><enabled>Y</enabled><name>RESULT</name></field><field><id>NR_RESULT_ROWS</id><enabled>Y</enabled><name>NR_RESULT_ROWS</name></field><field><id>NR_RESULT_FILES</id><enabled>Y</enabled><name>NR_RESULT_FILES</name></field><field><id>LOG_FIELD</id><enabled>N</enabled><name>LOG_FIELD</name></field><field><id>COPY_NR</id><enabled>N</enabled><name>COPY_NR</name></field></jobentry-log-table>
<channel-log-table><connection/>
<schema/>
<table/>
<timeout_days/>
<field><id>ID_BATCH</id><enabled>Y</enabled><name>ID_BATCH</name></field><field><id>CHANNEL_ID</id><enabled>Y</enabled><name>CHANNEL_ID</name></field><field><id>LOG_DATE</id><enabled>Y</enabled><name>LOG_DATE</name></field><field><id>LOGGING_OBJECT_TYPE</id><enabled>Y</enabled><name>LOGGING_OBJECT_TYPE</name></field><field><id>OBJECT_NAME</id><enabled>Y</enabled><name>OBJECT_NAME</name></field><field><id>OBJECT_COPY</id><enabled>Y</enabled><name>OBJECT_COPY</name></field><field><id>REPOSITORY_DIRECTORY</id><enabled>Y</enabled><name>REPOSITORY_DIRECTORY</name></field><field><id>FILENAME</id><enabled>Y</enabled><name>FILENAME</name></field><field><id>OBJECT_ID</id><enabled>Y</enabled><name>OBJECT_ID</name></field><field><id>OBJECT_REVISION</id><enabled>Y</enabled><name>OBJECT_REVISION</name></field><field><id>PARENT_CHANNEL_ID</id><enabled>Y</enabled><name>PARENT_CHANNEL_ID</name></field><field><id>ROOT_CHANNEL_ID</id><enabled>Y</enabled><name>ROOT_CHANNEL_ID</name></field></channel-log-table>
  <pass_batchid>N</pass_batchid>
  <shared_objects_file/>
  <entries>
    <entry>
      <name>START</name>
      <description/>
      <type>SPECIAL</type>
      <start>Y</start>
      <dummy>N</dummy>
      <repeat>N</repeat>
      <schedulerType>0</schedulerType>
      <intervalSeconds>0</intervalSeconds>
      <intervalMinutes>60</intervalMinutes>
      <hour>12</hour>
      <minutes>0</minutes>
      <weekDay>1</weekDay>
      <DayOfMonth>1</DayOfMonth>
      <parallel>N</parallel>
      <draw>Y</draw>
      <nr>0</nr>
      <xloc>53</xloc>
      <yloc>33</yloc>
      </entry>
  </entries>
  <hops>
  </hops>
  <notepads>
  </notepads>


</job>

I'm using Spoon 5.0.1.

Thanks for your help!

how to fetch all the tables from database and convert to CSV format in one go

$
0
0
Hi,

I successfully connected to DB and able to view all the tables. But how to convert required group of tables to CSV
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>