Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Date format MM/dd/yyyy not retaining leading zero

$
0
0
Hello,
I am trying out Kettle - Spoon General Availability Release - 5.4.0.1-130
I have a CSV file input to Text file output, where the output separator is a vertical bar.
For the output file fields of type Date, regardless of whether I specify format of MM/dd/yyyy or M/d/yyyy, a value such as January 1, 2013 is output as 1/1/2013, not 01/01/2013.
Is there some other parameter I should be setting to retain the leading zero in a date value?

Unable to break page at the end of each iteration inside "Details"

$
0
0
Hello,

I have a master report with a subreport inside "Report Footer". This subreport have some fields ans subreports inside "Details". What I want is to start each iteration in a new page. I have tried it using breakpage-after and before in every band of the last subreport, but the page doesn't break. Any advice?

Thank you.

How to start subsequent multi-thread job

$
0
0
I have current ETL whit 5 thread, each one doing some sql store procedure as following pic:
5thread.jpg

My question is , since the one thread might take way longer time than others, say hours, if one thread not finished, all ETL job will wait it before repeat another run, we have thousands of jobs.
is it possible to kick off new thread to make the total thread number to 5, before one of the last 5 thread finished?

Thank you very much.
Attached Images

Pentaho BI Export Import Issue

$
0
0
Hi,

I am Using Export Import utility to export Pentaho BI from window , When I am using Chinese language and I export that file, file name changed to arbitrary characters or in chinese . Also name is showing in chinese characters in exportManifest.xml.
when I import again that file will display in English language and normal.

Please help me what to do ?? is there is encoding issue ?

OutOfMemory: Java heap space

$
0
0
Hello Everyone,

I am using Spoon (Kettle CE stable 4.4.0)
I have an excel input with more than 20,000 rows.
Now while trying retrieve the sheetname Sppon gives me the attached error.
Has anyone encountered something similar.
As I understand by the error, it's something to do with the memory issue.
Is there anyway we can come over this issue?

Will be waiting for your response.
Thanks a lot in advance.
Attached Images

One more time about anonymous access to ..8080/pentaho/Xmla (CE 5.4)

$
0
0
Hi all.

I've installed Pentaho CE 5.4 for access to postgreSQL ROLAP cube.
I can get an access to it by Excel and Xmlaconnect driver: the driver makes POST request with Authorization header filled by credentials which were input by me via Excel dialog.

But I need access to cube via web controls. I'm using now Kendo UI pivot table (html5 version).
I've configured its dataSource with next code (javascript):

..kendoPivotGrid({
...
..

read: {
url: "http://192.168.15.54:8080/pentaho/Xmla",
dataType: 'text',
type: "POST",
beforeSend: function(req) {
req.setRequestHeader('Authorization', "Basic " + btoa("anonymous:anonymous"));
}
}​

Well, it is not enough: kendo dataSource at first makes OPTIONS request to http://192.168.15.54:8080/pentaho/Xmla without using of my Authorization header. And I get this result:

XMLHttpRequest cannot load http://192.168.15.54:8080/pentaho/Xmla. The request was redirected to 'http://192.168.15.54:8080/pentaho/Login;jsessionid=9DE4DE4EE9EB9827D20313E1FEB659FC', which is disallowed for cross-origin requests that require preflight

Demo from kendo on the other hand:
a) makes the same OPTIONS request to their own sample server
b) gets the right answer 200 OK (on this step i get an error of authentication as shown above)
c) makes POST request for getting OLAP xmla data
d) gets this OLAP data and displays it

I played with anonymous access to my server. I found that applicationContext-spring-security.xml contains some data for it: i switched on|off anonymous access to the docs directory. But the entry like this \A/Xmla/.*\Z=Anonymous,Authenticated not working: the authentication is needed for access to http://192.168.15.54:8080/pentaho/Xmla url both via browser and via javascript OPTIONS request.
Can I set anonymous access to /Xmla address? And only for OPTIONS request? I'm not sure than kendo let me to put any authentication cookie
to OPTION request (from the answer to preceding auth request with userid\password params to pentaho/login url)

Dynamic schema processor

$
0
0
I have a working dsp which looks up the username in a db table. What I also want is to retrieve the current roles for the user. Do anyone knows how to retrieve the roles?


Code:

package com.company;

import java.io.InputStream;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import java.util.regex.PatternSyntaxException;

// ADDED FOR SESSION VARS
import org.pentaho.platform.api.engine.IPentahoSession;
import org.pentaho.platform.engine.core.system.PentahoSessionHolder;
import org.pentaho.platform.api.engine.IUserRoleListService;

import mondrian.olap.Util;
import mondrian.spi.DynamicSchemaProcessor;
import mondrian.spi.impl.FilterDynamicSchemaProcessor;

public class SubscriptionCountry extends FilterDynamicSchemaProcessor implements DynamicSchemaProcessor {
    private static final String BBDD_HOST = "xxx";
    private static final String BBDD_PORT = "xxx";
    private static final String BBDD_NAME = "xxx";
    private static final String BBDD_USER = "xxx";
    private static final String BBDD_PASSWORD = "xxx";

    public SubscriptionCountry() {

    }
    @Override
    public String filter(String schemaUrl, Util.PropertyList connectInfo, InputStream stream) throws Exception {
        String schema = super.filter(schemaUrl, connectInfo, stream);

        Connection c = null;
        String country_list = "";

        try {
            // Get Pentaho Session
            IPentahoSession session = PentahoSessionHolder.getSession();
                       
            // Get user from session variable
            String userName = session.getName().toString();

            String query_countries = "";
            query_countries = "SELECT something from a table where username = + userName";

            Class.forName("org.postgresql.Driver");
            c = DriverManager.getConnection("jdbc:postgresql://" + BBDD_HOST + ":" + BBDD_PORT + "/" + BBDD_NAME,BBDD_USER, BBDD_PASSWORD);
            Statement st = c.createStatement();
            ResultSet rs = st.executeQuery(query_countries);
            rs.next();
            country_list = rs.getString("country_list");
            rs.close();
            st.close();
        }
        try {
            schema = schema.replaceAll("%COUNTRY_LIST%", country_list);
        }
        return schema;
    }
}

launch next entries in parallel

$
0
0
Hi ,

Initially i was loaded my data with out logic "Launch next entries in parallel" , took 1 hr 2 min to complete the job after few days i was googled for performance issues fixes and applied few changes and at the same time few of them preferred logic as "Launch next entries in parallel" but it is also taking same time : 1 hr 2 min.

Trying to understand the logic behind "Launch next entries in parallel" , is this help to me if i applied then what i am missing, please suggest me.

Launch next entirs in parallel.png
Attached Images

Connection - Advanced

$
0
0
Hello, I don't understand how "the SQL statements executing after connecting" work. When I go to Manage Data sources, there is my Oracle JDBC connection. On the Advanced menu item is something which I have no idea how works. I've tried to write few queries there, but it did nothing (and yes, the test succeeded). So could you give me an example of use, please?

Create Text Files with Predetermined Number of Records

$
0
0
I have a ETL stream output that may have 100's or 1000'2 of records.

I want to write the stream to a text file however I want to limit the number of records in each file to 500 records.

I tried to create a couple of solutions that not produce the desired output regarding file size.

Have anyone performed a similar ETL and/or give some pointers on how best to implement.

Thank you
Ray

Database Join error for function embedded on SELECT clause.

$
0
0
Hello All,

I am trying to use a Database Join step in order to invoke PostgreSQL for finding the hunber of days on the month before a given date by using the following SELECT statement:

SELECT
DATE_PART('days', DATE_TRUNC('month', timestamp ?)
- (DATE_TRUNC('month', timestamp ?)
- '1 MONTH'::INTERVAL) ) as billdays

I have tried this already on Postgre's PGAdmin III tool (with the appropriate changes for the ? parameter replacement) and it goes fine.

When I run the SQL above in my transformation in Spoon, however, I get the following error:

2015/09/25 15:17:32 - org.pentaho.di.trans.steps.databasejoin.DatabaseJoinMeta@6e93b440 - ERROR (version 5.4.0.1-130, build 1 from 2015-06-14_12-34-55 by buildguy) : Ocorreu um erro de banco de dados:
2015/09/25 15:17:32 - org.pentaho.di.trans.steps.databasejoin.DatabaseJoinMeta@6e93b440 - Couldn't get field info from [SELECT
2015/09/25 15:17:32 - org.pentaho.di.trans.steps.databasejoin.DatabaseJoinMeta@6e93b440 - DATE_PART('days', DATE_TRUNC('month', timestamp ?)
2015/09/25 15:17:32 - org.pentaho.di.trans.steps.databasejoin.DatabaseJoinMeta@6e93b440 - - (DATE_TRUNC('month', timestamp ?)
2015/09/25 15:17:32 - org.pentaho.di.trans.steps.databasejoin.DatabaseJoinMeta@6e93b440 - - '1 MONTH'::INTERVAL) ) as billdays]
2015/09/25 15:17:32 - org.pentaho.di.trans.steps.databasejoin.DatabaseJoinMeta@6e93b440 -
2015/09/25 15:17:32 - org.pentaho.di.trans.steps.databasejoin.DatabaseJoinMeta@6e93b440 - ERROR: syntax error at or near "$1"
2015/09/25 15:17:32 - org.pentaho.di.trans.steps.databasejoin.DatabaseJoinMeta@6e93b440 - Posição: 58

The 58th position falls precisely at the first ? for parameter replacement.

What am I doing wrong? Is there a smarter way of determining the number of days of a monthe referred by a date?

Thank you!

The step configuration screen looks like below:

Captura de tela de 2015-09-25 15:19:34.jpg
Attached Images

executeTrans always returns HTTP Status 200

$
0
0
How can I get the status of a transformation called using executeTrans? I would like to run transformations using the DI web service instead of using Pan but every request appears to be successful (HTTP Status 200). Is there a separate service to check the status or a better way to do this? We are using Pan currently but the JVM startup is very slow. I plan to call the web services via curl if I can get the transformation status.

Job executor and Single threader

$
0
0
Hi,

I have a transformation and sub transformation where the sub transformation is executed in single threader with batch size = 1. The entire transformation takes less than 1 minutes to process 3000 records.

If I replace the single threader step with job executor with row size=1, the entire transformation slows down and it takes 15 minutes to process.

What is the reason behind this?

Pentaho Webservie PUT call is giving (HTTP Status 405 - Method Not Allowed) error

$
0
0
I am new to both pentaho and web service. I need to do a PutMethod call to Pentaho with HTTPClient and update a MetaData file to it's repository. I am able to get the file using GetMethod but when i am calling the PutMethod it is giving me (HTTP Status 405 - Method Not Allowed)


My code is as follows


import org.apache.commons.httpclient.*;
import org.apache.commons.httpclient.methods.GetMethod;
import org.apache.commons.httpclient.methods.PutMethod;


public void PutRequest(String context, String data, String filePath, String userName, String tenantName) {


File file = new File(filePath);


urlParams = GenerateParam(userName, tenantName);


try {
logger.debug("PUT : " + url.toExternalForm() + "/" + context + "/" + data + urlParams);
logger.debug("File Path :: " + file.getAbsolutePath());


PutMethod putMethod = new PutMethod(url.toExternalForm() + "/" + context + "/" + data + urlParams);
putMethod.setDoAuthentication(true);


Part[] parts = { new StringPart("domainId", data), new StringPart("overwrite", "true"),
new FilePart("metadataFile", file, "application/octect-stream", "utf-8") };


putMethod.setRequestEntity(new MultipartRequestEntity(parts, putMethod.getParams()));


int status = client.executeMethod(putMethod);
logger.debug("PUT status : " + status);
logger.debug("Response : " + putMethod.getResponseBodyAsString());
logger.debug("File : " + file.getName());
putMethod.releaseConnection();
// file.delete();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}










Please let me know what is wrong with this code.

Ivy and resolve task

$
0
0
Hi,
each time I try to "ant dist" from pentaho-kettle project, it always takes a couple of hours before ivy has checked the needed jars.
Maybe there's something I can do or better configure.

Questions:
- instead of writing in "cache" folder each time, can I move all jar somewhere and build a local repository?
- if I already have a maven repo, can I instruct ivy to look inside it?

If some of you have some hints, I would be grateful.
Thank you in advance.
Nic

Where does "transformation batch ID" come from?

$
0
0
Hi,
I am using 'Get System Info' to retrieve 'transformation batch ID' into a field name which goes onto the database (as ETL_PROC_SK).
The doco suggests the batch ID is 'a unique number, increased by one for each run of a transformation'.
Later steps, using completely different technology, are using this value as a 'where was I up to' so are incorporating a where clause into their retrieval such as WHERE ETL_PROC_SK > maximum-value-found-last-time-I-ran.


This seems to have been working OK but I now suspect it may be by accident.
Q1. The source and target DB connection are parameterised. If sucessive runs are to different connections will the number just go up by one?


Q2. So far we are running from different 'instances' of Pentaho. ie two developers each running it from their desktop. Doing it this way will numbers be reused? And there is no repository just running from file system.


Q3. Before I stuff everything up could somebody suggest a 'best practice' guide to setting up the site before we go too far?


Thank You


JC

Pentaho BI Export Import Issue with File Name

$
0
0
Hi,

I am Using Export Import utility to export one file from BI Console window. I have a file name

लेबल.saiku. when I export this file ,
it is pop up with लेबल.saiku.zip but when I open this zip, file inside it become %E0%A4%B2%E0%A5%87%E0%A4%AC%E0%A4%B2.saiku

Please help me what to do ?? is there is encoding issue ?


Exporting Transformation Output

$
0
0
I have taken Excel Input file as a source & excel Output file as Target file, I want to Export that Output file into an Excel file format so how do I Export it into an excel file.Please help me for the same.
Thanks.

How to Integrate Penatho CDE dashboard with Adempiere ERP

$
0
0
Hi,

After doing lots of research, I could not be able to find a proper document or approach to start the integration of CDE dashboard with web application ( Adempiere). Please if anyone have done these task, share your valuable ideas and thoughts which would benefit the pentaho users. Eagerly waiting for your responses.

Regards,

Abdur Rahmaan

Parallelism in Weka

Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>