Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Data

$
0
0
Hi,

I am getting some values as question mark which is shown in below image. But how can i correct with correct value I have used "UTF-8" when reading from the text file input.

Test1.PNG
Attached Images

Get Mail Pop - Target Directory name from Database

$
0
0
Hi

I am trying to fetch my email and storing them in a Target Directory, I have a requirement wherein i need to create a new Target Folder for all new emails based on the name set in a Database.

Can anyone provide some help on this,

Thank you

showing selected parameter to dipslay

$
0
0
Hi everybody, I have one question. How to get and show selected value (parameter) of (dropdownlist) filter on Pentaho Report Header?

XML in Pentaho

$
0
0
Hi Friends,

Need your help on the below scenario:

I have a Pentaho job, in which there are 3 transformations. My requirement is to find out the transformation name, by searching through the table name which is used in the transformations. This is to be done using Pentaho DI itself. Not from the DB level.

Hope the above scenario is clear. Please let me know, if anybody need more clarification.

Thanks in Advance
Sandeep

Pentaho Objects Migration to Different server

$
0
0
Where to find the Below Tool for installation

Download the pentaho-platform-migrator-5.0.0-dist.zip file from the Customer Support Portal on the computer where the new version of the server has been installed.

(http://infocenter.pentaho.com/help/i...roduction.html).

I am using Pentaho Version 5.4 and want to migrate all objects from Dev to QA or Prod.

Error msg. while running transformation in pan.sh.

$
0
0
I am getting attached error msg. while trying to run below pan.sh

./pan.sh -file="/opt/pentaho/data-integration/transformations/Transformation_load_commodity_network_service_dim.ktr" -level=Minimal >> /opt/pentaho/data-integration/log/trans.log

Please help.

Thanks,
Raji.
Attached Images

hadoop file or text file steps ?

$
0
0
Hi,

I am trying to load data from SOURCE to TARGET database. and following below approaches and trying to use hadoop file system but as per my observation Text file system is taking less time to process the data. for both type of file system i am using extension as CSV(coma separated values).

Table Input -> Text file Output then Text file input -> Table output (took 13 sec for 7000 records)
Table Input -> Hadoop file Output then Hadoopfile input -> Table output(took 20 sec for 7000 records)

i know hadoop file system will not be useful for these simple data loading process but looking for best approach. generally hadoop file system should load data fast as compare to text file system, is it correct ?.

but it is not happening in my case

Thank you

Boolean Level for Hierarchy

$
0
0
Hi!

I have a schema and I have created a Hierarchy with one Level. The type of the level is Boolean. All is working and I can see in JPivot the the values 'true' and 'false' for that level.

However, I need to translate true/false to spanish, for example si/no.

Can anybody tell me how to translate a boolean level to other language??

thanks in advance!

Numerous warnings about cfgbuilder configuration at startup

$
0
0
Hi,

I'm trying the latest community version (6.0.0) of spoon on Windows 7 and I am getting around 150 cfgbuilder warnings every run. The warning messages aren't present when I run in enterprise edition 5.4. Is there something wrong with my configuration, and if so how can I silence them? Otherwise, is this a issue other people are seeing?

Thanks,

Phil

--- begin error info ----

2015/12/22 15:08:48 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2015/12/22 15:08:48 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2015/12/22 15:08:48 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
2015/12/22 15:08:48 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
.
.
.
2015/12/22 15:08:55 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp

row hash md5, sha1

$
0
0
Can someone explain why PDI gives me a different hash than what the database does (PSQL, MSSQL, etc) for the exact same row of data? I would expect the same data to yield the same hash no matter where it was calculated. Please explain why they differ.

A second question would be: Should they be the same?

Involing Carte addJob REST service gives 302

$
0
0
I am trying to evaluate Carte and add a job I created in the Spoon designer to a local file repository I also created.

I created the job_configuration xml to add the job and wrote a test program.

The response however is always 302 - any idea how I can get past that to actually call the service ?

I have read that 302 means redirect so go grab the 'Location' header and make another call. In this case the location header again is "http://localhost:8181/kettle/addJob", and re-calling results in a further http 302 response.
I seem to be stuck in a loop now ..

String fileName = "C:\\format.xml";
String url = "http://localhost:8181/kettle/addJob";
HttpClient httpClient = new HttpClient();
List authPrefs = new ArrayList();
authPrefs.add(AuthPolicy.BASIC);
httpClient.getParams().setParameter(
AuthPolicy.AUTH_SCHEME_PRIORITY, authPrefs);

Credentials creds = new UsernamePasswordCredentials("cluster",
"cluster");
AuthScope scope = new AuthScope("localhost", AuthScope.ANY_PORT);
httpClient.getState().setCredentials(scope, creds);
AuthPolicy.registerAuthScheme("Basic", BasicScheme.class);
PostMethod post = new PostMethod(url);
String xml = FileUtils.readFileToString(new File(fileName));
post.setRequestBody(xml);

int result = httpClient.executeMethod(post);

How to show pentaho report on PHP webpage sending parameters?

$
0
0
Hello my friends!. I simply need to show in a php webpage a report I've created in pentaho who's running on a tomcat. I need to send the number of semester as a parameter to display data given what the user selected on a dropdown list. Simply that, which is the easiest way of accomplish that?. Thank you very much. I've looked everywhere but I can't seem to find any info on php and pentaho....

How to get unsampled data in Google analytics

$
0
0
HI,

I am having one ETL job which is fatching data from Google analytics.

When I run now for 22-10-2015 date, it is showing Contain Sample data = 'No' , Total results found:201160.
When I run it for 21-10-2015 date,
it is showing

  • Total results found:165815
  • Contains sampled data:Yes


What is the difference here ? Is there it makes sample data after one day ?
Also I found if I will run for 22-10-2015 date later , it will show Contain Sample data = 'Yes'.

Please help.

Json Path is missing

$
0
0
Hi,

I'm working with MongoDB data as input.
I use JSON Input to parse data from MongoDB. The document have this structure
[{
House: "15H451",
Speaker: "John",
Members : [
{
Name: "John",
Sexe: "M",
Age: "68",
Role: "Father"
{
Name: "Elvis",
Sexe: "M",
Age: "45",
Ill: "Yes"
},
{
Name: "Phill",
Sexe: "M",
Age: "25",
Hability: "No"
}
]
},
{ ... }
{ ... }]

As you could see, some attributes doesn't always exists and JSON Input throws an error "No data has been found for the path [$.Members]!" (in this case, Members attribute is not defined)
Sometimes, it's the "Members" attribute that doesn't exists.

How could I solve it please ?

how to write now() functions in Where Clause in MDX quries

$
0
0
Hi,

This may be a basic question:) but I am not able to get a solution for this.

I wanted to find out all the records from current date -30 days, but not able to write it in MDX query.
My sample query is :
WITH
SET [~FILTER] AS
{[Created_Date.Created_Hir].[Created_On].Members}
SET [~ROWS] AS
{[Sales Order Attributes SO.Sales_order].[Sales Order ID].Members}
SELECT
NON EMPTY {[Measures].[CONT_AMT_GROSS]} ON COLUMNS,
NON EMPTY [~ROWS] ON ROWS
FROM [SALES_ORDER]
WHERE [~FILTER]

Currently it is fetching all the records.

Thanks.

Design transformation based on XSD

$
0
0
Hi,

I've only XSD schema definition, no XML data. How to import XSDs for Input / Output ?

Thans a lot in advance,

Michel

Missing some spoon plugins at runtime (KarafLifecycleListener Error)

$
0
0
Hi everyone.

I am using open-source Pentaho distribution from github.com (version 6.1-SNAPSHOT).

In Spoon there are some step missing (e.g. There is no Mongodb input/output step listed) and I cant add dataservice to step (There are no errors, this option just isn't on the list).

I have reinstalled everything (removed .kettle and .pentaho directories as well as whole source and distribution) but it didn't help.

This is what I get at spoon startup:

Code:

16:05:50,304 INFO  [KarafInstance]
*******************************************************************************
*** Karaf Instance Number: 1 at ~/pentaho-kettle/d ***
***  ist/./system/karaf//data1                                            ***
*** Karaf Port:8801                                                        ***
*** OSGI Service Port:9050                                                  ***
*******************************************************************************
Dec 23, 2015 4:05:51 PM org.apache.karaf.main.Main$KarafLockCallback lockAquired
INFO: Lock acquired. Setting startlevel to 100
2015/12/23 16:05:53 - cfgbuilder - Warning: The configuration parameter [org] is not supported by the default configuration builder for scheme: sftp
Dec 23, 2015 4:05:58 PM org.pentaho.caching.impl.PentahoCacheManagerFactory$RegistrationHandler$1 onSuccess
INFO: New Caching Service registered
16:06:04,009 ERROR [KarafLifecycleListener] The Kettle Karaf Lifycycle Listener failed to execute properly. Releasing lifecycle hold, but some services may be unavailable.

I suspect that
Code:

ERROR [KarafLifecycleListener] The Kettle Karaf Lifycycle Listener failed to execute properly. Releasing lifecycle hold, but some services may be unavailable.
has something to do with it as missing plugins reside somewhere under karaf/ directory.

It was working just fine week ago.

I will be grateful for any hints.

Greetings.

EDIT:
I am using Ubuntu 15.04

Bug in Text File input at linux?

$
0
0
I develop with Spoon at Windows, but work on a Linux system.
I used Step 'Text file input' with 4 fields. My original file have only 2 fields that I want to read, so I set field 3 and 4 to to a default value.

example.txt
12,34
56,78
90,00

defaults fields ABC and XYZ

solution at client
12,34,ABC,XYZ
56,78,ABC,XYZ
90,00,ABC,XYZ

solution at linux
12,34
56,78
90,00

is this a bug or didn't I understand the Text File Input step

JSON input with nested structure with same property name at different levels

$
0
0
Hi All
I'm wondering if I found a bug in parsing JSON data. Here is my sample
Code:

[
    {
        "productId": 329,
        "name": "Multi Image Enhanced",
        "productGroupId": 65,
        "productGroupName": "Prints",
        "SKU": "110000",
        "SKU_Category": "SHEETPRODUCTS",
        "finishingPackageId": 315,
        "packageTypeGlobalId": "2306a5bd-1008-4241-a114-74c5020d9d48",
        "finishingPackageGlobalId": "5ebaa4c2-cc6b-4f95-a358-81957106b313",
        "stylePackages": [{
            "stylePackageId": 2727,
            "stylePackageGlobalId": "91cc7186-a927-45c0-8552-0fa2c97cdfbf",
            "name": "duet black"
        },
        {
            "stylePackageId": 2728,
            "stylePackageGlobalId": "d2d91a44-3496-4891-825e-ed94cc73bd68",
            "name": "duet white"
        }]
    },
    {
        "productId": 338,
        "name": "Multi Image Enhanced",
        "productGroupId": 65,
        "productGroupName": "Prints",
        "SKU": "80000",
        "SKU_Category": "SHEETPRODUCTS",
        "finishingPackageId": 324,
        "packageTypeGlobalId": "2306a5bd-1008-4241-a114-74c5020d9d48",
        "finishingPackageGlobalId": "e6a4b2c3-b676-4ef0-9edb-155851c6bced",
        "stylePackages": [{
            "stylePackageId": 2727,
            "stylePackageGlobalId": "91cc7186-a927-45c0-8552-0fa2c97cdfbf",
            "name": "duet black"
        },
        {
            "stylePackageId": 2728,
            "stylePackageGlobalId": "d2d91a44-3496-4891-825e-ed94cc73bd68",
            "name": "duet white"
        },
        {
            "stylePackageId": 2729,
            "stylePackageGlobalId": "31bf8147-c44d-4804-a45b-c1ff80ff999a",
            "name": "slated black"
        },
        {
            "stylePackageId": 2730,
            "stylePackageGlobalId": "e8468e13-9ded-496b-a927-c34d38044e0b",
            "name": "slated white"
        }]
    }]

It has a nested structure and one of the propertiees is named "name"
Here is how I'm reading it and I get error when trying to preview the data. Once "productName" line is removed evrything is back to normal. I know PDI does not hanlde nested JSON data, but I thought if $.. is specified the parser would know which level "name" it is.
Json.jpg
Any help would be appreciated.

Thanks
Attached Images

Working with the Kafka Consumer and Producer Steps in Kettle

$
0
0
The other day a partner asked how to work with the Kafka Marketplace plugins for Kettle contributed by Ruckus Wireless.  I decided to kick the tires and get the steps up and running.I first started off downloading Kafka, you can find it here:http://kafka.apache.org/I downloaded version 0.9 of Kafka.  I happened to have a Cloudera Quickstart VM [...]

More...
Viewing all 16689 articles
Browse latest View live


Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>