Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

UNRESOLVED DEPENDENCIES - pentaho-hadoop-shims-hadoop-20 + -mapred - V 5.3.0.0-213

$
0
0
Hi everyone,
i tried to download and build the big-data-plugin from github, to learn something about the used libraries for hive and so on.
But when i build the project, i get the following error-message:

Code:

[ivy:resolve]  ==== pentaho-ivy: tried
[ivy:resolve]    http://ivy-nexus.pentaho.org/content/groups/omni/pentaho/pentaho-hadoop-shims-hadoop-20-mapred/5.3.0.0-213/pentaho-hadoop-shims-hadoop-
20-mapred-5.3.0.0-213.ivy.xml
[ivy:resolve]  ==== pentaho-mvn: tried
[ivy:resolve]    http://ivy-nexus.pentaho.org/content/groups/omni/pentaho/pentaho-hadoop-shims-hadoop-20-mapred/5.3.0.0-213/pentaho-hadoop-shims-hadoop-
20-mapred-5.3.0.0-213.pom
[ivy:resolve]    -- artifact pentaho#pentaho-hadoop-shims-hadoop-20-mapred;5.3.0.0-213!pentaho-hadoop-shims-hadoop-20-mapred.jar:
[ivy:resolve]    http://ivy-nexus.pentaho.org/content/groups/omni/pentaho/pentaho-hadoop-shims-hadoop-20-mapred/5.3.0.0-213/pentaho-hadoop-shims-hadoop-
20-mapred-5.3.0.0-213.jar
[ivy:resolve]  ==== public-maven: tried
[ivy:resolve]    http://ivy-nexus.pentaho.org/content/groups/omni/pentaho/pentaho-hadoop-shims-hadoop-20-mapred/5.3.0.0-213/pentaho-hadoop-shims-hadoop-
20-mapred-5.3.0.0-213.pom
[ivy:resolve]    -- artifact pentaho#pentaho-hadoop-shims-hadoop-20-mapred;5.3.0.0-213!pentaho-hadoop-shims-hadoop-20-mapred.jar:
[ivy:resolve]    http://ivy-nexus.pentaho.org/content/groups/omni/pentaho/pentaho-hadoop-shims-hadoop-20-mapred/5.3.0.0-213/pentaho-hadoop-shims-hadoop-
20-mapred-5.3.0.0-213.jar
[ivy:resolve]          ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve]          ::          UNRESOLVED DEPENDENCIES        ::
[ivy:resolve]          ::::::::::::::::::::::::::::::::::::::::::::::
[ivy:resolve]          :: pentaho#pentaho-hadoop-shims-hadoop-20;5.3.0.0-213: not found
[ivy:resolve]          :: pentaho#pentaho-hadoop-shims-hadoop-20-mapred;5.3.0.0-213: not found

Code-Source:
- https://github.com/pentaho/big-data-...tags/5.3.0.3-R

Can someone please give me a hint :) ?

regards
Stefan

Out of memory - Java heap space -- hard to diagnose what is wrong.

$
0
0
I got an error java.lang.OutOfMemoryErorr: Java heap space.

I have a job that fires another job that runs in a loop. Aka the master job keeps cycling the child job until the last JSON page is fetched and parsed.

I'm not sure if it's this "loop" the keeps compounding the memory usage every time it's run (the process threw this error after the 78th loop).



I'm no java expert but I heard "heap space" refers to variables or object instances ... perhaps?

Anyway I have maybe 12 environmental variables --- but those just get set every loop, not added .. so that can't be it. Believe it or not there aren't even any javascript variables in the 'looped' job ... there are field names ... I'm not sure.

How do I diagnose this problem?

Even if we're not talking variables, each "loop" deals with exactly 1000 rows maximum, within a JSON document. None of the row "fields" are large by any means ... although the JSON document itself contains full body email text (for 1000 emails). This full body email text is not extracted or parsed, but must be read to look for the other meta-data tags of course ... I'm not sure if someone put the King Jame's Bible in there, but would that really throw this 'heap space' error? EDIT ... just checked and the KB of this particular JSON document it failed on is no larger than any other really (< 5 MB total).

Also strangely enough ... the transformation step that threw this error was a JSON input/ parse of the 1000 record document only looking for the VERY last bit of info ($.record_count) at the very end of the document. Not the heavy-lifting 1000 record parser.

I can do more analysis .... but man ... is there a way to VIEW the current 'heap space/ usage' at a given time in Pentaho? How would I go about solving this, or assessing what architecture is causing this problem? Yes, I can throw more memory at the problem and fix it for now, but I'm interested in the cause nonetheless.

Kettle Plugin development

$
0
0
I am still trying to figure out what is an effective way to develop a new kettle plugin:

1. create a dialog class with the UI element: such as: check box, combo box

2 create a data class from the base;

3. create a meta class from the base;

4. create a set class with the process row method.

Here is t questions:

1. if the user select one value from the combo box, How can I access it from the step class?
2. what should we put in the meta class;
3. What is the purpose to the data class;

thanks , can anyone shed some light on these

Execute SQL statement does not escape special character "--"

$
0
0
Hi there--

I'm using 'execute SQL statement' to do a custom log table update (the Pentaho out-of-the-box is not exactly what I need).

I'm using the 'execute SQL statement' as opposed to 'Update' because the former allows me to update based on a dynamic environmental variable ${LOG_ID}. Actually, I guess I use 'Update' and use javascript + an 'add constants' step into a field to do the same thing. Might be a little harder for other people to understand though.


Neverthless, I'm throwing log_text into a parameter to update a SQL field.

Many characters like quotes are escaped. The MySQL syntax for a line comment "--" is not escaped. It makes all text on the line not executed in the SQL statement --- thus cutting off text in the parameter itself, but also cutting out the commas between, and any other parameters or SQL text on the line, often creating an invalid statement.

I mean the Pentaho log text does every so often contain the sequence "-->" some sort of arrow but obviously it contains the MySQL -- comment out.


How can I fix this myself?

Should I simply do another javascript step that takes the field "log_text" ... and replaces "--" with something else? I mean, it's not an important sequence as far as log_text goes.

Actually, it's quite bizarre ... at least in the SQL client I'm using, the sequence --, when contained within quotes, does not perform a comment. Yet in my Pentaho log, it looks like it cut out at the first instance of '--', and I have quote strings turned on. Also just checked instr function for "--" for all log text capture thus far --- 0 every time.

What are the plans to migrate Pentaho BI to Java 8?

$
0
0
Hello, everybody!

As Java 1.7 is entering end of life stage, our company is outlawing its usage and forcing migration to Java 8. It seems to me that Pentaho BI server is incompatible with Java 8. I see a lot of plugin registration exceptions on server startup. We were using Pentaho BI for our dashboarding and BI needs, but without Java 8 support it is efficiently becoming useless to us.
That said, I wander if there are any plans to add Java 8 support to the Community Edition and when that can be expected to happen. Maybe there are some workarounds for this issue?

I see a PDI Jira regarding Java 8, but there is nothing I can find for Pentaho BI Server.

Any info will be highly appreciated!

Rita

Error Variable?

$
0
0
I have an odd scenario where I am required to call a stored procedure in a MySQL 5.6 database, but the availability of that procedure is unpredictable. I would like to trap the error that is received when an attempt to call the stored procedure (using "execute SQL statements" step) and put the error message into a variable for later use/evaluation. Is this possible?

Thanks!

data in downloaded pdf file didnt appear

$
0
0
Hello,

I encounter a problem with pdf generated with PRD. All data appeared when I print the pdf file, but when I download and saved, some of the data on the pdf files is missing. Why is it happened? I already checked, all the element didn't overlap.

I appreciate all helps, thanks!

Create Data Audit Report from Flat Files

$
0
0
Hi all,


I have a number of flat files (csv and txt) and I want to create an excel data audit report to detail audit results such as total records, field types, formats, min/max lengths, nulls etc etc.


Does anyone have any good examples they can share or advice on the best way of doing this through Pentaho (version 5.3.0)?


I'm also looking to create data validation for these files too.


Many thanks for your time,
Dan.

Align csv columns

$
0
0
hi

I have used denormaliser to convert rows to cloumns.but Output CSV is not correctely formatted.
Please help me .I am new to this tool

PFA,I have added screen shot of actutal output.But i need it in expected output format.

thanks & Regards,
Hemavathi
Attached Images

How to split HTML table rows

$
0
0
Hi

I need to split the HTML table rows. Table is created by fetching data from XML's . I need a seperate table for each XML, right now all the data are getting appended to a single table.

Thanks
Appurv
Attached Images

Move XML Join output's to a single Text Output File

$
0
0
Hi Experts,
I have to move two XML join outputs with different structure to a single text file Output.
The two output from XML Join is two HTML table with different number of Columns. I tried using Append streams but I am not getting proper output.

Can anybody help?

Regards,
Vivek

Unparseable Date...couldn't parse field errors

$
0
0
Hello - I am new to Pentaho and have some questions.
We are receiving an encrypted .csv file through sftp from an outside organization and the job is setup to decrypt and parse the data and when the job successfully completes, it will place the decrypted file in a folder.

The job has yet to run successfully and returns the following errors almost immediately - this is just a short clip of the errors:

Error converting line

Couldn't parse field [Date] with value [Delivery,Retail], format [MM/dd/yyyy] on
data row [13780].

BLDG_OCCU_DATE String : couldn't convert string [Delivery,Retail] to a date usin
g format [MM/dd/yyyy] on offset location 0
Unparseable date: "Delivery,Retail"


I am reading this to mean that the file that was sent to us has errors in the formatting. When I returned the file with this message, I was told by the outside organization that it is a problem with our decryption. I do not see any indication of this in these messages - however, like I said, I am new to this.

Thank you for any assistance.

Text Colour formula not working when report is exported to Excel

$
0
0
Hi all,

I have a report where 1 column has the text colour set by a very simple formula to red or green :
Code:

=IF([PROFIT]>=0;"#009933";"red")
This works fine when I run the report as HTML or PDF. However, when I run the report into excel (or excel 2007) the entire column of text comes out in a single colour. It looks like the formula only runs against the first row in the report when exported to excel. This happens when run from both the designer and the BI server user console.

Any suggestions on how I can make this work in excel?

Thanks in advance,
Nathan.

Pentaho Report Designer : 5.3.0.0-213
Pentaho User Console : 5.3.0.0.213

Aggregate table column not recognized

$
0
0
I have Fact table named FT_Sales and a aggregate table named agg_trmcscsb_ft_sales with 16 columns . Mondrian recognize 14 columns in that aggregate table excepts two columns. I am following same naming convention for all the column as ${Table_Name}_{Level_Name}

The Pentaho error is as follows.

[mondrian.rolap.aggmatcher.AggTableManager] Recognizer.checkUnusedColumns: Candidate aggregate table 'agg_trmcscsb_ft_sales' for fact table 'FT_Sales' has a column 'Dim_Market_EPOS_Market_Name' with unknown usage.

Please give some suggestion to solve this problem

Thanks

URL of CGG for PRD not load

$
0
0
Hi.

I have a problem with the URL of a chart of a dashboard, I believe the problem has in how I informed the parameters. Follows the URL (parameter names are in bold):
http://localhost:8080/pentaho/plugin...password&parampTabOrigem=CadCliente&parampUnidsPie=[Unidade.Unidade].[Todas_Un]&parampUnidsRows=[CadCliente].[2014].Children&parampWherePie=CrossJoin({[periodo].[2014]},{[Atividade].[Todas_At]})

These parameters are in the MDX query used by the chart. Follows the query with the appropriate parameters:
with
member [Measures].[ComCompras] as 'Sum([${pTabOrigem}].CurrentMember.Children, IIf(([Measures].[Valor] > 0), 1, 0))'
member [Measures].[Cadastros] as 'Count([${pTabOrigem}].CurrentMember.Children)'
select {Hierarchize({${pUnidsPie}})} ON COLUMNS,
CrossJoin({[Measures].[Cadastros], [Measures].[ComCompras]},{${pUnidsRows}}) ON ROWS
from [Vendas]
${pWherePie}

I am sending the parameters correctly?

---EDITED---

Oops. I forgot to tell what happens... :o
When I run in the browser (mozilla or chrome) nothing happens, the page is blank.

Display time in Eastern Time

$
0
0
In my reports I need to display the time in Eastern Time.
Is it possible?
How do I do that?
thanks

multiselect result as a parameter for a PDI datasource

$
0
0
I have a multiselect selector, and I want to pass the selection as a parameter to a PDI datasource.
Currently, I have an error because PDI receives an object. In the error message I see something like (Ljava.lang.Object;@79b6f80d)

Is it a way to accomplish what I need?
thanks

Connect LINKED MS SQL Server

$
0
0
I currently have a linked server setup within Server Objects on SQL Server 2008. I used SSMS to set this linked server up. How do I connect Pentaho spoon to the database on this server? Attached is a screenshot from SSMS. Any help is greatly appreicated!!!
Attached Images

Jira 2296

User folder is created case sensitive, on first login

$
0
0
When a user logs on for the first time on the PUC, a folder with the username is created (i.a. /home/user1). Unfortunately, the folder is created case sensitive.

Example:

1. Login --> User types in "User1". This results in the folder "/home/User1".

2. Login --> User types in "user1". This results in a new folder "/home/user1".

The same user now has multiple folders.

Does anyone have any idea what setting can correct this? Thank you in advanced.

BTW: We are using LDAP authentication (OpenLDAP).
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>