Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

libext dir missing in PDI 5.0.1

$
0
0
Hi,

i downloaded the Community Edition of PDI 5.0.1 and unzipped in my PC. looks like the libext folder under the data-integration is missing.
any ideas, where i can copy my Database jar files so that i can connect to various Databases?

Thanks

Andrew

Error HTTP Status 500 after MarketPlace Install

$
0
0
I can't create any dashboards after installing through marketplace for BI Server 4.8. CDE throws a http 500 error. CDB, CDC and CDV all show Could not load dashboard: null.

Any help would be appreciated.

Mike

Increase MySQL output to 80K rows/second in Pentaho Data Integration

Salesforce access using OAuth credentials

$
0
0
Hi,

I need to access Salesforce data using OAuth credentials [the token just isn't available]. I was able to make a change to SalesforceConnection.java (which I'm happy to share), but if there's a supported solution it's even better.

Thanks,

Ranjan Bagchi

Relational DesnseInstance Generation

$
0
0
I'm having some trouble working out the Java code to produce a relational denseInstance. The documentation states that "If an attribute is nominal (or a string or relational), the stored value is the index of the corresponding nominal (or string or relational) value in the attribute's definition." but I can't get my head around to what that actually means, and how that translates to Java code.

Using code from the denseInstance main function I am able to get a non relational instance being generated.

An example of the kind of instance I want in .arff format is:

Code:

@relation testing


@attribute set {a,b}
@attribute bag relational
    @attribute x numeric
    @attribute y numeric
    @attribute z numeric
@end bag
@attribute class {same,diff}


@data
a,"1,1,1\n1,1,1\n2,2,2\n2,2,2\n6,6,6\n4,4,4",same
a,"1,2,2\n1,6,8\n3,2,6\n2,1,3\n0,6,8\n9,0,5",diff
a,"2,2,2\n4,4,4\n7,7,7\n8,8,8\n3,3,3\n4,4,4",same
b,"2,7,5\n3,4,2\n6,7,3\n9,7,8\n0,2,2\n7,6,2",diff
b,"4,5,3\n6,4,7\n6,4,9\n8,6,4\n8,4,6\n8,4,2",diff
a,"5,3,9\n0,1,6\n8,2,4\n0,6,4\n8,1,1\n0,8,2",diff
b,"4,4,4\n8,8,8\n0,0,0\n6,6,6\n2,2,2\n9,9,9",same

I'm aware that I can just import an .arff file, and have it automatically generated, but I need to be able to generate instances from within Java for this project.

Any help in getting Java code to generate this is much appreciated.

Process EDI files

$
0
0
Hi everyone.

I'm creating some Kettle transformation in order to parse an SAP HR IDoc file (not XML). I already have a Java class which has the logic to search and get all person data I need. I would like to read the file and create some groups, like I do by using Java and regular expressions.

Following you can find the raw version of a received IDoc:

Code:


EDI_DC40_U8000000000000757798700 3012  HRMD_A07                                                    HRMD_A                                          SAPDOD    LS  ADMCLNT800                                                                                          CAIDM    LS  IDMCLNT800                                                                                          20130914201937                                                                                                                20130914201900     
E2PLOGI001                    80000000000007577980000010000000201P 00000010 I                                                       
E2PITYP001                    80000000000007577980000020000010301P 000000100000    1800010199991231
E2P0000001                    800000000000075779800000300000204000000100000      999912312002010100020030507WEISSANJA                  01  31   
E2PITYP001                    80000000000007577980000040000010301P 000000100001    1800010199991231
E2P0001002                    800000000000075779800000500000404000000100001      999912312002010100020030507WEISSANJA                  2000200 1GC200              0002    G1            500013575000521450016575                BOND JAMES                    Herr James Bond                        S 200 1000                                                                           
E2PITYP001                    80000000000007577980000060000010301P 000000100002    1800010199991231X
E2P0002001                    800000000000075779800000700000604000000100002      999912311967052200020030507WEISSANJA                            Bond                                              James                                                                                                                        001119670522                              GB    D  0000000000    08AA232132B          00000000                                                                                                                                                                                                19670522BOND                    JAMES                    DEBond                                                                                                                    James                                                                                                                                                                                                                                         
E2Q0002002                    800000000000075779800000800000705000000100002      9999123119670522000                                                                                                                                                                                                                                                                                                                                                 
E2PITYP001                    80000000000007577980000090000010301P 000000100003    1800010199991231
E2P0003001                    800000000000075779800001000000904000000100003      999912311800010100020030507WEISSANJA                  X000000002002010100000000 000000000000000000000000000000002002010100000000  20030507162740                                      2002010100000000   
E2PITYP001                    80000000000007577980000110000010301P 0000001000061  1800010199991231
E2P0006003                    8000000000000757798000012000011040000001000061      999912312002010100020030507WEISSANJA                  1                            1213 Test House Road                                                            TW18 8HD  GB              0                                        000000        0                                                                                                                                                                                                                                                  0                                        1213 Test House Road                                                                                                                                                                                                                                                                                                                                                                                                                   
E2PITYP001                    80000000000007577980000130000010301P 000000100007    1800010199991231
E2P0007001                    800000000000075779800001400001304000000100007      999912312002010100020030507WEISSANJA                  STD-WK  0100.00 169.00 39.00  8.00  5.00  2028.00  0.00  0.00  0.00  0.00  0.00  0.00  0.00    0.00             
E2PITYP001                    80000000000007577980000150000010301P 00000010010500011800010199991231
E2P0105002                    8000000000000757798000016000015040000001001050001  999912312002010100020030507WEISSANJA                  0001VELAPPUS                                                                                                                                                                                                                                                                         
E2PITYP001                    80000000000007577980000170000010301P 00000010010500101800010199991231
E2P0105002                    8000000000000757798000018000017040000001001050010  999912312000010100020000925BONIN                      0010VELAPPUS                      James.Bond@ides.com                                                                                                                                                                                                                                 
E2PITYP001                    80000000000007577980000190000010301P 000000100302    1800010199991231
E2P0302001                    800000000000075779800002000001904000000100302      200201012002010100020030507WEISSANJA                  01     
E2PITYP001                    80000000000007577980000210000010301P 000000101001A2091800010199991231
E2P1001001                    80000000000007577980000220000210480001P 000000101001A2091  2002010199991231CP0000602300020030507WEISSANJA      00000000CP00006023                                    0.00                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             
E2P1001001                    80000000000007577980000230000210480001P 000000101001A2091  2002010199991231CP0001295000020030507WEISSANJA      00000000CP00012950                                    0.00                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             
E2PITYP001                    80000000000007577980000240000010301P 000000101001B0081800010199991231
E2P1001001                    80000000000007577980000250000240480001P 000000101001B0081  2002010199991231S 5000521400020030507WEISSANJA      00000000S 50005214                                    100.00

Here is how the file is organized:

idoc-sap.jpg

I know how to get information from lines, but I gotta find an way to organize the first column with the rest of the line and look for information I want row by row.

Any clue will be very helpful due I'm pretty new on Kettle design.

Thanks.
Attached Images

Log out user from PUC instead of open login dialog inside PUC when session timeout

$
0
0
Hi,

By default the PUC will open a login dialog when the session timeout, based on the design requirement, I will need the system to log out the user from PUC and bring the user to login page. Can someone advise how to get this done?

Thank you.

Pentaho 5.0.1 - User Management Issue

$
0
0
We have configure pentaho 5.0.1 with mysql database.

In Pentaho 4.8, while we create any user it will store in hibernate database that having USER and USER_SETTING table.

But we could not get any any data in pentaho 5.0.1 hibernate database.

Could someone tell me how we will manage user in pentaho 5.0.1.?

Regards,
Chintan

where user data store in pentaho BI 5.0.1

$
0
0
Hi,

We installed Pentaho BI 5.0.1 in our Oracle Linux server,
We connect it with MySQL Successfully.
Now when we create one user using Administrator , user created but i can not found it in MySQL database table.
In Pentaho 4.8 it is stored in "Hibernate" database with table name is "USER", but it is not found in 5.0.1 .

So where we can find it ? Any solution ?

Thanks in advance !!!

Load external html menu in dashboard

$
0
0
Hello,

I am trying to load an external html file that contains my menu for all my dashboards so I only have to edit ONE file to update all dashboard pages with any new or changed menu bar content.

I have tried using PHP and Javascript however neither seem to work.

PHP I have tried in CDE Row > HTML - <?php include "location of file"; ?>

The above code works in a doc saved as HTML but I cannot get it working in Pentaho.

Any Ideas how I can solve this issue?

Thanks

facing fatal error: unable to create Java Virtual Machine

$
0
0
Hi,

I have got a windows 2003 server 32bit installed on virtual machine, with following configuration:
1. jdk 7.0.1_51; -xmx2048m
2. 4GB of physical memory
3. 2 CPUs

I needed to assign more memory to data integration so in the Spoon.bat, I changed the configuration to:
PENTAHO_DI_JAVA_OPTIONS="-Xmx2048m" "-XX:MaxPermSize=2048m"
but after running the Spoon.bat file a dialogue popped-up with: "Fatal Error: unable to create Java Virtual Machine" message.

what is the problem? and how can I solve it?

thanks in advance,

P.S. I should mention that I have done this on windows 7 even with higher values but everything works fine.
P.S. one more thing is that maximum working value is 800m.

[DI 4.2.0] Call DB Procedure ISSUE

$
0
0
Hi guys,

I'm trying to call a DB procedure using the step "Call DB Procudure" ( strange uh? :D )

The fact is that: the procedure start to run but, because it need about 8/9 hour to work, kitchen don't wait enough so the DB don't see anyone in listening and decide to stop the procedure...

The strange fact is that in the log I DON'T SEE ANY error......that's the log:

INFO 11-02 18:54:22,536 - Kitchen - Start of run.
INFO 11-02 18:54:22,554 - RepositoriesMeta - Reading repositories XML file: /home/user/.kettle/repositories.xml
INFO 11-02 18:54:26,281 - MAIN_JOB_MASTER_REPORT - Start of job execution
INFO 11-02 18:54:26,287 - MAIN_JOB_MASTER_REPORT - Starting entry [TRF_CALL_DB_PROCEDURE]
INFO 11-02 18:54:26,295 - TRF_CALL_DB_PROCEDURE - Loading transformation from repository [TRF_CALL_DB_PROCEDURE] in directory [/MASTER_REPORT/TRASF]
INFO 11-02 18:54:28,705 - TRF_CALL_DB_PROCEDURE - Dispatching started for transformation [TRF_CALL_DB_PROCEDURE]


I know that the Procedure is running or not because i can see it on DB process..

QUESTION: is there any Timeout parameter that i can set?

N.B. Obviously i launch that job by terminal :D

Thanks, Dev.


------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

SOLVED: Ok guys, there was a bad setting of "keep alive" on firewall setting!!

ReportValidationFailed on BI server

$
0
0
Hi,

I have published a report on BI server. It says Report validation failed. I am using prd-ce-5.0.1-stable and biserver-ce-5.0.1-stable. The catalina.out log details are as follows:

Code:

Caused by: java.sql.SQLException: Unable to load Hive Server 2 JDBC driver for the currently active Hadoop configuration
    at org.apache.hive.jdbc.HiveDriver.getActiveDriver(HiveDriver.java:107)
    at org.apache.hive.jdbc.HiveDriver.callWithActiveDriver(HiveDriver.java:121)
    at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:132)
    at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.DriverConnectionProvider.createConnection(DriverConnectionProvider.java:138)
    at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.getConnection(SimpleSQLReportDataFactory.java:144)
    at org.pentaho.reporting.engine.classic.core.modules.misc.datafactory.sql.SimpleSQLReportDataFactory.queryData(SimpleSQLReportDataFactory.java:195)
    ... 84 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.GeneratedMethodAccessor116.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hive.jdbc.HiveDriver.getActiveDriver(HiveDriver.java:105)
    ... 89 more
Caused by: java.lang.RuntimeException: Unable to load JDBC driver of type: hive2
    at org.pentaho.hadoop.shim.common.CommonHadoopShim.getJdbcDriver(CommonHadoopShim.java:109)
    ... 93 more
Caused by: java.lang.Exception: JDBC driver of type 'hive2' not supported
    at org.pentaho.hadoop.shim.common.CommonHadoopShim.getJdbcDriver(CommonHadoopShim.java:104)
    ... 93 more

I copied Hadoop Shim jars and other hsql and h2 jars from prd/lib to biserver/tomcat/lib. But no luck. Any clues? And my report includes a bar chart (although i think it least matters). Thanks in advance.

Pentaho Community has a new home

$
0
0





New Pentaho Community Websites

http://community.pentaho.com

No more under construction messages. No more 2008-ish style websites. The Pentaho Community now has a new, entirely redesigned websites.

So visit us at http://community.pentaho.com
New visuals

We chose to adopt a visual layout that complements the main pentaho.com website; The formal blue and white tone of the enterprise websites was replaced by a more informal green / gray color schema.
Same good-ol' projects

Kettle Homepage

All the big projects had a face-lift. That's the case with:





And a few new ones

Language Packs

We added some of the more recent project that we felt as deserving to get some more highlights. That was the case of:



Sparkl Website

Contribute

Our UX team did a great job designing the websites, and they obviously based out of the old ones. And as you may guess, it's not an easy task given the high technical content in these.

What I mean is that, despite our extensive reviews (I'm lying, they were not that extensive :p ) you may find some outdated or even wrong information. If that happens to be the case, let us know!

Or, better than letting us know, why not submit a fix? Yeah, I guess it won't surprise you a lot knowing that the entire websites are stored on github. At least we can't be accused of not trying to achieve some consistency in the way we engage with the community :)

Cheers!




More...

Logging in when using Kettle API

$
0
0
We have embedded Kettle within our Web Application and executing the transformation via a Quartz Scheduler. This works like a charm. What we are unable to find out is the output or result of the transformations which have been executed. We are using Kettle 4.3.0 stable API.
Have seen some examples around using LogWriter, but not much success. Can someone provide with some examples of how to output the Result data into a Log file.

Timeline and Widescreen in CDE

$
0
0
Hello Forum,
i have two simple questions about CDE.

1. Anybody knows a chart type or a addin for a timeline, like this:
timeline.jpg

2.CDE use the CSS Blueprint Framework.
Is it possible to change the span count (for example 50 instead of 24)?
Because I have to create a Widescreen Dashboard. With CDE is the width max 960px. But I need more :)

Thank you for your help.
;)
Attached Images

Pentaho Reporting

$
0
0
how to save the end user selected parameter filter values in bi server?

Calling a job against Carte via URL

$
0
0
Does anyone know where the carte config file should reside when using YAJSW? I see five config files (eg.carte-config-{port}.xml and carte-config-master-{port}.xml) listed under the PWD directory but since I am running Carte as a windows service it's not clear to me if any of those are even visible to the YAJSW wrapper. Thus far I have not been able to successfully call a job from a URL call (using a db repo) as describe here: http://wiki.pentaho.com/display/EAI/Carte+Configuration but I can call a transformation as described here: http://wiki.pentaho.com/display/EAI/...r+web+services. When I attempt the call to runJob Carte is basically complaining that it cannot find the repo. My goal is to move away from having to use a job wrapper for every main job when using kitchen (Eg. "C:/data-integration/kitchen.bat" /rep:MyRepo /job:"My Job" /dir:"/path/to/my/job" /user:user /pass:pass) and selecting the carte server for it to run on within the wrapper job (see pic). This is the error I receive:
Quote:

ERROR Unexpected error running job:
org.pentaho.di.core.exception.KettleException: Repository required. at
org.pentaho.di.www.RunJobServlet.loadJob(RunJobServlet.java:156) at
org.pentaho.di.www.RunJobServlet.doGet(RunJobServlet.java:65) at
javax.servlet.http.HttpServlet.service(HttpServlet.java:707) at
javax.servlet.http.HttpServlet.service(HttpServlet.java:820) at
.....
Attached Images

Kettle Validation Questions

$
0
0
Hi. I'm a newbie with Pentaho but an experienced ETL/BI professional. I'm attempting to do a simple validation.
My transformations are:
  1. A simple two steps transformation in order to load de CSV input file into a MySQL -all varchars- RAW table (it works)
  2. Then a second transformation to validate input Data Type. In this case I'm casting the input. It looks like this:


trans2.jpg

The input is casted from the RAW (all varchars) table into the corresponding data types, like this:

SELECT ID_RAW, CAST( CONCAT( '20', SUBSTRING( trim( FECHAVTA ) , 7, 2 ) , '-',
SUBSTRING( trim( FECHAVTA ) , 4, 2 ) , '-',
SUBSTRING( trim( FECHAVTA ) , 1, 2 ) ) AS DATE ) AS FECHAVTA,
cast(trim( ID_CLIENTE ) as signed) AS ID_CLIENTE,
cast(trim( ID_PAISVTA ) as signed) AS ID_PAISVTA,
cast(trim( NROSUCVTA ) as signed) AS NROSUCVTA,
cast(trim( ID_ARTICULO) as signed) AS ID_ARTICULO,
cast(trim( CANTIDAD ) as decimal(5)) AS CANTIDAD,
cast(trim( PRECIOESPECIAL ) as decimal(7,2)) AS PRECIOESPECIAL
FROM rawprueba1


Okay then, the validation step doesn´t work well because I checked the box "Validate Data Type" for every validation of every field. Using a correct file it takes the whole input file as in error inspite of I casted the input for all of them. At the same time if I uncheck all input is accepted disregarding that I entered some numeric data as alpha and the result set of input contains null values for those fields (however I marked he "Only numeric data is expected"

In short. what am I doing wrong? Kettel seems to be intuitive but in the end is full of tricks.

Can anybody help me please?

Thanks in advance,

Marcelo
Attached Images

How to proces new data to input in a stardardised model

$
0
0
Hi,

I used weka to generate a simplelogistic model.

I embedded this program in my existing php code, with the formula:


whereby s is the output from weka.

I checked the option "standardize" when building the model with weka.

Now I want to run this formula on a new instance and see how well it classifies. Do I need to modify it in some way? Standardize? But how as I am not sure how weka did it?
Viewing all 16689 articles
Browse latest View live