Quantcast
Channel: Pentaho Community Forums
Viewing all 16689 articles
Browse latest View live

Dynamic reading table columns

$
0
0
Hi Friends,
I have one requirement as below.
CREATE TABLE accounts(
id integer constraint accounts_id_pk primary key,
name varchar(20),
createtime timestamp without time zone
);

This accounts table is present in my source DB and target DB having same structure. I have to load target table such that if,
- All new source ids will be inserted
- existing source ids will be updated
- If id gets deleted from source then same id must be deleted from target table as well.

The real problem is wanna make the script such that it can dynamically read the columns and do the inserts/updates/deletes based on Primary Key column.
Please let me know if you want more info.
Thanks in advance.
Sashikanta

HTTP Status 404 - /pentaho/Home

$
0
0
Hello, i'm sorry if i write in bad category. I am new to the forum.
Version biserver-ce: (biserver-ce-6.0.1.0-386)
I have problems starting pentaho. A few days ago started properly. I was trying to solve another problem and I have modified the file context.xml in biserver-ce/tomcat/conf.
i have only comment the line:


<Manager pathname="" />

I don't solve my problem, but when i restart pentaho, on browser to localhost:8080 i have this error.

HTTP Status 404 - /pentaho/Home


type Status report


message /pentaho/Home


description The requested resource is not available.
Apache Tomcat/8.0.24

i have uncomment the line e restart pentaho but the error continues to exist.
You would know someone kindly help me? Thanks you all.

Delete rows from Hbase using Pentaho Kettle

$
0
0
Is it possible to delete rows from Habse Database using PDI .
I was not able to find any step.
Can it be established using User Defined Java Step ?
If Yes Can anyone Provide the Sample please...

Thanks ,
Shalini

Query Timeout in Saiku - Where to change it?

$
0
0
Hi all. I am trying to find where I can change Saiku query timeout but I had no success.

Sometimes I have to press the "play" button two or three times in order to retrieve data from the cube but it wouldn't be necessary if I had 1-2 min. timeout instead of 30-secs.

Could someone help to find out how to change this property?

Thanks in advance!

Step Null If return error java.lang.Double can not be cast to [B

$
0
0
I am importing a text file with a numeric column that I defined as being Number (14).

Some records in the file have value "0" (zero, without quotes) and I would like to turn this value into NULL, so I used the "Null If" step, but when I try to perform a preview it returns the error:

Java.lang.RuntimeException: Unable to verify if [host_document Number (14) <binary-string>] is not or not because of an error: java.lang.ClassCastException: java.lang.Double can not be cast to [B

I'm having the same problem with fields of type String, but in this case the value in the file can be "-" (dash) or "" (empty string). The error message for the String is identical.

PS: sorry my english

Sample file:
Code:

1;10395200  ;DL99999;DIV LOGRADOUROS E LOCALIDADES          ;0            ;19860220;19931129
1;10395400  ;MG05393;-                                      ;0            ;19860331;19931129
1;10395700  ;MG05439;MURIAE                                  ;0            ;19860716;19931129

V7.0 REST Client doesn't send Header parameters

$
0
0
Hi,

Issue: the parameters defined in the Header section of the REST Client are not sent to the API I'm using. This works in v6.1 but not in 7.0. Same settings, same computer.

Could anyone else reproduce this, and especially have a fix for it?

I cannot investigate the receiving side more as it's a company's API, I only get a "Missing authentication headers" message.

Thanks!

Potential causes of all jobs (10+ unrelated) -- all suddenly running much slower?

$
0
0
Hi there-

I've had various PDI scripts running since early 2015 -- they are unrelated except they are all run by a master job in PDI (version 5.4.0.1), on the same remote computer, and to the same SQL Server data warehouse.

For the last two nights, they've all been running about 20-30x slower.

What are some potential tests I can do to determine what is causing this across the board?

Moreover, at least for now (around noon as opposed to midnight when these scripts typically run, along with other processes) -- it seems instead of taking about 30x longer (whoof) --- they are only now taking 2x as long, making it harder to find the source.


I can test the SQL database but I have a hunch that isn't it.

What else can be going on here? The remote computer is shared - but I find it highly unlikely a change was made after Friday night (it ran just fine then).

Maybe it was connectivity issues (database is on a network with PDI ETL machine) -- but yeah, I'm not sure. Maybe I need more granularity in performance of each step.

Making a ReST API call with authorization header

$
0
0
Hi all,
I'm trying to make an API call to a service which requires an authorization header, but having a time with figuring it out. Perhaps syntax?
I'm on version 6.1 of PDI community edition.

My transformation is simple:
  1. I generate a row which provides the url, my parameter (-k) and my authorization header
  2. The REST Client step takes the URL from the submitted rows, and uses the headers and parameters passed in as well

Screen Shot 2016-12-19 at 2.55.48 PM.png

I'm not doing anything with the XML output just yet. The result of this call is a 401, so it's letting me know I'm unauthorized.
I've attached my work sample, having swapped out the actual url as authentications. I did validate that all of these were correct (opened the url in a browser), verified that my username and password were properly translated. And I was able to make a curl call with this for the correct response.

So how do I properly use the authentication headers? I've tried
  • As a parameter
  • As a header
  • with and without quotes
  • Including -H "Authorization: basic <username:password>"

But still not getting through.

Thanks for your help and clarifying questions!
Attached Images
Attached Files

Spoon and AWS VPC

$
0
0
Hi,

Does anyone know if there is a thread on these forums (or a tutorial elsewhere) that explains how to access an AWS VPC Redshift environment using PDI (Spoon) run from your desktop?

Thanks in advance!

Getting Error as [SQL0913] Row or object *** in *** type *FILE in use AS400

$
0
0
Hi All,
I am using Pentaho 6.0 to Insert/Updating AS400 Database(Using Native JDBC).

**************** Multiple Thread

****************------------------> Table1(I/U)
****************------------------> Table2(I/U)
Get XML Data Read ------------------> Table3(I/U) Make the transformation database Transactional
****************------------------> Table4(I/U) Option Selected in Transformation Property.
****************------------------> Table5(I/U)

I am using Transformation Properties - Make the transformation database Transactional option. If any table operation insert/Update fails then I should not do any operation in any table. If all table insert/update then good then read the next file.
Pentaho job read a file and the trasformation do the AS400 operation. But some time I am getting the below error and transformation getting abort.

Caused by: java.sql.SQLException: [SQL0913] Row or object ***** in ***** type *FILE in use.
at com.ibm.as400.access.JDError.throwSQLException(JDError.java:650)
at com.ibm.as400.access.JDError.throwSQLException(JDError.java:621)
at com.ibm.as400.access.AS400JDBCStatement.commonExecute(AS400JDBCStatement.java:914)
at com.ibm.as400.access.AS400JDBCPreparedStatement.executeUpdate(AS400JDBCPreparedStatement.java:1189)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
at org.pentaho.di.core.database.Database.insertRow(Database.java:1233)

I need help ti fix the issue ....

Thanks in advance
Raj.

How to pass the JNDI connection into the pentaho kettle via java ?

$
0
0
Do I need to add any additional jars? If any samples here passing JNDI connection ?

CDE Dashboard using mdx and parameterized cube

$
0
0
is it possible to use a cube name set via a parameter, like
SELECT
something ON COLUMNS,
something_else ON ROWS
FROM [{S{param_cube}}]

I've tried it, but it is not working yet, maybe some parameter definition not correct yet, but afaik everything is defined correct.

Pentaho 7.0 Community Startup Tabs

$
0
0
I'm trying to use Community Startup Tabs to do when a user logs in the page,it redirects him to another page. I've modifyed the config.xml this way:

Code:

<?xml version="1.0"?>
<cstConfig>
  <rule match="USER" pattern="false" value="educa">
    <tab title="Bienvenida" order="1" fullScreen="true" tooltip="Bienvenida"><![CDATA[api/repos/%3Apublic%3Aaula_upv%3Abienvenida%3Aindex.wcdf/generatedContent]]></tab>
  </rule>
  <rule match="USER" pattern="false" value="asistic">
    <tab title="Bienvenida" order="1" fullScreen="true" tooltip="Bienvenida"><![CDATA[api/repos/%3Apublic%3Aaula_upv%3Abienvenida%3Aindex.wcdf/generatedContent]]></tab>
  </rule>
</cstConfig>

But it doesn't redirect the user anywhere and in the console appears this:

HTML Code:

"GET localhost:8080/pentaho/plugin/cst/api/readConfigOnce 500 (Error Interno del Servidor)"
The tabs.js (it is the javascript that redirect the user) has the next code: (Can anybody help me? because in the pentaho 6 I didn't have this problem and I don't have anything different, I do not know if in the new version something has changed)

Code:

var fetchAndOpenTabs = function(){
    var tabs;
    $.ajax({
      type: "GET",
      url: CONTEXT_PATH + "plugin/cst/api/readConfigOnce",
      dataType: "json",
      success: function(data){
        if(data.resultset){
          var mode = $.unique($.map(data.resultset, function(row){
            return row[4];
          }));
          //console.log("CST: " + JSON.stringify(mode));
          if (mode[0] == "launcher"){
            window.top.mantle_openTab('CST', 'Community Startup Tabs',  CONTEXT_PATH + "plugin/cst/api/launcher");
          } else {
            var tabsCount = data.queryInfo.totalRows;
            $.each(data.resultset, function(i, tab){
              if(tab[2]){
                if(tabsCount == 1){
                  window.location.href = tab[3];
                } else {
                  window.open(tab[3], tab[1]);
                }
              } else {
                window.top.mantle_openTab(tab[0], tab[1], tab[3]);
              }
            });
          }
          $($(".pentaho-tab-deck-panel iframe")[0]).load(function(){
            setTimeout(function(){
              $($(".pentaho-tab-bar .pentaho-tabWidget")[0]).mouseup();
            }, 1000);
          });
        }
      }
    });
  };

pentaho-server 7.0

$
0
0
Hi All,

I am confused, I'm trying to learn pentaho comunity edition. I started from http://community.pentaho.com/ site and downloaded pentaho-server 7.0.25 from https://sourceforge.net/projects/pentaho/. My plan to set up pentaho server and load the data from other sources to the server. Basically I would like to build data warehouse from scratch. I have some experience with kettle. I have a lot of misunderstanding:

1. Does pentaho-server-ce-7.0.0.0-25.zip contain all needed tools for buildoneg DWH(I am not talking about DB backend, JBOSS, etc...)?
2. May be someone know where I can get installation tutorial for it?
3. Does pentaho-server-ce-7.0.0.0-25.zip contain mondorian?
4. MySQL vs PostgreSQL for ROLAP?(I will use CentOS 6.x) Any thoughts?
5. As I understood https://help.pentaho.com/Documentation/7.0/0F0 will not work for CE version?


Please help me to learn Pentaho. Thanks in advance!

How to load multi level XML file INTO CSV .?

$
0
0
i want to load XMl FIle data into CSV. when i load the data i am getting duplicate record and and also i was getting cross join.

How to handle translate MDSCHEMA_KPIS for CalculatedMembers with KendoUI?

$
0
0
KendoUI contacts Mondrian to present a Pivot Grid. In schema xml, a cube is defined with CalculatedMember. KendoUI translates this into KPIs in Fields list.

But when a user clicks on the KPI Field to see available sub-fields, Mondrian produces an error:

ERROR XMLAParser:113 - Method : XMLAResponseParser.parseFaultyResponse(String xmlaResponse),
Mondrian Error : The Mondrian XML: No enum constant mondrian.xmla.RowsetDefinition.MDSCHEMA_KPIS
ERROR XmlaServlet:324 - Errors when handling XML/A message
mondrian.xmla.XmlaException: Mondrian Error:XMLA SOAP Body processing error
at mondrian.xmla.impl.DefaultXmlaServlet.handleSoapBody(DefaultXmlaServlet.java:511)
at mondrian.xmla.XmlaServlet.doPost(XmlaServlet.java:318)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:808)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1669)
at com.utel.stinga.spa.analytics.server.XmlaRequestFilter.doFilter(XmlaRequestFilter.java:119)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1652)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:585)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:499)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:310)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)
at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:540)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException:
No enum constant mondrian.xmla.RowsetDefinition.MDSCHEMA_KPIS
at java.lang.Enum.valueOf(Enum.java:236)
at mondrian.xmla.RowsetDefinition.valueOf(RowsetDefinition.java:54)
at mondrian.xmla.XmlaHandler.discover(XmlaHandler.java:2857)
at mondrian.xmla.XmlaHandler.process(XmlaHandler.java:671)
at mondrian.xmla.impl.DefaultXmlaServlet.handleSoapBody(DefaultXmlaServlet.java:507)
... 20 moreWhy does not Mondrian support MDSCHEMA_KPIS? What is the correct keyword to use in order to populate the PivotGrid with KPIS from Mondrian's CalculatedMembers?

This is the current request from KendoUI which causes the problem:

<Envelope xmlns="http://schemas.xmlsoap.org/soap/envelope/"><Header/>
<Body><Discover xmlns="urn:schemas-microsoft-com:xml-analysis">
<RequestType>MDSCHEMA_KPIS</RequestType><Restrictions><RestrictionList>
<CATALOG_NAME>LTE</CATALOG_NAME><CUBE_NAME>LTE_10000</CUBE_NAME>
</RestrictionList></Restrictions><Properties><PropertyList>
<Catalog>LTE</Catalog></PropertyList></Properties></Discover></Body>
</Envelope>

Data Service Parameters in DI

$
0
0
Hi, I am new to Pentaho DI. I have a transformation in which output step serves as Data Service. I use this data service from non pentaho tool by using jdbc connection. I want to get username and password which non pentaho tool sends to this data service for connection.
Waiting for your earliest response.

Java out of memory error

$
0
0
Hi,
i'm receiving this error when at end of my transformation (Insert step)

my spoon.bat has:

if "%PENTAHO_DI_JAVA_OPTIONS%"=="" set PENTAHO_DI_JAVA_OPTIONS="-Xms512m" "-Xmx1300m" "-XX:MaxPermSize=512m"

There is a solution or pentaho is a ****?

How can I get Marketplace in Pentaho 7.0?

$
0
0
Hi, I'm using the Business Analytic Platform version 7.0 from http://community.pentaho.com/

But in this system there is only a link for the marketplace...
http://www.pentaho.com/marketplace/

I downloaded a previous marketplace version and put it the path below
\pentaho-server\pentaho-solutions\system
And now I can see the marketplace option in the menu but it doesn't work, it shows different error according to the each old version that I used :(

I want to install Saiku to work with cubes.

In the website there is only "Saiku Chart Plus" but is not the option I´m looking for.

However, after downloaded the plugin how is the installation?

Can anyone please help me and tell me what is the correct way to install the marketplace?

Thanks

Component Mail Pentaho Data Integration - [Error]

$
0
0
Personally what configuration I use to run the email component in windows 10, the following error appears:

Error:
2016/12/20 12:52:43 - Mail - ERROR (version 7.0.0.0-25, build 1 from 2016-11-05 15.35.36 by buildguy) : Problem while sending message: javax.mail.MessagingException: Could not connect to SMTP host: smtp.gmail.com, port: 465;
2016/12/20 12:52:43 - Mail - nested exception is:
2016/12/20 12:52:43 - Mail - java.net.ConnectException: Connection timed out (Connection timed out)
Viewing all 16689 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>